Store 100.000 action per day in sql server !

  • In a previous application i stored about 100.000 actions ( coming from an external system )in an access database for each day. Therefore i created everyday a new accessdatabase to have enough performance . Each month i copied automatically all databases in a new 'month' map as a backup.

    How this is done in an sql server ?

    Do i have to use a different kind of method ?

  • Depends. What is you actual goal for the data? I have used several different methods including two input tables with a single SP that flips nightly so I can move the data to their ultimate location. I have also use replication where I disabled deletes. Just depends on your ultimate goal.

  • My application is use for storing actions from an external system and view these actions in a grid. A kind of history. To see what has been happend with the external system on a specific day of week. Because each day i'm receiving 100.000 actions i'm afraid the application will be to slow if all actions of one month is in one database.

  • OK then what I have done when I want no interferance and there is no real time need is this.

    I have 2 tables in my capture database

    CaptureTable1

    CaptureTable2

    I chose to use a Stored Procedure but you can use other methods.

    StoredProcedure1 is setup to input all data in CaptureTable1.

    StoredProcedure2 is setup to read all the data from CaptureTable1.

    StoredProcedure3 is setup to truncate CaptureTable1.

    Just before midnight I have a job that then runs and changes StoredProcedure1 to use CaptureTable2.

    Then the job has a step with a DTS package to move all the data from CaptureTable1 using StoredProcedure2.

    The third step is to run StoredProcedure3.

    The last step is to change StoredProcedure2 and StoredProcedure3 to point to CaptureTable2.

    The flip is controled by a management table which is used to note that a task has been completed and where it is overall.

    The next night the process begins again but flips SP1, SP2 and SP3 to use CT1 instead CT2 and so on.

    This setup allows me to manage over 200k transactions a day in a multi-switch call center to report on users status thru the day.

  • Thanks !

    But will i have no problems with the performance. After one month I will have 3.000.000 actions, after one year 36.000.000 ??? etc. etc.

  • You should not see an issue on collection side (as long as the collection side is on another server). You might thou in reporting and that is what you have to decide on what is acceptable for that server. I would consider what you have to retain for reporting and trending and remove anything you won't actively need regularly. Could be just a hardware issue. Without knowing a lot more detail about the reporting side of the process I have no way to tell you what should work best.

  • There's no issue with a table with 36M rows, or even 12 * 36M rows even. However if you want to easy reporting, you might look at partitioning and swtiching out partitions every month to help reporting if you report on a time basis.

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply