Problems filling a Load testing database

  • Hello!

    I would like some input on following:

    We are trying to do some load testing on one of our applications. We are trying to build a massive datatabase that some of our customers might have after many years of use.

    We have several tables. In two of them we will have close to 2 millions of rows and in two we will have closer to 120 millions of rows.

    We have written a simple C# application to insert the data into MsSql server. In each loop we insert ~600 rows totally (theese rows are spread throughout the database and only a few in each table above).

    We have GUIDs as primary keys and indexed (not as clustered indexes though).

    Our problem is that we only insert ~7000 "posts" (loops) / hour and if we want to insert 2 millions it will take too long.

    Any ideas?

  • Use your app to generate flat files then use BCP or BULK INSERT to import the files.

    Should be faster than a very fast thing

    Also, drop the indexes, foreign keys, primary keys etc prior to the insert for maximum input speed then re-add them after population is complete.

  • Thanks for the advice.

    I tried Bulk Insert using a little test application and I will probably manage to insert my data in hours instead of weeks.

    /Joel

  • Be wary though ... if you dropped all the indexes and primary keys as was suggested. The time you saved up front inserting data may be spent recreating what you dropped.

     

    Besides your original timing in 1 hour - 7000 loops ot 600 inserts per loop seems to be pretty respectable at 4,200,000 rows insered per hour. No matter how you slice it, big is big.

    RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply