lots of inserts.. best way to insert em

  • I have a question on the best way to do something.

    I'm monitoring some metrics, and wanted to keep a history, probably no more than a days worth, but im checking every 2-5 seconds, and there could be 50-100 metrics

    I'm thinking an insert per metric every 2 seconds could be quite inefficient.. I'd be doing 100 or so INSERT's every couple of seconds

    So I'm trying to think the best way around this.. I could union each the inserts.. they're all going into the same history table anyway

    INSERT into mytable VALUES

    SELECT 'blah blah' UNION

    SELECT 'blah blah' UNION

    etc.

    But it just doesn't seem to me that it'd be any better.

    Or I could save the inserts for a minute and burst em with either straight inserts or whatever, but it'd only be once a minute, but for more values.. uch..

    FYI: I'm using a .NET app to get the metrics and do the inserts..

    Any ideas on how to do this better would be appreciated.. im sure theres something easier, but i've been out in the sun today and i think i've cooked my brain a little

    --------------------------

    I long for a job where my databases dont have any pesky users accessing them 🙂

  • It will not be the inserts that will be inefficient, but querying that data will most likely present the problems...

    SQL, even on modest hardware can handle lots of simple inserts, I'd just do them individually. Make sure your data and/or index files have enough room so that you aren't trying to grow the data file(s) as part of the insert transaction and I suspect you'll be fine.

    /*****************

    If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek

    *****************/

  • I'd collect the data in a comma separated list (comma on fixed position, e.g. every 10th character) up to a size of approx. 4k (resulting in 400 values), pass that list to a stored proc, use a Tally table based split function to get the list back to a table structure and insert the values.

    References: Tally table & string split[/url]



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • If storing the values to a file is an option, I'd set it up so that a new file is created every x minutes or when it gets to x bytes. Then you could use SSIS to stuff the data into the database.

    Obviously, if you took this approach, you'd need to build some sort of file maintenance and tracking into your SSIS application, but that's pretty easy to do.

    /*****************

    If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek

    *****************/

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply