Execute large number of insert statements

  • I m trying to insert thousands of rows in a table and don't want to use the SSMS query windows because it will cause a lot of blocking.

    I thought about using bcp or bulk insert but the file I m trying to execute is a sql file not a flat file and is not coma or tab delimited

    The format is as follow but more like thousands of records: INSERT INTO table_1 (field1, field2) values ('a', 'b')

    Is there another solution to run more efficiently without using SSMS query window?

    Thanks in advance

  • Why do you say using the SSMS window will cause lots of blocking?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • I'd probably insert the thousand rows into a table variable first. Then, insert the set in one shot. In either case it should take less than a second or two unless there is a heavy duty insert trigger.

  • FWIW, a few thousand rows shouldn't cause much blocking any way you do it.

    If the file is really just a bunch of insert statements that you're trying to run, I would be inclined to use find/replace to alter the inserts into (what I'm assuming is) your presentation table, but into a temp table instead (e.g. #myStagingTable). That way the work involved with putting all the rows together will be done on tempdb, not on the live table.

    From there, you have several options. You could simply insert directly from the table, or you could do some sort of batching over the staged data set and insert them row by row, or by moderately larger batches.

    Executive Junior Cowboy Developer, Esq.[/url]

  • i've found that files that contain INSERT INTO ....statements are painfully slow in SSMS once they hit, say 40K rows or more, stretching simple inserts into minutes of waiting.

    i'd prefer BULK insert over insert statements for anything substantial.

    I'd also go with the previous recommendations of a staging table first, before inserting into the live, but that's because i'd like to review or clean the data before inserting into the destination, and that staging/temp table provides that ability.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • A few thousand rows? Just use the INSERT statements.

    A hundred thousand rows? BCP or BULK INSERT - that many INSERT statements at once often hit memory issues (even using sqlcmd instead of SSMS).

    Definitely put things into a #temp table first, apply your check constraints/unique indexes/etc. and other validation processes, then put the good data into final tables.

  • this query:

    create table #orders (ordernumber int, orderdate datetime)

    insert #orders

    select ordernumber, orderdatetime

    from tblOrderHeader

    where CatalogYear = 2013

    inserted 134,057 rows in 1 second from SSMS. Does it need to be much faster than that?

    Wallace Houston
    Sunnyland Farms, Inc.

    "We must endeavor to persevere."

  • do you mean you want to insert data into a SQL table from another SQL table on the same database?

    if so:

    Insert table#1 (field1, field2)

    select field1, field2 from table#2

    BCP or Bulk copy is used to get date into or out of a database from (or to) a flat file. If you are inserting into a table on the same database there is no better way than the Insert statement.

    Wallace Houston
    Sunnyland Farms, Inc.

    "We must endeavor to persevere."

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply