How to bulk insert (or another way) to SQL table from datatabel from inside store prcedure

  • I passed .net datatable from a .net app to a store procedure.

    From this store procedure, how to code to bulk insert (or another way) to SQL table?

  • Here's an example:

    public void WriteDataTable(

    String ServerInstance,

    String Database,

    String TableName,

    DataTable Data,

    int BatchSize,

    int QueryTimeout,

    int ConnectionTimeout

    )

    {

    String ConnectionString;

    ConnectionString = "Server={0};Database={1};Integrated Security=True;Connect Timeout={2}";

    ConnectionString = String.Format(ConnectionString, ServerInstance, Database, ConnectionTimeout);

    using (SqlBulkCopy bulkCopy = new System.Data.SqlClient.SqlBulkCopy(ConnectionString,

    SqlBulkCopyOptions.KeepIdentity |

    SqlBulkCopyOptions.FireTriggers |

    SqlBulkCopyOptions.CheckConstraints |

    SqlBulkCopyOptions.TableLock))

    {

    bulkCopy.DestinationTableName = TableName;

    bulkCopy.BatchSize = BatchSize;

    bulkCopy.BulkCopyTimeout = QueryTimeout;

    bulkCopy.WriteToServer(Data);

    }

    }

    -- Gianluca Sartori

  • Thank you help but I think that this code is working on App not in SQL store procedure.

    My app alreay created datatable and passed it to SQL store procedure.

    I need complete app to insert data from datatable in store procedure to a SQL table so that end of user can use it.

  • If you already managed to pass a whole datatable to a stored procedure, I guess you are using a Table-Valued Parameter.

    Simply insert from the TVP to the destination table:

    INSERT INTO destinationTable

    SELECT SomeColumns

    FROM @TableValuedParameter

    It sounds too simple to be the answer you are looking for... can you please share some more info on what you're after? Some portions of the stored procedure maybe?

    -- Gianluca Sartori

  • Real table is including 300 columns with different column name and data type but datatable has 300 columns with name as column1, column2 ... column300. (imported from a csv fle)

    Is it possible to use bulk insert?

  • There's a BULK INSERT command in SQL Server. You can read about it in here: http://msdn.microsoft.com/en-us/library/ms188365.aspx

    To add functionality using format files, you can read the following article:

    http://www.sqlservercentral.com/articles/BCP+(Bulk+Copy+Program)/105867/

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Luis Cazares (11/4/2014)


    There's a BULK INSERT command in SQL Server. You can read about it in here: http://msdn.microsoft.com/en-us/library/ms188365.aspx

    To add functionality using format files, you can read the following article:

    http://www.sqlservercentral.com/articles/BCP+(Bulk+Copy+Program)/105867/

    This functionality is pretty powerful and you might not even have to get .NET involved. The throughput of this approach is pretty incredible.

    When I do this, I normally import into a staging table, where I validate the data and make any necessary adjustments. This isolates your incoming data from what's in production and you won't be hitting your production tables with a bunch of scans and updates while users are trying to use it. It also gives you an "out" without ever touching production if something necessitates that you abort. Once you're done, you can insert directly into your production tables.

  • Ed Wagner (11/4/2014)


    Luis Cazares (11/4/2014)


    There's a BULK INSERT command in SQL Server. You can read about it in here: http://msdn.microsoft.com/en-us/library/ms188365.aspx

    To add functionality using format files, you can read the following article:

    http://www.sqlservercentral.com/articles/BCP+(Bulk+Copy+Program)/105867/

    This functionality is pretty powerful and you might not even have to get .NET involved. The throughput of this approach is pretty incredible.

    When I do this, I normally import into a staging table, where I validate the data and make any necessary adjustments. This isolates your incoming data from what's in production and you won't be hitting your production tables with a bunch of scans and updates while users are trying to use it. It also gives you an "out" without ever touching production if something necessitates that you abort. Once you're done, you can insert directly into your production tables.

    +1000

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply