April 14, 2008 at 7:59 am
First of all, I'll admit that this may be something that would be better addressed to a VB forum, but I've always found help and/or great support with SQL related issues here. Than being said, here goes...
I have been assigned to a project where we receive a monthly spreadsheet from several vendors containing updates, additions and delete information about their customers. This data in these spreadsheets rarely comes in the same format (column headings, number of columns, etc. change all the time.)
I have created a VB application that reads the CSV file into a data grid, and displays the results on a form.
Once the data is in the grid I read the columns, create a script to generate a SQL table and then read the rows of the grid create, creating a SQL script to insert them into the newly created table.
No problem to this point, except that the SQL Database is located on a linked server and inserting 2500 or more rows in this fashion can be rather slow.
I would like to modify this process to use an insert into (field1, field2, ...) Select field1, field2... from [myTable], which I assume would be a much quicker method of moving the data into the SQL table (right?);)
My DataGridView has been created at runtime and does not have a DataSet, TableAdaptor or DataTable associated with it. I cannot use the DataGrid object, as it's not truly a DataTable that can be referred to in the SQL script. Has anyone uncovered a method to generate a DataSet, DataTable, etc. that can be generated based on my DataGridView object.
April 15, 2008 at 5:28 am
If linked server is not on a fast connection, it's the latency that makes it slow - each insert waits for reply. So, you can insert data to a local server and transfer the whole table to linked server.
April 15, 2008 at 11:30 am
The function below moves 26,000 records at around 6 Mb into a table on our server in less than 2 seconds, using ADO.Net's bulk copy object. The destination table contains all varchar fields in the same order as those in the .csv file. After the data has been imported, I run a stored proc to clean it up and cast fields to the proper type as I move the data to a live table. I scraped this single function together from different tiers in my app, but it should run okay as it is below. All of our .csv data is in the format: "xxxxx","xxxxxxx","xxxxx"....
What you could do is create a generic import table for your .csv imports, then process the generic table with the appropriate stored procedure depending upon the type of spreadsheet sent by your vendor.
Hope this helps.
Imports System.Data
Imports System.Data.SqlClient
Public Sub BulkCopy()
Dim myConnectionString as String = "your connection string"
Dim myConnection As SqlConnection = New SqlConnection(myConnectionString)
Dim csvInputFile As String = "DataToLoad.csv"
Dim dt As New DataTable
' Get the contents of the file
Dim sr As New StreamReader(csvInputFile)
Dim fullFileStr As String = sr.ReadToEnd()
sr.Close()
sr.Dispose()
' Lines in my .csv file are terminated with line feeds
Dim tmpcsvLines As String() = fullFileStr.Split(ControlChars.Lf)
' First line in csv file contains headers- Don't want it.
' Also, the last character in the last line of input is a line feed, which has created an extra item
' in the tmpcsvLines array with a value of Nothing. Get rid of this also.
' So copy the array to the working array, minus the first and last items from the
' original.
Dim csvLines(tmpcsvLines.Length - 3) As String
Array.ConstrainedCopy(tmpcsvLines, 1, csvLines, 0, tmpcsvLines.Length - 2)
' Some fields have embedded commas, so replace the comma delimiter with something else.
For i As Integer = 0 To csvLines.Length - 1
csvLines(i) = csvLines(i).Replace(""",""", """|""")
Next
' Add a column in the DataTable for each field
Dim tmpCols As String() = csvLines(0).Split("|")
For Each tmpStr As String In tmpCols
dt.Columns.Add(New DataColumn())
Next
' Now add a row for each item in the csvLines array and fill it with a record's data
Dim dr As DataRow
For Each tmpLine As String In csvLines
dr = dt.NewRow
dr.ItemArray = tmpLine.Split("|")
dt.Rows.Add(dr)
Next
' Hand the DataTable & name of the destination table over to BulkCopy.
myConnection.Open()
Dim bc As New SqlBulkCopy(myConnection)
bc.BatchSize = dt.Rows.Count
bc.DestinationTableName = "DestinationTable"
bc.WriteToServer(dt)
bc.Close()
myConnection.Close()
End Sub
April 16, 2008 at 5:48 am
Charlie,
Thanks for your posting!
With very little modification the procedure will do the trick for me.
I tried the SqlBulkCopy method to the destination linked server and found it to perform extremely well. 😀
I knew that someone here would have an answer for me.
This is one great resource!!!
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply