February 4, 2003 at 7:28 am
I have to process *.csv files, put the modified information into a sql table.
Instead of doing "insert" from the vb side, is it possible to pass the entire array in a parameter to a procedure, and let t-sql do all the inserts on the server? Some of the files have 10,000+ records, and I have over a 1,000 files to process per cycle (3 cycles per day).
Or any suggestions on a more efficient method?
Thanks,
Joe
February 4, 2003 at 8:15 am
Unless there's something you really don't like about VB, I'd recommend making a stored procedure for the insert, opening the file in VB, cutting it up in VB, then executing the Insert stored procedure using an ADO Command (adExecuteNoRecords). The advantage of doing it this way, is you standardize the way a record is inserted (using the stored procedure), and keep processing time minimal on the database (as it doesn't need to handle the "cutting" up of the files. Additionally, if the files change format, or possibly more likely, you require additional formats, you would be able to adapt the VB program without changing the SQL code. Just a suggestion. There are some articles out there which explain cutting a delimited string into either a TABLE data type, or into a temp table (7.0 vs. 2000), however, I would investigate doing this through a VB program as well. Let's see what other people suggest...
Hope this helps,
Jay
February 4, 2003 at 8:21 am
Thanks for the suggestion. I plan on doing the cutting up of the file(s) on the vb side.
I was looking for more efficient way of processing the records into the sql server table.
I think what you are saying, is that it is more efficient to pass the parameters for the insert to sql server, than do the "insert .. into .." from vb.
Is this correct?
I was hoping to find an "array" type in T-SQL, so I can pass an entire file worth of data to slq server to be processed.
Thank you,
Joe
February 4, 2003 at 8:24 am
As an alternative, you can use OPENXML to insert multiple rows with one procedure call.
Tim C.
//Will write code for food
One Windows to rule them all, One Windows to find them,
One Windows to bring them all and in the darkness bind them
In the Land of Microsoft where the Shadows lie.
Tim C //Will code for food
February 4, 2003 at 8:32 am
quote:
As an alternative, you can use OPENXML to insert multiple rows with one procedure call.
Indeed, that is a fine way of handling the solution. That way, your VB program will handle converting the file into an XML document, and the stored procedure can handle batch inserts based on the OPENXML call. Look up OpenXML for instructions/pointers on how to do this effectively. Great suggestion, Tim!
February 4, 2003 at 8:35 am
Thanks,
So, I process the *.csv file into OPENXML file, that can be passed into a SQL Stored procedure.
I will try this. If any one has code snipets (vb and/or t-sql) that they can offer, I would appreciate it. I will be looking this up and giving it a try.
Joe
February 4, 2003 at 1:16 pm
I'll go with the VB clan.
Use split to get to arrays, insert local recordset and do batch insert to SQL.
How about DTS?
February 4, 2003 at 2:00 pm
February 4, 2003 at 2:32 pm
Thank you all for you help. I have a solution coded and working. I went with reading the csv into a record set, processing the information into a temp table, then using the temp table in a store procedure.
thanks again,
Joe
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply