November 23, 2009 at 12:34 pm
Hi,
I have CSV files which I'd like to load into a table using the function uftReadfileAsTable. (http://xpns.net/Examples/uftReadFileAsTable.txt). I've been able to read the first line, but am unsure how to move to the next line of the CSV file. Has anyone used this function to loop thru a file a line at a time?
Thanks,
Steve
November 23, 2009 at 1:11 pm
I think it would be so very much easier to create a DTS / SSIS Package. There is so very much more that you can do with the Comma delimited text file by using SSIS.
Andrew SQLDBA
November 23, 2009 at 1:19 pm
Or to use BULK INSERT in T-SQL. If the file is a CSV this should be relatively easy to accomplish.
November 23, 2009 at 1:50 pm
I looked at BULK INSERT, but unfortunately the comma-separated values have some fields that are
also delimited by double quotes, like this...
...MWF,0227813,"Miller,Steve", 1,PI, 0,Y,UWCOL,BRB1501,BRB Administration,...
Is there a way to specify 2 delimiters, similar to Oracle where can specify that the data is optionally enclosed by '"'?
Thanks,
Steve
November 23, 2009 at 2:53 pm
BULK INSERT includes the ability to use a bcp-style format file. The format file could in theory define a different delimiter for each column in your input file.
Format files are a discussion unto themselves, but should be easy enough to find in BOL. SSIS would probably be easier.
--SJT--
November 23, 2009 at 3:21 pm
What is the source of the data? Instead of a comma delimited file, could you get a pipe delimited file?
November 23, 2009 at 9:45 pm
smithsp17 (11/23/2009)
Hi,I have CSV files which I'd like to load into a table using the function uftReadfileAsTable. (http://xpns.net/Examples/uftReadFileAsTable.txt). I've been able to read the first line, but am unsure how to move to the next line of the CSV file. Has anyone used this function to loop thru a file a line at a time?
Thanks,
Steve
You don't need to loop through it. You give the function the correct parameters (especially the number of lines to read) and you use the function in a FROM clause as if it were a table because it's a multiline table valued function.
That being said, the function uses the ever slothful sp_OA* procedures... I agree with the others... BCP, Bulk Insert, or even OPENROWSET are much better choices. I don't use it but you could also take a crack at SSIS... some say it's pretty good.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 23, 2009 at 10:07 pm
You don't need to loop through it. You give the function the correct parameters (especially the number of lines to read) and you use the function in a FROM clause as if it were a table because it's a multiline table valued function.
That being said, the function uses the ever slothful sp_OA* procedures... I agree with the others... BCP, Bulk Insert, or even OPENROWSET are much better choices. I don't use it but you could also take a crack at SSIS... some say it's pretty good.
Please listen to the advice above. As an aside, I think SSIS works great, except when it doesn't (which happens much more often than some would like to admit). T-SQL code is much more reliable IMO. I'd use bcp or BULK INSERT to get the job done and be pretty certain you won't get a phone call at 03:00 AM because your SSIS package failed and the business process is stalled in Hong Kong.
--SJT--
November 23, 2009 at 10:16 pm
A theoretical example, the Hong Kong thing.
November 24, 2009 at 1:19 pm
Thank you all for your input and help on this. I'm going to use the bulk insert, as we've had some problems with SSIS package that's in place now.
Steve
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply