October 28, 2009 at 5:57 am
Hi,
I have discovered the wonders of Openrowset to extract data out of flat files but to date Ihave only been using files with several thousand records.
My task is to open the rowset and insert the contents of the text file into a DB table.
ie INSERT into TableA (a, b, c, d)
select a, b, c, d
from Openrowset'MSDASQL', ..........)
If the file I am opening has millions of records,
At what point will the insert commit, ie once all records have been processed or is it possible to do a periodic commit?
Thanks
October 28, 2009 at 6:04 am
October 28, 2009 at 9:00 am
Thanks for that. I'll investigate SSIS.
October 28, 2009 at 8:45 pm
Ok, I'll be honest, as well. If I could avoid SSIS, I would. A simple BULK INSERT (possibly using a BCP "Format File") will blow the doors off most other things. Yeah, I know... SSIS has a BULK INSERT task but why bother?
--Jeff Moden
Change is inevitable... Change for the better is not.
October 29, 2009 at 3:16 am
Without knowing the ins-and-outs of Bulk Insert I originally used Linked Servers then was told by the DBA that I would require to higher a privilege to use that, so we moved onto openrowset. The reason being, I will need to load x number of files with unknown number of columns but knowing that y number of columns will always be the same in the files so only load those y columns from the files.
Oh, and I have to automate it via shell scripts.
I have the whole process working in a automated manner but then I was asked to see if this would work with files with a few million records, which it did., but then my concerns in the original question arose.
I've started with the SSIS tutorial then will see if I can auto load files using my problem above.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply