November 8, 2008 at 8:52 pm
Hi all,
i have a situation like reading from a folder which have thousands of flat file
for populating my fact table. Please suggest on best way to do this considering
that the source files have millions of rows.
it will be very much useful for me if some one do a walk through on the solution.
Ut help is appriciated.
Thanks
Arif
November 9, 2008 at 7:15 am
You can loop through the files and with variables you can set the connection string for the source to perform a bulk insert. If you are going to have to perform transformations to the data then you will have to use a different method. Not sure what your requirements are with the data. Here are some links to help guide you in your decision:
SSIS: Enumerating files in a Foreach loop
Flat File Bulk Import methods speed comparison in SQL Server 2005
----------------------------------------------------------------------------------------
Dan English - http://denglishbi.wordpress.com
November 9, 2008 at 7:16 am
Hi,
The fastest way to get data into SQL Server is using the T-SQL command "BULK INSERT" (or possibly INSERT ... SELECT * FROM OPENROWSET). One reason for this is that everything will be done in-process with the SQL Server process (so no shuffling the data between processes).
If your data is ready to be chucked in w/o any additional cleaning, BULK INSERT is the way to go. You will find a description of the syntax (very straight forward) in Books Online.
Next, read "Guidelines for Optimizing Bulk Import" in Books Online because there are optimizations you can do to load that data even faster (with locking behaviour, logging, multiple load streams etc.)
HTH!
/Elisabeth
elisabeth@sqlserverland.com
MCITP | MCT
http://sqlblog.com/blogs/elisabeth_redei/
http://linkedin.com/in/elisabethredei
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply