August 16, 2012 at 10:00 am
I came across this blog post when looking for a quicker way of importing data from a DB2 database to SQL Server 2008.
http://blog.stevienova.com/2009/05/20/etl-method-fastest-way-to-get-data-from-db2-to-microsoft-sql-server/[/url]
I'm trying to figure out how to achieve the following:
3) Create a BULK Insert task, and load up the file that the execute process task created. (note you have to create a .FMT file for fixed with import. I create a .NET app to load the FDF file (the transfer description) which will auto create a .FMT file for me, and a SQL Create statement as well – saving time and tedious work)
I've got the data in a TXT file and a separate FDF with the details of the table structure. How do I combine them to create a suitable .FMT file?
August 25, 2012 at 6:59 pm
The Bulk Insert Task will work but I prefer using a Flat File Source with an OLE DB Destination setup w/ FastLoad.
The former replaces the need for a format file because you will define the structure of the flat file within it.
The latter operates on the database using the same API as the Bulk Insert Task, namely the bulk load API.
Here is an article I found that covers most areas of this topic well.
Using SQL Server Integration Services to Bulk Load Data by Robert Sheldon[/url]
The only other thing I would recommend looking into to get maximum performance not mentioned in the article are tuning the buffers:
Integration Services: Performance Tuning Techniques > Buffer Sizes
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply