June 3, 2004 at 8:07 am
This one's been giving me a headache for a couple of days now...
I'm attempting to import multiple text files to multiple tables (we are talking in excess of 100 files and tables). Using the example on SQLDTS I've created a looping dts. I've been able to update the code to pull the table names. change the source and destination objects, however when trying to run it I get the problem of the transformations not being defined.
The transformations themselves are very basic and consist of a column to column import. I thought I could be clever and add the column names to the beginning of the import text file, but alas this cannot be done as you can only append to the end of the file (and the files are realisitically too large to move the data over to copies).
Any thoughts on how I could accomplish this?
ps. I thought about using format files and the bulk insert task, but I'd rather use a datapump
June 4, 2004 at 12:44 pm
I don't understand all the requirements, but have a look at this...
In your DTS, add an ActiveX object and a Source file object
Use a Scripting.FileScripting object to walk your source directory.
Once you target the file you want to import
Declare a connection object and set reference to the DTS Source File object
Dim o_conn
Set o_conn = DTSGlobalVariables.Parent.Connections("SOURCE_FILE_OBJECT")
o_conn.DataSource = PATH_TO_SOURE_FILE
Hope this help you get started...
June 4, 2004 at 12:51 pm
I have all that, with it dynamically setting the source and destination.
The problem is that the source tables are all different and they don't have the column names as the first column. As a consequence of this SQL is unable to map the transformations correctly in the datapump and it fails (transformations default the Col + col no.).
I've not been able to find a way around this and so it looks like I am going to have to resort to the bulk insert task.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply