July 9, 2004 at 3:19 am
Thank you for your prompt reply, please could you expand upon
"YIKES, including the in-line SQL "
as for breaking the original file down before hand, this just seems like additional steps which we could avoid by using this script. Because we are not just inserting data, we are also updating and deleting it just seems simpler (but lengthy on the code) to keep all of the processing in one place.
July 9, 2004 at 3:30 am
You know your data best of course.
My flatfile looks like this
ACHIKSFIUNY 87398 ASF9 DKS
BDSFI9SDK KDIFAS KJDSJFSKFK
BDFSDUOUOOOSD
C12
C15
C19
C9
D4
ACHIDSKFI 83942 SDKF FO
B, B, C , D, D ETC ETC
Each line starts with either A,b,c or d. There's one:many B C D to any A. Only A has the Key value. So I had two options.. one was to import the whole thing into one table in SQL and then use Script to parse through the rows like you did, or have the flat file be broken down into 4 individual files file for A, file for B file for C, file for D before I got to SQL Sever stage. Seeing how VBScript and the programming enviroment for the script is lacking at best, I went with option B. I chose to create a C#/.Net program but any language would do, so I use C# to read through my file, keeping track of the key in A, which I write in each file B,C,D. Once I have my four files (nice and clean), I invoke a DTS Package which runs through importing each file into its own table and then perform my data intergration/changes. Since its more productive to me as a programmer to work with C# in a real IDE, then VBScript in lame-brain DTS, I gained productivity by using best tool for job. There's nothing "faster" about using SQL Server to use VBScript to read a file line by line than another program. My original big flat file is about 25MB and the C# program takes about 2 seconds to run. In fact it takes longer for me to start the program than the program spends time reading the big file and creating the 4 new ones.
July 23, 2004 at 2:43 am
We are starting to test the bulk insert section of this import process and hit an unexpected issue. We are importing the text file into a temp table to then do the inserts, updates etc. We need the data to be every other line as stated in previous posts, the txt file is the right format but the bulk insert seems to reorder the rows on import meaning the table name in record 1 may then be followed by the wrong data row or many table rows and data rows can be found next to each other! Is there same way of maintaining the order of the text file lines when importing into the table.
Any help on this would be appreciated
Martin
Viewing 3 posts - 16 through 17 (of 17 total)
You must be logged in to reply to this topic. Login to reply