July 1, 2003 at 3:28 pm
I have an ASCII fixed-length text file that I'm having trouble trying to move into SQL Server 2000 through DTS. It's a file that I FTP from our mainframe MVS system. I've already written the DTS to periodically FTP this file and move it into our SQL Server. The only problem is that this flat file sometimes doesn't contain data in the last few fields on the first couple records. This throws the DTS package off completely and only imports every other record for some reason.
Fix-Length ASCII File Details
1. Records are fixed-lenth with {CR}{LF} as record delimiter. Basically, each record is on it's own line.
2. Total record length is 447 characters
3. The last few fields (byte #430-447) in this 447 record length may contain spaces which represents no data for these fields.
Here is my problem. When creating the Text File (Source) object in the DTS Designer, I'm unable to chop it into it's 447 fixed length size. The GUI automatically ends at 430 because the first record in this file contains spaces from 431-447. This will happen the majority of time due to the way the data is stored.
I had to manually type in junk data such as "XXXXXX" to fill in the gap from 431-447 for only the first record in this file so that I can chop up the fields correctly. I was able to see and access these additional fields represented as ColXXX during my ActiveX Transformation script.
The DTS package successfully imported 100% of the records when the first record had
data such as "XXX..." from 431-447. However, in production, the first record
usually doesn't have data from 431-447. Hence, only 50% of the records are
imported successfully. I'm a bit puzzled by this.
BTW, this text file is about 300MB. It's not possible for me to manually go in and
add "XXX" everytime this DTS package is supposed to run. The DTS package is
supposed to run Mon-Fri and should be completely automated. I hate to be manually
typing in "XXX..." everytime in order for the DTS package to work properly.
I can't believe that the GUI interface won't let you define fields from 431-447 when there are no data in the first record. It just completely ends at 430 with that Red Line when you are chopping it up.
Does anyone know a way to overcome this limitation? Thanks.
July 2, 2003 at 4:34 am
All you need to do is have data, not spaces, in columns 431-447 the first time you define the columns and transformations.
Subsequent eecutions of the DTS package will just chop up the record based on your previously defined column settings, whether there are spaces or not.
Just ensure that the record will always be 447 characters in length all the time.
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
July 3, 2003 at 11:25 am
That is what I thought too. However, subsequent execution (using real production data with blanks spaces in 431-447) only imports every other record. I've verified that it only picks up Record #1, 3, 5, 7, etc.. (all odd # records). I can't figure out why it's doing this. I've never encountered this weird situation before.
I've written a program that will modify this input text file and add an "X" in column 448 so that the total fixed-lenght is 448 for every single line. I don't use this 448 column at all in my DTS. It's just an extra field that doesn't get used. This modified text file works perfectly in DTS. 100% of the records were imported.
However, it would be nice to find out why the DTS isn't picking up every single line in this text file. In this case, the above workaround was a solution. However, if the text file was comma-delimited or variable length, then I would have really been in a hole if it only picked up the odd # records.
July 3, 2003 at 7:02 pm
Now that is strange because I tested this on my server here. I created a text file in Notepad with 5 lines of varying lengths, and made sure the first one was the 447 characters. I then duplicated these lines 20 times to give me 100 records.
The destination table was just a pile of varchar fields with no primary key, so I was able to run my import a number of times. Each run produced 100 additional records in the table.
Are you sure it's not some other reason that the records are being missed?
Have you tried setting the exception file properties so that source records in error get output to a text file?
Maybe that will help solve your problem.
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply