May 26, 2005 at 9:39 am
I'm using SS2000 and I get no errors. The value of the data in XLS is 'D51SEVO'. The field in the dest table is varchar (50) which allows nulls.
Another record with a value of '1501' in XLS does get populated to the table. How do I fix this, and is there a way to have DTS show me
each field.row of the source XLS that did not get populated into the
dest table?
Thanks for your help, I'm new and clueless.
May 26, 2005 at 10:41 am
DTS by default checks the first few rows of data and then determines the type it expects. If it sees something different it will ignore the value and bring in null. Very annoying, I know from experience.
This article if followed correctly works a treat:
http://www.sqldts.com/default.aspx?254
May 26, 2005 at 11:34 am
Thanks Jonathon for replying so fast!
As another option, could I save the XLS file as a text file
and use that as input and then would it be logical to assume
that if the data looks good in several random records, then the data is
being imported correctly to the table?
Thanks for your help!
May 27, 2005 at 2:29 am
As far as I know text files are treated differently and import various data types in the same column with no problem.
I have never had the same problem, so I can see it working OK for you.
Try out an import with a text file and see how it goes.
May 27, 2005 at 6:08 am
I would save your XLS as an MSDOS CSV file. Use an Active-X Transform and format each source column in the manner expected at the destination:
Function Main ()
'** Force text values from column one
DTSDestination("foo") = Left(Trim( "" & DTSSource("Col001"), 50)
'** Force non-nullable numeric field with default to zero from column two
If IsNumeric( Trim( "" & DTSSource("Col002") ) Then
DTSDestination("ID") = CInt( DTSSource("Col002") )
Else
DTSDestination("ID") = 0
End If
.....
-Mike Gercevich
May 30, 2005 at 1:47 am
You might find this is a bit slow if you have lots of data. Generally I prefer to see the data imported into varchar columns into a raw table and then manipulated and copied into a stage table which has the right types for its columns, as this is faster and (IMHO) easier to debug.
Bill.
June 1, 2005 at 7:50 am
Back in the late 90's, when I was still primarily a programmer, I wrote a program that would import data from any source and put it into our systems standard format. Worked great - except with ZipCodes. Seemed whenever we got a foreign (read Canadian) Zip Code with the letter-num-letter format, it compeltely blew up the engine. Guess what engine I was using? Yep - the SQL Engine.
My solution after months of searching was to ban Excel Files. I imported from data sources as vast as any ODBC driver would allow, but completely disallowed any kind of Excel file.
To this date, I will NOT use Excel for data import because I know about this issue. Whenever I receive one I immediatley export it out to a delimited file, which I then import. Import, transformation and Staging tables have become a way of life, and the fastest is to bulkcopy into a import table, then run a process to transform and insert the data into your staging area.
Other than that - my best advice is to not accept Excel files at all.
Tim Blum
Senior Manager, Outsourced Data Services
Learn2Live at Ureach.com
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply