June 29, 2009 at 6:50 am
Greetings
Sorry if this has been asked before but I couldn't find the answer with the search function, so links on a postcard if anyone know about a reference which can help me.
I'm upgrading some of our legacy SSIS imports, and one of the requirements is to store each line of data from an excel sheet as a blob of data (preferebly pipe delimited) and error codes into an error table, so that any import faliours can be re-run from the database, and not picked out from the error spreadsheet.
I realise that I can write a query from the data source to concatinate all these together, but it doesn't seem the most sensible thing to do, and I'm aware that JET may have different requirements when it comes to data conversion which could cause problems, as this part of the process cannot fail otherwise we drop valuable records without any convenient way of reloading them.
Any opinions on this, or interesting alternatives would be much appreciated
Thanks
Laurence
June 29, 2009 at 11:21 am
Couldn't you do something in Excel to save the file as pipe-delimited and then import the rows into the database?
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
June 30, 2009 at 4:21 am
Hmm, interesting idea, however I'm not sure this would be any more efficient than concatinating the fields from the data source. This would involve me either changing files manully (not going to happen), having to code something through the interop's, or outputing a line at a time as a new flat file and then re-importing it, all of which seems like a bit of overkill for something which should be quite simple :S
I suppose I'll try both and see what produces more satisfactory results, but if you can think of a less labour intensive way of doing this (I could potentialy be designing a lot of these) I'd be interested to hear it.
Cheers
L
June 30, 2009 at 4:35 am
Using the Excel file as the source, a Script Component should be able to build the output you want. Will take some dev, but may be able to code it to take as many columns as are present in the input and produce a single output column.
But concatenation sounds easier to me!
Phil
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
June 30, 2009 at 4:47 am
Yea, I was thinking this, and would be a good idea if I could make it generic somehow. I made a script to dig the metadata from the dts framework about 8 months ago which I've now lost and google doesnt seem to be able to enlighten me as to how to do it again which is pretty frustrating.
Do you know of a resource that touches on data pipe metadata for a script task, I've always found this area very lacking in documentation.
L
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply