CSV source files keep changing

  • Hello All

    I am newbie to working with SSIS, However I am fairly experienced with SQL server and working with DTS.

    Currently I am working to upload handful number of CSV files to import into SQL server 2008.

    I am using Bulk insert with XML format files.

    The files are loaded into stagging tables where I do some data conversions, creating primary keys and foreign keys etc.

    The problem I have, CSV source files are likely to change and there will would be more fields added to them and of course those new fields will need to tbe added to the tables. I noticed when this happens most of the case I would have to recreate the Data Flow from scratch.

    Am I missing something here ? Is there's a way around it? or a way to prevent this?

    Many thanks in advance

  • Any thoughts on this yet?

  • SSIS is fantastic for writing data pump architecture (static colum setup, changable row volume and values), but if the columns info changes it does cause it all to grind to a nasty halt.

    I'd have to ask why are the columns changing so often? schema may require looking at. It might be better to switch to using a named pair combo, that way additions to the schema are converted into additional rows instead of columns. This allows your meta data to stay the same. It will have downfalls though with data types, referential and domain integrity issues.

    I did notice you said they are XML files? If it is indeed then using some of the XML datatypes and sql functions could improve your lot here. Not something I have much experience with though.

    Something else that might be of use to you is the BIML language, which is a method you can use that outputs dtsx packages... Check out BIDS helper on codeplex, it has added support for BIML in its latest releases.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply