November 16, 2010 at 8:24 am
Hi,
Pls have a look on the below
Client:1 is having data as empid,empname,address,pin,phone,city,mobile,gender
Client:2 is having data as empid,empname,address,phone,city,pin,mobile,gender
Client:3 is having data as empid,empname,address,phone,city,pin,mobile
Client:4 is having data as empid,empname,address,phone,city,pin,country
.......
Client:65 is having data as empid,empname,address,phone,city,,state,country,pin
The above are the input for me from different clients they will be giving data in textfile with their own delimiters some as ,;"<TAB> what i have to do is i need to push data from this file to my table having similar columns for those columns not present in files i want to insert NULL and i need to harcode clientid(1,2,3...etc) and finaly to table
In a package Source i have choosen is Flatfile staring from 1 to 65 and i have choosen only one destination [OLE DB Destination]
How to Process this what transformation i need to do inorder to achive this.
Whether the flat file for the above choosen by me is correct or any other way
All the files are in FTP path for each file there is folder [Client1,Client2..etc] there the text files will be there i will
be taking from there File names will be as Client1_todaysdate , Client2_todaysdate... this week on monday i need to load and next week there will be new file with different data i need to load
One thing is sure here once the client has given the format[delimiter] they will not change
Advice me on this since it is my first implimentation.
Any other way it can be done(simple).
Thanks
Parthi
Thanks
Parthi
November 16, 2010 at 9:00 am
How big are the files.
If they are smallish the easiest way to deal with this would be to bulk insert into a staging table with a single column.
Parse the data into the constituent columns (use a cte?).
Have a metadata table to show the structure of the data depending on the file name mask - probably just need the column names + any data that you need to add.
Insert from there into the destination table.
Encapsulate it all in an SP and pass in the filename.
It means you only have to write everything once and the changes to deal with the different structures are dealt with in the metadata table.
Cursors never.
DTS - only when needed and never to control.
November 17, 2010 at 5:57 am
nigelrivett (11/16/2010)
How big are the files.If they are smallish the easiest way to deal with this would be to bulk insert into a staging table with a single column.
Parse the data into the constituent columns (use a cte?).
Have a metadata table to show the structure of the data depending on the file name mask - probably just need the column names + any data that you need to add.
Insert from there into the destination table.
Encapsulate it all in an SP and pass in the filename.
It means you only have to write everything once and the changes to deal with the different structures are dealt with in the metadata table.
Thanks for your replay i need to be done in SSIS only
Any way we can do this
Thanks
Parthi
November 18, 2010 at 1:35 am
parthi-1705 (11/16/2010)
Advice me on this since it is my first implimentation.
parthi-1705 (11/17/2010)
Thanks for your replay i need to be done in SSIS only
Why do you need to be done in SSIS only? If this is your first implimentation, what makes you sure that it is the best method?
John
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply