August 6, 2009 at 3:48 am
Hi,
my task is :
step 1: want to create ##temp table
step 2: insert data into ##temp table from Flatfile.
step 3 retrieve value from ##temp table and insert into another table.
when i execute package, its throwing error
Error at Get Data from Temp Table to PayerDB [DTS.Pipeline]: "component "OLE DB Source" (576)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
becoz the temp table is not exists in Database, so it require meta data reference . if i create ##temp table manually before executing Package, then packagfe is working fine ..
so how to handle this ??
August 6, 2009 at 5:07 am
Hi,
Why are you inserting into temp table then into main table? Are you able to perform the transforms in the dataflow and insert directly into the main table?
Kindest Regards,
Frank Bazan
August 6, 2009 at 5:42 am
hi,
yes, i m doing that in Data flow task only.
creating temp table is done in Execute sql task
inserting into temp table form flatfile is in data flow task
inserting into Main table form temp table is in another data flow task
this is the flow
August 6, 2009 at 6:06 am
Try checking the "RetainConnection" property in the connection manager you are using.
If you elaborate a little, we may be able to help you find a way to avoid the temp table?
HTH
Kindest Regards,
Frank Bazan
August 6, 2009 at 6:20 am
hi,
i have a flat file.
Temp table name : ##Temp
Main Table name : Main_Table
this is the flow
Execute sql task1 ---> Data flow task 1 --->Execute sql task2 --> Data flow task 2
Execute sql task1 creating ##Temp table
Data flow task 1 Read data from Flat file
Execute sql task2 after inserting, i m checking pre validation in stored procedure . then ##temp table onlyu have valid data
Data flow task 2 the insert data into main_table from ##temp table
what is drawback of using Temp tables ?
how to avoid temp table ?
August 6, 2009 at 6:40 am
1. By using a temp table you are making unnecessary trips to the database.
2. If you lose your connection for some reason, then your base data is lost.
3. SSIS is perfect for most types of validation. You can use a derived column (or even a script transformation if the validation logic is more complex) with an expression to check that the data in the pipeline is valid and in doing so you won't sacrifice performance. Then you simply use this new column in a conditional split. Records that pass the validation go straight into your main table. Records that don't, you send to an error log or handle in what ever way suits.
I guess it depends how much time the current execution takes and how much time and freedom you have to work on your solution.
Kindest Regards,
Frank Bazan
August 7, 2009 at 3:16 pm
Frank is right on with his advice. SSIS is an ETL tool. You can do all of your validations and transformations inside the data flow before loading the data. You don't need to load the data (in a staging table), run your tranformations/validations, and re-load the data into your destination . That would be a Extract Load Tranform Reload (ELTL) tool!
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy