November 13, 2007 at 2:57 am
Queries regarding SSIS:
1) Source table contains 4 columns and target table contains 5 columns:
the fifth column needs to be a hard-coded string value
How can I achieve this ETL logic.
2) How to schedule data flows in a package. i.e. If I have to run first data flow task using ETL logic. Second package should run after completion of first task. Also is it possible to run only second data flow task in a package or I have to create separate packages for each data flow task.
3) Is it possible to run one DBCC query before and after running a package which will load the data into the destination table? Actually before loading the data in the final table, I have enable the identity insert.
Before running the package: set identity_insert TBL_TEST on
After running the package: set identity_insert TBL_TEST off
4) I have written a trigger (After Insert/Update) on a table which loads the data in the Audit Table. But when I run the a package using Data Flow (E:Flat File Source -> T: Copy Column -> L:OLEDB Destination), the data get inserted in the destination table correctly, but no data gets inserted in the Audit Table. What would be the problem?
5) I am loading the data from CSV file to table using Bulk Insert. Only two columns Id and Descr. There is no overload, the data is perfectly alright.
For other columns, I am using identity and default values.
But it gives following error
[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 2. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (Id)."
November 14, 2007 at 7:51 pm
Hello
1) in the source of the data flow, use a select command instead of the table name. In the select statement you can pass column names *and* a quoted string as an expression - for example, select col1, col2, 'this is a hard-coded string' from MyTable
2) Use constraints between the data flow tasks. Yes, you can use conditional logic in constraints.
3) Yes you can do that using two sql tasks sandwiched around the data flow task - You might have to set the RetainSameConnection property to true
4) Might be the fact the trigger fires once for the action - not once for the row
5) No idea. I would google the error message
November 14, 2007 at 8:23 pm
Hi Mike,
Thanks a lot for reply.
Following are the my comments:
1.Source table contains 4 columns and target table contains 5 columns:
the fifth column needs to be a hard-coded string value
How can I achieve this ETL logic.
?Use Script Component Transform.
2.How to schedule data flows in a package. i.e. If I have to run first data flow task using ETL logic. Second package should run after completion of first task. Also is it possible to run only second data flow task in a package or I have to create separate packages for each data flow task.
?Use conditional script Transform or use Sequence Container (Control Flow)
3.Is it possible to run one DBCC query before and after running a package which will load the data into the destination table? Actually before loading the data in the final table, I have enable the identity insert.
Before running the package: set identity_insert TBL_TEST on
After running the package: set identity_insert TBL_TEST off
?In a Control Flow, write Sql Task before and after the data flow.
4.I have written a trigger (After Insert/Update) on a table which loads the data in the Audit Table. But when I run the a package using Data Flow (E:Flat File Source -> T: Copy Column -> L:OLEDB Destination), the data get inserted in the destination table correctly, but no data gets inserted in the Audit Table. What would be the problem?
?Bulk Insert probrlm of SSIS, since SSIS loads data in a bulk which may not be firing the trigger.
Solution: Write a query to insert into trigger (OLE Command) or write a Trigger in a different way…(after each statement etc,..)
5.I am loading the data from CSV file to table using Bulk Insert. Only two columns Id and Descr. There is no overload, the data is perfectly alright.
For other columns, I am using identity and default values.
But it gives following error
[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 2. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (Id)."
?Use Data Flow Task.
let me know your views on the same.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply