May 11, 2012 at 5:23 am
Hi Gurus,
I am totally new in ssis package,in my package i want to insert,update and delete into destination table.
Eg: If source table happening any DML Operation that should be need to replicate into destination table also once the package is run.
What to do?
How to achieve the target.
Help me on this...:-)
Thanks in Advance.
Pradeep
May 11, 2012 at 9:24 am
How many rows in the source table?
Since you need to track deletes you may need to copy all contents of the table from the source into a staging table in the destination using a Data Flow Task. Then issue three DML statements (one each to update, insert and delete) to sync the tables using an Execute SQL Task. If you were on SQL 2008 or above I would have recommended MERGE to avoid having three separate DML statements.
Another option is to do a complete replacement of the table where you copy the source table to the destination database into a staging table, drop the destination table and then rename the staging table to have the same name as the destination table you just dropped.
If the table is too large to copy in its entirety then the recommendations above will not be relevant and you'll have an entirely different type of problem on your hands.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
May 13, 2012 at 10:40 pm
Hi Friend,
source table is keep on growing (day by day) so we cant judge the number of rows.
We cant able to drop or truncate the destination table also because the table is using 24/7.
I'm using SQL Server 2005 version.
In control tab in BIDS i used only 3 Execute SQL task (Insert,update,delete) based on that DML operations happening to destination table.(Without using Data flow task)
How can i use with data flow task (Data Flow Task + Execute SQL task)
Thanks in Advance....
Pradeep
May 14, 2012 at 9:41 am
pradeep.mohan (5/13/2012)
In control tab in BIDS i used only 3 Execute SQL task (Insert,update,delete) based on that DML operations happening to destination table.(Without using Data flow task)
The Execute SQL Task does not work that way. The data has to be copied to the destination database instance before you can use an SQL statement that interacts both the source and destination data. In other words there is nothing built into SSIS that will manage the copying of that data to the destination prior to using it, you have to manage that yourself.
How can i use with data flow task (Data Flow Task + Execute SQL task)
In light of the other parts of your response regarding the size of source data and nature of data at the destination I would do the following:
1. Keep track of manually in a custom parameter-table, or decide how to discover based on a date/time or identity value in the destination, which rows in the source need to be applied to the destination to bring it up to date.
2. Use a Data Flow Task to transfer the rows from the source to the destination that need to be applied to the destination.
3. Use an Execute SQL Task to apply the changes to the destination.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
May 14, 2012 at 10:36 pm
So i can use like this in BIDS :
In Control Tab: Execute SQL Task (Insert)
Execute SQL Task (Update)
Execute SQL Task (Delete)
Data Flow Task (All four components hookup in green color)
In Data Flow Tab: OLE DB Source
Pradeep
May 14, 2012 at 10:38 pm
pradeep.mohan (5/14/2012)
In Control Tab: Execute SQL Task (Insert)
Execute SQL Task (Update)
Execute SQL Task (Delete)
Data Flow Task (All four components hookup in green color)
In Data Flow Tab:
OLE DB Source
OLE DB Destination
Is it possible to use.
Pradeep
May 15, 2012 at 12:12 am
Not really. Please re-read my earlier post.
The Control Flow would look something like this:
And your data flow would look something like this:
Within the second Execute SQL Task is where you would issue the INSERT, UPDATE and DELETE statements to apply the data in the Staging table you loaded in your Data Flow Task to the destination table.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
May 15, 2012 at 12:19 am
Sorry to say that i did not understand your point.
can you explain once again briefly.
Pradeep
May 15, 2012 at 12:26 am
pradeep.mohan (5/15/2012)
Sorry to say that i did not understand your point.can you explain once again briefly.
Which part are you having trouble understanding?
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
May 15, 2012 at 12:33 am
Staging table means destination table?
if yes, we cant able to clear the staging table, its accessing 24/7
I'm using SQL 2005 so its not possible to using merge query on single execute sql task
what to do?
Pradeep
May 15, 2012 at 12:31 pm
pradeep.mohan (5/15/2012)
Staging table means destination table?
The staging table to which I am referring is a new table you will create in the destination database. It's sole purpose will be to hold the "new" data from the source. My Step 2 from this earlier post.
In this image from another earlier post the first Execute SQL Task clears that staging table.
Then the Data Flow Task copies data from the source into the destination staging table.
The third Task, another Execute SQL Task, will issue an INSERT, an UPDATE and a DELETE to sync the destination table with the source data. In the case of trying to carry DELETEs over to the destination you will need to track DELETEs separate from the source table.
I am wondering if Transactional Replication might be an option for you.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
May 15, 2012 at 2:40 pm
to clarify further: NO ONE, and NOTHING but your package accesses Staging. it is your "work table", that is its sole purpose, to serve your data flow.
May 15, 2012 at 2:45 pm
I probably am not understanding something here:-) but wouldn't a trigger on the source table which wrote data to the new destination table accomplish the same thing? your destination table would look exactly like your source table, but would include additional columns like update datetime, update by, row action ('A' = add, 'U' = update, 'D' = delete), etc. this results in a kind of 'snapshot' type table that, for any point in time will show you the changes to the source table.
May 15, 2012 at 2:52 pm
atfalatitkb (5/15/2012)
I probably am not understanding something here:-) but wouldn't a trigger on the source table which wrote data to the new destination table accomplish the same thing? your destination table would look exactly like your source table, but would include additional columns like update datetime, update by, row action ('A' = add, 'U' = update, 'D' = delete), etc. this results in a kind of 'snapshot' type table that, for any point in time will show you the changes to the source table.
If adding a trigger is an option that would be one way to handle it. Depending on the data volume it could be way down my list. Tracking deletes with a trigger may be a necessity though, as I alluded to in a previous post, depending on whether the table allows physical deletes or logical deletes are implemented (is_deleted flag or the like). Transactional Replication would be another way. Change Data Capture would be another, but the OP posted in a SQL 2005 forum so I assume that is out of the question. There are several ways to handle the case.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
May 15, 2012 at 10:59 pm
opc.three (5/15/2012)
pradeep.mohan (5/15/2012)
Staging table means destination table?The staging table to which I am referring is a new table you will create in the destination database. It's sole purpose will be to hold the "new" data from the source. My Step 2 from this earlier post.
In this image from another earlier post the first Execute SQL Task clears that staging table.
we cant able to clear the staging table, its accessing 24/7
Then the Data Flow Task copies data from the source into the destination staging table.
The third Task, another Execute SQL Task, will issue an INSERT, an UPDATE and a DELETE to sync the destination table with the source data. In the case of trying to carry DELETEs over to the destination you will need to track DELETEs separate from the source table.
I'm using SQL 2005 so its not possible to using merge query on single execute sql task. How can i use insert,update and delete in single execute sql taks:-)
I am wondering if Transactional Replication might be an option for you.
Pradeep
Viewing 15 posts - 1 through 15 (of 18 total)
You must be logged in to reply to this topic. Login to reply