December 14, 2011 at 9:45 am
I'm guessing the answer is no, but I figure I'd ask anyway:
I have several tables for which the process that I'm trying to run is the same.
1. Get the count of the rows for a specific date from a local table
2. Get the count of the rows for a specific date from a remote table
3. Check if the counts are equal
4. If yes, goto end
5. If no, take all the records from the remote table and load them into the local table
The problem is that the tables are not the same at all. The thing is though, that the loading of the data is always done in the same way - take all the rows from the remote table and put them into the local table. The fields are all the same in the source and destination tables, just that they are not the same for each source.
As it stands, there's two ways I can see that would work. First is to have a separate data flow task for each table being loaded. The second is to have an Execute SQL Command task that does the loading, since it is just an INSERT INTO SELECT FROM query.
The problem with the SQL Command idea though, is that A) it's going to be slower, and B) more importantly, it's going to be a lot more difficult to maintain.
Anyone see a way to get this to work better?
December 14, 2011 at 9:56 am
The wording in your question demonstrates that you know the answer, as far as I can see.
Have you considered Cozyroc software? It includes a dynamic data source component, from what I've read, that would make it possible to do what you want. I am not recommending anything here - I have not even used this component.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
December 14, 2011 at 9:58 am
Have you considered the possibility of transactional replication?
Steve Jimmo
Sr DBA
“If we ever forget that we are One Nation Under God, then we will be a Nation gone under." - Ronald Reagan
December 14, 2011 at 10:11 am
Basic fact is that the schema of the source and/or destination of a data-flow cannot change at run-time with the delivered components. There is a company called CozyRoc that sells a component capable but it is not free..
With that said much of this can be done with copy/paste/modify. I would probably use the import/export wizard to generate a basic package then build up one table then copy/paste/modify the rest of them.
To check the row counts there are several methods:
1. Two EXEC SQL Tasks that write to a variable
2. a Data-Flow task that contains two sources, one for each size and a script component to take the pipeline data and push it into variables.
3. Get rowcounts for all objects on both sides and assign them to variables at the start then use them to control execution..
Once you have the counts you can control the execution using expressions on the constraint to the next component. The expression for the "go ahead and run it" would be something like @[Cnt1] <> @[Cnt2], and the path around it would be like @[Cnt1] == @[Cnt2].
I hope this helps..
CEWII
December 14, 2011 at 11:32 am
Yeah Elliott, that's what I figured - had basically already started that anyway, since I was fairly sure there wasn't a better solution. CozyRoc software would be nice, but I'm not going to be able to convince my company to buy it just for this.
December 14, 2011 at 11:38 am
I think that is a common resistance. Tools are hard to get money to buy, especially small tools. A fair amount of the stuff I do on those lines ends up on codeplex..
CEWII
December 15, 2011 at 4:06 pm
kramaswamy (12/14/2011)
Yeah Elliott, that's what I figured - had basically already started that anyway, since I was fairly sure there wasn't a better solution. CozyRoc software would be nice, but I'm not going to be able to convince my company to buy it just for this.
Hi,
I know from your posts you are knowledgeable guy. You have listed a bunch of major points at the start of the discussion, which in my opinion could be easily passed to your upper management for consideration. You are going to save time + money with the CozyRoc's solution. And your savings will continue to add, add ... in the future alot.
The Data Flow Task Plus was not a small feat to accomplish and really innovative. Some customers have recommended to sell it separately for 3x the amount we ask for the library. What we ask is a small amount for the considerable time and effort we have invested to get it to the level where it is now. In the next 1.6 version you will have the ability to also have dynamic data flow transformations (remember what I said about add, add ... 😉 )
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply