February 21, 2020 at 4:07 pm
Great ideas and much appreciated...having the articles in place prevents the dropping of tables. Not knowing which table is going to get dropped is the issue there...then when a table is attempted to be dropped or truncated or whatnot, it gets an error that it can't due to replication.
I have a set of scripts I put in place which add new tables as they are added and sync the data...it's just not the most loving solution, lol. It also required dynamic SQL...in which a part of me died implementing it. Having said that, it is working as long as no columns are added/removed...but in those cases (which are fortunately rare), they are easily spotted and corrected.
I had thought about SSIS and also creating a .NET solution. I actually wrote out a more robust solution in C#, but could not optimize it enough to weigh in over the performance of the dynamic SQL (always best to handle data where data lives, *sigh*).
Seriously, I appreciate y'alls input though; it's given me a lot more to consider and think about for a long term solution.
February 21, 2020 at 4:17 pm
OK - what if there are 200 tables(I made that number up) that need replicating? wow, sql agent is going to get busy and run out of threads quite soon (been there seen it done it)
I'm going to agree in advance that creating a snapshot for 200 tables would be horrendous, so finding the balance of number of articles per publication is trial and error
SSIS wouldn't give you realtime updates (or even near that) and personally SSIS makes me want to do violent things (usually involving a keyboard and a window)
I think grouping the replication into little "puddles" rather than a "lake" is maybe a good idea.. then reinitialise when the DDL trigger fires
again just spitting ideas out there.
I'd be interested in the original poster's thoughts
MVDBA
February 21, 2020 at 4:20 pm
Great ideas and much appreciated...having the articles in place prevents the dropping of tables. Not knowing which table is going to get dropped is the issue there...then when a table is attempted to be dropped or truncated or whatnot, it gets an error that it can't due to replication.
I have a set of scripts I put in place which add new tables as they are added and sync the data...it's just not the most loving solution, lol. It also required dynamic SQL...in which a part of me died implementing it. Having said that, it is working as long as no columns are added/removed...but in those cases (which are fortunately rare), they are easily spotted and corrected.
I had thought about SSIS and also creating a .NET solution. I actually wrote out a more robust solution in C#, but could not optimize it enough to weigh in over the performance of the dynamic SQL (always best to handle data where data lives, *sigh*).
Seriously, I appreciate y'alls input though; it's given me a lot more to consider and think about for a long term solution.
have you considered using instead of triggers?
instead of "drop table"
remove article
do stuff
add article
MVDBA
February 21, 2020 at 4:24 pm
@mvdba Mike Vessey,
First, lol...love your opinion of SSIS, haha! I'd love to introduce you to this product's database, it would make you love SSIS in contrast...
Regarding grouping into any smaller amount, the problem is that I cannot affect production and cannot reliably duplicate in a separate environment. So, if an article exists on one of the affected tables, it will cause an error.
...for your second post, I would love not to have it drop tables...this is a vendor product unfortunately. They do everything in SQL that you shouldn't do...and if you can't do it, they find a way and implement that too.
This has created a unique situation for me to approach...my current solution resides on a different server and pulls the data deferentially.
Viewing 4 posts - 16 through 18 (of 18 total)
You must be logged in to reply to this topic. Login to reply