December 14, 2023 at 10:45 am
I want to set up SQL Server transactional replication from a main SQL Server to an archive SQL Server. The archive will contain historical data and have a retention period much longer than the main server. Data will be purged by application code on both the main and archive servers, and the archive will be configured to store data for longer.
I need deletes/updates/inserts to be replicated to the archive server. I can't simply not replicate deletes, as deletes do happen as part of normal processing. However I can differentiate purge deletes from regular processing deletes, as the normal processing deletes always occur above a particular row ID threshold. So I'd be able to have some custom logic in the DELETE replication stored procedure that says "only perform this delete if the id value > threshold." All purge deletes would happen below the threshold and can be ignored.
I understand that it's possible to modify the sp_MSdel_dboTableName stored procedures to customize the behavior of replication. However, I have two concerns:
I was also worried about how to deal with restoring backups, but it seems that there is quite a lot of information about this, and it seems that it is just something that requires manual intervention to resynchronize the replica after the main is restored.
December 15, 2023 at 11:10 am
Thanks for posting your issue and hopefully someone will answer soon.
This is an automated bump to increase visibility of your question.
December 15, 2023 at 9:58 pm
My question would be do you need to use replication for this?
What I mean is could you modify the stored procedure that updates the primary to ALSO update the secondary for insert, update, and delete operations? Advantage to this is immediate sync between the two systems at the cost of double the write operations per stored procedure call. Disadvantage is that if DB2 goes down for any reason, the SP fails even with the primary up.
Failing that, do you need the data synced immediately or could you use something like SSIS to move the data changes across on a schedule? Advantage here is that the data is eventually synced to the replica and you can build in the restrictions about what data does and doesn't move across. Downside is that it isn't near realtime.
Alternately, could you use service broker to push the data changes across? A trigger on the table would fire the changes over to service broker and then let service broker ensure eventual data consistency and have your process that purges the data turn off the trigger before the purge? Alternately, have the delete trigger check the ID prior to sending the message to service broker. Advantage here is that you have eventual consistency (DB2 goes down, the messages just queue up until it comes back). Downside is that service broker can be hard to troubleshoot.
I know this doesn't answer your question, but unfortunately I don't have a good answer for you. I am not familiar with updating replication related stored procedures, but modifying system stored procedures has always been a red flag for me as updates or upgrades MAY blow out my changes.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
December 15, 2023 at 11:43 pm
you don't change the system generated SP - what you do is setup a custom SP and register it to be used instead of the default one that is generated for each table.
when you need to change the article you will also need to update your sp and register it - see https://learn.microsoft.com/en-us/sql/relational-databases/replication/transactional/transactional-articles-regenerate-to-reflect-schema-changes?view=sql-server-ver16 for this bit.
with a custom SP you likely can do what you need.
December 16, 2023 at 11:27 am
SSIS
I wouldn't be able to do the sproc modification to also write to DB2 because the latency would be too much, aside from the other issue you mentioned of losing events if DB2 is down. SSIS is something I hadn't thought of. I know very little about it but it could be a good option since I can tolerate delays for getting to the archive. Reading a bit about service broker, something I know even less about, it does look like it could work as well. But my main table already has a bunch of triggers on it, not sure I want to add to that. I'm definitely going to read more about SSIS and service broker, thank you!
you don't change the system generated SP - what you do is setup a custom SP and register it to be used instead of the default one that is generated for each table.
Cool, this is the kind of thing that I'm struggling to find out just by reading the docs. I had also gone down a rabbit hole on this trying to understand how to make the insert/update procs auto update if a column is added. But I actually don't need custom ins/upd procs in any case, it's only deletes where I need special logic. I'm going to try out this custom del proc thing - thanks a lot for those links!
December 16, 2023 at 12:33 pm
This was removed by the editor as SPAM
December 19, 2023 at 9:59 am
This was removed by the editor as SPAM
December 20, 2023 at 6:54 pm
This was removed by the editor as SPAM
December 20, 2023 at 6:55 pm
This was removed by the editor as SPAM
December 20, 2023 at 6:55 pm
This was removed by the editor as SPAM
December 20, 2023 at 6:55 pm
This was removed by the editor as SPAM
December 20, 2023 at 6:55 pm
This was removed by the editor as SPAM
December 20, 2023 at 6:55 pm
This was removed by the editor as SPAM
December 31, 2023 at 11:43 am
This was removed by the editor as SPAM
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply