January 21, 2004 at 7:01 am
Hi All,
we develop a software based on SQL Server. I must maintain many SQL Server 2000 by our customers. So i try to find a good way to optimize the Merge-Replication maintain process. The problem today is the following:
We created a new table in which 30000 rows of change are in if i run the update script etc. during a created merge replication and also some other tables are related to this new table.
For example:
I have a customer with 10 branches. In each branch there is a SQL server. In this situation i have 10 tables with over 30000 rows of different information. If i will start than the merge replication it will be taken alot of time and even i only have one night to make the changes. So if i calculated the time it takes to long.
So what is the best solution to minimize or optimize the traffic through the branches. THe problem is we only have ISDN this mean least than 64 KB/s.
If we make it in the classic way i will need a lot of time to realyse all this.
I look forward for you answer. Thanks alot.
BR
Christian
January 25, 2004 at 1:04 pm
The profiles for each subscriber's agent are often used for optimisation, but this will not reduce the volume of data to be moved form your publisher to the subscribers. In this case you might want to look at having alternative partners to synchronise with. This would mean that the publisher sends data to perhaps five subscribers, and they then synchronise with each of the remaining subscribers. Tis reduces the volume of traffic eminating from the publication server by half.
HTH,
Paul (http://www.replicationanswers.com)
Paul Ibison
Paul.Ibison@replicationanswers.com
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply