February 18, 2009 at 10:01 am
hello;
I have so many tables 1000s to replicate from sql 2005 to sql 2005; at least 50,000 transactions/ minute; still replication is a good solution? Some tables are about 40GB, and 30GB; still replication good solution? How to add just one article and subscribe that one article without reinit all? Thanks for your helps;
February 18, 2009 at 11:32 am
833.33 trasactions/sec is respectable if sustained.
Replication can handle it if correctly configured so that distribution does not becomes a bottleneck
I have replicated Billion+ row tables without problem but it is all about planning.
You can add an article ( a table ) then run snapshot job and it should generate the snapshot data for only that table not all of them.
* Noel
February 18, 2009 at 11:46 am
Thanks for your reply; your reply me a confidence; I will tune distributor; I am doing a pull down trans. repl; Distri. to subs. takes a long time; 7hrs to 8hrs; I am new to repl; I am using all defualt config;
February 18, 2009 at 12:06 pm
Joe (2/18/2009)
Thanks for your reply; your reply me a confidence; I will tune distributor; I am doing a pull down trans. repl; Distri. to subs. takes a long time; 7hrs to 8hrs; I am new to repl; I am using all defualt config;
Here are a couple of hints:
- Try to setup a "separate" distributor if you suspect IO limitations on the Publisher
- Increase the CommitBatchSize at least one order of magnitude for the distribution agent.
- For BULK changes you can use stored procedure replication.
If you want to get a feel for Best Practices, go to: http://msdn.microsoft.com/en-us/library/ms151762(SQL.90).aspx
Hope it helps
* Noel
February 18, 2009 at 2:37 pm
Thanks again; I will go thru your tips;
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply