August 27, 2009 at 9:41 am
Hi all -
I am looking at a process that loads data from tables in one database to identical tables in another database.
I am running SQL Server 2000 SP4...
Currently the process is just a series of similar SQL statements:
USE myTargetDB
GO
TRUNCATE TABLE product_product
INSERT INTO myTargetDB.DBO.product_product
SELECT * FROM mySourceDB.DBO.product_product
There are additional steps in my SQL Agent job for additional tables. There could be record counts of up to 20-30k for each table.
So my question is if there are any issues I should look out for when putting together processes like these - specifically to keep the impact on the DB server as low as possible? Specifically keeping in impact / growth of my transaction logs in check and keeping the use of tempdb in check? Should I be taking a more transactional approach so that records are committed every x # records?
Thanks for your advice!
David
August 27, 2009 at 2:38 pm
my suggessions would be:
1.you can use replication (snapshot/transactional)?
2. use bcp that will be fast for small tables if your transfer is not very frequent(once a day) and less records?
but replication would be good..
HTH
Vinay
Thanx.
Vinay
http://rdbmsexperts.com/Blogs/
http://vinay-thakur.spaces.live.com/
http://twitter.com/ThakurVinay
August 28, 2009 at 7:10 am
Normally I'd recommend batching, but it depends on amount of data and other activity on the database(s). 20-30k rows are really not that many.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply