January 15, 2003 at 10:26 am
Scenario. Data stored in Adaptive Server Anywhere db. Need to replicate to SQL Server 2000 on a hourly basis.
Currently, we do the replication using Delphi's tbatchmove component. Using the appendupdate option, it will append any new rows and update any existing rows, based on index. It works fairly well except that when it is doing its updates, other users that are only trying to read data are getting deadlock errors because the delphi program is locking the tables.
2 questions. is there a better way to do this replication between these 2 databases? Is there a way to change the locking so others can read the db at the same time?
January 15, 2003 at 4:22 pm
Good first step is to figure out why the deadlocking. Maybe you need better indexes, do smaller batches, etc. Could also set the deadlock priority to low so that if it happens, its the repl process.
Andy
January 21, 2003 at 2:58 pm
Andy,
Thanks for the reply. I believe the problem is happening because it is a farily large batch, etc. What I am doing is creating a mirror of that table in a seperate database which will be used for the uploading. Once the data gets there, is there an easy way to replicate to the "real" table. Keep in mind, the data will likely already be there but will need to be updated.
January 21, 2003 at 5:53 pm
So you'd drop/recreate your staging table each time? If so, replication not a good answer, probably better to just write a set of update procs to apply the changes - about the same as doing it to the live table directly, but with the advantage that everything would run on the server and you could index to make it go better. You're only posting changes I guess, not the entire data set?
Andy
January 22, 2003 at 7:46 am
I am removing all data from the staging table each time. Everything in the staging table will be copied to the "golden" table, not just updating changes. I think I will take your advice on righting some update stored procs to do this process. I believe it will run quite a bit quicker since the update to the staging table should be all inserts without comparison (since the table is blank)and then the new process to update should be like the following.
Process:
1. Staging table is updated from remote site.
2. Delete from golden table where id is in list of staging id's (since the new data is the now golden data)
3. Insert into golden table from staging table
4. Delete all from the staging table
Thoughts? My only worry is this, how will I know when the remote update to the staging table is complete? It is scheduled to run every 2 hours, so if I schedule the update to run on the opposite hour, I should be ok..but it doesn't make me feel to comfortable...in fact, none of this gives me the warm and fuzzies.
Thank you for your help.
January 22, 2003 at 1:53 pm
Set a flag someplace. Either a row in a table, a file on disk, whatever. That way you can run the server side job more often (sooner really), if its not okay to proceed it can just exit. I dont think you're in bad shape, just test it well.
Andy
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply