August 14, 2014 at 6:09 am
Hi,
We have a situation where a lot of our clients use ss 2005 and we are wanting to only export the updates from certain tables of their db to our azure db. As we are only a third party vendor and don't own their db's we can't really make any alterations to their db but (currently) using a c# app that sit's on the client, connecting to their db, connecting to our azure db and running a series of select statements for export to azure. This process was ok but took a long time for some big clients.
Now we wish to make the data update as close to real time as possible and so are looking at alternatives.
So I've looked into azure syncdata, syncframework and change tracking. None of these currently look good for us as they either add tracking tables to our client db and/or only work for ss 2008 updwards or os vista and upwards.
So I'm thinking if can we regularly make a copy of the client db / just the tables we need (still using their ss 2005 instance to a temp database) and then perform syncframework on that (so adding tracking tables won't matter) or maybe backing up/copying their db and just exporting that as a file to azure where we can do our stuff on it.
I was wondering if that is a good approach, would it effect the performance of their software/their database and how best to tackle it / would it take too long / to many potential pitfalls / and how to set up some kind of automated script process to copy or backup up their database
or if those idea's are rubbish and there are better alternatives?
thanks,
August 14, 2014 at 8:29 am
You have a few choices for copying specific tables.
1. Build an SSIS package to export those tables
2. Build a PoSh/bcp script to export those tables.
3. replicate those tables to a separate database (on the client), take a backup of that database periodically.
There are other things like Sync or CDC, but as you mentioned, they get to be a pain. I'd choose #1 or #2 as those are more easily set up, maintained, and deployed. They're also simple.
August 14, 2014 at 8:42 am
Hi Steve, thanks for your reply
so would ssis or powershell be able to export direct from the local db to the azure db? (Both technologies I know little about as yet but have just noticed in the lastest sql email a discussion on setting up auto back/restores...kind of similar)
The big question with exporting using ssis or powershell is would it be the whole table or just the update's to the table? It would be okay to run some kind of sync process once the tables are on azure but from a time it takes perspective ...
1) periodically (as often as possible) export 12 x tables from client to azure temp tables - clear/drop the temp tables that exist in azure first (technology - ssis / powershell)
2) once export complete, run some one way sync process between temp tables and production tables (technology - ?)
what are your thoughts?
August 14, 2014 at 8:51 am
For PoSh, for sure. Not sure how complex, but it has piping and I know I can run PS locally to connect to Azure databases. I assume SSIS can.
In terms of syncing things from the staging tables in Azure to production tables in Azure, you need a process. Not sure what Azure offers here. I know worker roles could do this. I suppose you could have a local process from PoSh that uploads, and then calls something else to move data.
What's the extent of the moves? Complex ETL?
August 14, 2014 at 9:38 am
I'm not sure on the average size of the tables really. Currently the etl is a c# app that exports some tables whole and some tables just new entries based on datetime stamps.
If we went for exporting say 12 tables whole every time then maybe 200mb's as often as possible.
I guess I'll have to look into the powershell, ssis options.
I could maybe get the c# app to do a backup/restore (to a different name) of the database/certain tables then run some syncframework code with azure. Then when complete start the process again.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply