March 30, 2015 at 8:55 am
Hi,
I have our production database and our development database. All SQL 2014.
Our developers need to have the production data in their development server quickly. Problem is the production database is 300Gig. So traditional backup and restore takes too long.
I need a way to update development quickly with the production data. What is the best way to accomplish this?
March 30, 2015 at 3:06 pm
Why would developers need live data to develop against?
But, if you have to, the technologies you have available are somewhat limited because they will prevent changes to the database. For example, you could set up availability groups so that you have a read only secondary. But, not the phrase, read only. They can't edit the data. You could use mirroring, but again, no editing data. You could look to use log shipping, but that's going to have intermittent periods where you can't connect AND you can't modify the data. You could look at replication, but again, no data mods.
All that leaves is manually building a replication engine maybe through SSIS, BUT, you're going to have to figure out how you're going to deal with the fact that if a row gets updated/deleted in dev, what happens when you have the same row in production? How about if rows get added to dev? What happens to the rows that were added in production?
In short, I don't recommend we develop this way.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
March 30, 2015 at 5:30 pm
Thanks Grant.
Long story, but it's financial data and I'm new there so can't complain..yet.
Kinda figured those were my only options. Shame I can't use my high avail...without read only.
Think I will build package to refresh data two or three times daily.
Thanks again!
March 30, 2015 at 6:37 pm
krypto69 (3/30/2015)
Thanks Grant.Long story, but it's financial data and I'm new there so can't complain..yet.
Kinda figured those were my only options. Shame I can't use my high avail...without read only.
Think I will build package to refresh data two or three times daily.
Thanks again!
If it's all on a SAN and the particular SAN is capable, the "SAN Snapshots" may be what you need. 300GB takes literally just seconds when correctly setup. The databases also turn out to be "writeable" with the understanding that any and all data on such copies can and will be overwritten on the next snapshot, which should be at least once per night.
I've never personally set such a thing up but I've seen the results of when someone who knows what they're doing does and it's a joy.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 30, 2015 at 7:35 pm
Jeff Moden (3/30/2015)
krypto69 (3/30/2015)
Thanks Grant.Long story, but it's financial data and I'm new there so can't complain..yet.
Kinda figured those were my only options. Shame I can't use my high avail...without read only.
Think I will build package to refresh data two or three times daily.
Thanks again!
If it's all on a SAN and the particular SAN is capable, the "SAN Snapshots" may be what you need. 300GB takes literally just seconds when correctly setup. The databases also turn out to be "writeable" with the understanding that any and all data on such copies can and will be overwritten on the next snapshot, which should be at least once per night.
I've never personally set such a thing up but I've seen the results of when someone who knows what they're doing does and it's a joy.
Oh yeah, and VM snapshots might be an option too. Thanks Jeff.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply