June 11, 2010 at 7:04 am
We currently have a development, user test and production server. Keeping the data up-to-date on dev and test are a real pain. It involves a restore of the database and the running of all the permission scripts. We are further hampered by needing to run in 6.5 compatibility until some older (important) apps are redone.
What is the best practice for keeping the data up-to-date easily?
June 11, 2010 at 7:57 am
cfry-1152891 (6/11/2010)What is the best practice for keeping the data up-to-date easily?
Are Development and Test Production size databases?
Test should be Production size to allow for real testing while - depending on Production size - Development may be Production size or just a fraction of it. Needless to say last option adds complexity to refresh process.
Refresh process usually includes...
1- Saving whatever objects/security are expected to survive refresh.
2- Refresh target database - Restore of a full backup would do
3- Apply saved objects/security on target database.
I like to have a pre-approved annual calendar to refresh Dev and Test. In some cases process has been 100% scripted.
Hope this helps.
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.June 11, 2010 at 7:59 am
Everytime I do a restore I have to remove and readd the permissions to the restored tables. It's a pain. How do you deal with permissions after a restore?
June 11, 2010 at 8:25 am
I run two scripts...
1st one drops all privileges coming from Production
2nd one creates privileges we want to grant in Test
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.June 11, 2010 at 11:50 am
We set up an automated process that captures the security on the target system, restores the database, cleans out the old security, cleans up any sensitive data, then applies the security from the target system. The time spent automating the process pays for itself a thousand times over.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 11, 2010 at 11:59 am
Please feel free to share what you can!
June 11, 2010 at 12:45 pm
It's not rocket science. Just script the stuff you do manually. That's how we built it. Put it together over time, save the script, you'll be up & running in no time.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 11, 2010 at 3:25 pm
If you have a decent SAN, learn how to use the SAN "clone" or "snapshot" feature. We used it to "replace" a Tera-Byte copy of the production databases every night... cause no delays or outages on production and the copy was brought up to date in less than 2 minutes.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 11, 2010 at 4:52 pm
Any time you have a manual process that you want to script, you can use the profiler. The profiler will capture the commands you are passing to the server as you go through your manual process.
Or and maybe even better, if you are working in SSMS
use the 'Script' button. It will output code equivalent to what you are accomplishing in the interface.
June 14, 2010 at 6:18 am
Jeff Moden (6/11/2010)
If you have a decent SAN, learn how to use the SAN "clone" or "snapshot" feature. We used it to "replace" a Tera-Byte copy of the production databases every night... cause no delays or outages on production and the copy was brought up to date in less than 2 minutes.
Not to hijack the thread, but we're going to Cuba....
Is that working well for you? We've been looking at the equivalent in our SAN but the "transaction aware" nature of it seems suspect.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 17, 2010 at 4:51 am
Grant Fritchey (6/14/2010)
Jeff Moden (6/11/2010)
If you have a decent SAN, learn how to use the SAN "clone" or "snapshot" feature. We used it to "replace" a Tera-Byte copy of the production databases every night... cause no delays or outages on production and the copy was brought up to date in less than 2 minutes.Not to hijack the thread, but we're going to Cuba....
Is that working well for you? We've been looking at the equivalent in our SAN but the "transaction aware" nature of it seems suspect.
It worked just fine for us although I'm not sure what you mean by the "transaction aware nature" of it is unless you're speaking about the deep internals of the system. Once set up, we could copy production tor "reporting" virtually anytime in less than 2 minutes without an outage on the production server. The "reporting" server was offline for the duration but, again, that was less than 2 minutes.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 18, 2010 at 6:31 am
One method I use, is SSIS. I have a production database, that has multiple schemas. So, I create an SSIS package for each schema. In it, I truncate all of the tables in a schema on test, then load the data from production. I also set up a job in test, to run the package. That way, a developer can refresh a specific schema, from production, anytime they need to.
I don't have to restore the whole database. The package could also restrict the amount of data, and restrict the copy of sensitive data.
The developers don't need access to production to run the package, because the SQL Server Agent has access, so that keeps the auditors happy.
You could probably do all the tables in a database that way. SSIS is pretty flexible, so you could even set it up to create tables in test that exist in production, and then copy the data.
You could also load from multiple production databases, into a single test database, if that is want you want.
June 18, 2010 at 7:06 am
My problem is the main database is in 6.5 mode. It just wouldn't work.
June 18, 2010 at 9:35 am
it might. I have a 7.0 and I extract from it all the time. I have a 2008 server that I set up as my SSIS server. I run all of my ssis from it (ok, I actually have 2, one prod and one test). I use the oledb connector to extract from my 7.0 server and I move the data to other servers, from 2000 through 2008. Not sure if that would work on 6.5 though. You could also set up a dts package to do something similiar.
Another way I used to use, for Sybase, was bcp. I had scripts set up to truncate my test tables, and then bcp the data in.
The dump and load method is perfectly valid, it can be a pain do the cleanup (security, sensitive data, etc.) once the database is loaded.
June 18, 2010 at 9:42 am
My ultimate goal would be to import the new data on a daily basis to both tables.
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply