February 29, 2016 at 2:01 pm
The development team is asking me to refresh data from Prod to Test without overwriting any stored procedures they might have on Test. So before I tell them no, is there a reasonable way to do this?
I've thought about trying to setup a job using sp_MSForEachTable to truncate the table in Test, then another sp_MSForEachTable to SELECT INTO to copy the data from Prod to Test.
Any one have ideas or a script they are willing to share?
Thanks,
Norman
February 29, 2016 at 2:23 pm
Can you generate the scripts from development for all of the procedures, functions, and so forth?
Then, you could restore the DB from production, and run the scripts to update the objects.
I'm thinking it's probably easier, and faster, to "fix" the objects as opposed to the data.
Also, they probably are not taking into consideration schema changes. For example, a column has been added into development that has not been put into production. Depending upon the nature of this column, whatever data load you may write may cause more issues.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
February 29, 2016 at 2:41 pm
I've thought about that as well but there are dozens, maybe several dozen, SP's in some of the databases. And most of the SP's will not have changed. Capturing this would be a lot of effort (as least I think it would be).
I think what they want is to just update the data in the tables in Test with a fresh copy of the data in Prod. (Hey! How hard can it be?)
And you're right, there are often new columns in Test that aren't in Prod yet, so I'm not really sure how that will affect any kind of automated update process.
Like I said, I'm looking for something that might work before I tell them no...
Thanks for thinking about this,
Norman
February 29, 2016 at 11:50 pm
I have to opinions how to do..:
1 ) use linked server and during breaktime or weekend you can delete all data in all tables with some cursor and using sys.tables and insert all data from linked production server..
I have something like this for moving data from production to historical database with transaction in case of some problem also with log. ( but in one instance) , but here you have to write little more coding
or
2)
- save procedures on test with script
- do backup and restore production database on test server...
- delete all procedures on this restore with some script
- execute script what you created... all procedures in on file
.. I am not sure if you can use ALTER in this, if yes this delete step you can skip
Hope it helps or maybe another guys will have better ideas.
March 1, 2016 at 8:20 am
Thanks, your idea 2 is probably what I'll have to do unless I can get the developers to make sure they have their in-process stored procedures scripted and able to re-install them.
Has anyone done something like this before? Be willing to share any code?
Thanks for the suggestions,
Norman
March 1, 2016 at 2:51 pm
Consider using SSIS by selecting Import/Export Wizard in SSMS. It will copy only data without disturbing stored procedures or permissions like a restore would.
Greg
March 1, 2016 at 5:10 pm
I suggest they start using SSDT and allow you to re-publish their stored procedures from the database project, or use the compare functionality to synchronize to their SSDT project.
Its possible, and much easier to use the new SSDT tooling.
March 1, 2016 at 8:59 pm
n.heyen (2/29/2016)
The development team is asking me to refresh data from Prod to Test without overwriting any stored procedures they might have on Test. So before I tell them no, is there a reasonable way to do this?I've thought about trying to setup a job using sp_MSForEachTable to truncate the table in Test, then another sp_MSForEachTable to SELECT INTO to copy the data from Prod to Test.
Any one have ideas or a script they are willing to share?
Thanks,
Norman
You can't truncate tables that might have FKs involved. You'll also have to work out dependencies.
One "easy" way to do what you want would be not have any "development" procs in the same database that will be restored. Put the Development procs in another database and build synonyms to point to the data in the other database. That way, you could do restores to refresh the data with relative impunity even if the restore contains code.
I believe that functions might even work doing it this way.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 1, 2016 at 11:16 pm
I do not speak English, so I use google translate .
I have developed a tool that does what you want , but help is in Spanish ( README InsertBulkBestia.txt ) .
If you have the patience to do the translation , I'm sure that will help .
Basically it is connected with the source and destination , triggers disabled, removes ForeingKeys , disable restrictions , truncates tables and disables nonclustered indexes . Copy tables and performs the reverse process.
March 2, 2016 at 1:52 am
This is one of many reasons why you need some form of source control for your database objects. I know RedGate has tools that allow you to just restore prod to test, then redeploy all changes that are tracked in source contriol. (And if you don't have source control, an alternative way would be to use SQLCompare to generate [but not execute!] a script to make prod equal to test, save [but not execute!] the script, restore prod to test, then load the script, change all database references to point to test instead of prod and then execute it).
The tool fgrodriguez mentions sounds useful and dangerous at the same time. I looked at the attachment and noticed that it's a compiled .Net project, not a pure T-SQL solution so I could not investigate the code - and I did not try to read the Spanish documentation. Concerns I would have are: (1) if it disables triggers before copying, how do I know that validations done in those triggers are not violated? (2) since it disables and later re-enables nonclustered indexes, what happens if the data copied over violates a nonclustered unique index? (3) since it disables and later re-enables check and foreign key constraints, how does it re-enable them? Does it use the WITH CHECK option? If no, then you end up with non-trusted constraints which may have a huge impact on performance - invalidating all performance testing you do. If it does use the WITH CHECK option, then once more the question is what happens if the data copied over violates a constraint.
Such automated tools may apppear to be nice and easy, but the problem is that they need to use a one-size-fits-all approach for all tables - and in most real databases, one size does not fit all. That is already a problem if you build such a tool yourself (which in all honesty is not even that hard). It becomes an even bigger problem when you use a tool created by someone else. And a huge problem if you cannot verify the code it is running.
March 2, 2016 at 2:16 am
I 'm sorry for my english
I understand your distrust. It's just a suggestion intended to help. You simply can test and if it is found that works and helps, as glad. If it is not useful, perhaps someone else can serve you.
The .NET code is only the control of certain stored procedures that are created, run and deleted. At any time during execution of the application you can make a copy of stored procedures and study them.
The tool assumes that you want to copy data from one database to another whose structure is virtually identical to the source, so data validations already been made in source. And as for rehabilitation of restrictions, etc, it is done with the same conditions that were previously, ie, if they were with WITH CHECK and will stay and whether they were WITH CHECK CHECK well they will stay.
I use it to copy 500GB databases from production to preproduction and development. And production is ONLINE, with people working in it.
March 2, 2016 at 3:10 am
Thanks for the explanation, frodriguez.
Also note that "distrust" is too strong a word. I think "careful" would be a better subscription.
And I guess I forgot tot say in my orignal post that I really appreciate that you make your efforts available to the community.
March 2, 2016 at 3:27 am
I reiterate my apologies for my English. I use google translate and I am sorry for not knowing the meaning of certain words.
This contibución 've done with the intention to thank all contibuciones of which I learned a lot and I have helped to develop this small application among other things.
If anyone needed any more detailed explanation , it shall be in Castilian ( Spanish for the rest of the world) , but I think with the comments included in stored procedures ( in Castilian ) should be sufficient .
Sorry my ignorance of English .
March 2, 2016 at 5:37 am
I have Redgate SQL Compare, but we do not(unfortunately) use source control for SQL objects.
I have a methodology for this situation that might help, assuming you also have SQL Compare.
In my case,On The Test Server,I create a new empty database named DatabaseName_SchemaOnly.
then i use Redgate SQL compare to sync the differences on Test's database to that new SchemaOnly database
Then I restore production's copy of the db to test.
Then I use SQL Compare to Move the changes From SchemaOnly to the newly restored database.
After that, I can drop the SchemaOnly Database, or leave it as a reference, so when more changes are made, i can compare the changes.
Lowell
March 2, 2016 at 2:02 pm
That does seem to be about the only way that I've seen. Short of buying another tool or spending a lot of time trying to build something.
But it has been really slow for me if there is much data involved.
Norman
Viewing 15 posts - 1 through 15 (of 19 total)
You must be logged in to reply to this topic. Login to reply