October 16, 2006 at 9:45 am
Hi All,
I am considering some scenarios for support of our software validation team. This team is responsible to create end users validation tests. We need to prepare dataset in the database so we can create expected outcomes for the test.
So I need a way to enable the team be able to create datasets with ease. Some of the members are not fluent in SQL so it complicates the thing.
I have looked at using DTS, Replication, Bulk Copy, Replication. So theirs a myriade of solution that can be used. I will not list all the scenarios that I have analyzed not to lenghten the reading of this email. I'd like to know if any of you have looked at a similar problem and know what way you took.
Thanks in advance for sharing your ideas.
Sylvain
October 17, 2006 at 6:54 am
The absolute quickest method and gets you as near to the production database as possible is the following:
Detach the production database, copy the .mdf file to your development machine, reattach the production, and newly attach the copy. You'll probably have to adjust some things.
Things inevitably get missed when you are only testing 10% of the production database or having such fake data that it doesn't reflect real-life, so having a 100% volume is a testing dreamland. I realize this is not always possible.
If not practical or feasible, use DTS. That's what its there for. It makes copying and refreshing data a snap and you don't have to take production off-line. I've used it often for setting up test databases.
October 19, 2006 at 12:27 pm
First off... make sure that it is "permissible" in your industry to share "real" data that way. In the health care industry, it is not. I can use real data for my own testing purposes as long as I 1) do not share it and 2) do not print it. Other than that, it just is not allowed.
Now, that said... I agree that the quickest way is to detach the DB and grab the files. I have been trying replication recently, per developer's request, and the biggest intent was to not make the production DB unavailable - not even long enough to grab the files. For a 3 GB DB the snapshot took 15 minutes to generate, and another 70 minutes to push... and my servers were VMWare (virtual instances) on a single server. The current concern, as I review the job steps for the process, is the entry that says the following:
"Locking Publisher tables while generating snapshot"... with a timestamp of 12:05:38.657... and a second entry that says it's ready at 12:18:11.580.
The entire database was LOCKED (at some level, not sure how) for 13 minutes.
If you are doing nightly backups, perhaps to a network device, perhaps you can just do a restore to your test server to get what you need.
Good luck! and keep us posted here as to your solution.
David
Thank-you,
David Russell
Any Cloud, Any Database, Oracle since 1982
October 19, 2006 at 12:42 pm
Unfortunetly I am working in an environment that is Pharma related and we are not able to make copies of production data to development or validation environment.
So we need our own datasets. The datasets needs to be created by the validation team. I need to load the data in a new database. Then the validation test are performed.
The part of the solution I am currently looking to solve is how to load the data. I know how to load data but I need a way for people that are not fluent in SQL to load the data.
I have looked at BCP utility and XML this is something. Then I looked at creating XL files and load them through a DTS ( this was the most user friendly approch ). And so on.
How do you handle data loads by non-SQL users ?
Thanks Sylvain
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply