April 16, 2014 at 2:50 pm
Hi guys, does any one have a good idea(or code) to reduce the size of live data so it can be used for dev. I have restored live data on a test server. But all of it and its making development slow. I want to reduce the amount of data in each table. there are over 3000. maybe to 10 000 rows each. Is there a way to do it in sql?
i can always do it for ech table I am using. but there are about 200.
ps. its dev so referetiol integrity for now is not an issue.
thanks guys
Ian Cockcroft
MCITP BI Specialist
April 16, 2014 at 3:01 pm
Do the tables have some sort of ID (like an identity)?
If they do, you could write a delete statement using a modulo operator that keeps only 1/5th of the rows for example.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
April 16, 2014 at 3:08 pm
i have code. was thinking of using this
SELECT 1
WHILE @@ROWCOUNT > 10000
BEGIN
DELETE TOP (1000)
FROM (loop through tables)
END
but what i need, is to loop throough the target tables easily
Ian Cockcroft
MCITP BI Specialist
April 16, 2014 at 3:11 pm
You could use sp_msforeachtable.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
April 17, 2014 at 12:03 am
eXCELLENT. THANKS KOEN
Ian Cockcroft
MCITP BI Specialist
April 17, 2014 at 11:03 am
Ian C0ckcroft (4/16/2014)
Hi guys, does any one have a good idea(or code) to reduce the size of live data so it can be used for dev. I have restored live data on a test server. But all of it and its making development slow. I want to reduce the amount of data in each table. there are over 3000. maybe to 10 000 rows each. Is there a way to do it in sql?i can always do it for ech table I am using. but there are about 200.
ps. its dev so referetiol integrity for now is not an issue.
thanks guys
10,000 rows is practically nothing. I suspect that you did something like use Import Data wizzard to copy tables from production to development, but you will now need to script out and create indexes.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
April 17, 2014 at 11:46 am
Yes Eric, I just moved it to a dev environment. then building a stageing and data warehouse. all i need for now is the structure of the source to make dev faster. running ETL quries against all the data is taking to long.
I dont have rights to the source so buiding indexes on it is not an option.
Ian Cockcroft
MCITP BI Specialist
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply