February 4, 2011 at 9:30 am
So I have 20K tables and I can do it.
You need to get the data for each table and then order by rowcount and / or size to see what are the big tables.
February 4, 2011 at 10:16 am
Ninja's_RGR'us (2/4/2011)
So I have 20K tables and I can do it.You need to get the data for each table and then order by rowcount and / or size to see what are the big tables.
Can you explain why we need to go through each table. In my prevoius responses I said I plan to delete most of the data.
So do I do a rowcount after deleting the data ? Can you explain the scope behind this test ?
February 4, 2011 at 10:20 am
Go read the articles and then come back if you have questions.
February 4, 2011 at 10:25 am
mw112009 (2/4/2011)
Michael Valentine Jones (2/4/2011)
You should probably take a step back and look at how much space the database files are using, and find the size of the individual tables in the database. You can find that information using the script on the link below.Script to analyze table space usage
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=61762
If you need to shrink the database data files, you can use this script:
Shrink DB File by Increment to Target Free Space
Hello
I have 5212 tables. I don't think I can analyze table by table ?
Apparently you didn't even bother to look at the script or try to run it.
How do you know you actually deleted most of the data? If the backup is large, that is an indication that the database has a lot of data in it. The size of the backup depends on the amount of data in the database, so you need to see how much space you are using.
February 4, 2011 at 10:39 am
Let me explain:
I am not in a place where I have access to this intermediate server at the moment. When I get to that location ( after 6 PM Eastern time ) I wll respond back with all the answers.
The way I did the first backup did not include some of the steps that you mentioned earlier.
I will get back with you once I complete the steps you suggested and report the results. Until then
my questions will be about planning this process.
Thanks for your input.
February 4, 2011 at 10:44 am
How about you run it against ANY other db to see how it works?
February 4, 2011 at 10:48 am
Ok ran the first script --- here are the results
File Size Used Unused Type DBFileName
======================================
3550.19295.853254.34LogNextGen_Log
207.13176.9430.19DataNextGen_System_Data
4911.504505.19406.31DataNextGen_Core_Data_1
5235.694431.13804.56DataNextGen_Index_1
13904.519409.114495.40*** Total for all files ***
There is also a second set of results which is too larger to paste here.
February 4, 2011 at 10:51 am
Hope this output is very clear ( Earlier reponse the numbers got mixed... )
3550.19 304.26 3245.93 LogNextGen_Log
207.13 176.94 30.19 DataNextGen_System_Data
4911.50 4505.19406.31 DataNextGen_Core_Data_1
5235.69 4431.13804.56 DataNextGen_Index_1
13904.51 9417.524486.99*** Total for all files ***
February 4, 2011 at 10:52 am
mw112009 (2/4/2011)
Ok ran the first script --- here are the resultsFile Size Used Unused Type DBFileName
======================================
3550.19295.853254.34LogNextGen_Log
207.13176.9430.19DataNextGen_System_Data
4911.504505.19406.31DataNextGen_Core_Data_1
5235.694431.13804.56DataNextGen_Index_1
13904.519409.114495.40*** Total for all files ***
There is also a second set of results which is too larger to paste here.
So basically you can't spare 14 GB of hd space on your dev server??? 1TB external drives are under 100$. I'd start there unless you feel you need to learn this stuff.
February 4, 2011 at 10:54 am
mw112009 (2/4/2011)
Hope this output is very clear ( Earlier reponse the numbers got mixed... )3550.19 304.26 3245.93 LogNextGen_Log
207.13 176.94 30.19 DataNextGen_System_Data
4911.50 4505.19406.31 DataNextGen_Core_Data_1
5235.69 4431.13804.56 DataNextGen_Index_1
13904.51 9417.524486.99*** Total for all files ***
Sorry this looks equally bad...
The attached JPG file is a little bit better...
February 4, 2011 at 10:57 am
mw112009 (2/4/2011)
mw112009 (2/4/2011)
Hope this output is very clear ( Earlier reponse the numbers got mixed... )3550.19 304.26 3245.93 LogNextGen_Log
207.13 176.94 30.19 DataNextGen_System_Data
4911.50 4505.19406.31 DataNextGen_Core_Data_1
5235.69 4431.13804.56 DataNextGen_Index_1
13904.51 9417.524486.99*** Total for all files ***
Sorry this looks equally bad...
The attached JPG file is a little bit better...
Same difference... just buy more space or delete uneeded files. It's a small db to restore in the first place.
February 4, 2011 at 11:00 am
Sorry, I should have mentioned this earlier.
I just ran the script against another database to see how it worked.
This is not the real database that is giving me the problem.
Just allow me a few hours and I will get back and post the values that I get.
February 4, 2011 at 11:03 am
mw112009 (2/4/2011)
Sorry, I should have mentioned this earlier.I just ran the script against another database to see how it worked.
This is not the real database that is giving me the problem.
Just allow me a few hours and I will get back and post the values that I get.
You should be all set. I hope you have something that cascade deletes the data... or you'll have a mountaint of work just to get this cleaning to work.
February 4, 2011 at 12:02 pm
Before I get home I need to be familiar with the next script which shrinks the database file sizes.
So before w move.. Here is the output from the first script.
Although the numbers don't match my database the file names are the same.
<3550.19><317.72>< 3232.47><Log>NextGen_Log
<207.13><176.94><30.19><Data>NextGen_System_Data
<4911.50><4505.25><406.25><Data>NextGen_Core_Data_1
<5235.69><4431.19><804.50><Data>NextGen_Index_1
<13904.51><9431.10><4473.41>*** Total for all files ***
Moving on now to the .. Shrink Database Script.
-- Shrink_DB_File.sql
/*
This script is used to shrink a database file in
increments until it reaches a target free space limit.
Run this script in the database with the file to be shrunk.
1. Set @DBFileName to the name of database file to shrink.
2. Set @TargetFreeMB to the desired file free space in MB after shrink.
3. Set @ShrinkIncrementMB to the increment to shrink file by in MB
4. Run the script
*/
declare @DBFileName sysname
declare @TargetFreeMB int
declare @ShrinkIncrementMB int
-- Set Name of Database file to shrink
set @DBFileName = 'MyDatabaseFileName'
-- Set Desired file free space in MB after shrink
set @TargetFreeMB = 1000
-- Set Increment to shrink file by in MB
set @ShrinkIncrementMB = 50
---------------------
MY QUESTIONS
1.) Do I have to run this script 4 times for each file ( 1 tr log, 1 index file, 2 data files )
Viewing 14 posts - 31 through 43 (of 43 total)
You must be logged in to reply to this topic. Login to reply