June 16, 2009 at 9:34 am
Hi,
I’m using the below script to get the size of the data file, log file and how much free space is left. I have scheduled a job which runs daily and appends the results to the output file and from there I’m analyzing the database growth.
sp_msforeachdb 'USE [?]
select Name,
(convert(float,size)) * (8192.0/1048576) File_Size,
(convert(float,fileproperty(name,''SpaceUsed''))) * (8192.0/1048576) MB_Used,
((convert(float,size)) * (8192.0/1048576) - (convert(float,fileproperty(name,''SpaceUsed''))) * (8192.0/1048576)) MB_Free
from sysfiles
order by
fileproperty(name,''IsLogFile'')'
Sample Results:
db_name name file_size MB_Used MB_Free
ABC Abc_data 1635014572.625 1777.375
Abc_log 200031.65625 1968.34375
Xyz xyz_data 1665014788.4375 1861.5625
Xyz_log 7405.9375 55.023 7350.91
I have couple of question here:
1.Can we use the above script for database growth monitoring on a daily bases?
2.In above results, I have the free space in log file as 7350.91 MB. Can we reduce that size? Because, the log size remains same but the data file size (MB_Used) increasing rapidly and I’m increasing the data file size weekly by 2 GB ?
3.With this script, I’m able to get the database size growth, but how can I know which table are growing in the particular database?
Please advise me..
thanks
June 16, 2009 at 9:40 am
There are some good scripts for database and table size monitoring on this site. Pls navigate to the scripts section to find those.
June 16, 2009 at 10:04 am
try this for table growth:
This will give you number of rows and tale size in each database.
sp_msforeachdb @command1='USE [?]; DECLARE @top int
DECLARE @include_system_tables bit
SET @top= 10
SET @include_system_tables = 0
BEGIN
IF @top > 0
SET ROWCOUNT @top
SELECT cast(db_name() as varchar(15)) as dbname,cast([Table Name] as varchar(25)) as TableName, (SELECT rows FROM sysindexes s WHERE s.indid < 2 AND s.id = OBJECT_ID(a.[Table Name])) AS [Row count], [Total space used (MB)]
FROM
(
SELECTQUOTENAME(USER_NAME(o.uid)) + ''.'' + QUOTENAME(OBJECT_NAME(i.id)) AS [Table Name],
CONVERT(numeric(15,2),(((CONVERT(numeric(15,2),SUM(i.reserved)) * (SELECT low FROM master.dbo.spt_values (NOLOCK) WHERE number = 1 AND type = ''E'')) / 1024.)/1024.)) AS [Total space used (MB)]
FROMsysindexes i (NOLOCK)
INNER JOIN
sysobjects o (NOLOCK)
ON
i.id = o.id AND
((@include_system_tables = 1 AND o.type IN (''U'', ''S'')) OR o.type = ''U'') AND
((@include_system_tables = 1)OR (OBJECTPROPERTY(i.id, ''IsMSShipped'') = 0))
WHEREindid IN (0, 1, 255)
GROUP BYQUOTENAME(USER_NAME(o.uid)) + ''.'' + QUOTENAME(OBJECT_NAME(i.id))
) as a
ORDER BY[Total space used (MB)] DESC
SET ROWCOUNT 0
END'
I use this weekly to only check whats happening where...
to reduce the Log Size. search for DBCC SHRINKFILE and SHRINKDATABASE in BOL.
Maninder
www.dbanation.com
June 17, 2009 at 3:44 am
You say the used space for the transaction logs never grows but you are only running this script once a day.
I have scripts that monitor the log space but I run these every hour so you can see the log space grow and shrink throughout the day, you might find if you ran it more frequantly that some of the log space is used, monitor it for a month and then shrink the log to a size more appropriate for the actual activity.
By the way, yes you can reduce the size of log files, the best method is using DBCC SHRINKFILE, but only do this when you think you know what the correct size the log should be.
June 17, 2009 at 10:05 am
I have scripts that monitor the log space but I run these every hour so you can see the log space grow and shrink throughout the day, you might find if you ran it more frequantly that some of the log space is used, monitor it for a month and then shrink the log to a size more appropriate for the actual activity.
Can have that script plz...
June 18, 2009 at 2:33 am
The script is quite complicated as it also does a lot more than just get the log size and would need a lot of explaining that I've not got time for at the moment, this would be the other parts of the script that generate alerts and populate tables, what else was being logged etc.
The easiest way to get log size stats is to run 'dbcc sqlperf(logspace)' and add the results to a table and then generate an alert when it gets over a certain percentage full. Use the stats you gather in a table to be able to look back historicly to see when the files grow but you can also use these for capacity planning etc.
Hope this helps.
June 18, 2009 at 6:24 pm
if you have text data in your table, there is a good chance that table size will not be reflected properly under the scipt posted above
June 19, 2009 at 12:45 am
barsuk (6/18/2009)
if you have text data in your table, there is a good chance that table size will not be reflected properly under the scipt posted above
Can you provide me with more information on this, do you have a link to an article of KB article? If text data gives an inaccurate size how would you measure transaction log size?
June 19, 2009 at 5:23 pm
It is based on my personal experience. I used similar procedure for some time. Once of the tables starting getting more and more text data, report start providing more and more inacurate data compare to when I just simply run sp_spaceused against that table. So ended up re-writting my procedure to use sp_spaceused...
June 19, 2009 at 9:54 pm
Can you provide me with more information on how to produce the database names, database owners and sizes in a single output.
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply