Monitoring Database Growth

  • Hi all

    I am currently dealing with a database that is heavily fragmented both on the indexes and on the physical disk. I am going to put something in place to rebuild the indexes and defrag the disk however I believe the cause of the issue to be that the database is set to auto grow at 1MB. Seeing as the database is 5gig this is probably not an ideal growth amount.

    I want to set the initial size of the database to be larger but need to gather stats as to how often the database grows and by how much so I can expand the database accordingly.

    Can anyone give me best practice on how to gather these kinds of stats?

    Thanks

  • There are quite a few topics where people have asked similar questions. depending on the backup strategy, you can look at previous backups over a certain period of time and work out the growth. that is a pretty simplistic approach though, but it will give you a basic idea.

    waiting for others to post links to other topics I cannot find 🙂

    --------------------------------------------------------------------------------------
    [highlight]Recommended Articles on How to help us help you and[/highlight]
    [highlight]solve commonly asked questions[/highlight]

    Forum Etiquette: How to post data/code on a forum to get the best help by Jeff Moden[/url]
    Managing Transaction Logs by Gail Shaw[/url]
    How to post Performance problems by Gail Shaw[/url]
    Help, my database is corrupt. Now what? by Gail Shaw[/url]

  • Try this query, it gives all the auto-growth events in the last day.

    SELECT CAST(TE.[Name] as VARCHAR(20)) as Name,CAST(I.LoginName as VARCHAR(20)) as LoginName,CAST(I.SessionLoginName as VARCHAR(20)) as SessionLoginName,CAST(I.DatabaseName as VARCHAR(20)) as DatabaseName,

    CAST(I.FileName as VARCHAR(20)) as FileName,CAST(I.ApplicationName as VARCHAR(20)) as ApplicationName,I.StartTime,I.EndTime,I.Duration

    FROM sys.traces T CROSS Apply

    :: fn_trace_gettable(T.[path], T.max_files) I JOIN

    sys.trace_events AS TE ON I.EventClass = TE.trace_event_id

    WHERE

    T.is_default = 1 AND TE.NAME Like '%Auto Grow%' AND I.StartTime > GETDATE() -1

    order by starttime desc

    Adjust it according your own situation.

  • I've used this query in a daily job to populate a database size history table that keeps 30 days of size data so I can get an idea of growth rates. This will get you some variables you can use to insert a record into a history table. It's just cut up from sp_spaceused

    declare @idint-- The object id of @objname.

    declare @typecharacter(2) -- The object type.

    declare@pagesint-- Working variable for size calc.

    declare @dbname sysname

    declare @dbsize dec(15,0)

    declare @logsize dec(15)

    declare @bytesperpagedec(15,0)

    declare @pagesperMBdec(15,0)

    declare @iReservedint

    declare @iDataint

    declare @iIndexSizeint

    declare @iUnUsedint

    dbcc updateusage(0) with no_infomsgs

    create table #spt_space

    (

    rowsint null,

    reserveddec(15) null,

    datadec(15) null,

    indexpdec(15) null,

    unuseddec(15) null

    )

    select @dbsize = sum(convert(dec(15),size))

    from dbo.sysfiles

    where (status & 64 = 0)

    select @logsize = sum(convert(dec(15),size))

    from dbo.sysfiles

    where (status & 64 0)

    select @bytesperpage = low

    from master.dbo.spt_values

    where number = 1

    and type = 'E'

    select @pagesperMB = 1048576 / @bytesperpage

    print ' '

    /*

    ** Now calculate the summary data.

    ** reserved: sum(reserved) where indid in (0, 1, 255)

    */

    insert into #spt_space (reserved)

    select sum(convert(dec(15),reserved))

    from sysindexes

    where indid in (0, 1, 255)

    /*

    ** data: sum(dpages) where indid < 2

    **+ sum(used) where indid = 255 (text)

    */

    select @pages = sum(convert(dec(15),dpages))

    from sysindexes

    where indid < 2

    select @pages = @pages + isnull(sum(convert(dec(15),used)), 0)

    from sysindexes

    where indid = 255

    update #spt_space

    set data = @pages

    /* index: sum(used) where indid in (0, 1, 255) - data */

    update #spt_space

    set indexp = (select sum(convert(dec(15),used))

    from sysindexes

    where indid in (0, 1, 255))

    - data

    /* unused: sum(reserved) - sum(used) where indid in (0, 1, 255) */

    update #spt_space

    set unused = reserved

    - (select sum(convert(dec(15),used))

    from sysindexes

    where indid in (0, 1, 255))

    select @iReserved = ltrim(str(reserved * d.low / 1024.,15,0)),

    @iData = ltrim(str(data * d.low / 1024.,15,0)),

    @iIndexSize = ltrim(str(indexp * d.low / 1024.,15,0)),

    @iUnUsed = ltrim(str(unused * d.low / 1024.,15,0))

    from #spt_space, master.dbo.spt_values d

    where d.number = 1

    and d.type = 'E'

    drop table #spt_space

    select @iReserved, @iData, @iIndexSize, @iUnUsed

  • In yesterday's thread on this topic, I posted a couple of scripts I use to look at overall growth:

    http://www.sqlservercentral.com/Forums/Topic781348-146-1.aspx

  • Hi thanks for the replies

    Running that script I have 36 autogrows for the last day which seems a little excessive. I have also going to monitor the database size nightly to see what the difference is, and hopefully I will be able to work out a decent size for my db.

    Thanks

  • Keep in mind when setting the size, you want to be able to have enough space such that at least 3 months or so will be accomodated without an Autogrow.

    If you are currently growing 36MB a day then you are looking at roughly 3GB larger in database. That is if your growth remains static.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply