January 25, 2011 at 8:32 am
Dear All,
At present our database size is more than 10gb.Autogrowth is 5%of db size, fixed growth is preferd by MS.
As autogrow event is happening very often during the day,because of that we are facing performance issue.So we were thinking on adding a growth step during the night.
We’d like to add this step after the shrink in the maintenance plan.
could you please help in this ?how to create this
January 25, 2011 at 8:35 am
First things first: Remove the shrink step in the maintenance plan.
That'll help more than just about anything else you can do.
The database needs to be a certain size for its day-to-day use. It's autogrowing up to that. Shrinking it nightly (or at all) just means it has to grow again. That's what you're running into.
Every day that it does that, your files are getting more and more fragmented, your performance is getting worse, and you're making the indexes fragment more and more as well.
Get rid of the shrink step in the maintenance plan, and then start looking at how big the database really needs to be.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
January 25, 2011 at 9:02 am
Regular SHRINK of a database is a very bad thing - see my article, and the lively discussion that followed: http://www.sqlservercentral.com/articles/SHRINKFILE/71414/ and Paul Randalls blog entry: http://www.sqlskills.com/BLOGS/PAUL/post/A-SQL-Server-DBA-myth-a-day-(930)-data-file-shrink-does-not-affect-performance.aspx.
January 26, 2011 at 2:59 am
Thank you so much .Your support is appreciated.
After checking disk usage report, we think that growing respectively 200 Mb for DATA file and 60 MB for LOG .
so would like to create a job which is going to run in the mid night.
could you please tel me the script for creating this job step.
January 26, 2011 at 3:56 am
bhargavgampala (1/26/2011)
Thank you so much .Your support is appreciated.After checking disk usage report, we think that growing respectively 200 Mb for DATA file and 60 MB for LOG .
so would like to create a job which is going to run in the mid night.
could you please tel me the script for creating this job step.
http://stackoverflow.com/questions/592540/how-do-i-grow-a-database-file
(I found this url after a quick google search)
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
January 26, 2011 at 3:23 pm
bhargavgampala (1/26/2011)
Thank you so much .Your support is appreciated.After checking disk usage report, we think that growing respectively 200 Mb for DATA file and 60 MB for LOG .
so would like to create a job which is going to run in the mid night.
could you please tel me the script for creating this job step.
Did you not read the message from GSquared? Remove the shrink step from your maintenance plan. That is what is causing all of your problems.
Once that has been removed - the database will grow the next day and shouldn't grow any more. The same with the log file - as long as you have frequent transaction log backups or your database is in simple recovery model (read the article I link to in my signature).
Once you know how large the database needs to be - then, manually grow the file a bit larger so you have space in the data file for several months of growth. Then monitor that usage and when it gets low, schedule a time to manually add more space.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
January 26, 2011 at 4:51 pm
Remove the shrink step in the MP as the experts have already mentioned. That will cause a lot of external fragmention and slows down the performance.
Thank You,
Best Regards,
SQLBuddy
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply