May 5, 2008 at 8:54 am
My db file is 169 mb but my log file for that db is 5.56 GB! I want to reduce the size of my log file considerably. I have tried shrinkdatabase and shrinkfile but nothing in the way of size reduction seems to happen. What can I do ?
May 5, 2008 at 8:59 am
As long as the DB is in full recovery mode, the log file will stay that size (and more) until a full backup has completed...
You could also change the recovery model, look in BOL for more on that, just make sure you know that you will not need to recovery the logs.
-- Cory
May 5, 2008 at 9:11 am
You may have to perform this proc more than once because sqlserver can
only shrinkt op to the last active log part.
Source urls are included.
Play it safe, take a full backup up front and afterward !
---------------------------
-- Shrink_TrxLog.SQL
--INF: How to Shrink the SQL Server 7.0 Transaction Log
-- SQL7 http://support.microsoft.com/support/kb/articles/q256/6/50.asp?id=256650&SD
-- SQL7 http://www.support.microsoft.com/kb/256650
-- SQL2000 http://support.microsoft.com/kb/272318/en-us
-- SQL2005 http://support.microsoft.com/kb/907511/en-us
-- select db_name()
-- select * from sysfiles
-- THIS SCRIPT IS NOT INTENDED FOR SCHEDULED USAGE !! READ BOL and the urls !!
SET NOCOUNT ON
DECLARE @LogicalFileName sysname,
@MaxMinutes INT,
@NewSize INT
-- *** MAKE SURE TO CHANGE THE NEXT 3 LINES WITH YOUR CRITERIA. ***
SELECT @LogicalFileName = 'logicalname', -- Use sp_helpfile to identify the logical file name that you want to shrink.
@MaxMinutes = 10, -- Limit on time allowed to wrap log.
@NewSize = 100 -- in MB
-- Setup / initialize
DECLARE @OriginalSize int
SELECT @OriginalSize = size -- in 8K pages
FROM sysfiles
WHERE name = @LogicalFileName
SELECT 'Original Size of ' + db_name() + ' LOG is ' +
CONVERT(VARCHAR(30),@OriginalSize) + ' 8K pages or ' +
CONVERT(VARCHAR(30),(@OriginalSize*8/1024)) + 'MB'
FROM sysfiles
WHERE name = @LogicalFileName
CREATE TABLE DummyTrans
(DummyColumn char (8000) not null)
-- Wrap log and truncate it.
DECLARE @Counter INT,
@StartTime DATETIME,
@TruncLog VARCHAR(255)
SELECT @StartTime = GETDATE(),
@TruncLog = 'BACKUP LOG ' + db_name() + ' WITH TRUNCATE_ONLY'
-- Try an initial shrink.
DBCC SHRINKFILE (@LogicalFileName, @NewSize)
EXEC (@TruncLog)
-- Wrap the log if necessary.
WHILE @MaxMinutes > DATEDIFF (mi, @StartTime, GETDATE()) -- time has not expired
AND @OriginalSize = (SELECT size FROM sysfiles WHERE name = @LogicalFileName) -- the log has not shrunk
AND (@OriginalSize * 8 /1024) > @NewSize -- The value passed in for new size is smaller than the current size.
BEGIN -- Outer loop.
SELECT @Counter = 0
WHILE ((@Counter < @OriginalSize / 16) AND (@Counter < 50000))
BEGIN -- update
INSERT DummyTrans VALUES ('Fill Log') -- Because it is a char field it inserts 8000 bytes.
DELETE DummyTrans
SELECT @Counter = @Counter + 1
END -- update
EXEC (@TruncLog) -- See if a trunc of the log shrinks it.
END -- outer loop
SELECT 'Final Size of ' + db_name() + ' LOG is ' +
CONVERT(VARCHAR(30),size) + ' 8K pages or ' +
CONVERT(VARCHAR(30),(size*8/1024)) + 'MB'
FROM sysfiles
WHERE name = @LogicalFileName
DROP TABLE DummyTrans
PRINT '*** Perform a full database backup ***'
SET NOCOUNT OFF
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
May 5, 2008 at 9:19 am
If you are in full mode, and you should be if this data matter, you need to be running regular log backups in addition to full (data) backups.
You should find out what size log you need based on the frequency of log backups and then keep the log at that size with a little pad. Avoid shrinks if you can.
May 5, 2008 at 10:49 pm
Al,
that script you published, I spent months getting rid of that very script from our production servers. You really need to add a line that categorically states that it must NOT be run as a job, if you want to maintain any kind of recovery path on your database (or your career if you run it on a production database). I hope you're not the original author, because I really came to hate that script, and as team lead, I banished it from our site. The concept of using a BACKUP LOG ... WITH TRUNCATE_ONLY command on a production database... send shivers down my spine as a professional DBA. MS finally got rid of that option in 2008.
Dapetrella,
what you really need to do is run T-Log backups regularly, even if you don't need the recoverability (in which case you could also set the database to Simple Recovery mode). Full Backups don't truncate the Log, only T-Log backups do.
May 5, 2008 at 11:54 pm
Backup log MYDATABASE with truncate_only
DBCC Shrinkfile ('Log',10)
In DBCC 'Log' is the logical file name of log file generally "Log"
and 10 is in MB.(change it acc to your needs)
sometimes files use to have holes created and hence we need to reduce file to a specific size.
May 6, 2008 at 1:26 am
Simon Facer (5/5/2008)
Al,that script you published, I spent months getting rid of that very script from our production servers. You really need to add a line that categorically states that it must NOT be run as a job, if you want to maintain any kind of recovery path on your database (or your career if you run it on a production database). I hope you're not the original author, because I really came to hate that script, and as team lead, I banished it from our site. The concept of using a BACKUP LOG ... WITH TRUNCATE_ONLY command on a production database... send shivers down my spine as a professional DBA. MS finally got rid of that option in 2008.
...
Indeed, that script is not intended to be scheduled.
I didn't pay attention to the fact that the OP is a new SSC-user. :blush:
I'll add the comment to the script.
As you can see, the source-urls are contained in the script itself.
It is not advised to shink files on a regular basis.
SQLServer has a reason to extend files.
So, one should only shrink files for specific (non-recurring) reasons.
One should read bol regarding backup and recovery options before messing around with the db or its files.
I'm still hoping, people use a little grain of common sence
and always test scripts before launching it into production.
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
May 6, 2008 at 8:09 am
Dapetrella,
How often are you performing transactionlog and full backups? Do you perform differentials? If you have a scheduled transaction log backup, you should run it more frequently. If you do not perform transaction log backups, then you will want to create a scheduled backup. The frequency should be determined by business needs.
If I might suggest, before you shrink the log file, you should determine the size needed. Frequent shrinking of the log file will casue fragmenting and hurt performance. If you are doing frequent log backups, then the log file needs to be large. If the size of the log is because a log back up hasn't been performed in a while, then you might divide the log size by the number of days since the last backup and add about 20% to that figure. There are other formulas that are more specific.
Q
Please take a number. Now serving emergency 1,203,894
February 3, 2009 at 12:24 am
i have a database called Link.The data file is 443 mb and log file size is 104GB.so that restoring is not done.so i want to reduce my log size 104GB to 40 GB.so wat can i do for that, plese send me with example.
February 3, 2009 at 12:31 am
kiranmca24 (2/3/2009)
i have a database called Link.The data file is 443 mb and log file size is 104GB.so that restoring is not done.so i want to reduce my log size 104GB to 40 GB.so wat can i do for that, plese send me with example.
Did you read and test the script I posted above ?
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply