August 7, 2006 at 8:59 am
I need some help with setting up a job to truncate the log file.
I've inherited the SQL database at my job and I have my database recovery model set to Full so that I can run transactional backups, however, my log file will grow and lock me up. I need a way to automatically truncate the log file....I think!
Thanks for any help
August 7, 2006 at 1:12 pm
The only correct way to do this is to regulary backup the transaction log. Increase the frequency of backups or increase the size if your transaction log can't handle the number of transactions in a reasonable time.
August 8, 2006 at 3:55 pm
DUMP TRANSACTION base_name with no_log
GO
DBCC SHRINKDATABASE (N'base_name', 0,TRUNCATEONLY)
esto achica tu Log y libera el espacio del archivo.
y despues de algun trabajo pesado trata con:
backup log base_name with TRUNCATE_ONLY
con esto liberas el LOG
saludos,
August 9, 2006 at 3:11 am
Norman
I don't really agree with what Andrés suggests. Basically what he's saying is that you should run the first script to free space and shrink the log, then after any heavy load you should run the second script.
Much better, as Wesley says, to back up the log frequently. Set the log to the largest size it will reach and leave it. Continual shrinking causes fragmentation and consumes server resources unnecessarily. If you need to be able to recover to a point in time or later than your last full or differential backup then you need to back up the log to disk instead of just truncating it. If you are short of disk space then you can look at ways of getting the dump files on to tape or another server as soon as they are made. You can also set up a job that will dump or truncate the log when it gets to a certain percentage full (80%, for example). If you need help with that, let me know.
John
August 9, 2006 at 6:58 am
Norm
This is how you set up the job to back up the log automatically:
(1) In Enterprise Manager, expand Management, SQL Server Agent, right-click on Jobs and choose New Job
(2) Give the job a name then click on the Steps tab and click on New.
(3) Give the step a name, verify that the database is master and type in the command, something like BACKUP LOG MyDB TO DISK='f:\MyFolder\MyBackup.bak' WITH NOINIT
(4) Click on OK then on the Schedules tab
(5) Click on New Alert
(6) Give the alert a name and change the Type to SQL Server performance condition alert
(7) Change the Object to SQLServeratabases and the Counter to Percent Log Used. Change Instance to your database name and change Alert if counter to rises above
(8) Choose a number to use as your threshold in Value - we usually use 80.
(9) Click on OK and OK to save the job
If disaster ever does strike, your first action should be to attempt to back up the transaction log. If that succeeds, you have a good chance of recovering your database without any data loss at all.
John
August 9, 2006 at 12:25 pm
I set transaction log backups to occur every hour, and in EM, set SQL to delete backups older than 3 days. I have a Scheduled Task on the server which runs hourly, copying the backups to a large disk system where they are archived. (We must save the logs as part of HIPAA compliance, which is a health information law.)
Use XCOPY to do this with switches to only copy files that are new:
xcopy "D:\MSSQL\Data\Backup\*.*" "\\server\backupfolder\*.*" /D /S /Y /C
(obviously, change the paths to what is appropriate for your network.)
So long, and thanks for all the fish,
Russell Shilling, MCDBA, MCSA 2K3, MCSE 2K3
August 16, 2006 at 8:53 am
Hi,
I've been using the following to reduce the size of the log.
CHECKPOINT
GO
BACKUP LOG <log_name> WITH TRUNCATE_ONLY
GO
DBCC SHRINKFILE(2)
Go
The problem is that I need to do this for every log. Does anyone knows a way to create a single job for all database logs?
Thanks
Bruno Sousa
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply