March 28, 2014 at 10:42 am
New Born DBA (3/28/2014)
Jeff Moden (3/28/2014)
New Born DBA (3/28/2014)
Come to think of it, I don't know how adding initial size to 1000 MB and set the growth to 1000 MB will help. The auto-growth of the log file is greater than 30GB and it is still getting filled.Hold the phone... let's slow down a minutes according to the .png graphic you attached, your logfile growth is set to 10%, not 30GB. And, your log file was only 121MB.
You also said that the DBAs recently changed the Recovery Model to "simple" which means that it's impossible to backup the log files now.
As someone else stated, the initial setting for both the database and the log file are horrible and need to be changed.
All of that brings up some serious questions but the biggest question that I have is... why are YOU doing this type of stuff? The DBAs should be doing this for you.
I agree with you but what If I say we don't have any professional DBAs. I and one of my co-worker became DBA by just learning on our own and by reading different blogs.
Ah! Got it! Whole 'nuther story then. Thanks. I'll try to post tonight or tomorrow (I'm just checking from work right now) on how to fix some of the "damage" done by using the default settings for a database and how/why to setup the MODEL database so you don't have to worry about such a thing so much in the future.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 28, 2014 at 10:47 am
JoshDBGuy (3/28/2014)
Initial log file size is 121 MB and the autogrowth is 10 percent which currently equates to about 12 MB. Where do you see 30 GB?
I am sorry, what I meant was that the log file grows to 30 GB in 25 minutes. It happens every other day or sometimes 2 or 3 times a week. I don't think we have this much space to allocate to the log file.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
March 28, 2014 at 11:02 am
New Born DBA (3/28/2014)
samalex (3/28/2014)
One issue is the potential of one log or data file filling up the entire drive preventing the other data and log files from growing.We already created a partitioned and put all the other mdf and ldf files on separate drive.
Bear in mind, when people talk about putting the MDF and LDFs on separate drives, they're talking about separate *physical* drives, not "logical" drives.
Partitioning one big physical drive into two smaller logical drives won't provide any benefit, as the heads still have to bounce around to read the MDF and LDF.
Having two physical drives, one with the MDF and one with the LDF will improve performance (how much? It depends...) because now the drive with the LDF can read / write independantly of the drive with the MDF.
March 28, 2014 at 11:18 am
Thank you so much guys for helping me out here. There is 1 more question I have. I can start a new thread but it is regarding the Tlog so I thought why not ask.
I take full backup everyday at 11 PM and than Tlog backup every hour (No Differential backup). I also delete the full backup which is 1 day old and delete all the Tlog which is 1 hour old. Any given time, I have a full backup and 1 tlog backup. I don't know if I should keep the Tlog backup for 24 hours or delete it every hour. if I delete it every hour (Which I am), does it help me to perform point-in-time recovery if lets' say something breaks at 10 PM?
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
March 28, 2014 at 11:40 am
You want to maintain consistent T-Log backups from your last full backup. So if you backup at say 6:00 PM every day, you're going to want all of the T-Log backups from 6:00 PM until 6:00 PM the next day. If you don't and say you only keep one backup at 5:00 PM, you're breaking the log chain and aren't going to be able to restore. Remember that T-Log backups only contain the backups of logs from the previous T-Log backup. Don't get confused between T-Logs and Differentials. A differential backup will backup all data from between the previous full backup and the current differential.
March 28, 2014 at 11:42 am
Thanks a lot.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
March 28, 2014 at 11:43 am
March 28, 2014 at 11:44 am
New Born DBA (3/28/2014)
JoshDBGuy (3/28/2014)
Initial log file size is 121 MB and the autogrowth is 10 percent which currently equates to about 12 MB. Where do you see 30 GB?I am sorry, what I meant was that the log file grows to 30 GB in 25 minutes. It happens every other day or sometimes 2 or 3 times a week. I don't think we have this much space to allocate to the log file.
So, have you been shrinking the log file and that's how it got back down to 121 MB?
--Jeff Moden
Change is inevitable... Change for the better is not.
March 28, 2014 at 11:46 am
Jeff Moden (3/28/2014)
So, have you been shrinking the log file and that's how it got back down to 121 MB?
I am not, but yes they are being shrunk by another accidental DBA.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
March 28, 2014 at 11:48 am
Grant Fritchey (3/28/2014)
JoshDBGuy (3/28/2014)
New Born DBA (3/28/2014)
JoshDBGuy (3/28/2014)
If it's simple recovery, after a transaction has been fully committed the space used should be reused. If it's in full recovery, you need to run t-log backups for the log to truncate. Now that it's in simple recovery, you should be fine unless the log file is 1 MB.I am running t-log backup every hour, but this job fills up the log file in half an hour. I don't know if setting up t-log backup every 15 minutes should prevent it.
We did put the DB to Simple Recovery Mode but now we are risking data lose because we won't be able to perform point-in-time recovery.
In environments with a lot of transactions I prefer to run my T-Log backup every 10 or 15 minutes. If you did run the log backup every 15 minutes and it takes about 30 minutes for the batches to complete, you should be fine as long as you tweak your log file size and autogrowth settings accordingly. What's the initial size of your log file?
Unless it's just a huge transaction. In which case, you either have to find a way to reduce the size of the transaction, or you have to have enough space in the log to support it.
Yep, based on his last post, it sounds like they are running a massive batch over a short time period. Probably a very inefficient transaction.
March 28, 2014 at 11:49 am
New Born DBA (3/28/2014)
Jeff Moden (3/28/2014)
So, have you been shrinking the log file and that's how it got back down to 121 MB?I am not, but yes they are being shrunk by another accidental DBA.
This isn't really the best way to handle this kind of process. Based on your current log and data file settings I foresee some major problems. I see that you are doing a batch, how many records are you archiving? Does the data contain blobs? (look for text fields).
March 28, 2014 at 12:03 pm
JoshDBGuy (3/28/2014)
I see that you are doing a batch, how many records are you archiving? Does the data contain blobs? (look for text fields).
I don't know how many records are being archived and I don't think the data contains any blobs.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
March 28, 2014 at 12:11 pm
New Born DBA (3/28/2014)
JoshDBGuy (3/28/2014)
I see that you are doing a batch, how many records are you archiving? Does the data contain blobs? (look for text fields).I don't know how many records are being archived and I don't think the data contains any blobs.
For it to be 30 GB requires quite a bit of data, I would say that you need to look at the script to see what it's doing.
March 28, 2014 at 12:22 pm
JoshDBGuy (3/28/2014)
For it to be 30 GB requires quite a bit of data, I would say that you need to look at the script to see what it's doing.
That's exactly what we need to figure out. Someone suggested to run a SQL profiler to see what's going on, but I have heard that its not a good practice to run profiler in a prod environment. Most DBA's recommend server side trace.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
March 28, 2014 at 12:28 pm
Trace is easy. Start Profiler, pick the options you want and then save as a trace script file. Then execute that on the server to run trace.
Viewing 15 posts - 31 through 45 (of 52 total)
You must be logged in to reply to this topic. Login to reply