December 4, 2009 at 2:49 pm
i recently had a MAJOR issue with performance a couple of weeks ago. i was able to start some server side traces and compile some data, and i've just been able to start going through it now that i've cleaned up the mess that was caused by the poor performance, and my brain's fried. in other words, i'm probably too close to the forest to see past the trees.
i've been able to improve performance by identifying the slowest procs (they're being run 1% of the time but they're responsible for about 20% of the slowness) and adding the WITH RECOMPILE hint to them. i know this tells SQL to basically not use whatever plans it's got stored for the proc and instead use a 'brand new' compile. what i'm worried about is:
1. how am i paying for this? this isn't the best solution, since if it was, all procs would be designed to recompile. and i don't necessarily like the idea of NOT using a cached plan. so there's a tradeoff somewhere.
2. WHY does this improve performance? i've got indexes being rebuilt during nightly maintenance, but this weekend i'm going to rebuild them myself, manually, one by one. i'm not looking forward to it, but i think it's got to be done.
i identified the problem and solution when i noticed that the procs were generating PAGEIOLATCH_SH waits when they were run, but when i ran the code inside the proc (without EXECing the proc itself), i didn't get the waits, and the code finished MUCH faster.
our db isn't all that large, relatively speaking. it's about 50gb over 5 files. all files, including the log, are hosted on our SAN. i wanted to separate the log file from the data files and put it on its own LUN, but i've been told by our sysadmins that it wasn't necessary... but i'm not convinced.
we're running sql 2005 x64, standard edition, sp3 cu4, with pages locked in memory.
any advice, thoughts, input, whatever that anyone can offer, i'd appreciate. i know i'm missing something, so i'm open to being put in my place.
December 4, 2009 at 3:06 pm
Without seeing the code for the stored procs, I'd say it sounds like a good case of parameter sniffing. By forcing a recompile of the procedures each time they are run, SQL Server is generating a new plan based on the current parameters passed to the procedure.
If you code post the code of one of the procedures, we may be able to help with some possible changes. It would be a start, and I would recommend you read the first two articles I reference below in my signature block.
December 7, 2009 at 6:29 am
In addition to posting the code as Lynn suggests, also try capturing an execution plan or two (more than one if they're changing for each execution).
Two answer part of your question, there's no such thing as a free lunch. You're paying for these performance improvements by using CPU, memory, & posssibly a bit of I/O, to compile an execution plan on each execution of these procedures. This may be a trivial cost, or it might be extremely expensive. You'd have to capture data (through trace events or by watching stats I/O & time in a query) to know for sure.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply