November 16, 2010 at 10:39 am
Hi,
We currently have two SQL Server 2005 Enterprise (version 9.0.4053) servers running on Windows Server 2003 R2 Enterprise SP2. Both servers are identical HP Proliant DL380 G5 servers with Intel Xeon E5440 processors, 4GB RAM, two Dual Port 15k 72GB SAS drives (Mirrored) and three Dual Port 10K 146GB SAS drives (RAID 5). SQL Server and all databases are installed on the RAID 5 and the O/S and page file (4092MB) are on the mirror for both servers.
SQL Server 1 has our Transport software installed on it using 26 different databases and has around 54 users, and SQL Server 2 running our Storage Warehouse Software uses 6 different databases and has around 8 users.
My problem is SQL Server 1 out performs SQL Server 2 by a mile and I cannot figure out why. Looking a little further into it, I have noticed that the sqlservr.exe process will never go above 25% CPU usage and the memory usage is 1,697,904K on SQL Server 2 as opposed to SQL Server 1 using up to 90% CPU usage and nearly 2GB for the memory usage.
These two servers are purely used for SQL Server and the Storage Warehouse Software Company are not being very helpful to giving me a hand in figuring out why SQL Server 2 is not running 110%.
It would be much appreciated if someone could give me some pointers as to what should be done to help improve performance on SQL Server 2 and if there is anything else I should be doing on the other server to help get the most out of the other server.
Thanks
November 16, 2010 at 12:14 pm
Offhand, I'd say your 8 users on Server2 are doing a lot of ad-hoc queries with ordering and the like going on.
Drop a serverside trace on Server2 catching Batch Completed and SP Completed. See what they're running most commonly. See what is giving you CPU spikes, and then see what you can do to optimize them.
If you have to go deeper because of a lot of ad-hocs, you CAN capture STATISTICS XML. It's the actual execution plan. I'd start with the first though unless you have to go to the second and can't puzzle out the missing components just from the T-SQL being run regularly.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
November 17, 2010 at 4:21 am
Thanks for the information Craig.
I have stuck it through the DETA some interesting figures have come out.
The number of indexes recommended to be created is 6 and I was wondering what is the best way to do this??
Alos the number of statistics recommended to be created is 194 (not 100% what these are??).
Thanks
November 17, 2010 at 4:24 am
I would take what DETA has to say with a very large grain of salt.
The best best for indexes is to design them yoursellf based upon the types of queries that will be hitting the table.
have a look at the execution plan from some of the worst performing queries and see where indexing may help
November 17, 2010 at 5:16 am
Something else which I have just noticed that on SQL Server 1 is that the sqlservr.exe process has a Priority set to AboveNormal. Now if I try and changed this in Task Manager I get Access Denied and I was wondering where I could change this??
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply