March 31, 2008 at 12:37 pm
Hi All
would appreciate your experiences.
I am working on a server (Windows 2003 R2 Standard x64 + SP2) with SQL Server (Standard + SP2 built for the intel x86 platform - 32 bit right?)
In the perfmon counters I am seeing that mem usage, peak and VM Size is sitting around the 1,790,184K size. This doesn't seem to change when I put a load on it during the day - like running tuning wizard with an hours trace log.
The application slows down and users start to complain - I know ... I did do this deliberatly to find out where the threshold was.
SQL Server cpu usages climbs upwards of 32% on top of its "normal" 49 - 52% that I have seen.
Now I am wondering if I can get more out of the system by using the /PAE and or WAE and increasing physical RAM.
I know I will also have to upgrade the SQL Server if I want to go beyond 4GB.
I am also at the moment doing all the usual stuff - collecting cah hit / % usage etc and determining where the housekeeping routines are ... hence the tuning wizard.
An increase in RAM is a possibility I need to start thinking about along with the others.
Oh .. application is web based (web server NOT on this machine, but is on another with RS! - problem for another day) written in VB.net 2 tier straight to the db server.
thanks
March 31, 2008 at 1:40 pm
SQL Server cpu usages climbs upwards of 32% on top of its "normal" 49 - 52% that I have seen.
So are you saying that your CPU is 80-90%? If so (and even if it is consistantly above 50%), I would recommend findout out what is using up all of the CPU cycles. Run Profiler and look at the heavy hitting queries in terms of CPU. It looks like you're configured for 2 GB of memory and I'll agree that that is not that much, but until you have solid proof that there is a memory bottleneck (low page life expectancy, low cache hit ratio, high lazy writes, etc), I would focus on finding out where your CPU is going.
March 31, 2008 at 2:21 pm
John, thanks for the prompt reply.
No the cpu is not that high but when I run the trace it was, however memory did not increase. And yes the system is at its default limits in terms of SQL Server.
I am trying to address the other factors you mentioned. I have about 18 gbs worth of traces - sql server and perfmon to go through and I expect I will be able to recommend - new indices, stats and a disk reorg and defrag.
What I'm trying to get input into is that once all this is done what else can I do in the memory space.
What I didn't mention is that the SQL Service.exe is run as under an account other than admin,ntlocal etc.
Now what is also weird is that the web application is also using this account to access the db for all incoming threads, so its very hard to identify individual users.
Would this have a bearing on using up memory and depriving sql server?
I know what I am trying to do is covering all the basis instead of tackling this one at a time. Its is in terms of the information gathering but the implementation will be one at a time.
Also this is an opteron machine 285 @ 2.61GHz. I am not familiar with this cpu so am looking for some pointers
thanks
March 31, 2008 at 3:32 pm
John,
There are a couple of things I'd like to point out here. Just as a note, I am not trying to sound condescending; rather, I'm not familiar with your level of SQL Server knowledge and I want to be sure I get my point accross. With that said.....
First, as you've already made known, you are in for a SQL upgrade if you plan to take advantage of AWE memory. You should also know that unless you plan to upgrade to 2005, you're Windows Server 2003 version will also require an upgrade to use AWE. Here's a good reference for you to read: http://www.sql-server-performance.com/tips/awe_memory_p1.aspx.
In the perfmon counters I am seeing that mem usage, peak and VM Size is sitting around the 1,790,184K size. This doesn't seem to change when I put a load on it during the day
however memory did not increase.
Second, if your SQL instance is configured to use 2GB of memory, what you are seeing here is really all that will be made available to SQL Server. Once SQL server has 'claimed' this memory, you will not see this value fluctuate with transaction volume. In other words, once SQL Server has grabbed all of the memory it can, it keeps it and uses it as effectively as it can. It is quite normal to see from the Total Memory / Target Memory counters that SQL Server has grabbed all of the available memory.
In order for you to determine if your system could benefit from additional memory, you need to monitor the instance and look for signs of a memory bottleneck. Sure, all database servers could benefit from additional memory as they'll take as much as they can get, but why spend the time and resources if it is not truely needed?
What I'm trying to get input into is that once all this is done what else can I do in the memory space.
As I've said, a DB server can usually benefit from extra memory, but prove your case before moving forward and spending time and money on it.
What I didn't mention is that the SQL Service.exe is run as under an account other than admin,ntlocal etc.
Now what is also weird is that the web application is also using this account to access the db for all incoming threads, so its very hard to identify individual users.
Would this have a bearing on using up memory and depriving sql server?
I don't think this is going to cause you memory problems, but I do agree that it makes it hard to know where the threads are originating. I would be more concerned here with security. If your SQL instance is running under a domain account and your application service is also running under that same account, I, for one, would have issues with this. This is the same thing as giving your users the SA account. I would recommend running SQL Server under it's own account. Beware that changing this now could have appilcation functionality impacts. I would do this in a test environment prior to production.
The bottom line is that if your system is running slow, find the reason why before considering upgrades. Instead of throwing hardware at it, find out what is using the existing resources. Most of the time, you can tune the code and nothing more. Also, if the system is running slow and users are complaining when you are running your analysis....stop running your analysis against the production DB during business hours!! If you must use your production DB, do your stuff after hours.
April 1, 2008 at 2:42 pm
John, thanks for the reply. No problems with what you have said - I guess I'm in transition having 15 yrs with another db and trying to translate what I know to this one and how it behaves.
I prefer a more clear explanation hence sometimes my questions may seem a bit generic.
I have learnt in the past that sometimes if you ask leading questions you get responses with leads you may never have thought of.
I had already made the decision to run with a set of housekeeping and new index/stats which came out of evaluating the most run queries/longest running queries first before any hardware changes. The logic behind this was to try and use any moneies saved from this server in getting a better one.
Once again thanks
April 3, 2008 at 9:17 am
Hi John
SQL Server 2000 Standard Edition has a hard wired memory limit of 2Gb, whether it's running under an x64 or x86 O/S. SQL 2000 Enterprise Edition does not have this constraint.
The reason you see 1.7Gb is that SQL Server "reserves" ~ 256Mb of memory at startup, which is not available to SQL Server's "Buffer Manager" (data & procedure cache controller). This ~256Mb reservation is commonly referred to as "Memory To Leave" & is termed such because the "Buffer Manager" cannot use it for data caching. MTL is used for a variety of purposes, including connection memory, CLR, xp_ programs, SQL Server's stack & threads etc.
This is why SQL Server 2000 Std Edn installations rarely go above 1.7Gb memory - they have a hard wired limit of 2Gb & they reserve ~256Mb as MTL at startup.
The easiest way to determine whether you have a shortage of memory is to collect the "Buffer Manager: Page Life Expectancy" (PLE) counter. If it falls much below 300 seconds, your cache memory is being "churned" too frequently (you're not getting much "caching"). This usually also manifests as an increased level of read-oriented disk IO. Systems where PLE get much below 100 are under fairly critial memory pressure & need upgrading.
HTH
Cheers,
Greg Linwood
Regards,
Greg Linwood
April 3, 2008 at 1:47 pm
Greg - thanks for the information.
I have been using the processor: %usage, buffer cach hit ratio and cache hit ratio - to try and determine if I'm low on memory.
Also using the recompile counter, but all seems good. Processor is below 50%, buffer cache hit and cahe hit ratios are both above 98% and I'm not seeing a lot of recompiles.
So bearing that in mind I'm concentrating of I/O and applying the usual DBA stuff to the databases, tables.
Once again thanks
time and time again it keeps going back to what the dba does or does not do - regardless of database
July 31, 2008 at 10:07 am
Hi Greg,
I have a customer who is currently running WIndows Server 2003 Standard with SQL 200 Std and is haveing memory issue. He is planning to upgrade to Windows 2003 EE with SQL Std 2000 do you think he could use upto 4 GB of memory.
If yes can you please suggest the steps.
Thx,
Amit Sood
July 31, 2008 at 10:19 am
a_sood (7/31/2008)
Hi Greg,I have a customer who is currently running WIndows Server 2003 Standard with SQL 200 Std and is haveing memory issue. He is planning to upgrade to Windows 2003 EE with SQL Std 2000 do you think he could use upto 4 GB of memory.
If yes can you please suggest the steps.
Thx,
Amit Sood
As has been stated before, SQL Server 2000 Standard is hard-wired to use no more than 2GB. You cannot get passed that without upgrade to Enterprise or SQL Server 2005.
I would recommend upgrading to SQL Server 2005 Standard SP2 (CU6\7) instead. Since this is going to be new hardware, you can test the upgrade and plan for any changes required to be compatible with 2005.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply