July 8, 2011 at 9:38 pm
still not convinced "significantly" always means there's pressure though. a new server staged for growth that is growing may not need all the ram allocated so it won't take it. I'm pretty sure that's what ours is doing. it's taking more and more every day, but still doesn't need the whole 32 yet.
September 9, 2011 at 10:42 am
The server log should tell you if you have memory pressure.
We have ;
"AppDomain 9 (mssqlsystemresource.sys[runtime].8) is marked for unload due to memory pressure."
and
Total = Target
So I have come to the conclusion that Total = Target = Max Memory does indicate memory pressure in our case.
September 12, 2011 at 7:00 am
Where do you find that log info?
September 13, 2011 at 2:01 am
To clear some air from the basic question
Target Memory = Memory available for the SQL server to consume and this sets up the upper limit. This also inturn means that server memory should be atleast 2 GB higher to allow it to function normally without any memory issue.
Total Memory = Memory which is readily available for the SQL server to consume and use as per its wish. This is the basal memory that SQL shall use to do it regular query processing and job execution.
So, if my understanding is correct
Total Memory > Target memory = problem
Total Memory < Target memory = happy
Is this the final conclusion ?
September 13, 2011 at 7:32 am
Take a look at my first post a few post up. Unfortunately things are not as clean cut as your conclusion. If you adjust the max memory setting, the values of target and total are not as easy to draw a conclusion from. Now, if you leave the setting to the max default value (essentially NOT controlling the value), then you can start drawing conclusions based on the values. I put this in there so people don't assume a certain conclusion. It really depends on the settings in place before hand.
September 13, 2011 at 8:24 am
Thanks Shifty1981..I did go through this thread and multiple other's to see if this process makes sense. Arguably, Max memory should be the upper limit. No matter how much we set Total memory SQL (in all true logical thinking) should not use more memory than that. If Target memory is just a number to denote upper limit, then why is that defined in first case (I m inquisitive about this).
Or is it that Target memory and Total memory are 2 sets of entities which compromise each other regularly based on the need per se ?
September 13, 2011 at 8:31 am
prithvi.m (9/13/2011)
Thanks Shifty1981..I did go through this thread and multiple other's to see if this process makes sense. Arguably, Max memory should be the upper limit. No matter how much we set Total memory SQL (in all true logical thinking) should not use more memory than that. If Target memory is just a number to denote upper limit, then why is that defined in first case (I m inquisitive about this).Or is it that Target memory and Total memory are 2 sets of entities which compromise each other regularly based on the need per se ?
There's also the fact that max memory does not refer to all memory that SQL will use. Only the Buffer pool part. Other parts use SQL as well. So even if you set the value to be 10GB below the total ram in the server, it may use close to all of it by the other memory pools. This is the part that is causing us issues. Because we run .NET tools to massage our databases which use up lots of memory, but then SQL itself is using more than just that max memory setting.
September 22, 2011 at 8:57 am
You wrote ; "Total Memory means how much memory SQL is using"
Just saw this on msdn ;
Target Server Memory (KB)- Total amount of dynamic memory the server can consume.
Performance monitor describes this as "Total amount of dynamic memory the server is willing to consume"
Total Server Memory (KB)- The committed memory from the buffer pool (in kilobytes). Note This is not the total memory used by SQL Server. Performance Monitor describes it as "Total amount of dynamic memory the server is currently consuming"[/i]
http://msdn.microsoft.com/en-us/library/ms190924.aspx"> http://msdn.microsoft.com/en-us/library/ms190924.aspx
September 22, 2011 at 8:59 am
shifty1981 (9/12/2011)
Where do you find that log info?
In SQL Server's Logs (Management node)
October 21, 2011 at 2:13 pm
OK,
A more experianced SQL DBA might be able to explain. I will be the first to say performance is not my area of experties (yet).
My general understanding of sql mem and tables is that it will bring the entire table into memory assuming you have enough. It will then hold that in memory until something else requires the memory (maybe a cleanup or timeout not sure). There are many other aspects to memory consumption but again I am no expert.
My quick test when reviewing this post.
Brand new server, has 4 databases but I am the only one connecting.
The starting point.
Taget 2,048,000 (2 GB)
Total 140,000 (140 MB)
I have a table that is (342 MB) approx. used space.
Did a "select * from bla"
Taget 2,048,000 (2 GB)
Total 489,000 (489 MB) approx.
I have a table that is 202 MB (approx.) used space.
Did a "select * from bla"
Taget 2,048,000 (2 GB)
Total 690,000 (690 MB) approx.
I did this for a few tables and did not take it to the max. My point is the total will reach target and that would mean for this to be a true gauge of memory pressure you would need more ram then disk space? I could be completely wrong here.
Please enlighten me 🙂
I am tring to figure out how to properly baseline a SQL Server and figure out when it requires more memory. Sure I can collect stastics all day long, I just don't know how to interpret them and turn it into justification.
May 27, 2012 at 5:10 am
What is Target Server Memory (KB)?
http://mssqlwiki.com/2012/05/27/what-is-target-server-memory-kb/
May 29, 2012 at 7:34 am
karthick_pk83 (5/27/2012)
What is Target Server Memory (KB)?http://mssqlwiki.com/2012/05/27/what-is-target-server-memory-kb/
Seems like you haven't gone through this post, please go through this post and you should by then difference between target and total memory :).
June 12, 2012 at 4:20 pm
I think some context is missing from the statement.
The statement says that if Total Memory is significantly and consistently less than Target Memory then there is memory pressure. This would be true if there is sufficient activity on the instance that the Total Memory should be reached but isn't. This would indicate to me that data that could be added to the buffer pool can't get there.
For example, if I had 8gb of buffer pool memory and I did a table scan on a table with 16 gb, I would expect to see Total and Target memory both reporting 8gb. (yes - I know it isn't quite that simplistic but you get the idea). If the Total Memory was only 4gb then I have a pretty good idea there is something not right with memory.
On the other hand, if I had 8gb in the buffer pool but had a database that only contained 2gb of data I would never reach the Target Memory but there would not be memory pressure.
June 13, 2012 at 8:29 am
I wouldn't believe anything from just one source. Troubleshooting memory bottlenecks means either you are encountering some performance issues or server problems you have associated by analysis to lack of memory. If I track target memory VS total memory (and I do) over time I can get a idea about what memory consumption vrs memory need is occuring in a very general way. But perhaps if I tracked more counter details and dug deeper into it I might confirm memory is indeed the issue or not...
--Any Free Pages below 500
SELECT *
FROM sys.dm_os_performance_counters
WHERE counter_name = 'Free pages' AND [object_name] LIKE '%BUFFER MANAGER%'
--Lazy Writes/sec above 20
--Memory Grants above 0
--Low Page life below 300
--Excessive full scans
DECLARE @SVRTYPE NVARCHAR(30)
SET @SVRTYPE = CASE
WHEN @@SERVICENAME = 'MSSQLSERVER'
THEN 'SQLServer:'
ELSE 'MSSQL$'+@@SERVICENAME+':'
END;
SELECT
[object_name] ,
counter_name ,
instance_name ,
cntr_value ,
cntr_type ,
CURRENT_TIMESTAMP AS collection_time
FROM sys.dm_os_performance_counters
WHERE ( OBJECT_NAME = @SVRTYPE+'Access Methods' AND counter_name = 'Full Scans/sec')
OR ( OBJECT_NAME = @SVRTYPE+'Buffer Manager' AND counter_name = 'Lazy Writes/sec')
OR ( OBJECT_NAME = @SVRTYPE+'Buffer Manager' AND counter_name = 'Page life expectancy')
OR ( OBJECT_NAME = @SVRTYPE+'Memory Manager' AND counter_name = 'Memory Grants Pending')
the code given represents some of the detail I would dig into, along with verifying memory pressure from file IO stats because if memory is short and the Server Database engine is having to pull data not from memory but from disk I should see signs there. I should see some indicators in waitstat information, buffer pool and cache analysis. There are many decent memory troubleshooting articles and code out there to monitor things yourself or tools that do it.
http://www.simple-talk.com/sql/performance/a-performance-troubleshooting-methodology-for-sql-server/
October 25, 2012 at 1:12 am
I have the same counfusion if almost all memeory is occupied by SQL Server.How we can identify in actual how much memory SQL Server is using and how we need to plan for future memory increase?
EX; 96 GB Total RAM. 94 GB occupied all the time by SQL Server(not releasing below that).Then how we come to know in actual SQL Server is using what amount of memory and in that case how we plan for future memory upgrade.
Regards,
Sachin
Viewing 15 posts - 16 through 30 (of 39 total)
You must be logged in to reply to this topic. Login to reply