November 25, 2013 at 10:05 pm
Dear Gurus,
I have 384 GB physical and 24 CPU's.
Please suggest what will be the Minimum and Maximum memory of SQL 2012.
Thanks
Adeel Imtiaz
November 26, 2013 at 12:31 am
That question is hard to answer without more information.
This article gives you some pointers:
How much memory does my SQL Server actually need?[/url]
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
November 26, 2013 at 2:10 am
https://www.simple-talk.com/books/sql-books/troubleshooting-sql-server-a-guide-for-the-accidental-dba/ - Chapter 4
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
November 26, 2013 at 3:18 am
Hi,
These settings I've done on a client machine with RAM of 128 GB were following:
max server memory set to 112 GB
min server memory set to 64 GB
My suggestion for your max memory is 336 GB.
For the min memory, it depends whether there are other OS actions at the server, what kind of and what requirements. If you often perform for e.g. copying of folders with many files, or some other apps are running, than the OS will be spending more from the RAM and therefore overtaking space of the sql server's memory buffer. If you rarely do copy or other actions or so, than the OS will not be spending much extra space of the RAM. 384 GB is quite a big value nowadays.
This setting is for you, but do not leave it the default value of 0. The suggested book (Chapter 4) above is a good reference.
Regards,
IgorMi
Igor Micev,My blog: www.igormicev.com
November 26, 2013 at 3:21 am
IgorMi (11/26/2013)
Hi,These settings I've done on a client machine with RAM of 128 GB were following:
max server memory set to 112 GB
min server memory set to 64 GB
My suggestion for your max memory is 336 GB.
For the min memory, it depends whether there are other OS actions at the server, what kind of and what requirements. If you often perform for e.g. copying of folders with many files, or some other apps are running, than the OS will be spending more from the RAM and therefore overtaking space of the sql server's memory buffer. If you rarely do copy or other actions or so, than the OS will not be spending much extra space of the RAM. 384 GB is quite a big value nowadays.
This setting is for you, but do not leave it the default value of 0. The suggested book (Chapter 4) above is a good reference.
Regards,
IgorMi
Also beware of other services running on the same machine, for example Analysis Services Tabular. For large data sets, SSAS Tabular can consume a lot of memory.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
November 26, 2013 at 3:48 am
Koen Verbeeck (11/26/2013)
IgorMi (11/26/2013)
Hi,These settings I've done on a client machine with RAM of 128 GB were following:
max server memory set to 112 GB
min server memory set to 64 GB
My suggestion for your max memory is 336 GB.
For the min memory, it depends whether there are other OS actions at the server, what kind of and what requirements. If you often perform for e.g. copying of folders with many files, or some other apps are running, than the OS will be spending more from the RAM and therefore overtaking space of the sql server's memory buffer. If you rarely do copy or other actions or so, than the OS will not be spending much extra space of the RAM. 384 GB is quite a big value nowadays.
This setting is for you, but do not leave it the default value of 0. The suggested book (Chapter 4) above is a good reference.
Regards,
IgorMi
Also beware of other services running on the same machine, for example Analysis Services Tabular. For large data sets, SSAS Tabular can consume a lot of memory.
Correct, of course all similar stuff should be taken in consideration. I put an accent on the usage/copying of files because I've met a situation where an application was doing a backup/restore (that is how that system had been designed) based on files (not the sql server's backup/restore, which was strange at first glance) and it's significantly eating the memory of the server.
Igor Micev,My blog: www.igormicev.com
November 26, 2013 at 6:45 am
I generally recommend leaving min server memory at 0 unless there's a good reason otherwise. Good reasons may include over-committed virtual machines and multi-instance clusters where it is possible for multiple instances to end up on the same node after server failures.
On a dedicated SQL machine, it shouldn't be necessary to set it.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
November 26, 2013 at 7:02 am
Do you plan on using replication on this server? If so, setting the min memory may be something to consider (http://technet.microsoft.com/en-us/library/ms151791.aspx). I am guessing this would be a physical server because of the specs, so over committed VMs may not be an issue for you. Basically, each server is a bit different depending on your environment. The goal is to dedicate as much of the memory as possible to SQL without suffocating the operating system which will cause paging. Start with reserving 5-7% of your memory. For example, using 7% you would want to set your max memory setting to around 357 GB. Keep an eye on your page usage and available physical memory and your can either dedicate a little more to SQL or take a little away.
Derek
November 26, 2013 at 7:11 am
7% reserved is low for a server this size. The OS needs memory to map and manage memory. I'd go with Jonathan's recommendations (in book and blog post referenced above) and set max server memory no higher than 330-335 to start with. It can always be adjusted upwards if Available MB is high.
Don't want the OS and SQL in a permanent lowmem/highmem dance and you definitely do not want the OS paging SQL's buffer pool out to disk.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
November 26, 2013 at 8:12 am
Great point Gail! The more memory you have, the more you need to manage mappings. Appreciate the clarification. That is a great book from Jonathan and Ted.
Derek
November 27, 2013 at 2:59 am
Don't forget that when you move to SQL 2014 and start to use memory-optimised tables your max memory calculation will need to be revised.
The link suggested by Koen and the book suggested by Gail are good for SQL 2012 and below, but the world moves on and new recommendations will be needed as SQL 2014 becomes more widely used.
Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.
When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply