February 2, 2017 at 8:38 pm
In our production server , memory usage is around 95% even tho it has been capped (8 GB ) . Total memory 12 GB
When I run a script I got a db letβs say Db 1 which has the highest catched_pages_count (574.198)
And I run another script , I got a table letβs say table 1 which has the highest catched_pages_count (514.201)
I really have no idea what to do with that table
Should I just lower down the memory setting ?
cheers
February 2, 2017 at 9:54 pm
No... 12GB is too little for a database server. Buy some more memory for it.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 2, 2017 at 10:15 pm
Jeff Moden - Thursday, February 2, 2017 9:54 PMNo... 12GB is too little for a database server. Buy some more memory for it.
Thanks for the reply !
hmm how do i measure the number of physical memory that the server needs?
February 3, 2017 at 12:25 am
If you sum up the "Size" column of sys.master_files for all of your MDF/NDF files and divide by 128, what do you get for a result?
--Jeff Moden
Change is inevitable... Change for the better is not.
February 3, 2017 at 8:14 am
For some context, my laptop has 16GB of memory, my development desktop has 32 GB.
I set up a new server for a client yesterday, 512 GB memory.
My laptop should not have more memory than your production server. Upgrade that thing, and not to the minimum you think you can get away with. SQL loves memory, it uses memory to cache data and plans to avoid the cost of fetching/regenerating them.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
February 6, 2017 at 3:44 pm
Thanks for the valuable input guys π I will upgrade it !!
February 6, 2017 at 3:54 pm
Jeff Moden - Friday, February 3, 2017 12:25 AMIf you sum up the "Size" column of sys.master_files for all of your MDF/NDF files and divide by 128, what do you get for a result?
I get 84641.9375 . What does it mean ?
February 7, 2017 at 4:57 am
WhiteLotus - Monday, February 6, 2017 3:54 PMJeff Moden - Friday, February 3, 2017 12:25 AMIf you sum up the "Size" column of sys.master_files for all of your MDF/NDF files and divide by 128, what do you get for a result?
I get 84641.9375 . What does it mean ?
It means you need a SQL Server which has at least 100GB RAM in order to be able to cache all your data. π If you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.
Thomas Rushton
blog: https://thelonedba.wordpress.com
February 7, 2017 at 8:10 am
ThomasRushton - Tuesday, February 7, 2017 4:57 AMWhiteLotus - Monday, February 6, 2017 3:54 PMJeff Moden - Friday, February 3, 2017 12:25 AMIf you sum up the "Size" column of sys.master_files for all of your MDF/NDF files and divide by 128, what do you get for a result?
I get 84641.9375 . What does it mean ?
It means you need a SQL Server which has at least 100GB RAM in order to be able to cache all your data. π If you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.
'zactly. π It certainly means that 12GB isn't going to hack it.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 7, 2017 at 9:16 am
You might also want to check the memory limitations for your edition and version before you start throwing too much memory at the problem:
https://msdn.microsoft.com/en-us/library/ms143685(v=sql.105).aspx
Although obviously any increase on 12 GB isn't going to hurt...
February 7, 2017 at 10:52 am
ThomasRushton - Tuesday, February 7, 2017 4:57 AMIf you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.
I'm going to disagree there. While the entire DB is 86 GB, we don't know how much of that is hot data, how much is cold data that's never touched, how much is used during business hours, how much is only used overnight, etc.
12 is almost certainly not enough (my laptop has 16), but 100GB may be far too much.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
February 7, 2017 at 5:07 pm
GilaMonster - Tuesday, February 7, 2017 10:52 AMThomasRushton - Tuesday, February 7, 2017 4:57 AMIf you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.I'm going to disagree there. While the entire DB is 86 GB, we don't know how much of that is hot data, how much is cold data that's never touched, how much is used during business hours, how much is only used overnight, etc.
12 is almost certainly not enough (my laptop has 16), but 100GB may be far too much.
how much do u think Gail ?
February 7, 2017 at 11:59 pm
WhiteLotus - Tuesday, February 7, 2017 5:07 PMGilaMonster - Tuesday, February 7, 2017 10:52 AMThomasRushton - Tuesday, February 7, 2017 4:57 AMIf you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.I'm going to disagree there. While the entire DB is 86 GB, we don't know how much of that is hot data, how much is cold data that's never touched, how much is used during business hours, how much is only used overnight, etc.
12 is almost certainly not enough (my laptop has 16), but 100GB may be far too much.
how much do u think Gail ?
More than 12.
I have no metrics or stats that would guide me to a better value.
Buy what your company can afford, plan for future growth and the more the better within reason.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
February 8, 2017 at 8:12 pm
GilaMonster - Tuesday, February 7, 2017 11:59 PMWhiteLotus - Tuesday, February 7, 2017 5:07 PMGilaMonster - Tuesday, February 7, 2017 10:52 AMThomasRushton - Tuesday, February 7, 2017 4:57 AMIf you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.I'm going to disagree there. While the entire DB is 86 GB, we don't know how much of that is hot data, how much is cold data that's never touched, how much is used during business hours, how much is only used overnight, etc.
12 is almost certainly not enough (my laptop has 16), but 100GB may be far too much.
how much do u think Gail ?
More than 12.
I have no metrics or stats that would guide me to a better value.
Buy what your company can afford, plan for future growth and the more the better within reason.
Apparently only 2 or 3 GB would be added
February 9, 2017 at 12:50 am
WhiteLotus - Wednesday, February 8, 2017 8:12 PMApparently only 2 or 3 GB would be added
So do you want my laptop to run your production DB? It's got more memory. Hell, my gaming and development desktop is running 32GB.
These days, I wouldn't go for less than 32GB for a new production DB server, mostly because memory is so incredibly cheap that there's no reason to skimp.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply