July 11, 2018 at 2:00 pm
Comments posted to this topic are about the item Find the memory consumetion by each database
July 17, 2018 at 6:17 pm
It was so good I had to script it in PowerShell:
#Provide SQLServerName
$SQLServer ="localhost"
#Provide Database Name
$DatabaseName ="master"
#Prompt for user credentials
$credential = Get-Credential$Query = "
-- Memory used by each database
SELECT
DB_NAME(database_id) AS DB_Name,
COUNT (1) * 8 / 1024 AS MB_Used
FROM sys.dm_os_buffer_descriptors
GROUP BY database_id
ORDER BY COUNT (*) * 8 / 1024 DESC
GO
"# Start the process here
do {invoke-sqlcmd -ServerInstance $SQLServer -Database $DatabaseName -Credential $credential -Query $Query | Format-Table
Sleep 1; Clear
}
while ($true)
July 18, 2018 at 5:30 am
Its brilliant idea SSC Veteran.
July 22, 2018 at 9:17 pm
I respect anyone that will take the time to publish knowledge and, no... this isn't a slam. Just some helpful hints for your next post.
I think your script would have gotten better marks if you remembered the special but simple handling for the ResourceDB (which is what MS called it in one of their sample scripts) and if you had done your ORDER BY based on the column alias instead of repeating the formula. Also, you can easily convert page counts to megabytes simply by dividing by 128. Last but not least, you might want to do a spell check before you submit your post.
For example:
SELECT [DB_Name] = CASE WHEN database_ID = 32767 THEN 'ResourceDB' ELSE DB_NAME(database_id) END
,MB_Used = COUNT(*)/128.0
FROM sys.dm_os_buffer_descriptors
GROUP BY database_id
ORDER BY MB_Used DESC
;
--Jeff Moden
Change is inevitable... Change for the better is not.
July 22, 2018 at 9:18 pm
p.s. I'm not sure why someone would write a PoSh script for this when it's so easy to do in T-SQL and requires no special handling.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 22, 2018 at 10:25 pm
Thanks for your inputs...
July 23, 2018 at 5:42 am
francesco.mantovani - Sunday, July 22, 2018 9:49 PMHello @jeff, I was thinking this was a good idea for monitoring SQL Server on real time but maybe I'm wrong.
Do you think my PowerShell script might be to heavy because it keeps running every second in an infinite loop?
Thank you
First and again, just to be sure because it can be a sensitive subject.... in no way am I slamming you for the code.
To answer your question, since it does run once a second, it does seem a bit heavy handed. Of course, that can be easily changed. Also, running once per second may be totally necessary when trying to troubleshoot a problem or when trying to establish some sort of pattern.
I'm just not sure why so many people resort to PoSh for such things when it's actually simpler to do such a thing from T-SQL. Also and in my case, having such an output go to the screen would be a bit overwhelming to review simply because of the number of databases present on my server. It would be much better if it were logged to a semi-permanent scratch table with a sample date/time and contain the code in a DBA utility stored procedure where you could tell it not only what the sample rate should be but also how long it should sample with a decent set of defaults. And, if necessary, the stored procedure could easily be scheduled as a one -time or regularly scheduled job to assist with late night troubleshooting.
Yes, most of that can certainly be done in PoSh but why?
To make a play on words similar to what I'm frequently told, "Just because you can do something in T-SQL, doesn't mean you SHOULDN'T". 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply