September 20, 2017 at 12:04 pm
We're looking at installing a new COTS tool, and the installation instructions request a pretty robust hardware config: (8 CPU, 64Gb Ram). Based on the size of our shop, and the existing applications we're running, we're nowhere near this config in our other SQL Server environments, so I'd like to start smaller, and save some money.
The users in this case are the Network Team, and they're distrusting of using a smaller initial setup unless I can track performance over time. My past experience has been with targeting individual T-SQL processes that we have control over, and fixing them. In this case, it's watching the overall instance. In these cases, do you use a 3rd Party tool to monitor performance, or is there a good set of existing tools (aka free) that I can learn to get a regular, overall reading of system performance?
Thanks,
--=Chuck
September 20, 2017 at 12:29 pm
chuck.forbes - Wednesday, September 20, 2017 12:04 PMWe're looking at installing a new COTS tool, and the installation instructions request a pretty robust hardware config: (8 CPU, 64Gb Ram). Based on the size of our shop, and the existing applications we're running, we're nowhere near this config in our other SQL Server environments, so I'd like to start smaller, and save some money.The users in this case are the Network Team, and they're distrusting of using a smaller initial setup unless I can track performance over time. My past experience has been with targeting individual T-SQL processes that we have control over, and fixing them. In this case, it's watching the overall instance. In these cases, do you use a 3rd Party tool to monitor performance, or is there a good set of existing tools (aka free) that I can learn to get a regular, overall reading of system performance?
Thanks,
--=Chuck
You can store file IO stalls and wait stats in a table and analyze/trend to your hearts content. There is a small wrinkle with wait stats in that they can be reset by users, but this can be dealt with. That's plenty for a high-level look-see like you desire.
Steve Stedman has a freebie: http://databasehealth.com/
Redgate (sponsor/owner of this site IIRC) has a good one, SQL Monitor (http://www.red-gate.com/products/dba/sql-monitor/).
There are many others.
BTW, I can COMPLETELY TRASH a database application with that server size simply by screwing up the thing that was NOT mentioned in your spec: IO capability. That is VERY OFTEN under-provisioned by clients I help (sometimes even those with SAN-based SSDs).
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
September 22, 2017 at 12:21 pm
chuck.forbes - Wednesday, September 20, 2017 12:04 PMIn these cases, do you use a 3rd Party tool to monitor performance, or is there a good set of existing tools (aka free) that I can learn to get a regular, overall reading of system performance?Thanks,
--=Chuck
Windows Performance Monitor is free and has a lot of counters to monitor overall system performance. It includes counters for SQL Server as well so it can give you information for the server, server resources as well as SQL Server.
Sue
September 22, 2017 at 1:09 pm
Thanks Sue. I've been looking over perfmon since the day I put up this post. It's got some good information coming out of it, now I'm trying to figure out how to manage it. There are some instructions for older OS's which allow you to send it's data to a csv, but I haven't been able to find anything like that for Win2012 onward, so that's my challenge next. I need to be able to get a historical perspective on performance, so I've got to be able to handily compare values over time.
--=Chuck
September 22, 2017 at 1:51 pm
Sue
September 22, 2017 at 3:41 pm
Maybe you can help then. That step 6) "click on Properties. Select source and select log file option and browse folder where the file is saved.". I think that's where I am supposed to be able to write directly to a csv file, but this isn't available on the properties menu I'm able to pull up. I can't find a "log file" option in this menu (Right-click on the Data Collector set, pick properties):
Nor in this menu, which is what I get when I right-click on the item in the performance pane, while I have my Data Collector Set highlighted:
September 22, 2017 at 3:45 pm
Ah, I can get to that as an option, but I can only seem to find it for the real-time performance monitor, but not for a Data Collector Set.
--=cf
September 22, 2017 at 4:06 pm
chuck.forbes - Friday, September 22, 2017 3:45 PMAh, I can get to that as an option, but I can only seem to find it for the real-time performance monitor, but not for a Data Collector Set.
--=cf
You have to run the one you configured and then stop it. You want to get enough data for it to actually report on though...so run it for a bit. Then go to the report - it's created automatically down below the collector sets. The name of the report should be the same as your data collector set and in the User Defined folder. Open the report, then right click on it and it displays on the right and then select Save Data As. There is a drop down for blg, csv and something else...can't remember. It has a regular file save as dialog box so you can save it where ever.
If you don't get enough data just restart it, it will just create another report and file and append increment numbers to the end of the names. When you stop it, the report will also get generated with the numbers at the end of the name.
Sue
September 22, 2017 at 4:09 pm
Got it.
I'll let this run on a development server over the weekend and then come back & see what kind of data was generated.
--=Chuck
September 22, 2017 at 4:16 pm
chuck.forbes - Friday, September 22, 2017 4:09 PMGot it.
I'll let this run on a development server over the weekend and then come back & see what kind of data was generated.
--=Chuck
You can stop at a particular time or max size of the file if you want - right click on the data collector set, select properties. You can use a stop condition to stop based on file size, duration. You can schedule it from that same windows on the Schedule tab as well.
Sue
September 22, 2017 at 6:10 pm
Perfmon can work fine, but the thing I'd note is that you'll spend time building a system and managing the data, as well as tweaking it over and over across time. There's a cost to your effort. The COTS tools are designed to save you that time so you can do other work.
If that's worth it to you, then SQL Monitor from Redgate or other tools are worth a look. If you have the time, then play with Perfmon. Keep in mind you need to not only start/stop , store, aggregate information, but also archive / remove it over time.
Disclosure: I work for Redgate.
September 23, 2017 at 4:06 pm
For monitoring I do recomend SQL Monitor - using it for years and can't go without it.
It has a fully functional 14 days evaluation which may be good enough for you to prove the current server usage.
As for using perfmon and saving to a file/sql server table have a look at As for using perfmon and saving to a file/sql server table have a look at https://www.sqlservercentral.com/Forums/FindPost1897712.aspx - Although it is talking about SSAS I also include scripts to get perfmon counters and the process to write them to a csv and import it to a sql table.
you will need to identify the counters required for SQL Server but those you ca find on the net yourself.
September 24, 2017 at 11:20 am
You can also play at monitor.sqlservercentral.com. Live (delayed) data from this site.
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy