September 5, 2012 at 8:48 am
Hi,
I work at a software house that has an ERP software (manages human resources, accounting, sales, purchases, ...) and I've recently changed job functions and now I'm responsible for the database.
The database has a 10 years legacy and has many "bad" practices.
Is there any good document, BOL or something about analyzing the database performance by profiling an active database at a customer? Check indexes being used and not being used, duplicate indexes, number of active connections, deadlocks and their source, tables with large amount of rows so they can be partitioned, most used queries/stored procedures, top queries that consume CPU, etc.....
Thanks in advance,
Pedro
September 5, 2012 at 8:51 am
Where to begin? You're talking about performance tuning from every angle. Best suggestion would be to start with a specific problem or two and work those out. What are the biggest pain points your users are facing when using the application? What parts of the system are most problematic? Don't try to fix everything at once.
September 5, 2012 at 9:01 am
One of the biggest problem is the time that some stored procedures take to execute. An example is on SP that inserts a sales' document, since it inserts on the sales tables, updates the products stocks (with triggers on the sales' details), updates the customer account, ... and does other stuff...
This SP causes deadlocks (since it takes a lot to execute other sales documents can't be created), has long running queries (stocks updates), ...
In this case I have almost every problem, except for the table partitioning...
Can SQL DMV help me on this? If so, is there any good documentation available with samples/examples?
Thanks,
Pedro
September 5, 2012 at 9:06 am
I'd start by profiling that process or at the very least that sproc. Look at the execution plan and start optimizing there. What about it is costly? Could indexes help? Does it use cursors? RBAR? Does it reference nested views?
Does this get you started? http://msdn.microsoft.com/en-us/library/ms188246.aspx
I'm sure there are some forum posts and articles here about deadlocks but I don't have them handy.
September 5, 2012 at 9:27 am
Scott D. Jacobson (9/5/2012)
I'd start by profiling that process or at the very least that sproc. Look at the execution plan and start optimizing there. What about it is costly? Could indexes help? Does it use cursors? RBAR? Does it reference nested views?Does this get you started? http://msdn.microsoft.com/en-us/library/ms188246.aspx
I'm sure there are some forum posts and articles here about deadlocks but I don't have them handy.
Scott and I are at the same starting point.
in order, these are the pain points i'd start looking at first.
1. does this return any data: if they do, determine if any of these use cursors touch Data... if they are cursorts that build commands form metadata, i'd think no problem, but anything selecting/updating/inserting has got to be reviewed.
identify the most used procs, and change them from Cursors to set based:
--procedures with cursors in them
SELECT OBJECT_NAME(object_id) from sys.sql_modules WHERE definition like '%CURSOR%'
Views that reference other views are often very costly.
--views that reference other views
SELECT OBJECT_NAME(modz.object_id),vwz.name
from sys.sql_modules modz
LEFT OUTER JOIN sys.views vwz
ON modz.definition like '%' + vwz.name +'%'
WHERE OBJECT_NAME(modz.object_id) <> vwz.name
2. use the DMV's for the top 20 most costly queries. looka t the execution plans of those queries, and see if they can be changed by making them SARG-able or assisted with better indexes. Fix these 20, and repeat until the most costly queries are not all that costly.
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
SELECT
TOP 20
CAST(( qs.total_worker_time ) / 1000000.0 AS DECIMAL(28, 2)) AS [Total CPU time (s)],
CAST(qs.total_worker_time * 100.0 / qs.total_elapsed_time AS DECIMAL(28, 2)) AS [% CPU],
CAST(( qs.total_elapsed_time - qs.total_worker_time ) * 100.0 / qs.total_elapsed_time AS DECIMAL(28, 2)) AS [% Waiting],
qs.execution_count,
CAST(( qs.total_worker_time ) / 1000000.0 / qs.execution_count AS DECIMAL(28, 2)) AS [CPU time average (s)],
SUBSTRING (qt.text, ( qs.statement_start_offset / 2 ) + 1, ( ( CASE
WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
ELSE qs.statement_end_offset
END - qs.statement_start_offset ) / 2 ) + 1) AS [Individual Query],
qt.text AS [Parent Query],
DB_NAME(qt.dbid) AS DatabaseName,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.DM_EXEC_SQL_TEXT(qs.sql_handle) AS qt
CROSS APPLY sys.DM_EXEC_QUERY_PLAN(qs.plan_handle) qp
WHERE qs.total_elapsed_time > 0
ORDER BY
[Total CPU time (s)] DESC
Lowell
September 6, 2012 at 9:30 am
Are there any tools to analyse the database structure?
For example:
- tell that 2 indexes have the same columns or could join 2 indexes into just 1 and save space;
- detect queries with UNIONs;
- if the table has data see if the columns data type matches the data inserted (for example a float column with only integers);
- other useful stuff that can be detected at a structure level.
DMV can help with the "read" problems of execution but the "basic" problems at structure level aren't analysed.
Has anyone taken a look at Red Gate book on DMV "SQL Server DMV Starter Pack"?
Thanks,
Pedro
September 6, 2012 at 9:53 am
For performance, maybe start here:
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply