March 2, 2022 at 5:20 pm
Memory - 32gig
MAXDOP, cost threshold -- Default
Index and stats are current...
It's a VM with backend SAN and no issues with latency..
March 2, 2022 at 6:01 pm
Index and stats are current...
How do you know this? Can you describe your process for maintenance?
MAXDOP, cost threshold -- Default
There are a lot of articles and advice related to Maxdop. How many cores does this machine have?
https://www.sentryone.com/blog/is-maxdop-configured-correctly
I'm betting the cost threshold should be raised. This will give you the queries with parallel execution. Which I got from Jonathan Keyahas. https://www.sqlskills.com/blogs/jonathan/tuning-cost-threshold-for-parallelism-from-the-plan-cache/
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;
WITH XMLNAMESPACES
(DEFAULT 'http://schemas.microsoft.com/sqlserver/2004/07/showplan')
SELECT
query_plan AS CompleteQueryPlan,
n.value('(@StatementText)[1]', 'VARCHAR(4000)') AS StatementText,
n.value('(@StatementOptmLevel)[1]', 'VARCHAR(25)') AS StatementOptimizationLevel,
n.value('(@StatementSubTreeCost)[1]', 'VARCHAR(128)') AS StatementSubTreeCost,
n.query('.') AS ParallelSubTreeXML,
ecp.usecounts,
ecp.size_in_bytes
FROM sys.dm_exec_cached_plans AS ecp
CROSS APPLY sys.dm_exec_query_plan(plan_handle) AS eqp
CROSS APPLY query_plan.nodes('/ShowPlanXML/BatchSequence/Batch/Statements/StmtSimple') AS qn(n)
WHERE n.query('.').exist('//RelOp[@PhysicalOp="Parallelism"]') = 1
ORDER BY n.value('(@StatementText)[1]', 'VARCHAR(4000)') --ecp.usecounts DESC --n.value('(@StatementSubTreeCost)[1]', 'VARCHAR(128)')
It's a VM with backend SAN and no issues with latency.
A little more info would really help. How are you determining latency, and if it's good or bad?
VMWare has a very good article on configuring a VM for SQL Server. Memory and CPU should be reserved in the VM. You can add up to 4 SCSI adaptors in a VM, the VMWare Paravirtual adaptor should be used for data, logs, and tempdb.
Are these drives configured in 8K block sizes, or 64k blocks?
You may want to read this Stairway article https://www.sqlservercentral.com/stairways/stairway-to-sql-server-virtualization
And this one https://www.sqlservercentral.com/stairways/stairway-to-sql-server-indexes
Some guy named Jeff Moden did an exhaustive set of research related to indexes. His eye opening findings are in a series of videos. Google "Jeff Moden black arts index maintenance"
Some other guy named Grant Fritchey wrote some books and articles about a lot of the things you are struggling with. I think this is a good place to start https://www.red-gate.com/simple-talk/books/
And, please help us help you.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
March 2, 2022 at 6:04 pm
Hope I caught all replies, I'm going with having ORM fix code as suggested..
It looks like someone is generating ORM code but doing it incorrectly. ORM tools can use the correct data types. I'd get with the developers to fix that, right now. Ahead of performance issues. Get your ORM code correct or you're always, ALWAYS, going to have pain.
March 2, 2022 at 6:11 pm
Hope I caught all replies, I'm going with having ORM fix code as suggested..
You didn't. And while I agree that ORM code frequently needs to be fixed, it's probably not going to fix the problem you originally posted.
And on your comment about indexes and stats being up to date... yeah... not on the table in question. 😉
NP. Moving on. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
March 3, 2022 at 4:38 pm
All problems won't be fixed in the code, but I'd start there. Get that right, then figure out how the queries really behave. From there, you can make some good index choices.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
March 3, 2022 at 7:58 pm
Thanks ... thats the plan of attack.
March 4, 2022 at 1:19 pm
I do have another question as I'm digging into this further to help uncover code needing fixed. When I ran the provided query to uncover IMPLICIT_CONVERSION query's(thx) I started looking at the Execution plan captured. I do see some of the conversions happening but don't see all of them ?
Example:
(@1 nvarchar(4000),@2 nvarchar(4000),@3 nvarchar(4000),@4 tinyint,@5 tinyint)
UPDATE [design] set [attValue] = @1
WHERE [attribute]=@2 AND [design]=@3 AND [designAlternative]=@4 AND [version]=@5
In the Index Seek(predicate) I can see it doing a IMPLICIT_CONVERSION on @4, but not on the other values even though none of the Table data types match @1,@2,@3,@5
March 4, 2022 at 1:53 pm
Effectively a varchar(50) and a varchar(4000) are the same, so you won't see an implicit conversion there. It's the varchar to int that you're going to see the most.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
March 4, 2022 at 2:06 pm
Thx,
Do we pay any performance hit when the script reports back that it's seeing:
Table - Design
Column - design
Convertfrom - nvarchar
convertfromlength - 40
converttolength - 240
March 4, 2022 at 5:13 pm
Help me understand this scenario... I'm trying to TEST a fix for Implicit conversion on a single update...
sample 1 would be doing a int to nvarchar(255)
sample 2 I try to do a cast to the correct field attribute. --Fix
I don't see any difference in Query Execution Plan.
-- Sample 1
UPDATE design SET attvalue = 10
WHERE design ='MMVERIFY.001'
AND designAlternative = 0 AND version =0
AND position =10 AND class ='process'
AND attribute ='bomposition' AND layerCode =10
--sample 2
UPDATE design SET attvalue = CAST(10 AS NVARCHAR)
WHERE design ='MMVERIFY.001'
AND designAlternative = 0 AND version =0
AND position =10 AND class ='process'
AND attribute ='bomposition' AND layerCode =10
March 4, 2022 at 7:09 pm
Help me understand this scenario... I'm trying to TEST a fix for Implicit conversion on a single update... sample 1 would be doing a int to nvarchar(255) sample 2 I try to do a cast to the correct field attribute. --Fix
I don't see any difference in Query Execution Plan.
-- Sample 1
UPDATE design SET attvalue = 10
WHERE design ='MMVERIFY.001'
AND designAlternative = 0 AND version =0
AND position =10 AND class ='process'
AND attribute ='bomposition' AND layerCode =10
--sample 2
UPDATE design SET attvalue = CAST(10 AS NVARCHAR)
WHERE design ='MMVERIFY.001'
AND designAlternative = 0 AND version =0
AND position =10 AND class ='process'
AND attribute ='bomposition' AND layerCode =10
Size your nvarchar at the proper size. Or, simply enclose the value in quotes preceded by an N, such as N'10'
As far as the script reporting different sizes as an implicit conversion, think about it. If a column is sized at 50, and the variable is sized at 255, SQL needs to validate that the data in the variable will fit in the column. Not too costly when you are updating or inserting one row, but in the case of many rows, or many executions, that adds up.
The ones to really focus on are int's to nvarchar, and nvarchar to varchar.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
March 4, 2022 at 8:14 pm
Thanks for response I'm going to try one int to nvarchar in tonight processing and see what happens. -- Test of course...
March 6, 2022 at 3:11 pm
I found another implicit conversion happening on this statement and looking for suggestion on how to improve qry or just the
is not null
WHERE d1.attvalue IS NOT NULL -- where is going from INT to Nvarchar(255).
Thanks.
INSERT INTO bom SELECT distinct d1.design, d1.version, d1.attvalue BOMitem, d1.designAlternative, d2.attValue BOMdesignAlternative
FROM design d1 INNER JOIN design d2 ON d1.design = d2.design AND d1.version=d2.version
AND d1.designAlternative = d2.designAlternative AND d1.layerCode = d2.layerCode AND d1.position = d2.position
AND d1.operation = d2.operation WHERE d1.attvalue IS NOT NULL AND d1.attribute = N'BOMitem' AND d2.attribute = N'BOMdesignAlternative'
AND d1.design=N'PRY_CABLE_LV_5X16SQMM' AND d1.designAlternative = 4 AND d1.version = 0
March 6, 2022 at 3:29 pm
I did want to mention that this machine is a 2 node(8 cores per) machine and the Cost Threshold for Parallelism is set to 50.
Maxdop on server basis is still at 0.
I reset my stats last night before process ran so I could get a clear picture of what's happening. I ran a qry that should Top 10 CPU Intensive Queries and list query was top on my list. The table has less than 1000 records,
(@1 nvarchar(4000),@2 nvarchar(4000),@3 tinyint,@4 nvarchar(4000),@5 nvarchar(4000),@6 tinyint,@7 tinyint)
DELETE [designError]
WHERE [design]=@1
AND [subDesign]=@2
AND [designAlternative]=@3
AND [layerCode]=@4
AND [object]=@5
AND [position]=@6
AND [operation]=@7
March 7, 2022 at 2:59 am
I did want to mention that this machine is a 2 node(8 cores per) machine and the Cost Threshold for Parallelism is set to 50. Maxdop on server basis is still at 0.
I reset my stats last night before process ran so I could get a clear picture of what's happening. I ran a qry that should Top 10 CPU Intensive Queries and list query was top on my list. The table has less than 1000 records,
(@1 nvarchar(4000),@2 nvarchar(4000),@3 tinyint,@4 nvarchar(4000),@5 nvarchar(4000),@6 tinyint,@7 tinyint)
DELETE [designError]
WHERE [design]=@1
AND [subDesign]=@2
AND [designAlternative]=@3
AND [layerCode]=@4
AND [object]=@5
AND [position]=@6
AND [operation]=@7
Check the data types of all those parameters, esp;ecially the NVARCHAR(4000) stuff. If they're playing against VARCHAR() columns, you've just found your first major performance problem.
Next, check for triggers, the number of indexes, and the number of foreign key relationships that this table has with other tables.
And MaxDop is probably not correct for your number of CPUs and workload. I have a seriously mixed workload of large batch code and a shedload of OLTP. I have half a TB of ram and 32 core... and got a serious improvement on the large batch stuff by changing MAXDOP from 8 to 4. YMMV so test it.
IMHO, most code doesn't need more than 1/4th the total core in the machine if it was written correctly and, if it wasn't it won't help anyway. "Scale smart instead of up and out". 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 16 through 30 (of 60 total)
You must be logged in to reply to this topic. Login to reply