March 7, 2022 at 1:12 pm
I checked this table it has no FK, or triggers, and character fields are defined as Nvarchar. There are NO indexes or a PK set of this table.
Are you thinking I could try MaxDop at 4 for my configuration?
Thanks for response..
March 7, 2022 at 3:31 pm
I ran this query to show me the execution count of the nightly process. The query that was number 1 of the list was a
(@1 nvarchar(4000),@2 nvarchar(4000))
DELETE [bom]
WHERE [PlantId]=@1 AND [ParentPart]=@2
The table def has PlanID as Nchar(9) and ParentPart as Nvarchar(40).
I didn't notice any Implicit_conversions in query plan but has high "Total Logical Reads" there is a non-clustered Index ON
PlantID and Parent Part
Thanks.
SELECT
QS.EXECUTION_COUNT,
QT.TEXT AS QUERY_TEXT,
QT.DBID,
DBNAME= DB_NAME (QT.DBID),
QT.OBJECTID,
QS.TOTAL_ROWS,
QS.LAST_ROWS,
QS.MIN_ROWS,
QS.MAX_ROWS
FROM SYS.DM_EXEC_QUERY_STATS AS QS
CROSS APPLY SYS.DM_EXEC_SQL_TEXT(QS.SQL_HANDLE) AS QT
ORDER BY QS.EXECUTION_COUNT DESC
March 7, 2022 at 4:45 pm
I ran this query to show me the execution count of the nightly process. The query that was number 1 of the list was a
(@1 nvarchar(4000),@2 nvarchar(4000)) DELETE [bom] WHERE [PlantId]=@1 AND [ParentPart]=@2
The table def has PlanID as Nchar(9) and ParentPart as Nvarchar(40).
I didn't notice any Implicit_conversions in query plan but has high "Total Logical Reads" there is a non-clustered Index ON PlantID and Parent Part
Thanks.
SELECT
QS.EXECUTION_COUNT,
QT.TEXT AS QUERY_TEXT,
QT.DBID,
DBNAME= DB_NAME (QT.DBID),
QT.OBJECTID,
QS.TOTAL_ROWS,
QS.LAST_ROWS,
QS.MIN_ROWS,
QS.MAX_ROWS
FROM SYS.DM_EXEC_QUERY_STATS AS QS
CROSS APPLY SYS.DM_EXEC_SQL_TEXT(QS.SQL_HANDLE) AS QT
ORDER BY QS.EXECUTION_COUNT DESC
The actual execution plan would be helpful. As well as the structure of the table and indexes.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
March 7, 2022 at 5:42 pm
I checked this table it has no FK, or triggers, and character fields are defined as Nvarchar. There are NO indexes or a PK set of this table.
Are you thinking I could try MaxDop at 4 for my configuration?
Thanks for response..
The vast majority of your tables should have a clustered index. Generally, on the column or columns that are most frequently used to select the data. Also, a primary key should be on every table (again, some exceptions, but exceptions should be exceptional).
Why MAXDOP 4? You're set to the 0, the default. Guessing, you probably are also set to Cost Threshold for Parallelism of 5 (which is without a doubt, too low, it is on every system I've ever worked with). Instead of poking at the MAXDOP for no reason at all, examine the plan. Find out why it didn't go parallel (it's in the properties of the first operator). That will tell you more about your need, or not, for parallelism.
Don't just try things to try things. Understand what you're doing and why. It'll make troubleshooting performance easier.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
March 7, 2022 at 7:14 pm
I see 3 pieces of information in Query Plan for this delete operation.
Delete -- Cost 0%
(@1 nvarchar(4000),@2 nvarchar(4000))D
DELETE [mx_bom] WHERE [PlantId]=@1 AND [ParentPart]=@2
Table - Delete Cost 86%
no real text here accept the table name
Index Seek (Non-Clustered) (index_name) Cost 14%
Just info in the Seek predicates Scalar Operator @1 and Scalar Operator @2
March 7, 2022 at 8:16 pm
I don't think anyone will be able to help until we can see the actual execution plan. Here is one way of providing that information: https://www.brentozar.com/pastetheplan/instructions/
With that said - looking at the top executing query isn't necessarily going to identify that query as something that needs to be addressed. All it indicates is that this query is called a lot - but if that execution is sub-second each time then you probably don't have an issue.
The high total logical reads is the total of all executions which is cumulative, so not really a good indicator of bad performance. Again, looking at the execution plan would help see if that is the case.
As for your implicit conversions - you need to look at the WHERE clause. If the implicit conversion is on a column - then an index cannot be used and that query would perform a scan of the clustered index or heap. If the implicit conversion is on the parameter - then it can and will be performed one time on that parameter...but in either case, if you have an implicit conversion it may very well cause an issue with the cardinality estimator which could cause that query to generate a less than optimal execution plan (it may not - just a warning that it could).
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
March 7, 2022 at 9:45 pm
All it indicates is that this query is called a lot - but if that execution is sub-second each time then you probably don't have an issue.
Ah, careful now. Been there and done that. Your worst queries are usually not your longest running queries. They're usually very short running queries that run thousands of times per hour and milliseconds do suddenly matter.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 7, 2022 at 11:36 pm
I found the offending query when I look at the Query Plan.
Type conversion in expression (CONVERT_IMPLICIT(int,[d2].[attValue],0)) may affect "CardinalityEstimate" in query plan choice.
How do I go about fixing..
THanks.
INSERT INTO bom SELECT distinct d1.design, d1.version,
d1.attvalue BOMitem, d1.designAlternative, d2.attValue BOMdesignAlternative
FROM design d1 INNER JOIN design d2 ON d1.design = d2.design AND d1.version=d2.version
AND d1.designAlternative = d2.designAlternative AND d1.layerCode = d2.layerCode
AND d1.position = d2.position AND d1.operation = d2.operation
WHERE d1.attvalue IS NOT NULL AND d1.attribute = N'BOMitem' AND d2.attribute = N'BOMdesignAlternative'
AND d1.design=N'CORE/FILLED' AND d1.designAlternative = 0 AND d1.version = 0
March 8, 2022 at 3:51 am
I found the offending query when I look at the Query Plan.
Type conversion in expression (CONVERT_IMPLICIT(int,[d2].[attValue],0)) may affect "CardinalityEstimate" in query plan choice.
How do I go about fixing..
THanks.
INSERT INTO bom SELECT distinct d1.design, d1.version,
d1.attvalue BOMitem, d1.designAlternative, d2.attValue BOMdesignAlternative
FROM design d1 INNER JOIN design d2 ON d1.design = d2.design AND d1.version=d2.version
AND d1.designAlternative = d2.designAlternative AND d1.layerCode = d2.layerCode
AND d1.position = d2.position AND d1.operation = d2.operation
WHERE d1.attvalue IS NOT NULL AND d1.attribute = N'BOMitem' AND d2.attribute = N'BOMdesignAlternative'
AND d1.design=N'CORE/FILLED' AND d1.designAlternative = 0 AND d1.version = 0
My 1st suggestion would be to always use the 2-part naming convention for all objects.
2nd, figure out why you need a DISTINCT and try to get rid of it. You shouldn't need it.
3rd, you folks really need to adopt a formatting standard to make your life easier for inline documentation and trouble shooting like you need to do for this problem. Here's your code using what I refer to as the "River Format"...
INSERT INTO dbo.bom
SELECT DISTINCT
d1.design
,d1.version
,BOMitem = d1.attvalue
,d1.designAlternative
,BOMdesignAlternative = d2.attValue
FROM dbo.design d1
JOIN dbo.design d2
ON d1.design = d2.design
AND d1.version = d2.version
AND d1.designAlternative = d2.designAlternative
AND d1.layerCode = d2.layerCode
AND d1.position = d2.position
AND d1.operation = d2.operation
WHERE d1.attvalue IS NOT NULL
AND d1.attribute = N'BOMitem'
AND d2.attribute = N'BOMdesignAlternative'
AND d1.design = N'CORE/FILLED'
AND d1.designAlternative = 0
AND d1.version = 0
;
Since you're doing a self JOIN, the datatypes between the d1 and d2 aliases are automatically identical so no need to check anything in the FROM clause. That's pretty easy to see now that the code is in an easily readable format.
That leaves only the the column datatypes compare to the literals in the where clause and the datatypes between the d1/d2 columns and the column of the BOM table in the SELECT list.
Once you've checked those and have fixed them (especially if they have to do with the Unicode literals in the code if they play against VARCHAR() columns), you should also seriously dig into why you need an DISTINCT because that's covering up the fact that your code is generating more rows than you need.
You should also use a target column list for the BOB right after the INSERT. Not having one is an major accident just waiting to happen if column changes in the BOM table are ever made. With some really bad luck, you might not even notice for a while until the boss asks you why the numbers are wrong.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 8, 2022 at 4:45 am
thanks for update I'm checking on the why for Distinct..
How to fix this in Query PLan:
Type conversion in expression (CONVERT_IMPLICIT(int,[d2].[attValue],0)) may affect "CardinalityEstimate" in query plan choice.
Design Table:
attValue is Nvarchar(255)
attribute nvarchar(60) part of PK
design nvarchar(40) part of PK
designAlternative int part of PK
version int part of PK
March 8, 2022 at 7:23 am
It's the code. I know you've said you can't affect that, but that's where the problem is. You have to use the correct data types. That's how you fix these conversions. It's not complicated at all. It's really simple. INT for INT, VARCHAR(50) for VARCHAR(50), DATETIME for DATETIME, etc. Without that, you get what you're dealing with. And fixing the indexes, updating the stats, some of that may help in some situations, but it fundamentally won't fix the core issue, the code.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
March 8, 2022 at 11:06 am
I removed the Distinct and tried to cast on attvalue but I'm still getting this error in query plan.
Type conversion in expression (CONVERT(int,[d2].[attValue],0)) may affect "CardinalityEstimate" in query plan choice, The query memory grant detected "ExcessiveGrant", which may impact the reliability. Grant size: Initial 8360 KB, Final 8360 KB, Used 40 KB.
INSERT INTO dbo.bom
SELECT
d1.design
,d1.version
,BOMitem = cast(d1.attvalue as nvarchar)
,d1.designAlternative
,BOMdesignAlternative = cast(d2.attValue as int)
FROM dbo.design d1
JOIN dbo.design d2
ON d1.design = d2.design
AND d1.version = d2.version
AND d1.designAlternative = d2.designAlternative
AND d1.layerCode = d2.layerCode
AND d1.position = d2.position
AND d1.operation = d2.operation
WHERE d1.attvalue IS NOT NULL
AND d1.attribute = N'BOMitem'
AND d2.attribute = N'BOMdesignAlternative'
AND d1.design = N'CORE/FILLED'
AND d1.designAlternative = 0
AND d1.version = 0
March 8, 2022 at 1:34 pm
I removed the Distinct and tried to cast on attvalue but I'm still getting this error in query plan.
Type conversion in expression (CONVERT(int,[d2].[attValue],0)) may affect "CardinalityEstimate" in query plan choice, The query memory grant detected "ExcessiveGrant", which may impact the reliability. Grant size: Initial 8360 KB, Final 8360 KB, Used 40 KB.
INSERT INTO dbo.bom
SELECT
d1.design
,d1.version
,BOMitem = cast(d1.attvalue as nvarchar)
,d1.designAlternative
,BOMdesignAlternative = cast(d2.attValue as int)
FROM dbo.design d1
JOIN dbo.design d2
ON d1.design = d2.design
AND d1.version = d2.version
AND d1.designAlternative = d2.designAlternative
AND d1.layerCode = d2.layerCode
AND d1.position = d2.position
AND d1.operation = d2.operation
WHERE d1.attvalue IS NOT NULL
AND d1.attribute = N'BOMitem'
AND d2.attribute = N'BOMdesignAlternative'
AND d1.design = N'CORE/FILLED'
AND d1.designAlternative = 0
AND d1.version = 0
Can you post the entire execution plan for this query, as well as the exact table structure?
As Jeff said, use aliases properly. And as I said, size your variable. Without a size, this is 1 character in length cast(d1.attvalue as nvarchar)
Run this to prove it:
DECLARE @n nvarchar
SET @n = 'aa'
SELECT @n
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
March 8, 2022 at 2:10 pm
Table def's attached.
CREATE TABLE [dbo].[bom](
[design] [nvarchar](120) NOT NULL,
[version] [int] NOT NULL,
[subdesign] [nvarchar](120) NOT NULL,
[designAlternative] [int] NOT NULL,
[subDesignAlternative] [int] NOT NULL,
CONSTRAINT [PK_bom] PRIMARY KEY CLUSTERED
(
[design] ASC,
[version] ASC,
[designAlternative] ASC,
[subdesign] ASC,
[subDesignAlternative] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[bom] ADD DEFAULT ((0)) FOR [version]
GO
ALTER TABLE [dbo].[bom] ADD DEFAULT ((0)) FOR [designAlternative]
GO
ALTER TABLE [dbo].[bom] ADD DEFAULT ((0)) FOR [subDesignAlternative]
GO
-----
CREATE TABLE [dbo].[design](
[design] [nvarchar](40) NOT NULL,
[version] [int] NOT NULL,
[layergroup] [nvarchar](40) NULL,
[layercode] [int] NOT NULL,
[material] [nvarchar](40) NULL,
[operation] [int] NOT NULL,
[class] [nvarchar](24) NOT NULL,
[object] [nvarchar](60) NOT NULL,
[attribute] [nvarchar](60) NOT NULL,
[sequenc_] [int] NULL,
[attValue] [nvarchar](255) NULL,
[attCalc] [nvarchar](255) NULL,
[locked] [int] NULL,
[formulaUsed] [int] NULL,
[tableUsed] [int] NULL,
[uses] [nvarchar](20) NULL,
[position] [int] NOT NULL,
[outofdate] [nvarchar](3) NULL,
[ruleChange] [nvarchar](3) NULL,
[lookupOptions] [nvarchar](255) NULL,
[updateDate] [datetime] NULL,
[lookupTable] [int] NULL,
[designAlternative] [int] NOT NULL,
CONSTRAINT [PK_design] PRIMARY KEY CLUSTERED
(
[design] ASC,
[version] ASC,
[layercode] ASC,
[operation] ASC,
[class] ASC,
[object] ASC,
[attribute] ASC,
[position] ASC,
[designAlternative] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[design] ADD DEFAULT ((0)) FOR [version]
GO
ALTER TABLE [dbo].[design] ADD DEFAULT ((0)) FOR [layercode]
GO
ALTER TABLE [dbo].[design] ADD DEFAULT ((0)) FOR [operation]
GO
ALTER TABLE [dbo].[design] ADD DEFAULT ((0)) FOR [position]
GO
ALTER TABLE [dbo].[design] ADD DEFAULT ((0)) FOR [designAlternative]
GO
March 9, 2022 at 12:46 am
Still can't get pass this error:
Type conversion in expression (CONVERT(int,[d2].[attValue],0)) may affect "CardinalityEstimate" in query plan choice, The query memory grant detected "ExcessiveGrant", which may impact the reliability. Grant size: Initial 8360 KB, Final 8360 KB, Used 40 KB.
Viewing 15 posts - 31 through 45 (of 60 total)
You must be logged in to reply to this topic. Login to reply