October 5, 2017 at 8:41 am
TheCTEGuy - Thursday, October 5, 2017 4:33 AMHere is a better solution i guess
create function dbo.split(@str varchar(100),@delimeter varchar(1)) returns @t TABLE (itemnumber int identity(1,1),val varchar(100))
begin
declare @cursor INT =0,@prv int = 0
while @cursor < Len(@str)
begin
set @prv = @cursor
set @cursor = charindex(@delimeter,@str)
insert @t
Select SUBSTRING(@str,0,@cursor)
set @STR = substring(@str,@cursor+1,LEN(@str))
end
insert @t
Select @STR
return;end
goSELECT Job_Prod, Job_ID, CONVERT(DATE,Actual_Start,101) AS TrDate, S1.val AS FirstDateField, S2.val AS SecondDateField
FROM TABLEA AS TA
CROSS APPLY dbo.split(TA.actual_Track, ';') AS S1
CROSS APPLY dbo.split(TA.pieces_track, ';') AS S2
WHERE S1.ItemNumber = 4
AND S2.ItemNumber = 4
ORDER BY Job_Prod;
You need to identify how that's "better". Execution plans are great for seeing what's going on but they must not be used to determine the winner for performance because even actual execution plans are fraught with mere estimates. I've actually seen where two pieces of code are tested where code "A" was reported as "0% of batch) and code "B" was reported as "100% of batch" and, yet, when measured for actual performance, precisely the opposite was true.
I can also tell you that a scalar function (which is frequently a performance issue to begin with) that contains a RBAR loop will probably tank when it comes to performance.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 5, 2017 at 10:12 am
Jeff Moden - Thursday, October 5, 2017 8:41 AMTheCTEGuy - Thursday, October 5, 2017 4:33 AMHere is a better solution i guess
create function dbo.split(@str varchar(100),@delimeter varchar(1)) returns @t TABLE (itemnumber int identity(1,1),val varchar(100))
begin
declare @cursor INT =0,@prv int = 0
while @cursor < Len(@str)
begin
set @prv = @cursor
set @cursor = charindex(@delimeter,@str)
insert @t
Select SUBSTRING(@str,0,@cursor)
set @STR = substring(@str,@cursor+1,LEN(@str))
end
insert @t
Select @STR
return;end
goSELECT Job_Prod, Job_ID, CONVERT(DATE,Actual_Start,101) AS TrDate, S1.val AS FirstDateField, S2.val AS SecondDateField
FROM TABLEA AS TA
CROSS APPLY dbo.split(TA.actual_Track, ';') AS S1
CROSS APPLY dbo.split(TA.pieces_track, ';') AS S2
WHERE S1.ItemNumber = 4
AND S2.ItemNumber = 4
ORDER BY Job_Prod;You need to identify how that's "better". Execution plans are great for seeing what's going on but they must not be used to determine the winner for performance because even actual execution plans are fraught with mere estimates. I've actually seen where two pieces of code are tested where code "A" was reported as "0% of batch) and code "B" was reported as "100% of batch" and, yet, when measured for actual performance, precisely the opposite was true.
I can also tell you that a scalar function (which is frequently a performance issue to begin with) that contains a RBAR loop will probably tank when it comes to performance.
In addition to Jeff's comments... Scalar & Multi-Statement functions are black boxes to the query estimator... It simply isn't able to incorporate them into the query plan, so it treats them as if they have zero expense... So, of course, they look better when looking at the execution plan.
Also... If you insist on using a cursor, at least make it LOCAL & FAST FORWARD...
Viewing 2 posts - 46 through 46 (of 46 total)
You must be logged in to reply to this topic. Login to reply