October 21, 2015 at 10:13 pm
Comments posted to this topic are about the item Dynamic Pivot, Passing Temp Table Variables to Dynamic SQL
October 21, 2015 at 11:43 pm
Nice one, this is quite useful in many situations. Thanks.:-)
October 22, 2015 at 1:45 am
Nice article. Although I still prefer Jeff Moden's dynamic crosstab approach:
DECLARE @sql NVARCHAR(MAX) = ''
SELECT @sql =
'SELECT
TableName' + CHAR(10)
', AVG(CASE WHEN YEAR(CreatedDate) = ' + CONVERT(NVARCHAR(4), yr) + ' THEN RecordCount END) AS ' + QUOTENAME(yr) + CHAR(10)
FROM (
SELECT DISTINCT YEAR(CreatedDate) AS yr FROM @TempTable
) t
'FROM @TempTable
GROUP BY TableName
ORDER BY TableName'
EXEC sp_executesql @sql, N'@TempTable dbo.TableType READONLY', @TempTable;
October 22, 2015 at 7:44 am
The task I have is to identify where in my daily sql I can use this.
October 22, 2015 at 7:59 am
Hi Iwas,
Most analytics/transformation tasks will require
these kind
of twisty sql/tsql code. As commented by
One of the readers above, it is apparent that this
is only one of the ways. I saw the cross tab solution
and that makes great sense too and looks more
elegant.
Sql/tsql was not meant for these kinds of tasks
And so newer and newer features are being added
And I happened to use this in one of my recent projects.
So shared it, thanks SSC.
Thanks for your comment!
Jambu
October 22, 2015 at 8:24 am
Is great to know that you're sharing your real life experience. As commented above, the cross tabs approach has been way back before pivot and is more flexible. That doesn't means that you didn't do a very good effort and showed what's needed when using a table variable (even if most people like me won't advocate their use).
I'd like to encourage you to use ORDER BY in your queries even if DISTINCT is used (which needs a sort by itself) to define real order of your columns. I'll also suggest that you use the FOR XML PATH concatenation approach[/url] which allows for a more secure and flexible order of the columns. I've seen that the columns in a pivot get messed up because the ORDER BY was ignored in the concatenation method you used.
This is the approach I usually follow for dynamic pivots.
declare @sql nvarchar(max)
SELECT @sql = 'SELECT TableName ' + CHAR( 13)
+ (SELECT CHAR(9) + ', AVG( CASE WHEN YEAR( CreatedDate) = ' + calyear + ' THEN RecordCount END) AS ' + QUOTENAME(calyear) + CHAR(13)
FROM @TempTable t
CROSS APPLY (SELECT CONVERT( CHAR(4),YEAR( CreatedDate))) y( calyear)
GROUP BY calyear
ORDER BY calyear
FOR XML PATH(''),TYPE).value('.', 'nvarchar(max)')
+ CHAR(9) + ',AVG( RecordCount) AS Total ' + CHAR( 13)
+ 'FROM @TempTable ' + CHAR( 13)
+ 'GROUP BY TableName ' + CHAR( 13)
+ 'ORDER BY TableName; '
exec sp_executesql @sql,N'@TempTable dbo.TableType READONLY', @TempTable;
October 22, 2015 at 9:05 am
Thanks Luis,
Your explanation and approach makes lot of sense. And the next time I come across such situations I will use the suggested approaches. Looks like cross tab is the best way to go from what I see in the feedbacks. I wish I knew that earlier.
Thanks again for the exhaustive feedback.
Jambu
October 22, 2015 at 9:28 am
I have searched the internet for this situation to no avail, but we believe that starting in some version of SQL2008+ that if you define the @columns variable as anything other than nvarchar(max) that the sql will fail intermittently.
Of course this means the @sql variable must be nvarchar(max) also.
It was a lot of trial and error to accidentally encounter a solution. We had several dynamically pivoting stored procedure that executed in jobs daily for years with no issues. then suddenly one day they intermittently failed. Modifying the variable name in the stored procedure occasionally would fix the issue, but only temporarily. It would run for a while and then fail.
Once we set the variables to nvarchar(Max) - no more failures.
Perhaps someone else has some insight or extra details to share about this issue.
Here are more particulars:
The contained a variable @SCols varchar(1000)
This is the error that would report intermittently "Warning: Null value is eliminated by an aggregate or other SET operation. [SQLSTATE 01003] (Message 8153) An expression of non-boolean type specified in a context where a condition is expected, near 'S'. [SQLSTATE 42000] (Error 4145)."
October 22, 2015 at 9:41 am
farfel (10/22/2015)
I have searched the internet for this situation to no avail, but we believe that starting in some version of SQL2008+ that if you define the @columns variable as anything other than nvarchar(max) that the sql will fail intermittently.Of course this means the @sql variable must be nvarchar(max) also.
It was a lot of trial and error to accidentally encounter a solution. We had several dynamically pivoting stored procedure that executed in jobs daily for years with no issues. then suddenly one day they intermittently failed. Modifying the variable name in the stored procedure occasionally would fix the issue, but only temporarily. It would run for a while and then fail.
Once we set the variables to nvarchar(Max) - no more failures.
Perhaps someone else has some insight or extra details to share about this issue.
Here are more particulars:
The contained a variable @SCols varchar(1000)
This is the error that would report intermittently "Warning: Null value is eliminated by an aggregate or other SET operation. [SQLSTATE 01003] (Message 8153) An expression of non-boolean type specified in a context where a condition is expected, near 'S'. [SQLSTATE 42000] (Error 4145)."
If the @SCols is declared as varchar(1000), when concatenating with the rest of the sql statement it will implicitly cast the resultant string into a varchar(1000) before casting it to a nvarchar(max) to assign it to a variable. I hope that this makes sense.
October 22, 2015 at 9:53 am
Yes it does, thank you. Regardless though, we have found it best practice to make those variables nvarchar(max). Lazy, but effective.:-)
October 22, 2015 at 9:55 am
farfel (10/22/2015)
Yes it does, thank you. Regardless though, we have found it best practice to make those variables nvarchar(max). Lazy, but effective.:-)
I wouldn't call it lazy if it's the right thing to do. 😉
August 4, 2017 at 11:19 pm
Iwas Bornready - Thursday, October 22, 2015 7:44 AMThe task I have is to identify where in my daily sql I can use this.
High performance reporting is the normal usage. It can also be used to make "records" that have been split by column into an EAV table (such as you might find for audit tables or "flexible" user tables) back into whole "records".
--Jeff Moden
Change is inevitable... Change for the better is not.
August 5, 2017 at 11:03 am
farfel - Thursday, October 22, 2015 9:28 AMI have searched the internet for this situation to no avail, but we believe that starting in some version of SQL2008+ that if you define the @columns variable as anything other than nvarchar(max) that the sql will fail intermittently.Of course this means the @sql variable must be nvarchar(max) also.
I have not seen this happen; thank you for mentioning it, I'll do some experiments.
My article about this technique (written in 2007) is here:
http://spacefold.com/lisa/post/2007/08/15/Matrix-Rebuilt-More-non-standard-fun-with-T-SQL.aspx
If folks are enjoying this technique, they might find it useful to see a different explanation.
I used a WHILE loop instead of a COALESCE(...) in which I determine two parts of the eventual SQL statement separately, to allow me to add generated column aliases for the result -- under the circumstances (the number of iterations is the number of columns) it seemed cleaner and more expressive, and no real perf difference.
Of course now I have to try it Jeff's way <sigh>.
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply