June 2, 2015 at 8:54 am
Jason A. Long (6/1/2015)
TheSQLGuru (6/1/2015)
Thanks for that Jason!No problem. Thanks for kicking me in the back side to get it done... With the data partition and the select output. 😀
When I want to test the server performance of a large set I usually do it thusly:
declare @var1, @var2, @var3, @var4, @var5 (appropriate definitions here)
SELECT
@var1 = rt.TransactionID,
@var2 = rt.DateTimeStamp,
@var3 = rt.TransactionDay,
@var4 = rt.TransactionAmount,
@var5 = SUM(rt.TransactionAmount) OVER (PARTITION BY rt.TransactionDay ORDER BY rt.DateTimeStamp ROWS UNBOUNDED PRECEDING) AS RunningTotal
FROM
dbo.RunningTotal rt
Now there is ZERO overhead for transferring data back to client (even if it is just to be discarded). We are truly testing server query performance.
I'm not familiar with that particular syntax... Are you defining the variables as table or scalar variables?
I like the "Discard results..." option simply because it's an easy toggle in SSMS, it doesn't require any alteration to the code being tested and doesn't have any impact on either the estimated or actual plans...
That said, I'm always up for learning something new and/or finding a better way of doing things. Do you have a link to a working example?
Thanks,
Jason
The variables are declared to be EXACTLY the same type as the fields you shove into them. So in this example it would start with:
DECLARE @var1 int, --since TransactionID is (I presume) an integer data type
Remember, it can still take substantial time to spool large result sets to SSMS when testing, even if you have DISCARD enabled. This method allows you to just test server performance.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 2, 2015 at 8:56 am
Never thought of this way of doing it. Absolutely clever - this one goes into my bag of tricks. Love this forum - learn something new every day.
Glad you liked it, and indeed these are wonderful forums!!
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 2, 2015 at 9:02 am
Jason A. Long (6/1/2015)
TheSQLGuru (6/1/2015)
Thanks for that Jason!No problem. Thanks for kicking me in the back side to get it done... With the data partition and the select output. 😀
When I want to test the server performance of a large set I usually do it thusly:
declare @var1, @var2, @var3, @var4, @var5 (appropriate definitions here)
SELECT
@var1 = rt.TransactionID,
@var2 = rt.DateTimeStamp,
@var3 = rt.TransactionDay,
@var4 = rt.TransactionAmount,
@var5 = SUM(rt.TransactionAmount) OVER (PARTITION BY rt.TransactionDay ORDER BY rt.DateTimeStamp ROWS UNBOUNDED PRECEDING) AS RunningTotal
FROM
dbo.RunningTotal rt
Now there is ZERO overhead for transferring data back to client (even if it is just to be discarded). We are truly testing server query performance.
I'm not familiar with that particular syntax... Are you defining the variables as table or scalar variables?
I like the "Discard results..." option simply because it's an easy toggle in SSMS, it doesn't require any alteration to the code being tested and doesn't have any impact on either the estimated or actual plans...
That said, I'm always up for learning something new and/or finding a better way of doing things. Do you have a link to a working example?
Thanks,
Jason
With this syntax, you're assigning values to variables. To minimize code change, you would name your variables the same way you want to name/alias your columns. If you align the variables as shown in here, you could just select using Alt+Shift to use vertical selection either with arrow keys or the mouse cursor and then delete all the @ at once.
All the variables are scalar as they'll only store a single value at a time.
June 3, 2015 at 6:16 am
Ok... just to kick in a bit here because it seems that we keep trying to redevelop the wheel for test data. What's wrong with the million row test data table from the running total article I wrote? Just use that.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 4 posts - 31 through 33 (of 33 total)
You must be logged in to reply to this topic. Login to reply