October 30, 2014 at 7:36 am
I have a SP SPone. i have optimized that and kept it as SPone_Optimized. i would like to test the both SP's execution time to find out how best the optimized one fares.
i planned to test it as follows
declare @starttime datetime,@endtime datetime
declare @count int=0
select @starttime=getdate()
while(@i<10000)
begin
execute SPone_optimized @param='value1'
end
select @endtime=getdate()
select datediff(ms,@stattime,@endtime) 'total_exec_time'
----- for the SP that is before optimize
declare @starttime datetime,@endtime datetime
declare @count int=0
select @starttime=getdate()
while(@i<10000)
begin
execute SPone @param='value1'
end
select @endtime=getdate()
select datediff(ms,@stattime,@endtime) 'total_exec_time'
Please tell me whether the idea makes sense...or any other better way to find the same... thanks in advance
October 30, 2014 at 7:48 am
This was removed by the editor as SPAM
October 30, 2014 at 7:55 am
Thanks Stewart... I would like to have cumulative execution time in order to compare the both SPs
October 30, 2014 at 8:15 am
In addition to the above advice, you should check out http://statisticsparser.com, which will summarize the output for you. It's really handy for when you're using statistics IO to check you logical reads, but also helps with the kind of thing you're looking at.
October 30, 2014 at 10:16 pm
Thanks a lot sqlslacker...
October 31, 2014 at 2:44 am
One minor issue with your code you don't seem to be incrementing the @I counter so it will run indefincately (possibly a cut/paste error)
The problem you will have is that after the first run your results will be skewed, ad the cache will have been warned, by the previous execution, so you could see a significant drop in the performance and so over a thousand runs you will experience a false positive.
One way round that would be to 'vary' the Input parameter, in order to replicate the true running of the SP, unless every execution will have the same value.
This will cause a more natural view of how performant the SP is.
It wouldn't take much to do , simply create a table with 10 values and then use a Mod 10 to get the value for that run.
the other issue you will have with the STATISTICS is that they will produce data for each statement after they are switched on, and it can be confusing when trawling through a complex SP to see which Stat relates to which part of the code.
Hope this helps.
_________________________________________________________________________
SSC Guide to Posting and Best Practices
October 31, 2014 at 3:49 am
Thanks Jason... by turning off the statistics i tried to execute the sp 5000 times... as it is 'select' SP , i am getting out of memory exception after some time of execution...i have understood your point except that the usage of mod function... with what value should i find mod?.. it will be great if you give some more hint
October 31, 2014 at 4:03 am
Ok got it... Mod 10 using counter.. Thanks a lot Jason... I am storing the resultset into a temporary table in order to avoid 'out of memory' exception due to result grid.
October 31, 2014 at 4:10 am
The Out of Memory Exception is likely to be Client side and the Temporary FILE that SSMS uses hitting the max limits, especially with all the results being returned.
I wouldn't think you need to run it so many times, maybe 1000 to get a good estimate of how long it takes.
Sorry I meant the Modulus function (%). I would set it up something like this.
DECLARE @TableVar TABLE (ID SMALLINT, InputVar VARCHAR(30))
INSERT INTO @TableVar
VALUES
(0,'Value1')
,(1,'Value2')
//ETC to
,(9,'Value10')
WHILE (@I<10000)
BEGIN
SELECT @Param=InputValue FROM @TableVar WHERE ID = (@I % 10)
SET STATISTICS IO ON
SET STATISTICS TIME ON
EXEC <SP> @Param
SET STATISTICS TIME OFF
SET STATISTICS IO OFF
SET @I=@I+1
END
You might have to check about the switching on and off of the statistics.
_________________________________________________________________________
SSC Guide to Posting and Best Practices
October 31, 2014 at 4:45 am
Thank you so much Jason!
October 31, 2014 at 5:22 am
Be wary if you have any scalar functions in the procedure. I've seen it give misleading times on occasion. A way of determining the run time for the whole batch is to do exactly what you have above when you read GETDATE() into a datetime variable. Then, when you're done, use the following to get the total ms elapsed between the two times for the whole run.
select datediff(ms, @starttime, @endtime);
October 31, 2014 at 8:47 am
SqlStarter (10/30/2014)
I have a SP SPone. i have optimized that and kept it as SPone_Optimized. i would like to test the both SP's execution time to find out how best the optimized one fares.i planned to test it as follows
declare @starttime datetime,@endtime datetime
declare @count int=0
select @starttime=getdate()
while(@i<10000)
begin
execute SPone_optimized @param='value1'
end
select @endtime=getdate()
select datediff(ms,@stattime,@endtime) 'total_exec_time'
----- for the SP that is before optimize
declare @starttime datetime,@endtime datetime
declare @count int=0
select @starttime=getdate()
while(@i<10000)
begin
execute SPone @param='value1'
end
select @endtime=getdate()
select datediff(ms,@stattime,@endtime) 'total_exec_time'
Please tell me whether the idea makes sense...or any other better way to find the same... thanks in advance
Shifting gears a bit, what does SPone do and why are you trying to process by single row rather than a set of rows? Things will be much easier and faster all around if you can process in sets rather than using RBAR.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 2, 2014 at 10:46 pm
Whole Idea behind the script is to find the performace gain in terms of execution time of two SPs...
SPone has set of queries...
SPone_Optimized is optimized version of SPone...
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply