August 9, 2011 at 7:14 am
Hi All,
I have to get the Total execution time in MILLISECONDS of a script which have lot of batches. Following scripts are the example for the same. I tried 2 different ways but that also not working for me, which I have described below. Here Part-1 is sample code. Part-2 and Part-3 are the solutions I have tried.
--====================================
-- Part - 1
--====================================
USE tempdb
GO
IF OBJECT_ID('tempdb.dbo.#uTable1','U') IS NOT NULL
BEGIN
DROP TABLE tempdb.dbo.#uTable1
END
GO
IF OBJECT_ID('tempdb.dbo.#uTable2','U') IS NOT NULL
BEGIN
DROP TABLE tempdb.dbo.#uTable2
END
CREATE TABLE #uTable1
(
Name VARCHAR(100)
)
GO
CREATE TABLE #uTable2
(
Name VARCHAR(100)
)
GO
DECLARE @lCounter SMALLINT
SET @lCounter = 1
WHILE (@lCounter <= 10000)
BEGIN
INSERT INTO #uTable1(Name) VALUES(@lCounter)
SET @lCounter = @lCounter + 1
END
GO
DECLARE @lCounter SMALLINT
SET @lCounter = 1
WHILE (@lCounter <= 5000)
BEGIN
INSERT INTO #uTable2(Name) VALUES(@lCounter)
SET @lCounter = @lCounter + 1
END
GO
--------------------------------------
I tried the below solution, but it is throwing error as in between lot of GO statements. GO statements cannot be removed as in the actual scenario lot of objects including procedures are creating. It will not work in a single execution if there is no GO statements.
--====================================
-- Part - 2
--====================================
---------------------------------------------------
DECLARE @lStartTime DATETIME, @lEndTime DATETIME
SET @lStartTime = CURRENT_TIMESTAMP
--// Add the part - 1 query here.
SET @lEndTime = CURRENT_TIMESTAMP
SELECT DATEDIFF(ms, @lStartTime, @lEndTime)
---------------------------------------------------
Below solution is giving milliseconds for each batch. But I want total milliseconds.
--====================================
-- Part - 3
--====================================
--------------------------------------
SET STATISTICS TIME ON
--// Add the part - 1 query here.
SET STATISTICS TIME OFF
--------------------------------------
Appreciate any help.
_____________________________________________
One ounce of practice is more important than tonnes of dreams
August 9, 2011 at 7:27 am
I personally preffer to also include profiler using the event statement completed, also sp_statement completed.
It gives a little more info and you can do multiple runs and do averages. Which is much better than just going 1 run.
You can also use this tool to test perf with concurrent users (built by an mvp) : http://www.datamanipulation.net/sqlquerystress/
August 9, 2011 at 8:28 am
You might modify your code to include:
IF OBJECT_ID('tempdb.dbo.#Ts','U') IS NOT NULL
BEGIN
DROP TABLE tempdb.dbo.#Ts
END
---------------------------------------------
CREATE TABLE #Ts (Id INT IDENTITY(1,1),Stime DATETIME,Etime DATETIME)
INSERT INTO #Ts(Stime)
SELECT CURRENT_TIMESTAMP
Further modify:
INSERT INTO #uTable1(Name) VALUES(@lCounter)
SET @lCounter = @lCounter + 1
END
INSERT INTO #Ts(Etime)
SELECT CURRENT_TIMESTAMP
And including similiar inserts into #Ts after the insertion into your second table. In my short test I obtained the following:
IdStime Etime
12011-08-09 09:56:47.173 NULL
2NULL 2011-08-09 09:56:47.717
3NULL 2011-08-09 09:56:47.890
Now unless you copy the data from #Ts into a premanent table, repeating the testing does not allow you to gather data to arrive at an average time, and the inserting into #Ts of course does require some time so that would slightly distort your results ...
August 11, 2011 at 12:23 am
Ninja, Correct. But for me in the actual script almost 62 batches are there. So I tried Ron's solution. If I get the result in a single exec, it will be good 😀
Ron, It is working for me. Thank you.
_____________________________________________
One ounce of practice is more important than tonnes of dreams
August 11, 2011 at 6:46 am
C.K.Shaiju (8/11/2011)
Ninja, Correct. But for me in the actual script almost 62 batches are there. So I tried Ron's solution. If I get the result in a single exec, it will be good 😀Ron, It is working for me. Thank you.
They're both good, depends what you need to do.
P.S. You can also include the bact or sp completed event. Which will then include the totals from the 62 batches.
STATISTICS IO could also be parsed to get IOps per table. There might be a dmv for that but I don't know.
August 11, 2011 at 7:02 am
C.K.Shaiju (8/11/2011)
Ninja, Correct. But for me in the actual script almost 62 batches are there. So I tried Ron's solution. If I get the result in a single exec, it will be good 😀Ron, It is working for me. Thank you.
Thank you for the above. It is very satisfying to me to know that I was able to assist you.
August 11, 2011 at 7:11 am
Ok Ninja, I'll try that way also. Thank you.
Ron, I was trying to DECLARE a variable in the beginning and trying to get data from the same at the end of the script. As there are GO statements in between, DECLAREd variables are terminated and not available in the new scope/batch. Your solution (temp table) solved the issue. Thank you.
_____________________________________________
One ounce of practice is more important than tonnes of dreams
August 22, 2011 at 2:34 am
I've always used:-
-- Capture the entire script start time (survives batch's / go's)
DECLARE @starttime datetime
DECLARE @binvar binary(128)
SET @starttime = getdate()
SELECT @binvar = CONVERT(binary(128), @starttime)
SET CONTEXT_INFO @binvar
-- Then to get it back again at a later date
SELECT @starttime = CONVERT(datetime, CONTEXT_INFO) from sys.dm_exec_sessions where session_id = @@spid
Hope that helps.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply