October 4, 2005 at 10:56 am
I have been asked if I could come up with any code that would check for some possible infinite loop occurring due to a Dynamic SQL query. (I know the Curses and Blessings of Dynamic SQL).
Basically, this group will have an ad hoc query web page, (intranet only) for some of the business analysts. Although training will be involved, there is enough turn over and odd requests that problems could arise.
This is not my data, so I cannot be specific. The example I was given was if two dates were entered into various categories that generated a query that would loop infinitely.
Short of an amazingly complicated CASE statement, how would one prevent this from occurring.
I was thinking along the lines of grasping the SPID from the Dynamic SQL and checking on how long it is taking to process. If, for instance, it went over five minutes, it would be KILLed, [ obviously a bulldozer approach ]. Unfortunately, this would have to fire on a regular basis and if many users were running these reports, this could have serious repercussions.
Has anyone come across this situation or does anyone have a good idea as to how to approach this situation?
TIA.
I wasn't born stupid - I had to study.
October 4, 2005 at 11:04 am
There are two methods of limit the resources used by a SQL statment. From BOL:
Using SET QUERY_GOVERNOR_COST_LIMIT applies to the current connection only and lasts the duration of the current connection.
Use the query governor cost limit option of sp_configure to change the server-wide query governor cost limit value.
SQL = Scarcely Qualifies as a Language
October 4, 2005 at 11:44 am
Thank you. Is the value in seconds? I could not find any information on setting the value past zero, (other than it will be treated as an integer...).
I wasn't born stupid - I had to study.
October 4, 2005 at 11:49 am
Can't find anything either in BOLs!!
Does that mean that microsoft isn't perfect???
October 4, 2005 at 11:59 am
NO WAY!!!
I wasn't born stupid - I had to study.
October 4, 2005 at 12:10 pm
I am not sure about infinite loop but the performance part is covered by Erland on his web page click on Dynamic SQL for more info. Hope this helps.
http://www.sommarskog.se/arrays-in-sql.html
Kind regards,
Gift Peddie
October 4, 2005 at 12:37 pm
The following returns 999,999 records... Not sure what setting the value to 20 means, (I got the same results when I set it to 2)... Query Analyzer indicated 0:04:37 seconds.
SET QUERY_GOVERNOR_COST_LIMIT 20
DECLARE @sql varchar(2000)
SET @sql = 'DECLARE @Counter integer ' + CHAR(10) +
'SELECT @Counter = 1 ' + CHAR(10) +
'WHILE @Counter < 1000000 ' + CHAR(10) +
'BEGIN ' + CHAR(10) +
CHAR(9) + 'SELECT @Counter ' + CHAR(10) +
CHAR(9) + 'SELECT @Counter = @Counter + 1 ' + CHAR(10) +
'END '
--PRINT @sql
EXECUTE( @sql)
(Thanks for the link Gift Peddie - already tried that though... doesn't really answer this question at least as far as I could tell...).
I wasn't born stupid - I had to study.
October 4, 2005 at 1:34 pm
Farrell that value is the cost in seconds (CPU seconds) to be able to use the SET command with Dynamic SQL you probably need to includ it in the @sql variable.
Now I think you may be shooting yourself in the foot if you try this as server-wide setting because you may need a heavy lifting query at some point. There is also the issue that some DBCC commands and sqlmaint don't play well with that setting change.
If you decide then to go the session by session path it maybe better to just keep control of the time on the client side.
For SQL 2005 though you may be able to get the "query plan" back to the client and you could predict by the amount of the largest rowcount to proceed or not which is a nice feature ( as of today you never now if it will make the RTM )
Cheers,
* Noel
October 4, 2005 at 1:53 pm
Thanks Noel! Wholeheartedly agree about not setting this server wide. That is why I want to use a number, but I still can't figure out how to actually use that number... What I could find on it stated that using 0 would set this server wide.
I tried the following (moving the SET QUERY_GOVERNOR_COST_LIMIT 20 within the Dynamic SQL) but I ended up with the same results as above, [ 0:04:48 seconds]. I then tried it with 2 instead of 20, but that yielded no change.
DECLARE @sql varchar(2000)
SET @sql = 'SET QUERY_GOVERNOR_COST_LIMIT 20 ' + CHAR(10) +
'DECLARE @Counter integer ' + CHAR(10) +
'SELECT @Counter = 1 ' + CHAR(10) +
'WHILE @Counter < 1000000 ' + CHAR(10) +
'BEGIN ' + CHAR(10) +
CHAR(9) + 'SELECT @Counter ' + CHAR(10) +
CHAR(9) + 'SELECT @Counter = @Counter + 1 ' + CHAR(10) +
'END '
EXECUTE( @sql)
Maybe this is not the way to go or someone else can explain to me how to better us that number...
I wasn't born stupid - I had to study.
October 4, 2005 at 2:27 pm
Farrell, I didn't look at the SQL code you were trying to execute
The reason is that all you are doing is running a million times queries which have very small costs!!!
The batch is not one query!
Now try this in Northwind (Have a look at Total Subtree Cost on the execution plan):
SET QUERY_GOVERNOR_COST_LIMIT 1
select * from products p1 cross join products p2 cross join products p3
-- This will fail and will indicate Actual Query Cost vs the setting
SET QUERY_GOVERNOR_COST_LIMIT 5
select * from products p1 cross join products p2 cross join products p3
-- This will go through
Hth
* Noel
October 4, 2005 at 2:38 pm
Perfect reply! Thank you. I assumed it was a total cost on the SPID, not on each query. That makes total sense. Thanks.
I wasn't born stupid - I had to study.
October 4, 2005 at 4:06 pm
Glad to help
* Noel
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply