December 12, 2006 at 8:01 am
NO COUNT OFF --> NOT COUNT NOT --> NOT NOT Count() --> Count() --> Counting takes up more resources.
December 12, 2006 at 8:09 am
Sorry...I didn't really convey what I was thinking. And I put ROWCOUNT instead of NOCOUNT. Sorry for the confusion!
What's the scope of SET NOCOUNT? Is it inside a SP, transaction, session, global?
Say I want to use it in triggers... I SET NOCOUNT ON. How do I know the triggering Insert/Update Script has it ON or OFF? Or does SQL Server revert when the trigger returns.
December 12, 2006 at 8:25 am
NO COUNT OFF --> NOT COUNT NOT --> NOT NOT Count() --> Count() --> Counting takes up more resources.
Thanks for the attempt - it now makes less sense though. I have shown the above to my workmate, who also agrees with what i say - and he is still confused.
Thanks
Dan
December 12, 2006 at 8:33 am
When you turn the NOcount OFF you are turning COUNT ON, thus slowing the results.
December 13, 2006 at 1:03 am
I think this is a pretty contrived example - a single procedure which loops thousands of times and does a single insert without a select will show dramatic NOCOUNT results, but that's not at all a common use case. If you're inserting thousands of rows and want to maximize your performance, you're much better off spending some time figuring out how to insert in one batch. If you're simulating repeated inserts from an application, your example isn't a valid test because you're ignoring the network communication cost of separate execution calls from the application server or webserver to the database. The reason NOCOUNT is so much faster in local looping scenarios is that the proc without NOCOUNT reports success for each statement while the NOCOUNT proc doesn't; that advantage goes away when an application is making separate calls.
As others pointed out, NOCOUNT can also have some unanticipated effects on ADO.NET - specifically, it can confuse the SqlDataAdapter.Update() method.
http://weblogs.asp.net/jgalloway/archive/2006/08/30/How-NOCOUNT-affects-ADO.NET.aspx
December 13, 2006 at 9:35 am
Specifying the owner for objects will have a greater effect with greater concurrent usage when the user calling the object isn't the owner. The reason being you see the SP:Cache Miss and then the secondary search for an execution plan. However, while it does this search it issues a [COMPILE] lock, which is an exclusive lock, on the object. So if you're running one process at at time you likely won't see a very noticeable difference, but if you've got many connections executing the same stored procedure you'll see a difference.
K. Brian Kelley
@kbriankelley
December 13, 2006 at 10:18 am
Thanks Brian. I knew that I was right about this.
ATBCharles Kincaid
December 13, 2006 at 1:12 pm
Sorry Jon, but it is a common use case in a sales order system.
You are right that the example was contrived. I did say in the article that it was run on the actual server and not across the network. This is because I am writing and testing stuff on my home PC and not on my employers equipment or in my employers time.
I didn't know about the SqlDataAdapter thing but it doesn't surprise me. I think M$ favours ease of use over good sense. For example, being able to put non-alphabetic characters in SQL Server object names.
Is it really a good idea for .NET developers to be able to put something in place that restricts the database options open to database specialists?
December 13, 2006 at 1:16 pm
"I didn't know about the SqlDataAdapter thing but it doesn't surprise me. I think M$ favours ease of use over good sense. For example, being able to put non-alphabetic characters in SQL Server object names."
I would extend that usefull characters go beyond just letters. Numbers and underscore can be very userfull. But I agree that everything else could (should) be left out. But I guess we ain't gonna see that for a good while!
December 15, 2006 at 5:36 am
March 13, 2007 at 7:54 am
David -
I don't know if you have a sniffer available to you... If so, get it ready (I'll come back to this in a few).
One of the posts mentioned that they ran a report local and it ran "fairly" quickly (I think is said 3 minutes - going from memory here which is going). When run on a remote PC it jumped significantly when not using "SET NOCOUNT ON".
One of the effects of using "SET NOCOUNT ON", as you indicate is to suppress the "xxx Rows Affected" message. This is true. It is not were you get all of your performance boost though. A secondary affect of using the statement is that it also suppresses what I call "engine chatter". What the engine does, when executing a stored procedure is return information to the client after the completion of every statement.
If you have that sniffer available - place it on the line, start a capture between a client and the server and execute the tests again. While not using "SET NOCOUNT ON" you will get a packet returned to the client after each statement that contains "DONE_IN_PROC". Depending on the complexity of your stored procedures / application - this can become very verbose... thus my term "chatter".
Cheers -
Eric P. Melillo
March 13, 2007 at 2:55 pm
Thanks for this Eric, do you know what is in the packets that are returned?
August 3, 2007 at 3:09 am
I think when Set NOCOUNT ON, the value that this call return should be -1 instead of 0.
[Original Post] In .NET applications an ExecuteNonQuery command returns the number of records affected by the operation. Set NOCOUNT ON means that the value that this call returns is always 0.
December 7, 2007 at 7:40 am
Try running the dbo/nondbo test again where there are objects with the same name owned by the non-dbo user. See if anything is different then.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
December 7, 2007 at 7:43 am
I'd like to add a little dissent here to see what you all think.
I've never been in favor of using this universally just because it makes sense in some areas for the following reason: sometime you need the row count returned and this statement makes that impossible, especially where updates/deletes are concerned. I think it is better to leave open the possibility but let the programmer control whether this is needed. There is an ado property that says to the SP that the record count is not required that can be used. If the record count isn't required for a particular procedure, which is most common, the programmer sets that property; otherwise no. This allows messages to be returned to the user such as "x records updated" or "x records deleted". It would be interesting to know what effect using that has on performance, although that's a wider test than just the database.
This article has convinced me, however, that SPs designed to affect one record can have this universally. Not only is the rowcount known, but the addition of the information compared to the size of what would otherwise be returned is higher than say returning multiple records.
Viewing 15 posts - 16 through 30 (of 35 total)
You must be logged in to reply to this topic. Login to reply