October 20, 2012 at 11:26 pm
Hi,
I have a SP that calculates data based on criteria, the result were % and duration. It takes 8-10 seconds before it displays the result.
The problem is when more users are trying to generate the report, it seems that it takes more time.
My problem was when I tried to update a simple statement such as below, it resulted to a "time out expired" and it affects the entire process of the application becuase users cannot perform their task.
update table
set status = 1
where keyId = 6
I don't have any locking on my select nor update.
What is the best practice to resolve it?
I have 100 users nationwide aside from report users, please need help.
Thanks in advance
Dabuskol
October 21, 2012 at 4:33 am
You always have locking on updates, you will have locking on the select unless in one of the snapshot isolation levels.
Are you seeing blocking, or is the update just slow?
What's the entire process that the update is part of?
Got any triggers on that table?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
October 21, 2012 at 4:59 am
No Triggers.
But I have a lot of Select where it exists on many different reports.
What I did for now is to add from my select statement on all reports
"WITH (NOLOCK)", dirty read is no problem at all.
Regards
Dabuskol
October 21, 2012 at 5:08 am
Bad, bad, bad idea. Unless your users are happy with their reports being potentially inaccurate (have you asked them it 'mostly correct' is acceptable? Have you asked your manager if making the reports 'mostly accurate' is an acceptable fix?).
See - http://blogs.msdn.com/b/davidlean/archive/2009/04/06/sql-server-nolock-hint-other-poor-ideas.aspx
I think you need to have a read through chapter 6 (at least) of http://www.simple-talk.com/books/sql-books/troubleshooting-sql-server-a-guide-for-the-accidental-dba/
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
October 21, 2012 at 5:17 am
He did say "dirty read is no problem at all"...so what you are asking for help with is the performance of the reports, is that correct?
If you are trying to determine who is blocking who, click the link in my sig I just posted a blocking script that could help you, if you're looking for something else please give us some more context. How often are these tables updated? Why are you running that update during the production day, can it be run off hours?
October 21, 2012 at 6:23 am
Thanks for the infos.
My insert and update is a common transaction every seconds considering the number of users I have.
The application is a tracking system where approval and confirmation is often needed in order to approve and implement a certain request.
My problem is that I often ecnounter nowadays "time out expired" only to found out that one my many report is running slow and after some investigation found out that this one is causing the time out.
I need to solve this issue becuase It even happens in the night wherein users are stock and can't work.
That is the very reason I used NOLOCK just to overcome the locking issue but as per best practice it is not recommended.
What is the best approach to this problem? Our administrator is not that experience to explain to me in details what is written in the recommendations.
Solving this problem for more than 1 month though it happen Isolated but it affects the integrity and performance of the application.
Thanks in advance
Dabuskol
October 21, 2012 at 6:30 am
SQLWorks (10/21/2012)
He did say "dirty read is no problem at all"...
Yes, but in my experience most people who say that don't realise what that means. It's not just a case where the report might show old values for a row that's just been updated, that is the least of the anomalies that nolock can cause.
I had an architect who maintained that position and insisted on nolock on all reports, until I showed him a report that had 3 of a particular client's transactions duplicated on the report, resulting in the total being out by around 40%.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
October 21, 2012 at 6:45 am
SqlUser-369680 (10/21/2012)
What is the best approach to this problem?
Please read the book chapter that I referenced.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply