May 2, 2003 at 7:11 am
our software application has two main sets of users
1) Call center personnel performing customer maintenance (traditional OLTP type transactions)
2) Back office personnel performing massive updates across many customers (billing)
Is there any way to regulate that a particular user, or connection, or command can be regulated so it does not consume all resources?
We had a recent example where our old method finding duplicate data was to read the data back to a client, hash the data, then find the dups on the client. The process was taking 2 hours. The programmer was able to find the dups with a relatively simple group by query using a having count(*) > 1 clause and the process finished in 10 minutes. The problem was processing some much data that it crushed the SQL server and all the other connections ended up with terrible performance (timeouts left and right).
It would be great if the one application could run longer but not crush the server.
Does anyone know of any settings to help here?
Robert F Knopf
Robert F Knopf
May 2, 2003 at 11:45 am
You can set a query timeout, either on the server or on the command/connection running the query. Not the best, but sometimes better than nothing. For long running processes locks are usually the problem, adding nolock hints can help in that regard. More brute force solution is to replicate the data to a different server and run decision support type ops against it.
Andy
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply