June 9, 2009 at 3:36 pm
In our software, we have a import / update process that is locking one of our tables and causing timeouts on other queries. The process is running in C# code and calling a single stored procedure multiple (15,000) times within one transaction (the set the transaction on the C# side).
I proposed modifying the process to a single stored procedure call which sends all the data to be modified as XML. We could extract the data from the XML to a temp table and then run the updates as a single, faster operation.
So I have two questions:
1) Is this a worthwhile approach to the problem?
2) If so, I am faced with the issue that the original stored procedure for single updates is used in other places and maintains a large chunk of business logic. So we would end up maintaining the same logic in two places and I think our design team will have a problem with that.
So the second question is: has anyone dealt with this issue before, and were you able to discover a satisfactory resolution?
Thanks...
June 10, 2009 at 7:26 am
Yes, getting that many updates into a single statement is the best mechanism. You should experiment between using XQuery directly on the XML and loading the data into a temporary table (not a table variable), because you might see that the XQuery is faster than the write and read required for the temp table approach. But, yeah, you're on the right track. And yes, this is how I've solved this problem in the past. Usually though, just an FYI, single row inserts/updates become much less performant than batch inserts/updates somewhere between 100 & 500 rows, depending on the data set, indexes, etc. So you might want to build the smarts into the client side to make a determination that udpates past X number of rows uses an alternate path. Conversely, the cost of shredding XML makes set based operations below that 100 row threshold more expensive than a transaction with single row inserts/updates. But, your mileage may vary, so test, test, test.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 11, 2009 at 6:19 am
8kb,
I think Grant has the right ideas. I wish to comment on something you wrote:
...and I think our design team will have a problem with that.
I believe your design team has a problem already: calling a SP as you describe is very bad design to begin with. It's a bit strange to allow such implementation and then worry about having to maintain similar logic in 2 places....
Just my 2c.
Greetz,
Hans Brouwer
June 11, 2009 at 6:32 am
I have to agree with Hans here. You may need to maintain logic in more than one location simply because you can't have a single solution for all problems. As you've found out, single row inserts across large transactions will cause lock escalation up to a table lock. That doesn't mean that single row inserts are bad. It means that lots of single row inserts can cause problems. So you need to be able to deal with the fact that sometimes, the solution you already have is the best approach, but sometimes it isn't. That's just a practical reality. Hopefully your design team can cope with that fact.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 11, 2009 at 10:09 am
First off, thanks for the replies. Your answers made me feel confident about the approach and helped me sell the concept to the team leads. I have begun working on the code and will be testing later this week. Luckily for us, we can very easily duplicate the blocking on our dev environment so we'll know immediately if this new method will correct the problem.
As for the duplicated business logic, it wasn't as much of a concern to the team as I thought it would be. Apparently it's a situation that already exists in the code and it's maintenance hasn't been an issue. But we do have some ideas on how to consolidate if needed.
June 11, 2009 at 10:20 am
If you can upgrade to SQL2008, the following may be worth looking at as an alternative to XML.
http://www.sqlservercentral.com/articles/SQL+Server+2008/66554/
It looked like an interesting idea but I have not yet got around to playing with it.
June 11, 2009 at 8:17 pm
[font="Verdana"]I faced this very self same issue with a client a little over a year ago, who were running SQL Server 2005. The situation there was exacerbated by triggers.
The solution I proposed was:
1. Create a gateway stored procedure that accepts an XML list of updates (preferably use typed XML with a common schema shared between the application and the database.)
2. That stored procedure would extract the XML into a table variable or temporary table
3. The table variable/temporary table would then be used to generate set-based inserts, updates and deletes
With 2005, you could vary this by passing a table variable as the parameter. However, it's probably easier for the external application to work with XML, so I'd be tempted to stay with that approach.
If that sounds familiar with what you're doing... it is. 😀 So I concur that this is a better design approach than creating lots of single row procedural gateways. No one said CRUD had to be limited to one row at a time.
[/font]
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply