December 5, 2005 at 4:22 am
Hi guys, I don't really understand how best to work with my stored procs from ADO.NET.
Here's an example:
Say I have an ADO.NET DataTable which contains records that the user wishes to delete. I have a SP on that table that takes the table's primary key and a timestamp as arguments. Here's where I need help. Do I have to foreach through each record in this DataTable and call my SP for each record or is there some way to pass my SP the PK's and timestamps for ALL my records in one go. They all need to be in the same transaction anyway?
I have lots of operations to perform with similar requirements. I will have a DataTable with potentially many records (no more than 20 or 30 though) which contains inserts, modified records and deletions. Each SQL table has 4 SP's - one for selects, inserts, updates and deletions. I extract DataTables with just the set of records I'm interested in (inserted, deleted and modified DataTables). How do I call my SP's and pass them the arguments they need for ALL the rows in my DataTable rather than one at a time?
Cheers
December 5, 2005 at 4:37 am
I don't think this is a 'calling issue', but rather a procdesign issue...
How are your procs written? Can they handle multiple-row deletes or are they designed for singleton deletes? You call your procs in whatever fashion they expect in order to do their work properly.
Remember that a procedure is just an encapsulation of a piece of code. This code can do/cannot do things depending on how it's written.
/Kenneth
December 5, 2005 at 4:46 am
Perhaps you could comment on these SP's?
Create Procedure WorkType_Upd
@WorkTypeCode varchar(5),
@WorkTypeDesc varchar(50),
@ChangeStamp timestamp
AS
Update WorkType Set WorkTypeDesc = @WorkTypeDesc
Where ChangeStamp = @ChangeStamp and
WorkTypeCode = @WorkTypeCode
GO
Create Procedure WorkType_Del
@WorkTypeCode varchar(5),
@ChangeStamp timestamp
AS
Delete from WorkType
where WorkTypeCode = @WorkTypeCode and
ChangeStamp = @ChangeStamp
GO
Here's a sample update and a sample delete. I am picking up any errors in ADO.NET code. The ADO.NET code currently calls these SP's one time for each row that is being updated or deleted. It just seems inefficient to be doing it that way. I wanted to know whether it was possible to pass a whole table of updates/deletion/inserts to SQL and have it process them en-bloc rather than having traffic to and fro for each individual record. Obviously this is what happens when you right the SQL code in ADO.NET but I haven't seen an example using SP's.
If these SP's are not appropropriate for such an architecture could you provide some pointers as to what would need changing to allow it please?
Thanks
December 5, 2005 at 5:07 am
Yes, those are single-value statements, since the variables can only contain one value, each delete or update handles only one SARG at a time. (in theory it could be more than one row that matches, but in reality this should be the PK, so there should be max one row that qualifies)
If it's 'efficient' or not is a matter of opinion. Don't forget that there's more to 'efficiency' than just counting the number of roundtrips over the wire. Above all, code should be robust. Many times that also means 'simple', or not overly complex or hard to read or understand. You must be able to trust that it does what's expected - nothing less and nothing more. The examples you have looks simple and robust. Nothing strange going on, thus it's also easy to verify the results after they execute.
Since you mentioned that the max 'batch' would be 30-40 rows, I'm not too sure I'd consider that's worth the troubles of redesigning the functionality the procs provide. It's doable, but... it's not a one-shop stop. It's not enough to just rewrite the procs, you must make this an overall design decision and follow up from top to bottom - else you risk that your final solution isn't 'robust' enough, and therefore maybe not so 'efficient' as intended...
...in the end, you might say that this is one of those 'it depends' thingys.. (again)
/Kenneth
December 5, 2005 at 5:24 am
I am most grateful for your response Kenneth.
I think in hindsight it is probably best to leave this system the way it is. However, I would very much like to see what the update SP above might look like were it re-engineered to accept multiple rows at a time? Do you have pointers to other examples I might look at please?
Thanks
December 5, 2005 at 5:42 am
imo the most scalable 'arbitrary-multi-row-argument' method is to update (or delete) by joining the base table against another table holding the keys for the rows you want to update or delete. The problem is then how to seed this 'key' table with the keys for the rows in question.
I mean, it's nothing strange about doing an update like;
update a
set a.col1 = b.col1,
a.col2 = b.col2,
......
from myBaseTable a
join #myTempKeyTable b
on a.PK = b.PK
myTempKeyTable takes the role as placeholder instead of @myVariable...
The 'fun' starts when trying to decide upon the best method on how to enter the keys into this table.. and what happens if one fails? How to errorhandle it? And the transaction scope may be extended, since the 'seed' of the keyvalues may be considered to be part of the overall transaction.. and it is indeed added database activity.. and so on and so forth... It becomes more involved to keep the solution 'robust'... However, sometimes it's worth it, though that generally isn't in ad-hoc situations, but more in a batch context, where you may want to load or change large volumes of data.
/Kenneth
December 5, 2005 at 7:53 am
Personally I do what you described by creating a proc that updates a single row using key provided and call this many times from the app as required. This is the simplest and easiest method although as stated you have round-trip networking to consider.
Alternatively you can do as Kenneth suggests (which I believe is the best method, as stated by other people on this site, for multiple key input). This site has several threads detailing a udf that parses a string into a table for using in a join.
Far away is close at hand in the images of elsewhere.
Anon.
December 6, 2005 at 8:18 am
Well, actually what I'm suggesting is to stick to the things as they are - the light load doesn't seem to justify a remake of the design, I believe.
/Kenneth
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply