June 2, 2004 at 3:51 pm
We have a situation where in the application, user can make multiple operations like add/update/delete. And eventually decide to save or rollback the all the actions. Our UI team (a bunch of gurus want to send each action of the user to the database. And finally tell us to commit or rollback the data. I fought my battles trying to convince them to cache the user actions in the front end and finally send me the batch of operations if the user decides to save the changes. But not much luck there. So was wondering if anyone has been in a similar situation and has some ideas here. My Add/Update/Delete all seperate SPs. Please let me know if you need me to elaborate.
Thanks for your time
June 2, 2004 at 4:17 pm
The problem with writing this to the database in a transaction and holding the transaction open is it will effectively lock the affected tables until the transaction is commited.
We had a similar need in my department. What we came up with is VERY difficult to manage and VERY Expensive resource wise, but it works great for our use. We have a publisher server and each "USER" has SQL Server installed on their workstation. The Users have a merge subscription that they can do anything they want with and then when they deem the data ready they "Check it in" via a custom dynamically filtered, Merge Replication scenario. We also have triggers to allow us to lock the data on the publisher and our custom application checks for those locks before allowing the user to modify the data. We then went further and implemented a full recovery model for backups and removed the trig_backupset_delete trigger in the msdb database so that the logmarkhistory table in the msdb database doesn't get deleted. All transactions against the database use marked transactions via scripts or stored procedures using the "WITH MARK" option on the begin transaction statement. Dropping the trigger means that these marks get retained in the logmarkhistory table and we can read that from our front end app. We then have a procedure that allows the user to select the mark they want to roll the database back to. This causes a restore of the database to that point in time on the transaction log. We then clean up the logmarkhistory table to only show the transactions before that last mark they restored to.
When "Checking In" the data we clean up the locking tables, the logmarkhistory table, and re-initialize the backup files by doing a full backup, diff backup and log backup after the merge replication has finished.
Finally we have stored procedures that denormalize the data to a flat file so that we can do a diff between two different databases to see what changes have been made and/or where we have problems with the data.
As I stated, this is a very complex arrangement that I have to maintain daily. It's not something I would wish on someone else but it works very well for us.
Gary Johnson
Microsoft Natural Language Group
DBA, Sr. DB Engineer
This posting is provided "AS IS" with no warranties, and confers no rights. The opinions expressed in this post are my own and may not reflect that of my employer.
June 2, 2004 at 4:24 pm
cbarus,
I agree with your first thought whole heartedly. Here's some more ammo for your battle.
If you create a transaction in T-SQL for something that lasts a long period of time (ie like waiting on a user to fill out a form) then you will lock whatever records/tables (depending on the locking method) that are being updated. This would cause poor application performance and your user's would be very unhappy.
Depending on what your developing this in, this might not work, but...I would suggest investigating ADO or ADO.NET and how you can create a transaction through that to the SQL Server. I've only read about it a little, but I don't believe that it creates these locks on the SQL server like it does in T-SQL.
If that doesn't work then I would suggest that the UI team also keep track of each transaction within the app so that if the user decides to rollback they can do it manually. And obviously this isn't the most elegant method, but the UI team made their bed so let them lie in it.
Brad
June 3, 2004 at 2:05 am
I have two suggestions:
1. IF the UI guys are using .NET, they can make use of a dataset and perform all the changes in the dataset, then either commit or reject all the changes made to the dataset. The action never hits the database until a Save is done on the dataset.
2. If they cannot cache the actions done by the user on the client, it would be possible to cache them in the database. Have them send all the commands as strings eg. "exec spTable_Insert @prm1 = 23, @prm2 = 'newvalue".
Store all the commandstrings in a table, and when a save is issued from the client, the database can loop through and execute all the commandstrings in a transaction.
( Charge them extra for doing their work ).
Issues, if the database has Identity columns or Computed columns, this may be a reason why they want to "chat" with the database so that they can retrieve and use these values in their app.
/rockmoose
You must unlearn what You have learnt
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply