August 2, 2008 at 11:49 am
Table1
Sr.....Debit.......Credit.....Balance.
1......100.........0..........100.....
2......100.........0..........200.....
3......0...........50.........150.....
4......100.........0..........250.....
We can only enter the values of Sr, Debit and Credit where Balance column should be automatically update.
Balance = Debit - Credit + Balance (of last row)
plz duide me
August 3, 2008 at 5:25 am
You may do this on triggers.
CREATE TRIGGER insert_balance ON [dbo].[TEST]
FOR INSERT
AS
UPDATE Test
SET Balance = (inserted.Debit - inserted.Credit)
+ COALESCE((SELECT TOP 1 Balance FROM Test
WHERE ID<(SELECT MAX([ID])
FROM Test)
ORDER BY ID DESC)
,0)
FROM inserted
INNER JOIN Test
ON Test.ID = inserted.ID
- Zahran -
August 3, 2008 at 9:54 am
That will work just fine, but only if a single row is inserted at a time. If there's a chance that more than 1 row can be inserted at a time, you'll need to take a different tact.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 3, 2008 at 10:12 am
Jeff Moden (8/3/2008)
That will work just fine, but only if a single row is inserted at a time. If there's a chance that more than 1 row can be inserted at a time, you'll need to take a different tact.
Can you post the solution for more than 1 row at the same time !?!
😎
August 3, 2008 at 10:23 am
Sorry, was in a hurry. I should have posted them (more than 1) without even being asked.
There are a couple of ways to do it... the method with the shortest amount of code looks set based, but actually uses a "triangular join" to get the job done. It that short method can be thousands of times slower than a cursor.
The method with a longer amount of code is lightning fast... it can do a running total recalc on a million rows in about 7 seconds... imagine how fast it runs on just a couple of hundred...
Here's the link where both methods are discussed... it also has a link for the cursor method which I also don't recommend. I obviously recommend the faster code...
http://www.sqlservercentral.com/articles/Advanced+Querying/61716/
--Jeff Moden
Change is inevitable... Change for the better is not.
August 3, 2008 at 12:01 pm
Jeff Moden (8/3/2008)
Sorry, was in a hurry. I should have posted them (more than 1) without even being asked.There are a couple of ways to do it... the method with the shortest amount of code looks set based, but actually uses a "triangular join" to get the job done. It that short method can be thousands of times slower than a cursor.
The method with a longer amount of code is lightning fast... it can do a running total recalc on a million rows in about 7 seconds... imagine how fast it runs on just a couple of hundred...
Here's the link where both methods are discussed... it also has a link for the cursor method which I also don't recommend. I obviously recommend the faster code...
http://www.sqlservercentral.com/articles/Advanced+Querying/61716/
I would (personally) forget about any process that updates or tries to maintain this column using a trigger. I would just create it as a computed column or use a view with the calculation in it.
Much easier to maintain and always correct.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
August 3, 2008 at 4:34 pm
Jeffrey Williams (8/3/2008)
I would (personally) forget about any process that updates or tries to maintain this column using a trigger. I would just create it as a computed column or use a view with the calculation in it.
Perfect... with the idea that the running total may not be for consecutive ID's, would you mind posting the formula for the computed column and the view with the calculation in it? Thanks.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 3, 2008 at 6:37 pm
Jeff Moden (8/3/2008)
Perfect... with the idea that the running total may not be for consecutive ID's, would you mind posting the formula for the computed column and the view with the calculation in it? Thanks.
As you know - the formula would really depend upon the exact requirements.
Looking back at the original post, I have to say I missed the running total aspect and thought it was straight totalling across the rows.
So, to do this would require a function - and would require an identifier to group, unless it is a running total for the whole table. And, yes - would probably have a bit of a performance hit.
Note to self: read (and understand) the question before answering. 🙂
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
August 3, 2008 at 8:46 pm
Heh... thanks for the feedback... I've done the same thing, many times. Didn't know for sure if you were on to something that I wasn't aware of or not and wanted to see what you had.
If you haven't done so so far, take a look at the article in the following link... it's amazingly fast. I don't even include running totals in most nightly runs now except for things like bank statements and then only temporarily. Sure, for archived rows, I'll store running totals, but not for unarchived rows.
http://www.sqlservercentral.com/articles/Advanced+Querying/61716/
--Jeff Moden
Change is inevitable... Change for the better is not.
August 3, 2008 at 9:28 pm
Jeff Moden (8/3/2008)
Heh... thanks for the feedback... I've done the same thing, many times. Didn't know for sure if you were on to something that I wasn't aware of or not and wanted to see what you had.If you haven't done so so far, take a look at the article in the following link... it's amazingly fast. I don't even include running totals in most nightly runs now except for things like bank statements and then only temporarily. Sure, for archived rows, I'll store running totals, but not for unarchived rows.
http://www.sqlservercentral.com/articles/Advanced+Querying/61716/
Yeah, I have read it and as soon as I understood it was a running totals realized that any other approach would not perform as well.
The systems I work on generally don't require running totals - and for those that do, we have the data in the data warehouse already summarized.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
August 4, 2008 at 5:08 am
I wonder how they do running totals in data warehouses in SQL Server?
--Jeff Moden
Change is inevitable... Change for the better is not.
August 4, 2008 at 8:03 am
Jumping in...
I assume each debit and credit is associated with a user identifier and has a post date time stamp. If that is the case, it seems to me that you should be able to do a view, grouping by the user identifier where you subtract the sum of the credits from the sum of the debits. You could get a point in time balance by including an ending date in a where clause.
August 4, 2008 at 10:20 am
Jeff Moden (8/4/2008)
I wonder how they do running totals in data warehouses in SQL Server?
Several ways, that I have seen. One is to calculate the running totals during the load and storing the data. Another is to let Analysis Services (or Cognos, if using that) generate the running totals in the cubes.
But, I don't work on that system here - so, don't really know for sure.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
August 4, 2008 at 11:37 am
Jeff Moden (8/4/2008)
I wonder how they do running totals in data warehouses in SQL Server?
Either in pre-processing (the Load step of the ETL) or in post-processing (the Analysis client). Data Warehouses don't have interactive DML going on, so they do not have the same issues.
If you do it in the Load step with SQL, a windowed SUM() function with the new records, (adding in the last prior record) would probably be sufficient.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
August 4, 2008 at 1:35 pm
rbarryyoung (8/4/2008)
Jeff Moden (8/4/2008)
I wonder how they do running totals in data warehouses in SQL Server?Either in pre-processing (the Load step of the ETL) or in post-processing (the Analysis client). Data Warehouses don't have interactive DML going on, so they do not have the same issues.
If you do it in the Load step with SQL, a windowed SUM() function with the new records, (adding in the last prior record) would probably be sufficient.
Exactly what I was trying to say - thanks for the clarification.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply