March 19, 2012 at 7:01 pm
"It isn't fraud. It is just math."
As a mathematician I strongly object to that. Math isn't the problem. Unreasonable requirements (violating basic mathematics) are.
Either the VAT should be calculated against the detail items in which case it is rounded first to obtain the actual VAT amounts per item which are then added and will match the total by definition, or it is calculated against the total in which case the breakdown per item is a meaningless artefact (and if you need to specify it at all it can be rounded and does not have to add up to the total). That's the math of it.
March 19, 2012 at 7:07 pm
If I had a penny for every time a business user came up with an "unreasonable requirement" I wouldn't need to fudge round to be a millionare by now!
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
March 19, 2012 at 7:12 pm
"So is this fraud?"
There is a difference between rounding because of "technical restrictions" such as the ones you mention and fudging for no good reason other than to keep ignorant accountants and/or managers happy. Nevertheless even this type of "inevitable" rounding would need to be agreed upon (e.g. how many decimal places) or it I would also consider it fraud (or at least "wrong").
"Fraud is technically a legal term and I'd be surprised if this approach violates GAAP"
I can't and don't want to argue with that. But scientifically (and I think also morally) you are doing the wrong thing.
March 19, 2012 at 7:20 pm
There are many cases where penny rounding simply must be performed. Many times in billing applications, a fee will be calculated on a tiered fee scheduled based on the aggregate assets of each account in a billing relationship (e.g. husbands saving account, wife's saving account and each of their 401K accounts). The resultant fee then has to be allocated back to the contributing accounts. Typically, the fee amount is large enough that nobody will look for the accounts where an amount was rounded up to account for whole pennies even though mathmatically, it should have been rounded down. Occasionally, you will have a setup where it stands out (e.g. a fee of 1.00 dollar allocated across three accounts where each account contributed the same market value amount.
March 19, 2012 at 7:36 pm
"The resultant fee then has to be allocated back to the contributing accounts"
That's fine if everybody agrees on a necessarily arbitrary allocation method (or result) and then apply that. But wouldn't each of them have the (moral, perhaps not legal) right to disagree with an imposed random fudge, which is the context of the article?
March 19, 2012 at 7:44 pm
"Fudge" is a word I would try to avoid - it is hard to make it sound like a good thing. A well-designed penny-rounding method should minimize the difference between the final value and the most correct value mathematically. So, while I would expect cases of values being a penny more or a penny less than the actual rounded allocated amount, if the discrepancies were much larger, then I would expect some people to raise questions. I have seen cases of people questioning a difference of pennies for a total invoice of thousands or tens of thousands of dollars.
March 20, 2012 at 6:52 am
I worked for 14 years as a Programmer and BSA at a 401(k) recordkeeping company (third-party administrator). We had to use the "Penny Toggle" method. Basically, it meant figuring out which is the last one to share, and subtracting the accumulated split amount from the total split amount. In addition, it rounded one up, the next one down, and so on. Therefore, at the end, you are no more than 1 penny up or down, so you do the subtraction method just described to back into the last amount. It was used for any allocation situation, from splitting contribution amounts between investment elections within the same participant, to allocating earnings for a whole fund over all participants with a balance in that fund.
From what I understood, this was an acceptable method of allocation (from an accounting standpoint). There was a huge proof that went with it if allocating amounts over all participants, but I did not include it here.
Note that all dollar amounts were kept as integer numbers. For example, $100.99 would be kept as 10099.
Here is an example or the process:
! split the amounts by investment elections
!
DECLARE LONG PCT(MAX_FUNDS), TRANS_AMT(MAX_FUNDS), TOT_PCT, ACC_BASE, ACC_EARN, TOT_AMT, EARNINGS_AMOUNT
DECLARE GFLOAT ACC_UNRD
!
!
!split the earnings amount over the funds according to investment election
!
!
FOR X = 0% TO MAX_FUNDS
TRANS_AMT(X), PCT(X) = 0%
NEXT X
!
TOT_PCT = 0
FOR X = 1% TO NUM_FUNDS!NUM.FUNDS
FUND = X!FUNDS(X)
PCT(FUND) = ???
TOT_PCT = TOT_PCT + PCT(FUND)
NEXT X
!Check TOT_PCT to make sure it is 100
!
ACC_EARN, ACC_BASE, ACC_UNRD = 0
!
TOT_AMT = ?! total amount to split
!
FOR X = 1% TO NUM_FUNDS
FUND = FUNDS(X)
IF PCT(FUND) > 0 THEN ! MAKE SURE HE HAS A BASE IN THIS ACCOUNT AND FUND
!
!
IF ACC_BASE + PCT(FUND) = TOT_PCT THEN
!
!
! THIS SITIUATION IS LAST FUND TO SHARE, SO JUST SUBTRACT ACCUMULATED AMOUNT FROM TOTAL TO SPLIT
!
!
EARNINGS_AMOUNT = TOT_AMT - ACC_EARN
!
ELSE
!
!
ACC_UNRD = ACC_UNRD + ( TOT_AMT * PCT(FUND) * .01)
EARNINGS_AMOUNT = ROUND( ACC_UNRD ) - ACC_EARN
!
END IF
!
!
ACC_EARN = ACC_EARN + EARNINGS_AMOUNT ! ACCUMULATE TOT EARNINGS ALLOCATED
!
ACC_BASE = ACC_BASE + PCT(FUND)! ACCUMULATE BASES USED
!
!
TRANS_AMT(FUND) = EARNINGS_AMOUNT! set this fund's amount
TRANS_AMT(0) = TRANS_AMT(0) + EARNINGS_AMOUNT! sum them up for totals
!
END IF! BASE > 0
NEXT X
March 20, 2012 at 7:16 am
My experience has been similar to Tom John's. Consider two printed reports/invoices that institute no penny rounding in the underlying data. One is a summary page and the other is a detail page. In a basic, even 1/3 split of $100 (33.33....% each), if you add all of the details by hand, you will get a total of $99.99 because the printed invoices goes down to the penny. On the other hand, the printed summary page has a total of $100.
Prior to implementing a penny rounding approach, very meticulous clients we have have rejected/questioned invoices, even when the invoice totals are very large and the total delta was a few pennies in total.
When you try to explain the reason why it is occuring, some clients have an appreciation of the reason and solution, others don't. But at the end of the day, both want invoicing reports that are consistent with each other. IIRC, I thought it was odd that reports directly out of our accounting package never had this issue... which tells me that accounting software must include it in some type of penny rounding feature as well so that the sum of the details add up to the header.
Note that this issue isn't limited to financial issues as you can probably come up with a similar story in other contexts. It's just that money is context that many of us see this in.
(edited for grammar)
March 20, 2012 at 7:35 am
vikingDBA (3/20/2012)
... the "Penny Toggle" method. Basically, it meant figuring out which is the last one to share, and subtracting the accumulated split amount from the total split amount. In addition, it rounded one up, the next one down, and so on. Therefore, at the end, you are no more than 1 penny up or down, so you do the subtraction method just described to back into the last amount. It was used for any allocation situation, from splitting contribution amounts between investment elections within the same participant, to allocating earnings for a whole fund over all participants with a balance in that fund.
Interesting. Thanks for sharing vikingDBA! I'll have to keep this in mind as yet another option.
March 20, 2012 at 6:40 pm
"Fudge" is a word I would try to avoid - it is hard to make it sound like a good thing.
I don't like to use the term "penny rounding" for 3 reasons:
1. As the example shows, the algorithm can be applied to any precision of decimal digits. In my example it is dollars.
2. Penny rounding is a highly "US-centric" term. In Thailand, where I live, the currency units are baht/satang with the smallest coin being 25 satang. So the algorithm can be manipulated to round to the nearest 25 satang (thusly):
ROUND(25*CAST(h.Amount * d.Weight / w.Weight AS Money), 0)/25 As AllocAmt
3. Only data typing restricts the algorithm to currency units. The allocation could be on any physical unit.
Besides I like fudge. It's almost the only truly American confection and it is really delicious!
And if anyone has an issue with "Fudge Rounding", encapsulate it in a function you name "FR." Then if anyone asks you tell them that it means Financial Rounding. 😀
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
March 20, 2012 at 9:52 pm
Post deleted as wrong info provided.
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
March 21, 2012 at 2:33 am
Not sure what you mean by Weight in the Detail table. is that the % of allocation? or actual quantity to be allocated. what do you mean by 33 in the detail table.
March 21, 2012 at 3:48 am
Weight (in the detail table) could be anything. Maybe an example will help.
Let's say you want to allocate a corporate Overhead charge to different departments based on their overall cost of doing business. Two departments:
Dept 1 - Total cost of business = 33
Dept 2 - Total cost of business = 45
To allocate an overhead charge of 100 to these two departments the calculation is as follows:
Dept 1 - 33 * 100 / 78 = 42
Dept 2 - 45 * 100 / 78 = 58 (+0 from Fudge Rounding)
33 was just an example number.
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
March 21, 2012 at 5:47 am
Unfortunately this doesn't play as nice when the sum of allocations cause an overage. There are cases when you either need to subtract from a few or subtract from all and add to a few (when per-allocation decimal > 0.5). Try 100 divided 6 ways. It fudges it to 17 for all items resulting in a total of 102.
Add this to @Hdr:
UNION ALL SELECT 'MNO', 100
Add this to @Dtl:
UNION ALL SELECT 'MNO', '1', 16
UNION ALL SELECT 'MNO', '2', 16
UNION ALL SELECT 'MNO', '3', 16
UNION ALL SELECT 'MNO', '4', 16
UNION ALL SELECT 'MNO', '5', 16
UNION ALL SELECT 'MNO', '6', 16
I haven't come up with a nice CTE way to do this yet. I know it is possible. 🙂 I did it in a procedure with temp tables, but it wasn't as elegant.
March 21, 2012 at 6:41 am
Mr. edobrzel ,
When i run with your MNO I get this result set which adds to 100:
key1key2WeightAllocAmt
MNO116.0015.00
MNO216.0017.00
MNO316.0017.00
MNO416.0017.00
MNO516.0017.00
MNO616.0017.00
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
Viewing 15 posts - 16 through 30 (of 35 total)
You must be logged in to reply to this topic. Login to reply