July 13, 2005 at 7:32 am
I have a column with data type of real. Till now this column was filled in by the application automatically (not user entered) and was rounded to 4 decimal places. The user has the option to override this value on the screen and would normally enter a number with 2 decimal places.
The business rules have now changed. The application will no longer auto calculate - user has to entered this value.
Is there a way to find the difference between user entered value and auto calculated entry ?
For example, I changed the value in the application to 0.60 but when I lookup the table, it show as 0.6000002
I have to clear all auto-calulated values in the table.
July 13, 2005 at 7:49 am
First of all, use decimal (18,4) instead of real. Real is not a precise datatype (as you can see with 0.6 that is represented as .600002).
Second this is gonna be hard because the user might have changed the calculated value with the same value, meaning that you could be erasing valid data. I think you'd be better off recalculating every column if possible with the server and leaving the users out of this one if possible.
July 13, 2005 at 8:07 am
I did a select (column - 0.60) and the result was 0. Should I not get 0.000002 ?
July 13, 2005 at 8:16 am
No because the values are APPROXIMATE. If you want the real value to be kept in the db you must use the DECIMAL datatype. Check the books online for more details.
July 13, 2005 at 11:07 pm
Yep,Both real and float can sometimes give weird decimal places.
another way is use numeric type.
July 13, 2005 at 11:27 pm
Also, be wary of any implicit conversion that may be going on as part of your calculation. It's always best to explicitly cast static data used in calculations.
--------------------
Colt 45 - the original point and click interface
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply