May 14, 2004 at 1:40 pm
I have a UDF bound to a field in a table. The UDF calculates the value of the field and the two fields it uses are defined as datatype float with a precision of 3 places after the decimal. The problem is the output of my UDF rounds off the decimals and only displays whole numbers or the left side of the decimal.
I defined the variables as numeric with a precision of 2 but still have the decimals rounded off when I query the calculated field. What is the right way to maintain the precision of the result? Is it defined
in the create table command or in the UDF? At present if depth1 is 4.1 and depth2 is 10 then the result from the UDF is 6 (5.9 rounded off). I would like 5.9 displayed in the field.
Here is the field definition in the create table command
[Elevation] [numeric(9,2) AS ([dbo].calc_feat_elevation([depth1],[depth2])),
[depth1] [float] NULL
[depth2] [float] NULL
Here is the UDF
CREATE FUNCTION dbo.calc_feat_elevation (@depth1 numeric(9,2), @depth2 numeric(9,2))
RETURNS numeric(9) AS
BEGIN
RETURN (@depth2 - @depth1)
END
Thanks for any help to do this properly.
May 14, 2004 at 2:24 pm
Simple mistake if you look closely at this line
RETURNS numeric(9) AS
The problem is you define no scale and thus 0 is used.
When this happens it will round to whole number.
Instead change to something like this
RETURNS numeric(9,2) AS
and thus all should be fine.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply