temp table level of precision

  • Hi everyone

    I have been working on improving the run time for a query.

    The query does math calculations so it is producing integers (for some parts) but decimals for most.

    version 1 - uses physical tables to store temp results

    version 2 - using #temp table to store temp results

    I was looking at improving the run time of version 1 (the original query).  To make sure I didn't break anything when I created version 2, I compared the numbers from version 2 against version 1.  They were different.  It turns out that version 2 was more closer to being accurate than version 1.  This completely stumped me b/c all I did was change the location where the temp results were stored.  There was no change to the actual logic.  Then it occurred me that it must be something SS does when #temp tables are created.  Here is how I created the #temp tables:

    select t1.field1
    t1.field2
    into #temp1
    from myTable t1

    I never explicitly indicated the level of precision for #temp1 so I think SS used some sort of default value.  Is that what happened?  If yes, what is the default?  How would I find out what my copy of SS uses for defaults?

    Thank you

  • I suggest that you compare the schemas of the temp tables with those of the permanent tables and you'll see for yourself.

    Alternatively, create the temp tables explicitly (using CREATE TABLE ...) using the same schema as your permanent tables and the answers should match.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Here is a script for getting the definition of temp tables:

    https://www.sqlservercentral.com/scripts/get-temp-table-definition

  • water490 wrote:

    The query does math calculations

    You should explicitly cast the results to the type you want - presumably not float.

  • thank you everyone!

    I used the script that Jonathan provided.  It turns out that temp was using float when I used numeric.  Once I made this change the numbers match.

  • water490 wrote:

    thank you everyone!

    I used the script that Jonathan provided.  It turns out that temp was using float when I used numeric.  Once I made this change the numbers match.

    I can almost guarantee that was NOT caused by the use of SELECT INTO to create the table unless your original table was changed and the Temp Table was not.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    water490 wrote:

    thank you everyone!

    I used the script that Jonathan provided.  It turns out that temp was using float when I used numeric.  Once I made this change the numbers match.

    I can almost guarantee that was NOT caused by the use of SELECT INTO to create the table unless your original table was changed and the Temp Table was not.

    I have never used float for table definitions.  I am not sure why the INTO command I used ended up with that data type.  To be honest, it did improve the accuracy so I am happy with that.  It is just a mystery to me how/why SS decided on float.

  • water490 wrote:

    Jeff Moden wrote:

    water490 wrote:

    I have never used float for table definitions.  I am not sure why the INTO command I used ended up with that data type.

    You did some calculations without thinking about scale and precision.

  • Ken McKelvey wrote:

    water490 wrote:

    Jeff Moden wrote:

    water490 wrote:

    I have never used float for table definitions.  I am not sure why the INTO command I used ended up with that data type.

    You did some calculations without thinking about scale and precision.

    When I first wrote the queries I was a complete rookie.  Now I'm much more experienced so I'm revisiting all the queries.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply