SQL Function performance / design question

  • Hi there,

    I am trying to work out the best way of sorting out a performance problem with a set of procedures and functions I am working on.

    I have basically put most of the complicated processing in a set of SQL Functions which return tables.

    These are great in terms of maintainability and visibility. The only problem is they perform very badly.

    The reason behind this is that they process lots of data temporary tables. I grab the data, insert it in a temporary table and then subsequently run a series of updates on the data. This makes for neat code. However because you can't use normal temporary tables (#tablename), you have to use the table datatype temporary tables and the crunch is, these can't have indexes.

    I am now duplicating the functions as procs, but these are much more complicated to use because you can't query them directly and so this results in the system needing a major overhaul. DOH.

    Can anyone see a better solution here?

    Thanks

    Matt

  • You're running into the basic problems with table variables. They don't have statistics. So, while coding in them looks very neat and clean, performance is a total bear. You're doing the right thing working through stored procedures.

    If you're loading lots of data into temporary storage for processing, you might be hitting performance problems just because of that. For more detailed help you'd need to outline what it is you're trying to do.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Thanks Grant, I am glad I am going down the correct route now. The performance problems are easy to solve in procs where you can use indexes, so I should be OK now thanks.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply