"Per-user" temp table

  • Well, if you are not sure how they will search data, what about if they will specify not selective condition. For example smth like "gender=F" or "field like 'a%' ", then all the data from BigTable goes into #tempTable that is also big. It may give an extra load for tempdb. As it is new table and it is big, you should also build indexes on it, or it will be always a scan. And what indexes? As far as you have no idea what is user going to do next, it may be a hard question.

    What I'm trying to say that for an abstract task it has a lot of pitfalls, imho.

    Of course you know your pattern of work and data better, so if you don't see any problems, try to implement it using my proposition from the first post, about creating temp table outside proc with direct command.

    It will be also interesting to hear, how good this solution works in the real world.

    Good luck!


    I am really sorry for my poor gramma. And I hope that value of my answers will outweigh the harm for your eyes.
    Blog: http://somewheresomehow.ru[/url]
    Twitter: @SomewereSomehow

  • Lowell (8/9/2012)


    what about going ahead and using either a permenant or a global temp table, but adding an extra column with some identifier to the user or session, ie the value of @@spid, or login name, or something?

    that's a bit of a kludge, as you'd need to filter the results of the temp table on spid=@@spid or something.

    Yes, that would work, but again - I'm trying to reduce the number of records to scan. The search criteria in these cases are usually so messy that SS will wind up doing a table scan. I'm trying to give it a smaller table to scan, especially for repeated probes: "Okay, I've got my set down to 5,000 possibilities, now let's try...", where the user will slowly investigate additional layers of filtering, and when something looks reasonable, DO that to the temp table, making it "... now 2,000 records, let's try filtering on ..." and so on. If I stick everything into one big table, I'm not gaining anything over just scanning the source.

    And as before, there is no convenient and reliable mechanism for purging records when the user disconnects.

  • SomewhereSomehow (8/9/2012)


    ... or it will be always a scan. ...

    Yes, it will almost always be a scan in these cases - that's the problem. I'm handling the edge cases here, when the user was unable to find what he needed using one of the standard fast searches. At this point the user is sort of feeling his way around in the dark, trying all sorts of things. Since it will be a table scan, I'm trying to cut down the number of records that must be scanned. Indexes would be pointless - they would be used once or twice, if at all. More time and trouble to build one than it's worth, and "... WHERE fld1 Like '%stuff%' " is not going to benefit from an index anyway.

    Of course you know your pattern of work and data better, so if you don't see any problems, try to implement it using my proposition from the first post, about creating temp table outside proc with direct command.

    It will be also interesting to hear, how good this solution works in the real world.

    Good luck!

    That does seem like the clearest path - thank you for the explanation on how to do it. I'll tack my results onto this thread when I see how it works out. Probably be a day or so.

  • Have fun! 🙂

  • If you are using Acess for the application side, why don't you pull the original dataset into Access and work to refine that instead of trying to track a temp table in SQL Server?

    Seems like that would give you the ability to limit access to the individual user.

    Just a thought, not knowing exactly how you are using Access.

  • jerry-621596 (8/9/2012)


    If you are using Acess for the application side, why don't you pull the original dataset into Access and work to refine that instead of trying to track a temp table in SQL Server?

    Seems like that would give you the ability to limit access to the individual user.

    Just a thought, not knowing exactly how you are using Access.

    Hm, that's not a bad idea. Maybe do it for the initial population, when I restrict the first selection to the user's records. Shoveling that many records across a network isn't usually the best plan, but all users here are one gigabit switch away from the server, and the network is quite lightly loaded. I'll look at that - thanks for the nudge.

Viewing 6 posts - 16 through 20 (of 20 total)

You must be logged in to reply to this topic. Login to reply