May 20, 2003 at 1:12 pm
Greeetings all.
I have a search proc which returns n amount of rows. Could be 1, could be 100000000.
I currently display 20 per page. The problem I face is that it is now starting to slow down and hit the SQL box badly when returning many rows.
How I do it is run the query and insert the results into a temp table.
If the user wants 1-20 I select TOP 20 out the table, truncate it and drop it.
If the user wants 20-40 (etc etc) I SET ROWCOUNT 19 DELETE FROM #Temp, SET ROWCOUNT 20 Select * From #Temp.
This works fine for a couple hundred or thousand rows but some of the searches use the last couple pages. This causes anything from 8000 to 100000 rows to be moved into the temp table and only the last 20 used. VERY SLOW!
I tried coping the PK from the temp table into a second temp table and deleted from there thinking that it would be faster as SQL did not have to ditch so much data. Then select out the 20 required and joining it onto the main temp table.
This did help a bit but not ideal. It still pegs the processor and disks.
Any other ways of doing it?
Thanks
Cheers,
Crispin
Cheers,CrispinI can't die, there are too many people who still have to meet me!It's not a bug, SQL just misunderstood me!
May 20, 2003 at 2:30 pm
Hi,
We had a similar performance problem and ended up adding an identity column to the table so that we could get the "N" number of rows directly from the base table...
I feel this sort of "adding columns just to make things easy and avoid taxing the brain" mentality is not very good but I couldn't find a better alternative...
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply