December 15, 2003 at 8:21 pm
I'm trying to "optimize" some code I have, running in CF, where I'd either like to cut the trips to the SQL Server, or make multiple trips as the SQL could be slow
Currently I'm using code like:
select fields,
(select field
from table2
where statements) as FieldCheck
from table1
where statements
Is this a good or bad idea?
TIA
December 15, 2003 at 8:34 pm
Sub queries shouldn't make a difference. This query should return a single result set, not two result sets which are processed on the client.
December 15, 2003 at 8:41 pm
I know it returns a single result set, but for speed purposes, are they a good idea?
I've not done too much testing as I know it can make a big difference on live data compared to the test data I'm using (plus load and such)
I was just wondering if they are horrid for preformance or pretty good
December 17, 2003 at 6:15 am
It can depend on the exact nature of the sub query, if you have a sort of some kind, for instance, it will completely change the query plan. Sometimes the plan is the same for the join or the subquery. Very occasionally it's quicker to build a #table and index it!
The only way to be sure it to check your plan in query analyser every time.
Keith Henry
DBA/Developer/BI Manager
Keith Henry
December 18, 2003 at 7:00 am
After some testing, I've taken a "sideways" step on this
I've added fields to the main table, then run Triggers on the sub-tables that update the main table
Query is now about 10 times faster
December 29, 2003 at 12:50 pm
I think that I would work at attempting to find another solution. It may be that the impact from a de-normalized table, and overhead of the triggers doesn't affect you right now, but going this route, rather than reworking the query seems excessive......I don't know your situation enough to say, it's just that this is one of those "red flag" things...
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply