June 5, 2007 at 2:26 am
Morning All,
I have a need to find the 5th and 6th lowest values in a row of numerical data so that I can perform a calculation on them.
Currently I transpose the data ie turn the row into a column and then using rowcount and ordering I can get the values I need.
This works well , but if I could get rid of the transpose step it would obviously be quicker.
Does anybody have any ideas or pointers of how to go about it ? if someone had a prewritten function/SP that would be sensational.
Thanks
Will
June 5, 2007 at 3:26 am
Something like this?
CREATE TABLE #sillyexample (sillyint INT)
GO
INSERT INTO #sillyexample (sillyint)
SELECT 101
UNION SELECT 202
UNION SELECT 303
UNION SELECT 404
UNION SELECT 505
UNION SELECT 606
UNION SELECT 707
UNION SELECT 808
UNION SELECT 909
GO
SELECT sillyint FROM #sillyexample ORDER BY sillyint
SELECT TOP 2 sillyint
FROM
(SELECT TOP 5 sillyint FROM #sillyexample ORDER BY sillyint DESC) sillyinline
ORDER BY sillyint
David
If it ain't broke, don't fix it...
June 5, 2007 at 3:29 am
Oops, just realised you said a row,
still, if you put the contents of the row into a temporary table, you can use the technique above to return the two values you require
is your data in separate columns, or is it in a single column?
David
If it ain't broke, don't fix it...
June 5, 2007 at 3:40 am
Thanks David ,I was just about to point up that its a row or multiple rows and you beat me to it.
What technique would you use to turn the rows into columns ? I use 2 nested cursors (I know shoot me) the outer to first loop through the row values and the inner to add the row values as a column values using dynamic sql . The thing is the rows are not of equal size some have 500 fields others 480 etc.
This works, but its for a dynamic report so Im now trying to improve the response time.
All suggestions gratefully received.
W
June 5, 2007 at 4:53 am
June 7, 2007 at 6:00 am
500 COLUMNS????
--Jeff Moden
Change is inevitable... Change for the better is not.
June 7, 2007 at 8:01 am
Yes Jeff , 500 columns,, that was my initial reaction too.
1 column is an Id column and the other 500 are financial risk values for the last 2 years generated by another system that pertain to that id column these need to be grouped and summed on the fly and other operations performed on the vector of data.
I've gone back to my original solution and concentrated on improving my stored procs and indexes to enhance performance.
I'm happy enough with the solution that I have , hopefully it will meet the projects needs
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply