August 1, 2011 at 5:42 pm
Hey guys, here is a pickle. Warning!!! not as straight forward as it looks 🙂
I have a table called temp below
CREATE TABLE #TEMP
(ID INT, VALUE INT)
INSERT INTO #TEMP
VALUES
(0,21),
(1,98),
(2,0),
(3,0),
(4,0),
(5,0)
SELECT * FROM #TEMP
How do I select the average of only the values that actually have numbers
and (not zeros)as well as the first id, in the same select statement
so here is a ddl of the expected result
CREATE TABLE #Expected
(ID INT, VALUE decimal(4, 2))
INSERT INTO #Expected
VALUES (0,59.5)
SELECT * FROM #Expected
which gives (21 + 98) / 2 because they are the only two numbers that are not zero
This seems ticky or is it straightforward ?
August 1, 2011 at 5:57 pm
Nothing tricky. So simple it actually looks like homework.
What have you tried and why do you think it's not working?
August 1, 2011 at 7:48 pm
As a side bar, I think it's really cool that you have 2K8. I also think it's really cool that you've learned to use the "new" Insert/Values. But, when you do so, remember that there's a whole world of SQL Ninja's that could probably help but don't have 2k8, yet, and they'll just pass your problem by because they don't have time to convert the test setup code to something that will work for them.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 1, 2011 at 7:53 pm
Homework? How do you mean ?
I tried the following but the group by makes it return all the results and averages out each row separately.
select ID, AVG(value) 'avg' from #TEMP
group by ID
Then I tried this but it returns the average of just the first row as opposed to the average of the whole table where there is a number (2 rows in this case)
select ID, AVG(value) 'avg' from #TEMP
where ID = 0
group by ID
August 1, 2011 at 7:56 pm
Very great point Jeff. here is a DDL that should be fine for all versions.
CREATE TABLE #TEMP
(ID INT, VALUE INT)
INSERT INTO #TEMP
select 0, 21 union all
select 1, 98 union all
select 2, 0 union all
select 3, 0 union all
select 4, 0 union all
select 5, 0
SELECT * FROM #TEMP
August 2, 2011 at 1:54 am
I just couldn't resist...
SELECTMIN(CASE WHEN Value = 0 THEN NULL ELSE ID END),
AVG(CASE WHEN Value = 0 THEN NULL ELSE 1E * Value END)
FROM#Temp
N 56°04'39.16"
E 12°55'05.25"
August 2, 2011 at 6:15 am
Ignoring the <uselessly> complexe query by peso... so what did you come up with?
August 2, 2011 at 6:49 am
Ninja's_RGR'us (8/2/2011)
Ignoring the <uselessly> complexe query by peso... so what did you come up with?
Ummm... did you miss it, Remi?
I tried the following but the group by makes it return all the results and averages out each row separately.
select ID, AVG(value) 'avg' from #TEMP
group by ID
Then I tried this but it returns the average of just the first row as opposed to the average of the whole table where there is a number (2 rows in this case)
select ID, AVG(value) 'avg' from #TEMP
where ID = 0
group by ID
--Jeff Moden
Change is inevitable... Change for the better is not.
August 2, 2011 at 6:57 am
Nope, I was going with round 2.
I always wait 24H before actually posting anything in case this is an exam or cert or late homework.
I was hoping for an evolution since yesterday!
August 2, 2011 at 6:58 am
johnsonchase7 (8/1/2011)
Very great point Jeff. here is a DDL that should be fine for all versions.CREATE TABLE #TEMP
(ID INT, VALUE INT)
INSERT INTO #TEMP
select 0, 21 union all
select 1, 98 union all
select 2, 0 union all
select 3, 0 union all
select 4, 0 union all
select 5, 0
SELECT * FROM #TEMP
Perfect. Thank you for the effort.
The following will do what you've asked and it'll do it without giving the warning message about NULL's being ignored for the aggregate (which may be interpreted as an error if a GUI is involved).
SELECT StartingID = MIN(ID),
Average = AVG(Value+0.0)
FROM #Temp
WHERE Value <> 0
The purpose of Peso's "1E*" and my "+0.0" is to simply convert the Value to something other than an INT so that you can get something other than a whole number for the average. I haven't actually done a performance test but I know enough about machine language at the cycle level to say that it will always take fewer cycles to do an ADD than it will to do a MULTIPLY. I suspect that it'll take millions of rows to really make a difference, though.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 2, 2011 at 7:13 am
Ninja's_RGR'us (8/2/2011)
Nope, I was going with round 2.I always wait 24H before actually posting anything in case this is an exam or cert or late homework.
I was hoping for an evolution since yesterday!
Ah... understood.
I figure that anyone that takes the time to post readily consumable data and has posted what they tried (as you got the OP to do), deserves the leg up. Heh... I agree on the exam/cert/homework thing, though... when some of those folks post a laundry list of things, I tend to post notes about homework, etc, too.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 2, 2011 at 7:55 am
Jeff Moden (8/2/2011)[hr
I haven't actually done a performance test but I know enough about machine language at the cycle level to say that it will always take fewer cycles to do an ADD than it will to do a MULTIPLY. I suspect that it'll take millions of rows to really make a difference, though.
Wrong on my machine. I gave this theory all the chance in the world but it just fails every test. Multiply is the winner on my prod server (under no load ATM).
Data :
USE tempdb
GO
CREATE TABLE dbo.REMI (a int NOT NULL)
INSERT INTO dbo.REMI (a)
SELECT TOP 10000000 CHECKSUM(NEWID()) % 10000 FROM sys.columns C1, sys.columns C2, sys.columns C3
Then using Sql Query Stress :
Multiply =
Addition =
August 2, 2011 at 7:59 am
If it's really faster it's 6 ms per 1M rows...
Yes everything counts but that's really squeezing every last bit out of the cpus!
August 2, 2011 at 8:43 am
Thanks guys, I assure you it was not an exam. It is actually a self motivated project I am working on. I would have appreciated the answers even if it was 48hrs delayed. Thanks again.
August 2, 2011 at 8:47 am
johnsonchase7 (8/2/2011)
Thanks guys, I assure you it was not an exam. It is actually a self motivated project I am working on. I would have appreciated the answers even if it was 48hrs delayed. Thanks again.
It's nothing personal. We just don't want to give free passes to someone who will then become bad employee... and then work under us.
Yes it has happened!
Viewing 15 posts - 1 through 15 (of 24 total)
You must be logged in to reply to this topic. Login to reply