October 21, 2016 at 6:20 am
For each of these Doctor's office locations, I need to get the percent of patients with this language.
create table #Test
(
PtLanguage varchar(100),
Abington int,
BelmontSt int,
Bridgewater int
)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('ALBANIAN',2,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('AMHARIC',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('ARABIC',5,0,3)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('ARMENIAN',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('ASL (AMER SIGN LANG)',1,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('BRAZILIAN',4,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('CAMBODIAN',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('CANTONESE',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('CAPE VERDEAN',11,7,2)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('CHINESE',0,0,2)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('CROATIAN',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('ENGLISH',2552,1375,6609)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('FILIPINO',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('FRENCH',4,4,8)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('GERMAN',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('GREEK',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('GUJARATI',1,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('HAITIAN',9,6,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('HINDI',2,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('HMONG',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('IBO-',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('ITALIAN',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('KOREAN',0,0,1)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('LAOTIAN',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('MANDARIN',0,0,3)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('OTHER LANGUAGE',9,1,2)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('PASHTU',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('PERSIAN',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('POLISH',0,1,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('PORTUGUESE',28,10,6)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('PUNJABI',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('RUSSIAN',3,1,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('SIGN LANGUAGE',2,1,1)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('SPANISH',19,14,10)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('SWAHILI',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('TAGALOG',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('TAIWANESE',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('THAI(LAOTIAN)',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('UNKNOWN',25,37,82)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('UPDATE NEEDED',2,3,2)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('URDU-',0,0,0)
insert into #Test(PtLanguage, Abington, BelmontSt, Bridgewater) values('VIETNAMESE',1,1,0)
October 21, 2016 at 6:25 am
What do the numbers for each surgery represent? I'm assuming they are the counts of patients speaking each language.
How to post a question to get the most help http://www.sqlservercentral.com/articles/Best+Practices/61537
October 21, 2016 at 6:46 am
If I understand your question correct, this could give you the results you need:
select PtLanguage
, sum(Abington) as total_abington
, cast(sum(Abington) * 100.0 / max(grand_total.total_abington) as decimal(6,3)) as pct_abington
, sum(BelmontSt) as total_Belmont
, cast(sum(BelmontSt) * 100.0 / max(grand_total.total_Belmont) as decimal(6,3)) as pct_Belmont
, sum(Bridgewater) as total_bridgewater
, cast(sum(Bridgewater) * 100.0 / max(grand_total.total_bridgewater) as decimal(6,3)) as pct_Bridgewater
from #Test
cross join (select sum(Abington) as total_abington
, sum(BelmontSt) as total_Belmont
, sum(Bridgewater) as total_bridgewater
from #Test
) as grand_total
group by PtLanguage, total_Belmont, total_bridgewater
I guess there are more efficient solutions possible, but this was the first I came up with 😉
October 21, 2016 at 6:51 am
Here is a 'fairly' simple solution to your problem.
First a table is 'created' which contains the totals for each column.
Then this table is used to calculate the percentages for all the columns.
Two solutions which are both about equal. First solution is using a 'WITH' statement which creates a 'temporary' table/view for the totals. Then the totals are used in the end part of the statement.
Second example does the same but actually build a #counted table for the totals.
--
-- T contains the totals for each column
--
;
With
T as (
select SUM(abington) S_A, SUM(belmontst) S_B, SUM(bridgewater) S_C from #test)
select *
,100.0*Abington/S_A Percent_A
,100.0*BelmontSt/S_B Percent_B
,100.0*Bridgewater/S_C Percent_C
from #Test cross join T
--
-- Alternative:
--
-- #Counted contains the totals for each column
--
select SUM(abington) S_A, SUM(belmontst) S_B, SUM(bridgewater) S_C into #Counted from #test
select *
,100.0*Abington/S_A Percent_A
,100.0*BelmontSt/S_B Percent_B
,100.0*Bridgewater/S_C Percent_C
from #Test cross join counted
drop table #Test
drop table #counted
There are other solutions which can calculate the percentage 'more' directly, but code for that is more advanced.
Weaknesses of the example and the solution is that the number of columns depends on the number of sites with patients. Code has to change if another site is added, this is a weakness.
For the percentages their is the weakness that a site with no patients yet will result in an error because of the division by zero.
Questions to others in the group:
With examples as these how do we prevent a division by zero. (Because the complete column does only contain zero's). (The same for averaging fields if there are only zero fields).
Please tel if this solution works for you, or helps you along.
Ben
October 21, 2016 at 7:17 am
ben.brugman (10/21/2016)
...Questions to others in the group:
With examples as these how do we prevent a division by zero. (Because the complete column does only contain zero's). (The same for averaging fields if there are only zero fields)...
The solution I use to prevent division by zero it to wrap it inside a CASE statement. In the query I have posted above the percentage column would change to:
, case when max(grand_total.total_Bridgewater) = 0
then 0
else cast(sum(Bridgewater) * 100.0 / max(grand_total.total_Bridgewater) as decimal(6,3))
end as pct_Bridgewater
October 21, 2016 at 8:11 am
Here's another way to do it
SELECT
PtLanguage
,Abington
,CAST(ROUND(CAST(Abington * 100 AS DECIMAL(9,3)) /SUM(Abington) OVER(),3) AS DECIMAL(6,3))AS Percent_Abington
,BelmontSt
,CAST(ROUND(CAST(BelmontSt * 100 AS DECIMAL(9,3)) /SUM(BelmontSt) OVER(),3) AS DECIMAL(6,3))AS Percent_BelmontSt
,Bridgewater
,CAST(ROUND(CAST(Bridgewater * 100 AS DECIMAL(9,3)) /SUM(Bridgewater) OVER(),3) AS DECIMAL(6,3))AS Percent_Bridgewater
FROM #Test
How to post a question to get the most help http://www.sqlservercentral.com/articles/Best+Practices/61537
October 21, 2016 at 8:15 am
Thanx.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply