March 1, 2010 at 12:07 pm
mister.magoo (2/26/2010)
Agreed, the WHERE clause can combine both, but also you can do away with the @Account_Or_Group variable and associated lookup code.You said the two lookups are mutually exclusive, so there is no real need to worry about them....
SELECT SUM(AmountSold) FROM #Sales WHERE Account = @param1
OR Account IN
(
SELECT Account FROM #AccountGroupMembers WHERE AccountGroup = @param1
)
With your test data, because you have the same number of records in tables #Accounts and #AccountGroupMembers, both forms of the queries will perform about the same - your form has to query #Accounts to see if it is an Account lookup - this form queries #AccountGroupMembers, but not #Accounts. Both have to read #Sales.
Yes thank you both. I will do away with the @Account_Or_Group and also use the OR statement in the where clause.
March 1, 2010 at 12:14 pm
RBarryYoung (2/26/2010)
Jeff Moden (2/26/2010)
Heh... poor ol' dynamic SQL... nobody loves it anymore.It's sad, when the cure is worse than the disease.
Oh well, I'll be killing swine and birds all weekend for the CDC... 😀
My reason for not wanting to use D-sql is not security or performance but simply that others on my team are not as familar with it and it may be easier for others to modify/troubleshoot.
March 1, 2010 at 12:37 pm
RBarryYoung (2/26/2010)
emily:Why do you want to cram two different functional tasks into one procedure? Thats been bad design in software development for over forty years. This obviously belongs in two different routines and that's how you should write it.
Hi Barry, I'll certainly consider your point but its not as obvious to me. I would certainly break this into 2 routines if I was paid per line of code. 🙂 That would also increase the quantity of my documentation. 🙂
The select portion of my statement is quite a bit more complicated than the simplified version I presented. I guess my thinking was/is that I don't want to write/revise the select statement twice. I can also get rid of my account/group parameter.
I would consider this one functional task, that is how my business users see it.
I have seen code so concise that I couldn't even understand it with my mere-mortal tsql skills. And of course I have seen bloat. For me this seemed to be a reasonable area to consolidate. Thanks.
March 1, 2010 at 12:45 pm
emily-1119612 (3/1/2010)
I have seen code so concise that I couldn't even understand it with my mere-mortal tsql skills. And of course I have seen bloat. For me this seemed to be a reasonable area to consolidate. Thanks.
It's not just bloat and understandability that's in question here.
http://sqlinthewild.co.za/index.php/2009/09/15/multiple-execution-paths/
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 1, 2010 at 1:40 pm
GilaMonster (3/1/2010)
emily-1119612 (3/1/2010)
I have seen code so concise that I couldn't even understand it with my mere-mortal tsql skills. And of course I have seen bloat. For me this seemed to be a reasonable area to consolidate. Thanks.It's not just bloat and understandability that's in question here.
http://sqlinthewild.co.za/index.php/2009/09/15/multiple-execution-paths/
Thanks Gail, that was educational. This may be blasphemous but I usually think about maintainability rather than perfomance. This is because I work with small data sets.
Given my example:
Accounts-1000 records
Groups-50 records
Sales-5000 records
As a result I have bad habits related to performance that wouldn't be tolerated in shops with large data sets. :crying:
March 1, 2010 at 1:59 pm
Well, after all that my specifications has changed. My users would like to be able to specify mutiple accounts or multiple groups or a combination of groups and accounts.
I will be able to store these parameters in a table which I am calling #ReportParameters
Revised data is below with some potential data for #ReportParameters.
I believe my included select should work. Clearly I have been using so many subqueries that my join skills are fading...
CREATE TABLE #Accounts
(
Account varchar (10)
)
INSERT INTO #Accounts
SELECT 'Account1' UNION
SELECT 'Account2' UNION
SELECT 'Account3' UNION
SELECT 'Account4' UNION
SELECT 'Account5'
CREATE TABLE #AccountGroups
(
AccountGroup varchar (10)
)
INSERT INTO #AccountGroups
SELECT 'Group1' UNION
SELECT 'Group2 '
CREATE TABLE #AccountGroupMembers
(
AccountGroup varchar (10),
Account varchar (10)
)
INSERT INTO #AccountGroupMembers
SELECT 'Group1','Account1' UNION
SELECT 'Group1','Account2' UNION
SELECT 'Group2','Account3' UNION
SELECT 'Group2','Account4' UNION
SELECT 'Group2','Account1'
CREATE TABLE #Sales
(
Account varchar (10),
AmountSold int
)
INSERT INTO #Sales
SELECT 'Account1', 3 UNION
SELECT 'Account1', 4 UNION
SELECT 'Account2', 6 UNION
SELECT 'Account2', 12 UNION
SELECT 'Account2', 1 UNION
SELECT 'Account4', 4 UNION
SELECT 'Account4', 9 UNION
SELECT 'Account5', 3
CREATE TABLE #ReportParameters
(
Account_or_Group varchar(15)
)
--INSERT INTO #ReportParameters
--SELECT 'Account1'
--INSERT INTO #ReportParameters
--SELECT 'Account1' UNION
--SELECT 'Account2'
--
--INSERT INTO #ReportParameters
--SELECT 'Account1' UNION
--SELECT 'Group1' UNION
--SELECT 'Account5'
INSERT INTO #ReportParameters
SELECT 'Group1' UNION
SELECT 'Group2'
SELECT
SUM(AmountSold)
FROM #Sales
WHERE Account IN
(
SELECT Account_or_Group FROM #ReportParameters
)
OR
Account IN
(
SELECT Account FROM #AccountGroupMembers WHERE AccountGroup IN
(
SELECT Account_or_Group FROM #ReportParameters
)
)
SELECT * FROM #Accounts
SELECT * FROM #AccountGroups
SELECT * FROM #AccountGroupMembers
SELECT * FROM #Sales
SELECT * FROM #ReportParameters
DROP TABLE #Accounts
DROP TABLE #AccountGroups
DROP TABLE #AccountGroupMembers
DROP TABLE #Sales
DROP TABLE #ReportParameters
March 1, 2010 at 2:33 pm
As a result I have bad habits related to performance that wouldn't be tolerated in shops with large data sets.
I wouldn't sweat that - you are certainly not alone! Many clients WITH large datasets have bad performance habits too. 🙂 But that is why guys like me make a decent living! 😎
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
March 1, 2010 at 4:17 pm
TheSQLGuru (3/1/2010)
Jeff Moden (2/26/2010)
Heh... poor ol' dynamic SQL... nobody loves it anymore.Not true!! I am actually a great fan of it and use it where appropriate, which is far more often than most people would expect. 😎
Heh... me too! I'm just amazed at how many people try to avoid it at any and all costs even when the code isn't user facing.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 1, 2010 at 6:18 pm
but if you are using SSRS and the report is needing to pull data from up to the minute time, isnt dynamic SQL the only way to go?
March 1, 2010 at 8:02 pm
grnlt (3/1/2010)
but if you are using SSRS and the report is needing to pull data from up to the minute time, isnt dynamic SQL the only way to go?
Most of the SQL heavies here do not know SSRS very well, so you would have explain the "why" of that a little bit more.
I do know that SSRS passes field(parameter?) lists as a string of comma-separated values. That doesn't require dynamic SQL or even a cursor/while loop to split it apart quickly, though many think that it does.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
March 2, 2010 at 7:47 am
grnlt (3/1/2010)
but if you are using SSRS and the report is needing to pull data from up to the minute time, isnt dynamic SQL the only way to go?
No, there is no requirement to use dynamic sql from SSRS for that reason.
MM
select geometry::STGeomFromWKB(0x0106000000020000000103000000010000000B0000001000000000000840000000000000003DD8CCCCCCCCCC0840000000000000003DD8CCCCCCCCCC08408014AE47E17AFC3F040000000000104000CDCCCCCCCCEC3F9C999999999913408014AE47E17AFC3F9C99999999991340000000000000003D0000000000001440000000000000003D000000000000144000000000000000400400000000001040000000000000F03F100000000000084000000000000000401000000000000840000000000000003D0103000000010000000B000000000000000000143D000000000000003D009E99999999B93F000000000000003D009E99999999B93F8014AE47E17AFC3F400000000000F03F00CDCCCCCCCCEC3FA06666666666FE3F8014AE47E17AFC3FA06666666666FE3F000000000000003D1800000000000040000000000000003D18000000000000400000000000000040400000000000F03F000000000000F03F000000000000143D0000000000000040000000000000143D000000000000003D, 0);
March 2, 2010 at 11:10 am
I do know that SSRS passes field(parameter?) lists as a string of comma-separated values. That doesn't require dynamic SQL or even a cursor/while loop to split it apart quickly, though many think that it does.
OK -- I'll bite. How do you split apart a string of comma separated values quickly in T-SQL without a while loop?
Rob Schripsema
Propack, Inc.
March 2, 2010 at 11:16 am
Rob Schripsema (3/2/2010)
I do know that SSRS passes field(parameter?) lists as a string of comma-separated values. That doesn't require dynamic SQL or even a cursor/while loop to split it apart quickly, though many think that it does.
OK -- I'll bite. How do you split apart a string of comma separated values quickly in T-SQL without a while loop?
Okay, check out the following code:
/****** Object: UserDefinedFunction [dbo].[DelimitedSplit] Script Date: 03/02/2010 11:15:25 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE function [dbo].[DelimitedSplit] (
@pString varchar(max),
@pDelimiter char(1)
)
returns table
as
return
with
a1 as (select 1 as N union all
select 1 union all
select 1 union all
select 1 union all
select 1 union all
select 1 union all
select 1 union all
select 1 union all
select 1 union all
select 1),
a2 as (select
1 as N
from
a1 as a
cross join a1 as b),
a3 as (select
1 as N
from
a2 as a
cross join a2 as b),
a4 as (select
1 as N
from
a3 as a
cross join a2 as b),
Tally as (select top (len(@pString))
row_number() over (order by N) as N
from
a4),
ItemSplit(
ItemOrder,
Item
) as (
SELECT
N,
SUBSTRING(@pDelimiter + @pString + @pDelimiter,N + 1,CHARINDEX(@pDelimiter,@pDelimiter + @pString + @pDelimiter,N + 1) - N - 1)
FROM
Tally
WHERE
N < LEN(@pDelimiter + @pString + @pDelimiter)
AND SUBSTRING(@pDelimiter + @pString + @pDelimiter,N,1) = @pDelimiter --Notice how we find the delimiter
)
select
row_number() over (order by ItemOrder) as ItemID,
Item
from
ItemSplit
GO
March 2, 2010 at 11:34 am
Lynn,
Wow. Very cool, very clever....but with all of the SUBSTRINGing, CHARINDEXing and concatenating going on, is this really more efficient than a simple while loop? (Oh dear, I may have opened a can of very religious worms here....)
Rob Schripsema
Propack, Inc.
March 2, 2010 at 11:42 am
Rob Schripsema (3/2/2010)
Lynn,Wow. Very cool, very clever....but with all of the SUBSTRINGing, CHARINDEXing and concatenating going on, is this really more efficient than a simple while loop? (Oh dear, I may have opened a can of very religious worms here....)
You'll have to run tests. I will tell you that the function I provided you will scale much better than a while loop as the strings get longer and more numerous.
Also, there may be other methods out there as well that may be faster and don't use a while loop or cursor.
You may also want to peruse this thread http://www.sqlservercentral.com/Forums/Topic695508-338-1.aspx.
Viewing 15 posts - 61 through 75 (of 79 total)
You must be logged in to reply to this topic. Login to reply