SSRS report with 17 parameters times out

  • Hello,

    I am working on a multiple parameters(17) SSRS report. The report calls the stored procedure for the main query, however for the other 17 datasets, the sql query is included within.

    The report takes a lot of time to render and then finishes off with the outofmemory exception. However, when I run the code thru SQL studio, the data is displayed within seconds.

    Is this report running slow due to multiple parameters ? Do I have to create the sql queries for each parameter as a strored procedure ?

    Any inputs and suggestions will be appreciated. Thanks for your help.

    Regards,

    Paul

  • with 17 parameters, no doubt most of them are optional....and might have default values? sounds like parameter sniffing...fast in SSMS with static values, compiled is slow.

    search for "parameter sniffing" here on SSC, there's a LOT of posts on how to address it;

    Gail has sone nice stuff on it as well:

    http://sqlinthewild.co.za/index.php/2007/11/27/parameter-sniffing/

    http://sqlinthewild.co.za/index.php/2008/02/25/parameter-sniffing-pt-2/

    http://sqlinthewild.co.za/index.php/2008/05/22/parameter-sniffing-pt-3/

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Those articles are a great read and I would strongly recommend you read them.

    HOWever,

    I've been having a very similar issue in bids where the local machine just ain't strong enough to build the report with the 2 GB or ram I have left after the PC has booted up.

    1 simple way to wipe out the possibility of parameter sniffing is to add the with recompile option at the proc level. It takes 2 seconds and then you can give the report another go >>

    ALTER PROCEDURE [dbo].[RPT_Finances_Inventaire_Marges]

    @BeginDate DATETIME

    , @EndDate DATETIME

    ... long a$$ list of parameters

    WITH RECOMPILE

    AS

    SET NOCOUNT ON

    If the report then renders you know you had a P.S. issue. If it doesn't then the local box is not strong enough. To really confirm that, deploy the report to a prod server and see if it runs there. It most likely will.

    What I do to workaround that and still work on my dev machine is to put a conditional top in the final select so that I return only 5-10% of the data when the connection comes from the dev machine instead of prod.

    DECLARE @TopN INT

    SET @TopN = CASE WHEN UPPER(HOST_NAME()) = 'NAVTEST' THEN 15

    ELSE 100

    END

    --Then in the final select

    SELECT @TopN FROM ...

  • another way to avoid the parameter sniffing is to use local variables as well: here's another example to consider.

    CREATE PROCEDURE AVOIDSNIFFING(@Param1 int=0,@Param2 datetime = NULL)

    AS

    DECLARE @localParam1 int,

    @localParam2 datetime

    --locally declared variables are used in the queries below; because of that construction,

    --parameter sniffing ignores the inputted parameters and builds a plan based on statistics.

    --instead of assuming the input params represent the most probably input.

    SET @localParam1 = @Param1;

    SET @localParam2 = @Param2;

    SELECT * FROM MyTable WHERE Col1 = @localParam1 AND col1 = @localParam2

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Lowell (6/28/2011)


    another way to avoid the parameter sniffing is to use local variables as well: here's another example to consider.

    CREATE PROCEDURE AVOIDSNIFFING(@Param1 int=0,@Param2 datetime = NULL)

    AS

    DECLARE @localParam1 int,

    @localParam2 datetime

    --locally declared variables are used in the queries below; because of that construction,

    --parameter sniffing ignores the inputted parameters and builds a plan based on statistics.

    --instead of assuming the input params represent the most probably input.

    SET @localParam1 = @Param1;

    SET @localParam2 = @Param2;

    SELECT * FROM MyTable WHERE Col1 = @localParam1 AND col1 = @localParam2

    Agreed, but X 17 really gets annoying for a simple test you might scratch right away. Also with 12-17 optional parameters, recompile is not really overkill here ;-).

  • Ninja's_RGR'us (6/28/2011)


    agreed recompile is much easier; just wanted to throw an example out there for reference. we think so much alike we should be twins.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Ninja's_RGR'us (6/28/2011)


    Those articles are a great read and I would strongly recommend you read them.

    HOWever,

    I've been having a very similar issue in bids where the local machine just ain't strong enough to build the report with the 2 GB or ram I have left after the PC has booted up.

    1 simple way to wipe out the possibility of parameter sniffing is to add the with recompile option at the proc level. It takes 2 seconds and then you can give the report another go >>

    ALTER PROCEDURE [dbo].[RPT_Finances_Inventaire_Marges]

    @BeginDate DATETIME

    , @EndDate DATETIME

    ... long a$$ list of parameters

    WITH RECOMPILE

    AS

    SET NOCOUNT ON

    If the report then renders you know you had a P.S. issue. If it doesn't then the local box is not strong enough. To really confirm that, deploy the report to a prod server and see if it runs there. It most likely will.

    What I do to workaround that and still work on my dev machine is to put a conditional top in the final select so that I return only 5-10% of the data when the connection comes from the dev machine instead of prod.

    DECLARE @TopN INT

    SET @TopN = CASE WHEN UPPER(HOST_NAME()) = 'NAVTEST' THEN 15

    ELSE 100

    END

    --Then in the final select

    SELECT @TopN FROM ...

    Thanks for your reply....it was really helpful...just 3 more questions:

    1. Do I really need to create a dataset for a parameter which is using the values 'From Query' ? I asked cos if I dont I then I get an error like this:

    'The report parameter ‘Countryofreceipt’ has a DefaultValue or a ValidValue that depends on the report parameter “Countryofreceipt”. Forward dependencies are not valid.'

    2. If I do really need to create the datasets for all the 17 parameters, would I need to include 'WITH RECOMPILE' option in each of the stored procedures associated with the parameters (SP)?

    3. Related to question -2, is it better to create an SP than to include the sql code with the report itself, I mean performance wise ?

    Regards,

    Paul

  • I'm really not sure what you mean about questions 1 & 2. What context are you reffering to?

    Also on #1 I never had that error so I can't really help outside searching on google.

    2 - You have to create datasets for parameters only if you need to have the parameter multiselect... or predefined values. It's not an absolute requirement. Also I usually use table functions or views to return parameter list values. Again for 2 specific reason. #1 I'm working with vendor application. They use tables and views but never functions so I preffer to use functions and remove the changes of stepping into their ground... ang again functions are reusable on all reports so that makes for a single point of fixing an issue.

    3 - I usuallly do it in 2 phases. I build the query out in SSMS and when I have something that looks like it's complete in there I comment out the variables declaration (for the variables that will be parameters) and copy that whole query into the report (using the wizard to build it out). That usually saves a lot of time on the initial build of the report because in a few clicks you have something fairly decent and usable for the end user.

    Now once the report is built and that the users only needs small tweaks I start to move to sps. That allows for 1 extra benefit during dev phase. The user can play with the real report on the prod server while I keep changing the report. Since the datasources are "different" we don't step on each other's shoes (we don't have a full dev, test, qa and prod setup here... everything is pretty much done right in prod with the live data. I'd usually recommend against it but since I'm only doing selects we can live with that setup for the time being).

    Then when I have the sp on the server I can just edit the sp to fix any query mistakes that return bad data without having to redeploy. Also when I use this query in many reports, all the reports are instantly corrected rather than finding them all and redeploying.

    I know using sps can improve on the compile time and improve performance but since my reports usually run in 5 secs or less, it doesn't really matter all that much in this context.

  • Ninja's_RGR'us (6/29/2011)


    I'm really not sure what you mean about questions 1 & 2. What context are you reffering to?

    Also on #1 I never had that error so I can't really help outside searching on google.

    2 - You have to create datasets for parameters only if you need to have the parameter multiselect... or predefined values. It's not an absolute requirement. Also I usually use table functions or views to return parameter list values. Again for 2 specific reason. #1 I'm working with vendor application. They use tables and views but never functions so I preffer to use functions and remove the changes of stepping into their ground... ang again functions are reusable on all reports so that makes for a single point of fixing an issue.

    3 - I usuallly do it in 2 phases. I build the query out in SSMS and when I have something that looks like it's complete in there I comment out the variables declaration (for the variables that will be parameters) and copy that whole query into the report (using the wizard to build it out). That usually saves a lot of time on the initial build of the report because in a few clicks you have something fairly decent and usable for the end user.

    Now once the report is built and that the users only needs small tweaks I start to move to sps. That allows for 1 extra benefit during dev phase. The user can play with the real report on the prod server while I keep changing the report. Since the datasources are "different" we don't step on each other's shoes (we don't have a full dev, test, qa and prod setup here... everything is pretty much done right in prod with the live data. I'd usually recommend against it but since I'm only doing selects we can live with that setup for the time being).

    Then when I have the sp on the server I can just edit the sp to fix any query mistakes that return bad data without having to redeploy. Also when I use this query in many reports, all the reports are instantly corrected rather than finding them all and redeploying.

    I know using sps can improve on the compile time and improve performance but since my reports usually run in 5 secs or less, it doesn't really matter all that much in this context.

    Thanks again !!

    The report I am working on has 14 multi-select parameters, for which I wud need to create separate datasets for each one of them. do you know if i can still avoid creating datasets for the parameters ?

    Could you please tell how can a table function be used to return parameter values ?

  • No you can't avoid creating 1 dataset per input param.

    There's really nothing tricky about the function. It's basically a view moved to a function so that there's no conflit with the 3rd party app.

    ALTER FUNCTION [dbo].[fnListeFamilles] ( )

    RETURNS TABLE

    AS

    RETURN

    (

    SELECT

    imf.Code

    , imf.Description

    FROM

    dbo.[Groupe FORDIA Inc_$Item family] imf

    WHEREimf.Code > ''

    )

    GO

  • Ninja's_RGR'us (6/29/2011)


    No you can't avoid creating 1 dataset per input param.

    You can avoid creating a dataset if you specify the available list of values, so if you have a really simple paramater like True/False it would probably be simpler to specify the values for this within the parameter creation UI rather than adding a dataset for it

    Although, if your parameters are cascading parameters (e.g. the available choice of values for parameter2 is determined by the value selected in parameter1) then you will need to have separate datasets for each parameter

  • quan23 (7/1/2011)


    Ninja's_RGR'us (6/29/2011)


    No you can't avoid creating 1 dataset per input param.

    You can avoid creating a dataset if you specify the available list of values, so if you have a really simple paramater like True/False it would probably be simpler to specify the values for this within the parameter creation UI rather than adding a dataset for it

    Although, if your parameters are cascading parameters (e.g. the available choice of values for parameter2 is determined by the value selected in parameter1) then you will need to have separate datasets for each parameter

    Thanks for your reply, it was really helpful. However, its a different problem I am facing now. The report shows data now but only if I select single values for all the 17 parameters. If I try to select multiple values for even a single parameter, the report blanks out and only shows the labels. But I know there is data and its not showing up. Do I need to handle this differently so that the data is displayed for multiple selected values ? The report has cascading parameters, could that be a problem ?

    Thanks,

    Paul

  • The usual suspect is that you have used this type of syntaxe which works in the ssrs query generator, but then fails in the sp.

    where something IN (@MultiSelectParam).

    If you converted your datasource to a stored proc, then you need to split the values. The best way for this type of reports is to build a temp table (with PK) and dump the values in there. The PK will force stats and give better plan. I've not tested simply adding stats instead of PK.... but on a 3-4 rows table it won't make much of a difference perf wise.

    The other option is that 1 of the filters just doesn't return anything. It's hard to tell which one it is without full access to your systems.

  • I don't think it sounds like the cascade that is the problem, otherwise you wouldn't be able to select values for all the parameters

    Have you set up the parameters to allow multiple selection, and are you using IN rather than = in your sql code?

  • Oops, that'll teach me for replying without reminding myself what the original question was about!

    The Ninja is right, SSRS will split the multiple parameter correctly into individual values but it will just pass them across as a single string to the sp so you will need to do the splitting within the sp

    There are quite a few articles on SSC with splitter functions you can snag for your purposes if you don't have one to hand, Jeff Moden has posted quite a lot of cool content about splitting with tally tables so I'd use one of his functions

Viewing 15 posts - 1 through 15 (of 33 total)

You must be logged in to reply to this topic. Login to reply