All of these parameters are multi-selectable.
We have a request where it is possible for a user to select hundreds of parameter values in our SSRS report. A user selects General Manager and all the managers that are under those General Managers appear. Then the user can selects the Managers and all the employees that report to that manager are populated in another parameter.
If a user selects ALL for GM / Manager / Agent, the number of agents passed can easily be in the upper hundreds or low thousands. So the parameter value sent to the Stored Procedure can have 20000 characters, as each agent is a 10 character text. We use the "=Join(Parameters!paramGM.Value, ":")" function inside the parameters to pass.
Then inside the Stored Procedure, we have to split out all these characters into their values.
I understand this is nothing out of the ordinary and most of you already know all of this, but if the user selects such a large number, it greatly impacts performance. How do you handle a situation like this? And please, I know one solution is "not let them select such a large amount", but unfortunately this is their request. Also, they do not like adding a value to the parameter list of "ALL" and then we just pass this one value instead of everything.
I should also note, that the users do not have direct access to the SSRS reports on the server, but must go through a .Net "wrapper" where the user selects the parameters from .Net dropdowns and then when they hit view report, it displays the report output below. So we have to also consider that the user doesn't really have direct access to the SSRS reports.
Usually, I'll start with the lecture and then provide an alternative. Today, let's mix things up and start with the alternative. Check out Gail's blog post on catch-all queries. This is what you're looking for and she has some of the best guidance on the topic.
Now, to the lecture.
People are going to ask for the world. Here's a story. Designing a replacement system for an ancient green-screen app. Client said, we need 100 million rows returned to the screen in 3 seconds. That's simply impossible. We pushed back. They insisted. We tried to come up with a way to do it. Explored technology. Experimented. Nothing. It's physically impossible (not to mention stupid, what are they going to do with 11 million rows?). Finally went back to the client, a failure. They said, but we can do it now? What? Show me. Sure enough, the green screen app rumbles to life. Client inputs some search criteria. 5 rows come back on the screen after about three seconds. I'm sitting there wondering when the miracle occurs and the remaining 100 million rows pop up. Client points to the side of the screen where there's a number, 100,000,000. "See, 100 million rows". Me: Oh yeah, we can return 100 million rows, and we can do it in less than a second.
I get it. Reporting is different. They want to filter on all the things all the time. However, what you're really going to find if you push, pull and discuss, is that there are three or four (maybe even seven or eight) common patterns that you could supply different screens, different reports, or something, to fit the patterns. Then, there are secondary filtering requirements. Getting people to let go of the idea of filtering all the things on all the things is hard, but it's right. You'll be happier. They'll be happier.
Or, just implement the catch-all query using Gail's advice. Be sure to read both posts. A combination of them is likely to be the ultimate answer if you can't work with the client to establish sensible patterns.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 23, 2020 at 2:53 pm
I'll second Grant's recommendation on Gail's articles. Personally, I consider them to be THE "Goto" articles on the subject especially since she's keen not only about performance but also on avoiding possible SQL Injection, which is still one of the leading causes of compromised data.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 23, 2020 at 5:16 pm
One of my favorite horror stories goes back to the days of green bar paper and mainframes. The Army base printed out a phone directory on green bar paper every month and made several copies to distribute around the base. The trouble was that enough people were being transferred in and out that such a directory was almost useless. The solution was to get one of those old IBM CRT green screen thingies which the kids have never seen.
The problem was that the programmers had never seen CRTs before either. They took the COBOL program that had been doing the green bar print out, and simply change the output to the screen. So the entire phone directory scrolled past -- in one blast, without any pausing! The next attempt was to pause the directory every 10 or 15 lines and request that the user hit a key to get the next page. This made looking up somebody like Sgt. Zwabo a bit more difficult than you would like. The final version of the program required that we get an ISAM file to replace the old non-indexed file, and a program that would jump to the start of each initial letter of the last name. You still had to scroll within each letter of the alphabet, so the 'S' took a little longer than the 'Z', but at least it worked.
In our defense, I would like to say his help in many, many decades ago when we were a simple tribe that used punchcards, magnetic tapes, and barely had an understanding of alphabetizing the data.
Please post DDL and follow ANSI/ISO standards when asking for help.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply