May 2, 2014 at 1:28 pm
Some background to my question. I am working with healthcare data and my datasource for Charges is one big transaction table with all the amounts.
Out of these transactions there are many other tables I need to join with specific where clauses to classify my data. For example. I need all the charges that were written off as Charity. So from my transaction table I need to join another table where a code is equal to a few values and another columns has to be a specific value.
For reuse-ability by other report writers should I create a specific view vCharityDefinition with the codes in the view's where clause that returns the IDs that will join to the transaction table to give me the rows I need? Or maybe create that view and then another view vCharityCharges that merges the data I need from Transactions with the vCharityDefinition for all the rows. How far deep of a hierarchy is typically before it gets to confusing? Do you make these hierarchies easier to manage via naming conventions and\or use of schemas?
And what if my Charity Definition view requires a join in it with a where clause that filters both tables. Would I split ti again so no view every filters more than one table? I read soem stairways and an article or two on Functions and it looks like i should avoid those if possible along with cross apply.
Thanks,
Phil
May 2, 2014 at 2:34 pm
Hi. You are asking a huge number of questions without giving us enough details to be helpful.
Specifically, we would need to know the layouts of the tables.
Speaking personally, I'm not the biggest fan of using views in the manner you propose. However, I admit this to be a personal preference and not a best practice.
Please send more details so we can get a feel for what you are trying to do.
Thanks
John
May 2, 2014 at 3:29 pm
Maybe I should simplify it. Let's say you need to report on Midwest states in many reports, ETLS, or stored procedures... Would it not be easier to have a vie for Midwest states than typing out a list of states in an in list function for all those objects that would need to filter to Midwest states.
May 2, 2014 at 3:41 pm
Phillip.Putzback (5/2/2014)
Maybe I should simplify it. Let's say you need to report on Midwest states in many reports, ETLS, or stored procedures... Would it not be easier to have a vie for Midwest states than typing out a list of states in an in list function for all those objects that would need to filter to Midwest states.
How would you define the "Midwest states"? What happens if that definition changes? What happen when you want something more flexible?
Views might do the work now, but flexibility is something you should appreciate.
May 2, 2014 at 7:20 pm
I'd define the Midwest states by whatever the company defined them as. Let's say I was a franchise and I was limited so a certain area. Say Indiana, Ohio and Kentucky. I would think it would be better to set that standard in one view so all reports didn't have to recreate it. Then if someday I needed to add Illinois I would only have to change it in my vMidwestRegion and all reports, procedures.. would pick it up and not require a search for dependencies. I'm just wondering if that is the typical practice. Because currently i have to republish Stored procedures every time a customer decides they are missing some data or have too much and they throw another code that needs to be excluded\included. So the any referenced procedure gets changed in the test environment then pushed into Prod. Now we do have some report writers that use functions like the common split-list function that takes a list of values and parses it to create a table. But from what I am reading here it is probably better to join a view than a function that returns a table.
May 3, 2014 at 4:29 am
Regardless of the exact definition of Midwest states, I'm extremely adverse to nesting views within views, or functions within views or functions within functions.
I recognize that creating a view means never having to figure out some difficult set of joins a second time, so that makes it extremely attractive to report writers. They have what they want in a single location and don't have to figure out the T-SQL to regenerate it.
But...
When you reference a view, the optimizer is forced to deal with every single object in that view. It determines which of them you are referencing, which of them it has to use to get your result, which of them are not necessary. That's all additional load on the optimizer. Throw in joining a view to a view, or calling a view from a view, and that load increases radically. What happens is that the optimizer can only try so many times to get you a good execution plan and then it gives up and goes with what it has. This is referred to as a timeout. If you're getting lots of timeouts on execution plans, you're probably also seeing really poor performance, or at least, very inconsistent performance as it finds different plans on different days.
While it is more difficult, I'd suggest writing T-SQL that's unique to each report. That will work better with the optimizer.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 4, 2014 at 10:37 am
Thanks for the info. So the only way to manage frequently used filters is outside of SQL manually and then cut and paste into existing queries or new queries.
May 4, 2014 at 12:06 pm
...customer decides they are missing some data or have too much and they throw another code that needs to be excluded\included...
Exclude/include values in a report usually is a scenario where lookup tables can be helpful. Using a stored procedure would allow the customer to add/remove/modify values as needed.
May 5, 2014 at 4:35 am
Phillip.Putzback (5/4/2014)
Thanks for the info. So the only way to manage frequently used filters is outside of SQL manually and then cut and paste into existing queries or new queries.
When talking about parameters, then I'd say we're going to stored procedures, not functions or views.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 5, 2014 at 8:15 am
Joining to a lookup table is the way we've got around a similar issue, ever changing values from users. And for the example of midwest states, we have a table of all states, just have a column in the table to house the value 'midwest' then reference that in the join.
Then as you add or remove states from the group 'midwest' you don't need to change your code.
-------------------------------------------------------------
we travel not to escape life but for life not to escape us
Don't fear failure, fear regret.
May 5, 2014 at 11:55 am
Do you use a naming convention for lookup tables so report writers have a way of noticing them before they go create their own? Their own schema or prefix perhaps?
May 5, 2014 at 12:05 pm
We have a database that we store all of our 'reference' type tables like this. We just try to give it a descriptive name and let everyone know about it. (name of table/database, what values it contains and how to join to it) I think if you can get everyone on board with either one database to store this type of data or one common table prefix for these type of tables you should be able to limit any duplication.
-------------------------------------------------------------
we travel not to escape life but for life not to escape us
Don't fear failure, fear regret.
May 5, 2014 at 12:15 pm
Phillip.Putzback (5/5/2014)
Do you use a naming convention for lookup tables so report writers have a way of noticing them before they go create their own? Their own schema or prefix perhaps?
I agree with other poster who suggested using a separate database for this. That is a good idea. So is using a specific schema. Either of these two approaches is clean and manageable. Prefixes on object names are just ugly for a number of reasons and should be avoided.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
May 5, 2014 at 12:36 pm
Sean Lange (5/5/2014)
Phillip.Putzback (5/5/2014)
Do you use a naming convention for lookup tables so report writers have a way of noticing them before they go create their own? Their own schema or prefix perhaps?I agree with other poster who suggested using a separate database for this. That is a good idea. So is using a specific schema. Either of these two approaches is clean and manageable. Prefixes on object names are just ugly for a number of reasons and should be avoided.
It does make for some ugly, and long table names. Thats why we do the database, do that if you can. Schema would be my second option. You have to be careful when using the schema that someone doesn't create the same name different schema on same database and then accidentally delete both tables. That happened, once.
-------------------------------------------------------------
we travel not to escape life but for life not to escape us
Don't fear failure, fear regret.
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply