March 14, 2010 at 2:50 pm
Comments posted to this topic are about the item Denormalization Strategies
March 14, 2010 at 10:16 pm
Great article. Makes me feel justified in all that time I spend going about the detail. Thanks.
March 14, 2010 at 11:07 pm
The article would have benefited from code examples, either in-line or in an attached file.
The only type of denormalization considered is full redundancy: storing the results of a complex join in a single table. No discussion of the disadvantages of doing this, no analysis of where any performance benefits might come from, aside from join overhead. The design presented seems to be nothing more than a naiive indexed view over the full result of the eliminated join.
There are many more useful types of denormalization which were not mentioned: maintaining aggregates with indexed view or even triggers for example.
Overall, I rated it average - 3 stars.
Paul
March 15, 2010 at 2:54 am
If you denormalise the database where data it maintained you add to the cost of data maintenance tasks and you run the increased risk of being hit by bugs due to trying to maintain data in more than one place.
If you have separation between where you maintain data and where you present data then denormalisation of the latter makes a lot of sense from a performance perspective.
Obviously a denormlised database is going to require more work when a change is required because data changes end up requiring schema changes
March 15, 2010 at 5:01 am
Remember that there is no such thing as a free lunch! There are always a cost for everything. Denormalization Will cost you disk, maintenance and sometimes the overhead of updating indexes (or the lack of it) may cost you performance. A well designed database with correct indexes, correct sql statements on a well dimensioned hardware is probably the right aproach for 80 percent of all Case. And yes for some Cases denormalization is the right aproach, but do not implement it everywhere just because you can.
Håkan Winther
/Håkan Winther
MCITP:Database Developer 2008
MCTS: SQL Server 2008, Implementation and Maintenance
MCSE: Data Platform
March 15, 2010 at 5:14 am
Normalize 'til it hurts...de-normalize* 'til it works!
* appropriately
March 15, 2010 at 5:27 am
I maintain, in any event, that if things are tied together by their ID's they are tied together by pure luck, and the only way to get there is to bore down to the most basic unit. * insert obligatory conundrum here *
March 15, 2010 at 6:28 am
Paul White (3/15/2010)
Normalize 'til it hurts...de-normalize* 'til it works!* appropriately
good point! 🙂
/Håkan Winther
MCITP:Database Developer 2008
MCTS: SQL Server 2008, Implementation and Maintenance
MCSE: Data Platform
March 15, 2010 at 6:59 am
Maybe this is due to my lack of experience, or the fact that it's been several years since I last studied database normalization, but:
It seems to me that what Timothy is proposing is simply adding a few FKs to his database design to cut down on the JOINs required to get at some data.
How is that denormalizing?
I went back to my database design textbook (as well as the paper I wrote then on denormalization...)and could not figure out which form (1NF, 2NF or 3NF) was being violated?
Also, and please forgive my fussiness with this but I have teachers in my family going back three generations, Timothy's article, while being interesting, could have used a quick run through spell-checking. This is not just a "purity" thing: it is a little jarring to find typos or spelling mistakes in a technical article while you're concentrating on understanding the author's point, something like stumbling over a rock while on a good walk.
March 15, 2010 at 7:18 am
Gilles Willard (3/15/2010)
It seems to me that what Timothy is proposing is simply adding a few FKs to his database design to cut down on the JOINs required to get at some data. How is that denormalizing?
This is why I think code examples would have helped. My assumption is that he is de-normalizing the entire lower structure - putting the result of the JOINs into the parent table. That is how I read it anyway. I agree about the rocks by the way.
March 15, 2010 at 8:00 am
We use a de-normalized data-mart--I may be using the term loosely but it's esssentially "a separate place for up-to-yesterday production data" where semi-trained users can put together queries on our core data without getting into trouble with join syntax and without us investing in a huge Crystal Reports development effort.
It makes a lot of sense for this because it's not the main production transactional db, we can get away with a read-optimized box with a few huge wide-flat tables rather than many tables with joins.
Disk space is cheap. Why make the same Y2k mistake and add unnecessary complexity to a data mart to save what amounts to in our case, about $100 worth of disk space.
90% of the most commonly asked reports are no longer dumped on the desk of our IT staff--a small investment in simple query syntax training for end users solved most of the issues.
Granted, we're small and simple but a similar approach this could be scaled.
In addition, we have some data-specific issue where this helps. For example, Customer 123 might be "Bobs Widgets" today, but later he changes it to "Bob & Sons Widgets". I understand about temporal tables but if you've ever seen what joining a temporal table does to an execution plan (not to mention 8 or 10 temporal tables) then that's something for which I'm more than willing to add a redundant field.
--Jim
March 15, 2010 at 8:26 am
Gilles Willard (3/15/2010)
It seems to me that what Timothy is proposing is simply adding a few FKs to his database design to cut down on the JOINs required to get at some data.How is that denormalizing?
I had the same assumption...that is until he mentioned the fact that he decreased the number of tables. It would have been nice to have a diagram with the table design showing the suggested changes.
March 15, 2010 at 8:27 am
Jim,
Yes. Data warehouses are a totally different kettle.
March 15, 2010 at 8:41 am
Paul White (3/15/2010)
Jim,Yes. Data warehouses are a totally different kettle.
It's normal for denormalization to be present in a data warehouse.
(Seriously, there was no pun intended.)
For best practices on asking questions, please read the following article: Forum Etiquette: How to post data/code on a forum to get the best help[/url]
March 15, 2010 at 8:59 am
Alvin Ramard (3/15/2010)
(Seriously, there was no pun intended.)
Given your track record for bad puns, Alvin, I have my doubts :laugh:
Benefit of the doubt. 😛
Viewing 15 posts - 1 through 15 (of 45 total)
You must be logged in to reply to this topic. Login to reply