January 17, 2018 at 12:49 pm
Comments posted to this topic are about the item Compression estimates
January 17, 2018 at 5:20 pm
It'll be interesting to see someone's face when they run this code even on a giant database right after a service bounce. 😉 A little embedded documentation would go a long way.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 17, 2018 at 8:23 pm
Jeff Moden - Wednesday, January 17, 2018 5:20 PMIt'll be interesting to see someone's face when they run this code even on a giant database right after a service bounce. 😉 A little embedded documentation would go a long way.
I have run it on 2Tb database. 😛 But you are right it make sense to execute it on a UAT/Test database (if it is an exact copy of a production database) instead of a production database.
January 17, 2018 at 10:44 pm
Evgeny Garaev - Wednesday, January 17, 2018 8:23 PMJeff Moden - Wednesday, January 17, 2018 5:20 PMIt'll be interesting to see someone's face when they run this code even on a giant database right after a service bounce. 😉 A little embedded documentation would go a long way.I have run it on 2Tb database. 😛 But you are right it make sense to execute it on a UAT/Test database (if it is an exact copy of a production database) instead of a production database.
I was talking about the fact that it appears that the first temporary table will have little to nothing available because the usage stats (sys.dm_db_index_usage_stats) are cleared on restart of the service. I also wonder why a lot of people don't comment their code.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 18, 2018 at 3:02 pm
Jeff Moden - Wednesday, January 17, 2018 10:44 PMI was talking about the fact that it appears that the first temporary table will have little to nothing available because the usage stats (sys.dm_db_index_usage_stats) are cleared on restart of the service. I also wonder why a lot of people don't comment their code.
Thanks. I see what you mean, I have another version with left outer join...
February 15, 2018 at 3:58 pm
Jeff Moden - funny, I was thinking about how you'd react if you saw this code, and BAM, there you are!
Evgeny - great code. I'm trying to think if there's a way to run this without the RBAR. I guess you could dynamically create the INSERT... EXEC statements, but that's essentially the same thing.
I do have a couple of questions:
Does anyone else have any comments on my ideas here?
February 15, 2018 at 4:22 pm
Mike Stuart-386150 - Thursday, February 15, 2018 3:58 PMJeff Moden - funny, I was thinking about how you'd react if you saw this code, and BAM, there you are!
Is that a god thing or a bad thing? :pinch:
--Jeff Moden
Change is inevitable... Change for the better is not.
February 15, 2018 at 4:48 pm
Probably not a 'god' thing - lol.
Any thoughts on how to make this code behave better, i.e., no RBAR processing?
Mike
February 16, 2018 at 5:24 am
mike_the_dba - Thursday, February 15, 2018 4:48 PMProbably not a 'god' thing - lol.
Any thoughts on how to make this code behave better, i.e., no RBAR processing?Mike
Heh... no pun intended but I hate today's damned keyboards. They provide no tactical feedback as to if the key-press was deep enough to register.
The only way to make this non-RBAR would be to rewrite the functionality provided by sp_estimate_data_compression_savings, which is loaded with RBAR to begin with. I don't know why Microsoft continues to write single object stored procedures other than it's an easy thing to do and control loops (as opposed to row loops) aren't that expensive and so they leave such things up to the users.
I rewrote sp_spaceused for my own purposes when I needed single result set returns for many objects (worked really well for seeing the meta-data "guts" of partitioned tables all at once, as well). You can also see the code for sp_estimate_data_compression_savings using sp_helptext, so someone could rewrite it to be less RBAR in nature but that probably won't be me because I don't actually have a need to do so and the number of times that someone might actually use it are pretty small.
There are other scripts similar to the one posted out there that will return compression estimates even if compression is already in play. That's kind of important if you're into this kind of thing because it just may be that the original type of compression isn't the best and it would be nice to see if compressing or even decompressing might be a better thing to do. They also don't depend on data that might not be there after a service bounce.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 21, 2018 at 7:13 am
Good work but looks like you have reinvented the wheel, which does happen very often in the scripts area.
Estimate compression for all tables and indexes with both Row and Page
February 21, 2018 at 7:31 am
OnePunchDBA - Wednesday, February 21, 2018 7:13 AMGood work but looks like you have reinvented the wheel, which does happen very often in the scripts area.
Estimate compression for all tables and indexes with both Row and Page
Heh... re-inventing the wheel is sometimes what's needed. I'm really glad than today's vehicles have something other that thick rubber bands on a wooden spoked wheel. 😉 Conversely, I'm really glad that we're not using Reuleaux Triangles for wheels even though they are constant diameter. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply