January 25, 2019 at 11:46 pm
Think it has more to do with the fact our methodologies with the traditional RDBMS is slow, not the tech itself in most cases that are not Google-like companies. In meaning, the turnaround time for change with the RDBMS is slower than those of other techs that adopt different methodologies with the data.
For example, I adopted the document store for my data warehouse. The methodology for the warehouse didn't change, but the way we load data before its processed by the warehouse did. This allows the business to expose the data as it lands to where it can be accessed and changed before it even becomes a model. In return, the tech becomes more lean and can change at the speed of the business versus the warehouse, which cannot due to it's methodologies.
January 26, 2019 at 7:48 am
Shifting gears a bit but on the same subject, the code at the following link is a prime example of why it takes so long to make changes to database code and troubleshoot for performance issues. This isn't an exception... on most forums, including this one, it's the norm and people just continue to generate this kind of code. Want to know what the code actually does? All you have to do is read the code, right?
https://www.sqlservercentral.com/Forums/FindPost2017981.aspx
--Jeff Moden
Change is inevitable... Change for the better is not.
January 26, 2019 at 2:13 pm
xsevensinzx - Friday, January 25, 2019 11:46 PMThink it has more to do with the fact our methodologies with the traditional RDBMS is slow, not the tech itself in most cases that are not Google-like companies. In meaning, the turnaround time for change with the RDBMS is slower than those of other techs that adopt different methodologies with the data.For example, I adopted the document store for my data warehouse. The methodology for the warehouse didn't change, but the way we load data before its processed by the warehouse did. This allows the business to expose the data as it lands to where it can be accessed and changed before it even becomes a model. In return, the tech becomes more lean and can change at the speed of the business versus the warehouse, which cannot due to it's methodologies.
I think this nails it. For as long as I can remember data responsibilities have been abdicated to a small technical group who try to guess/cater for the organisations requirements. This has caused an all-things-to-all-men approach which is heavy weight and slow.
Work by Ronald Damhoff may offer a way out. http://www.b-eye-network.com/blogs/damhof/archives/2013/08/4_quadrant_mode.php
My current role involves supplying data scientist with data as fast as possible. Part of their role is to determine whether there is enough value in the data to throw a more heavy weight process at it. The thought process is that you don't need a cabinet maker to put up a garden shed
Viewing 3 posts - 16 through 17 (of 17 total)
You must be logged in to reply to this topic. Login to reply