November 22, 2021 at 12:00 am
Comments posted to this topic are about the item The Complexity of Modern Systems
November 22, 2021 at 10:19 am
I feel the point on accidental complexity is especially important. Dr Venkat Subramamian gives some fantastic talks on the subject https://agiledeveloper.com/aboutus.html
I also think that premature optimization and failing to understand the business you are in leads to unnecessary complexity. Too many systems are architected and built on a scale requirement that has more to do with corporate ego and CV driven development than market realities. Truly building for scale is necessary complexity. Building it for unnecessary scale is complexity in the avoidable accident camp.
It's an old saw that Data Warehouse projects fail without an engaged business stakeholder. Personally I think an insufficiently engaged and strong business stakeholder leads to tech led rather than business led projects. These are breeding grounds for accidental complexity and failure for precisely the same reason.
November 22, 2021 at 1:25 pm
I agree with this 1000%
Rod
November 22, 2021 at 4:47 pm
Well put, David.
November 22, 2021 at 8:26 pm
We do definitely need to always consider the complexity of software that we create. Even though just for my own use, I have been developing SQL code ( it's all I know how to code any more ) to clean, validate, and reformat a batch of about 1.2 million .CSV rows exported regularly from a pretty poorly designed application with a number of glaring problems, such as program logic based on special characters embedded in the data elements.
It is a pretty complex task that requires use of many of the SQL string functions that operate on a large temporary table. I could have done the process in fewer SQL commands and fewer passes of the table, but chose to go step by step with fewer nested functions with each well documented in comments - so I can remember what it does. I also constructed the temporary table with both the original AND the modified .CSV rows side by side so I could visually debug the process step-by-step as I work on it.
This way I can keep the code fairly understandable, and the code processes the multiple passes of 1.2 million rows in less than two minutes.
This reminds me of the old adage from early IBM days in the 1960's about using the KISS method (Keep It Simple, Stupid).
Rick
Disaster Recovery = Backup ( Backup ( Your Backup ) )
November 23, 2021 at 7:26 am
This was removed by the editor as SPAM
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply