January 17, 2017 at 12:08 am
Comments posted to this topic are about the item Delaying Patches is Problematic
January 17, 2017 at 12:38 am
I think that when the previous decade or two are looked back upon from a historical perspective that manual patching will seem very backward.Steve is absolutely right in suggesting that automatic patching requires a better quality of patch and that forms an important part of the journey.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
January 17, 2017 at 6:48 am
The problem with automatic patching over the long term is cruft. It also means bloat and all that jazz.
MS is perhaps king of backward compatibility--I have successfully run GWBasic from MS-DOS 3.1 on Vista (haven't tried it since then...) but that's some SERIOUS backward compatibility.
Unfortunately, many developers don't follow the published guidelines and things like SQL Server are constantly deprecating features.
Patching implies correcting errors in 1 version of the software and that's all well and good, MS's automatic approach is the correct one for Windows and .Net--but the sheer size of both makes guaranteeing a patch works on ALL hardware combinations and ALL drivers impossible. Most of the patching stories we hear about affect a small percentage of users. But a small percentage of a billion+ users adds up quick!
Personally I think the answer is vanilla development, but vanilla does not the vast crowd enthuse.
KISS, in other words. Avoid third party libraries and controls. Avoid the latest whiz-bang features that may not be around in a month's time, much less 5 years. Avoid long chains of services, especially third-party ones. Demand API contracts that WILL NOT CHANGE.
Of course that makes for some pretty tame software :unsure: but it *works*. Call it the "AK-47 design philosophy". Crude, plain, but capable of absorbing punishment that would leave other software a smoking ruin.
Then at least you have a chance of keeping it patched and operating. :laugh:
January 17, 2017 at 7:57 am
roger.plowman - Tuesday, January 17, 2017 6:48 AMThe problem with automatic patching over the long term is cruft. It also means bloat and all that jazz.MS is perhaps king of backward compatibility--I have successfully run GWBasic from MS-DOS 3.1 on Vista (haven't tried it since then...) but that's some SERIOUS backward compatibility.
Unfortunately, many developers don't follow the published guidelines and things like SQL Server are constantly deprecating features.
Patching implies correcting errors in 1 version of the software and that's all well and good, MS's automatic approach is the correct one for Windows and .Net--but the sheer size of both makes guaranteeing a patch works on ALL hardware combinations and ALL drivers impossible. Most of the patching stories we hear about affect a small percentage of users. But a small percentage of a billion+ users adds up quick!
Personally I think the answer is vanilla development, but vanilla does not the vast crowd enthuse.
KISS, in other words. Avoid third party libraries and controls. Avoid the latest whiz-bang features that may not be around in a month's time, much less 5 years. Avoid long chains of services, especially third-party ones. Demand API contracts that WILL NOT CHANGE.
Of course that makes for some pretty tame software :unsure: but it *works*. Call it the "AK-47 design philosophy". Crude, plain, but capable of absorbing punishment that would leave other software a smoking ruin.
Then at least you have a chance of keeping it patched and operating. :laugh:
I agree to some extent. MS encourages and needs 3rd parties to work on their products, which means many of their consumers will need them too. Yes, 3rd parties will fall behind and have issues with updates and upgrades, but depending on what we are talking about, the benefit can be way too high to not use 3rd party stuff, especially since MS does not cover everything, and if they did, some of their own stuff would probably conflict with their other stuff, so it's nice to pass that responsibility to 3rd parties 🙂
January 17, 2017 at 8:47 am
roger.plowman - Tuesday, January 17, 2017 6:48 AMThe problem with automatic patching over the long term is cruft. It also means bloat and all that jazz.MS is perhaps king of backward compatibility--I have successfully run GWBasic from MS-DOS 3.1 on Vista (haven't tried it since then...) but that's some SERIOUS backward compatibility.
Unfortunately, many developers don't follow the published guidelines and things like SQL Server are constantly deprecating features.
Patching implies correcting errors in 1 version of the software and that's all well and good, MS's automatic approach is the correct one for Windows and .Net--but the sheer size of both makes guaranteeing a patch works on ALL hardware combinations and ALL drivers impossible. Most of the patching stories we hear about affect a small percentage of users. But a small percentage of a billion+ users adds up quick!
Personally I think the answer is vanilla development, but vanilla does not the vast crowd enthuse.
KISS, in other words. Avoid third party libraries and controls. Avoid the latest whiz-bang features that may not be around in a month's time, much less 5 years. Avoid long chains of services, especially third-party ones. Demand API contracts that WILL NOT CHANGE.
Of course that makes for some pretty tame software :unsure: but it *works*. Call it the "AK-47 design philosophy". Crude, plain, but capable of absorbing punishment that would leave other software a smoking ruin.
Then at least you have a chance of keeping it patched and operating. :laugh:
Certainly there can be bloat here. Not much to do here, but patching helps.
I'm not sure I agree with avoiding third parties. While not every small library might be worth updating, plenty of popular ones are updated, and more importantly, they get additional screening and patches from the security standpoint over time. Managing that, and ensuring you have good security, is tough from a individual standpoint. There are no shortage of stories from people that assumed their code was secure, or well written, and it wasn't. Also, developing patches for your libraries can be harder than just applying patches from a well known source. There's the time to develop, as well as the expertise to get the patch built.
That being said, I don't want to say buy everything instead of build it. You have to make choices, not be afraid to change if necessary, and do what's best for your organization.
January 17, 2017 at 9:46 am
I am with Steve on this one. Judicial use of 3rd parties is essential. Sometimes no 3rd party software (be they tools, services or controls) is needed but sometimes it is.
Just like we should be careful not to make a mess so that patching is troublesome, so should 3rd party developers.
Automatic patching raises the bar. Those who cannot get over it will fall by the wayside. Good riddence, I say.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
January 17, 2017 at 11:30 am
Gary Varga - Tuesday, January 17, 2017 9:46 AMI am with Steve on this one. Judicial use of 3rd parties is essential. Sometimes no 3rd party software (be they tools, services or controls) is needed but sometimes it is.Just like we should be careful not to make a mess so that patching is troublesome, so should 3rd party developers.
Automatic patching raises the bar. Those who cannot get over it will fall by the wayside. Good riddence, I say.
It's amazing the number big name java based applications which break if you touch the underlying java revision level.
...
-- FORTRAN manual for Xerox Computers --
January 17, 2017 at 11:36 am
jay-h - Tuesday, January 17, 2017 11:30 AMGary Varga - Tuesday, January 17, 2017 9:46 AMI am with Steve on this one. Judicial use of 3rd parties is essential. Sometimes no 3rd party software (be they tools, services or controls) is needed but sometimes it is.Just like we should be careful not to make a mess so that patching is troublesome, so should 3rd party developers.
Automatic patching raises the bar. Those who cannot get over it will fall by the wayside. Good riddence, I say.
It's amazing the number big name java based applications which break if you touch the underlying java revision level.
Ugg... way to many companies jumped on the java band wagon because it was supposed to be much less platform dependent then realized it's less functional, still has to have separate versions for different OS's and can be horribly finicky depending on what version of java is installed.
January 17, 2017 at 11:42 am
ZZartin - Tuesday, January 17, 2017 11:36 AMjay-h - Tuesday, January 17, 2017 11:30 AMGary Varga - Tuesday, January 17, 2017 9:46 AMI am with Steve on this one. Judicial use of 3rd parties is essential. Sometimes no 3rd party software (be they tools, services or controls) is needed but sometimes it is.Just like we should be careful not to make a mess so that patching is troublesome, so should 3rd party developers.
Automatic patching raises the bar. Those who cannot get over it will fall by the wayside. Good riddence, I say.
It's amazing the number big name java based applications which break if you touch the underlying java revision level.
Ugg... way to many companies jumped on the java band wagon because it was supposed to be much less platform dependent then realized it's less functional, still has to have separate versions for different OS's and can be horribly finicky depending on what version of java is installed.
'Write once run anywhere' ranks right up there with 'your check is in the mail.'
...
-- FORTRAN manual for Xerox Computers --
January 17, 2017 at 12:09 pm
Interesting article. Substitute the word "performance" or "accuracy" for all instances of the word "security" and it makes the article 3 times as important.<
--Jeff Moden
Change is inevitable... Change for the better is not.
February 9, 2017 at 3:35 pm
I've found generally that applying fixes to quickly tends to give good results. However, I think it essential to read and understand the fix notice and test the fix before letting it get onto production machines or onto customer machines (if we controlled them) or into customers hands (if we didn't control the machines).
Some companies should not be fully trusted and their fixes should only be applied after extensive testing, and some are bad enough leting any of their software onto one's macines in the first place should be avoided so that their fixes are irrelevant. But most companies produced reasonable software and generally good reliable fixes.
At Neos the above went for servers but we had a very different approach for clients. The client software was heavily locked down (and a lot of windows features not included, in particular, no code installer - but I don't thing MS were calling it Windows Embedded at first) and updates could be applied only by formatting the disc and installing a complete new system. so fixes were only applied by us, in-house, in the form of complete replacement for client software and went through masses of tessting before the got out tof the door. But IE was a nightmare back then - we eventually added software to scan certain areas of the registry and destroy unauthorised entries (and the stuff that they pointed to) in order to get any sort of secure client (even with everything in theory absolutely locked down) with IE in it.
Tom
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply