April 30, 2016 at 11:05 am
Comments posted to this topic are about the item Install Cumulative Updates
April 30, 2016 at 8:58 pm
Serious question from a rather naive non-DBA:
Say your organization is running very complex software and a database licensed from a third-party vendor, and it contains thousands of tables in regular use. Is there really a way to test every permutation of things that might go wrong? I've looked on with a mixture of worry and pity at DBAs given a job like this to do. And whose responsibility is it to ensure the CUs stay current? The vendor or our DBAs? It's my understanding that some of our vendors have historically adopted the attitude that "it's our system, we've tested it to work on release X build Y (which might be ten years old) and we advise you NOT to tinker with that setup. If you apply a CU and it breaks our system, you're on your own."
Is this a common issue?
...One of the symptoms of an approaching nervous breakdown is the belief that ones work is terribly important.... Bertrand Russell
May 2, 2016 at 4:17 am
Yes, it's a common issue, and no, there's no way to test everything. Really though, most functions/calls/methods/application items are called the same way.
It's really about testing the items that are important to you. What things are complex, or have failed, or might cause users to get upset. Test those things.
And don't try to test everything at once. A partial test plan is better than no test plan. Start building a test plan, even if it's one thing to look at. Then add to it as you learn more.
Talk to vendors. Many of them don't want to spend time testing new CUs, but if you pressure them, and lots of other customers do, they will test things. It should be fairly easy for them to regression test on a new CU. Fixing things that break might be hard, but few things should break. If tey can't test, perhaps they're not really testing their software.
This is certainly a hard problem, and a moving target. I think ultimately you, as the DBA, bear some responsibility to ensure your system works. If you can't test, then don't upgrade. However, be aware that if you don't bother to try to ever keep up, getting support from MS is hard. they'll want you to upgrade.
May 2, 2016 at 5:28 am
From a security perspective a vendor that is behind the curve with SP support when the SP plugs security weaknesses is a problem vendor.
Smaller vendors have a cost and logistics problem in keeping up to date.
I think vendors should open-source their test suite and accept contributions to that test suite. That way those organisations with the resources to test and the need to upgrade have options.
May 2, 2016 at 6:59 am
When I install a new SQL Server I usually install the latest SP and the latest CU. With the exception if this is a vendor application then I will find out from them which CU they support... and.. usually they don't. They only certify SPs. So, vendors are going to have to jump on board with this. Most of our SQL Servers here are vendor type apps so applying the latest CU soon after they come out is never gonna happen since they don't certify them. Take for instance some of ours have 40 different apps on them, so that would mean all 40 vendors would have to certify that CU. That is just not realistic.
My thought process would be to be -1 CU, but even at that with as many servers as we have all we would be doing is applying CUs in DEV, TEST, PQA and Prod and have testing in the process for all of our systems in stages.
May 2, 2016 at 10:22 am
I think Microsoft's current cycle of 2 year major releases and 2 month cumulative updates shows that they still don't understand their customers (the corporate world), their needs, how they use the software, and how testing occurs in that environment. I have a feeling that for something as high profile and as sensitive as database servers, people will still maintain the "if it ain't broke don't fix it" mentality of only applying CUs that directly correspond to issues they have in their systems.
May 2, 2016 at 11:48 pm
Chris Harshman (5/2/2016)
I think Microsoft's current cycle of 2 year major releases and 2 month cumulative updates shows that they still don't understand their customers (the corporate world), their needs, how they use the software, and how testing occurs in that environment. I have a feeling that for something as high profile and as sensitive as database servers, people will still maintain the "if it ain't broke don't fix it" mentality of only applying CUs that directly correspond to issues they have in their systems.
I'd disagree, Chris. They understand them well. As a whole.
Individual companies don't keep up, but they do apply patches. There was no shortage of complaints and outcries for patches that were released too slow, too rarely. This was still an issue in Oracle/DB2, where problems linger for long periods of time.
The Windows world had lots of complaints with monthly patches, but many companies moved on to learn to apply them. There are companies that do apply SQL patches, and appreciate the releases every other month. There are plenty of companies that want a release every couple years, so they can upgrade to get those features if they can use them. Not upgrade every instance, but pick and choose.
May 3, 2016 at 12:51 am
I think test automation and regression test packs are part of the solution.
The following is a great suggestion as collaboration with vendors is critical in achieving this:
David.Poole (5/2/2016)
...I think vendors should open-source their test suite and accept contributions to that test suite. That way those organisations with the resources to test and the need to upgrade have options.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply