SQL Server is releasing a new version of the engine every 12-24 months at this point. That puts pressure on all of us to learn constantly about the changes. Even if we don't look to upgrade all of our existing instances, we can't buy older versions when the new one is released. Often many of us would like to install the latest version for new systems, but that isn't always possible. If we can, then we have a new version to support.
Recently I was reading ArsTechnica and they posed a question about the impediments to adopting new technology at work. Their query was in the form of a survey, and they'd like you to answer if you can. I took the survey, and I thought it was a bit vague in some sense, mostly around networking and communications rather than other areas, but perhaps that's their focus.
In any case, it often can be difficult for some companies to decide to upgrade their database servers. Licensing cost is a real concern, and while it pales with the cost of labor, I am not sure I've seen many features that would actually reduce labor costs significantly. Most the features might make some aspect of development easier (or quicker), but often we have solutions or code in place that we don't want to spent time changing. Upgrading older platforms is also a nebulous proposition. The improvements in HA/DR, quicker data loads, and more might be worth changing versions, but the capital cost can be hard to justify. Deciding to install the latest version for a new server isn't necessarily going to reduce any labor costs, and it could increase them, since staff now needs to understand the new platform.
Perhaps hardware inhibits you from making changes. Is there a point in upgrading from SQL 2008 to SQL 2017 for working applications if you don't change hardware? Is it worth buying modern hardware? Glenn Berry would argue yes, but not everyone agrees, especially in upper management. This might even be true in upgrading minor parts of your existing system. Can you add RAM? Cores? Does your storage subsystem ever receive improvements? While hardware is certainly a capital expenditure, small changes could make big impacts to users, despite the reluctance to invest in older systems.
Are there other reasons you don't upgrade? Is staff knowledge an issue? Perhaps you have concerns about support personnel having to work with new systems. There might even be other reasons to avoid change. Time might be the biggest one with people busy on other projects. If you have some thoughts, drop a note in the discussion today. I'd be especially interested if you have Software Assurance and don't regularly take advantage of that to upgrade. We'd also love some short articles on your experiences upgrading if you're willing to write about them.