There's a lot of talk about AI, machine learning, and our technology systems becoming more and more capable and in some sense, "smart" in particular domains of problems. I've been reading about intelligent systems for decades, but it seems that in the last 2-3 years, the amount of effort being made and the advances dwarf anything in the past. It's almost as if many vendors and academics have become more excited and focused on building tools and systems that can learn and grow. IBM's Watson system has been under development for years, building on the work of its game research into chess and Jeopardy, and has made strides in a number of areas, proving that computers can tackle some tasks quite well, better than humans in some cases.
I read the story about Google's AlphaGo and its defeat of human GO players. That got me thinking about the human impact of more intelligent, capable systems that we might see in technology. I remember when SQL Server 2000 was released, and there was an ad campaign that the DBAs weren't needed for most instances. That didn't prove to be true, though in many cases, a SQL Server system setup, with default configuration, limited load and lots of disk space might run for years before experiencing issues. When it does, because of a log file that's the size of the disk (and full) or extremely poor query code or some other symptom of neglect, it will likely require lots of consulting expertise to fix. However, plenty of companies have weighed that cost against the time when no real effort was expended from administrators.
I'm not recommending you run SQL Server instances this way. It's a poor practice, and I've had a few consulting clients that lost data (or their business) because of poor DBA practices. I'm just pointing out that SQL Server can be a very solid platform for many smaller applications.
Microsoft has noted in quite a few presentations that they run over a million databases on the Azure SQL Database platform without any DBAs. Their developers need to manage things, and really the focus is to force developers to automate everything that might be required. They are using machine learning to better tune their systems with minimal intervention from humans, and they gather lots of telemetry (100s of TB/day) for their systems to analyze. The various security platforms in Azure (and likely other products) also use machine learning to better provide whatever service they focus on, in a way that wouldn't even be possible in a cost effective manner with human labor.
While developers might not care if the systems run themselves, how do sysadmins and DBAs feel? We're not close to eliminating the need for DBAs, in my opinion, but I can see that many of the issues we face, especially around tuning, might be candidates for some sort of AI in the future. After all, given a set of data and a result set, essentially a test, and a basic query, how hard can it be to "tune" a query to work in a more efficient manner? It's not simple, but trying combinations and using logical rules about how to query data seem to be the exact type of problem an AI system would perform well. Need to index a table? Lots of workload analysis, considering volume and frequency of queries and some limit on the number of indexes might mean that humans never index tables again. At least not beyond an initial choice at installation.
How will we feel when some of the work we do now goes to machines? I do think that day is coming, though it could easily be a decade or more away. I'm not sure how quickly this will change the world, but I could see these capabilities coming to cloud platforms in less than 10 years, and that could change our industry. Perhaps 10-15 years to make it to most on-premises products, which likely means 20+ years before most companies have switched to new versions.
After I've retired, but maybe not after you have. How will you feel if the computer becomes smarter than you are managing a database?