I remember when
SQL Server 7.0 was released. One of the big marketing pushes was that the database would manage itself. I worked in a small startup at that time, and they kept joking that I would be out of a job soon. In fact, a family member in IT at the time, the late 90s, was going to get out of IT because he thought it was be a low paying, blue collar job in a decade.
He was wrong, and so were my co-workers. Since that time, I've had a number of jobs, and never much of a problem finding one that involved production work. I've also been well compensated, and I've had a nice career through the time I started working for Redgate. It's been great since then, but like Brent, I don't really do production work anymore.
Still, it does seem that database ought to do better managing themselves. Brent Ozar
talked about that recently, including why he doesn't like production work. I think he makes good points, but I think there's another view of this.
The hassles he mentions, the lack of self-tuning, the need to tweak the new features, this means work. This means that someone needs to be paid to do this. And since many companies have lots of database, if you show you can do this stuff at scale, with things like PoSh and
dbatools, you are worth being well paid.
I do think that databases ought to be more self-tuning and more self-contained. Settings like query items, client connections, jobs, backups, etc. ought to be stored inside the db. Query store ought to enhance slightly a track whether a db would benefit from more memory or CPUs. Not detailed history, but based on a short period, what would make this db run better?
Of course, as Brent mentioned, lots of this might not benefit vendors. After all, if we wrote better code, and knew where to focus, we might buy less copies of the database engine.