There is a skill that I think DBAs and sysadmins will need to develop: cloud cost analysis. I've thought this was important for quite a few years, and I've been (unsuccessfully) lobbying for cost information to be gathered and analyzed in Redgate Monitor. Hopefully, this work will get done soon, as I see more companies asking their technical people to provide analysis and justification of the resources being billed for in the cloud.
Basecamp analyzed its costs in 2023 and decided it could save money by leaving the cloud. I've seen other companies decide they were saving money in the cloud. Many, however, are likely unsure of the total return they get compared to the costs of cloud computing. I have seen some posts (like this one) that try to help you get a handle on your costs, but there is often a lot of complexity in cloud costs when multiple departments have different accounts (AWS) or subscriptions (Azure) with a provider.
In many ways, I think the large cloud vendors haven't really considered how to support large enterprises that need a lot of resources, a lot of different administrators, and a wide variety of ways to both secure those resources as well as aggregate billing. I know we have an Azure account I didn't create, where I have very limited rights to do things, but I can access some resources. However, my group has an AWS account that appears to be sending us billing statements instead of our corporate finance group.
What can be even more difficult to understand is that applications using the cloud might want a complete picture of billing, but various different technical groups might be responsible for the different parts of the system. Data services could be managed separately from web servers or application servers, with different groups creating, configuring, and destroying them as the needs change. Billing, however, be something accountants would want to separate by function/area/application, not group. Knowing all data services for all databases cost $xx/month is different than knowing the retail website (containing web servers, databases, and networking charges) costs $yy/month.
This is far different from the days when one group was responsible for most of our computing resources. Often we had a group of IT staffers who managed hardware, ordered it, and could link costs of different machines to a specific application or department. Now we often don't have a central group who is even aware of all the resources that the company has provisioned.
I suspect this will mean that many more technical people will not only be asked to account for the cloud resources being used, but also split out those costs in different ways, allowing people in finance departments to aggregate the different costs from the various technical groups. I don't know who will actually track network resources, but I suspect applications will often have this data included with the servers or services that access public networks.
Controlling costs and carefully removing unnecessary or unused resources is going to be a continual problem in the cloud. I suspect that quite a few data professionals are going to be integral to helping manage this, especially for dev/test systems that are easily forgotten about over time.