SQLServerCentral Editorial

The AI/Human Spectrum

,

I was asked this question recently: is it more likely that AI will replace humans or assist them in their work?

It's a good question. If you think about the way AI is being hyped in 2024, many people think AI is, or will soon be, replacing people and we need less of them in work. I guess the simplified view is that AI can do the jobs of many people, but I'm not sure the world is that simple. What I think is more likely is that AI becomes a lever that assists a few people in getting more work done and potentially replacing other, less knowledgable humans.

Maybe it's the ultimate, do-more-with-less pressure that the management in many organizations places on workers that has people looking to AI to help. Get more done this year, but we're not adding staff, and potentially we're removing a few staffers. Maybe AI can do your job?

AI can be a lever, and I do think that there are tedious tasks that we might have AI take on for us. We'll need more of an agent/proxy approach to AI systems, as now I can ask AI to do some things that relate to text/images, but not actually do work for me. I want an AI to actually set my Out-of-Office for me, not give me instructions on how to do it or write the message for respondents.

The current Generative AI/LLM systems aren't really smart or intelligent, but they do process vast amounts of data and mimic the responses that other humans might give. If you work in an area that can benefit from that type of interaction, maybe an AI works well as a lever and lets you get more done on any given day. For some jobs.

However, getting more done might not be enough. If I pair program with someone else and they write a bunch of poorly performing code, it takes me time to judge the quality of what they've written and then additional time to fix it. Getting more done in that situation can be a burden because the additional code produced requires additional rework. I might get less work done in some cases if the code is low-quality and I use a lot of time to rewrite or improve the code.

However, if I am tackling simple tedious tasks, perhaps basic CRUD work in an application, maybe an AI can generate enough SQL, web, C#, etc. code that the job is done quicker. Maybe not at the most efficient level, but how many of you think the code for your internal applications is amazing? Is it good enough? Can an AI do "good-enough" work?

As with a lot of dramatic changes to technology, I find myself going back and forth on the value produced. Some days I think the tech is amazing and some days I think it's akin to the stuff I shovel out of horse stalls. AI is in the same boat with me, and while I think there is potential, I know that there are also downfalls and potential detractions from its widespread use.

Certainly, the need to evaluate and judge quality is a challenge with AI, which leans me towards the lever that assists talented humans and replaces less talented ones. The other issue is cost. The LLMs are expensive, and use a lot of compute power. I've seen some smaller models, perhaps tailored with RAG or other methods of refining (and limiting) their use will overcome that, but who knows. The current models, however, cost something to run and someone is going to have to pay for that. Is there enough ROI to do that?

Lastly, trust. Can we really trust an AI to give us accurate responses, or even perform work on our behalf? We have that problem with other humans now, but they work slowly compared to a computer. Can you imagine the problems that a rogue computer system could create with access to change things in the real world?

Answer my question today. Are AIs more likely to assist or replace people?

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating