Fearing the Wrong Thing

The only thing to fear is failing to make the transition to AI-assisted programming

By Mike Loukides
July 11, 2023
Built by Jess Dixon of Andalusia, Alabama. Can fly forward, backward, straight up, or hover in the air. Circa 1940. Built by Jess Dixon of Andalusia, Alabama. Can fly forward, backward, straight up, or hover in the air. Circa 1940. (source: Kobel Feature Photos on Wikimedia Commons)

There’s a lot of angst about software developers “losing their jobs” to AI, being replaced by a more intelligent version of ChatGPT, GitHub’s Copilot, Google’s Codey, or something similar. Matt Welsh has been talking and writing about the end of programming as such. He’s asking whether large language models eliminate programming as we know it, and he’s excited that the answer is “yes”: eventually, if not in the immediate future. But what does this mean in practice? What does this mean for people who earn their living from writing software?

Some companies will certainly value AI as a tool for replacing human effort, rather than for augmenting human capabilities. Programmers who work for those companies risk losing their jobs to AI. If you work for one of those organizations, I’m sorry for you, but it’s really an opportunity. Despite the well-publicized layoffs, the job market for programmers is great, it’s likely to remain great, and you’re probably better off finding an employer who doesn’t see you as an expense to be minimized. It’s time to learn some new skills and find an employer who really values you.

Learn faster. Dig deeper. See farther.

Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

Learn more

But the number of programmers who are “replaced by AI” will be small.  Here’s why and how the use of AI will change the discipline as a whole. I did a very non-scientific study of the amount of time programmers actually spend writing code. OK, I just typed “How much of a software developer’s time is spent coding” into the search bar and looked at the top few articles, which gave percentages ranging from 10% to 40%. My own sense, from talking to and observing many people over the years, falls into the lower end of that range: 15% to 20%.

ChatGPT won’t make the 20% of their time that programmers spend writing code disappear completely. You still have to write prompts, and we’re all in the process of learning that if you want ChatGPT to do a good job, the prompts have to be very detailed. How much time and effort does that save? I’ve seen estimates as high as 80%, but I don’t believe them; I think 25% to 50% is more reasonable. If 20% of your time is spent coding, and AI-based code generation makes you 50% more efficient, then you’re really only getting about 10% of your time back. You can use it to produce more code—I’ve yet to see a programmer who was underworked, or who wasn’t up against an impossible delivery date. Or you can spend more time on the “rest of the job,” the 80% of your time that wasn’t spent writing code. Some of that time is spent in pointless meetings, but much of “the rest of the job” is understanding the user’s needs, designing, testing, debugging, reviewing code, finding out what the user really needs (that they didn’t tell you the first time), refining the design, building an effective user interface, auditing for security, and so on. It’s a lengthy list.

That “rest of the job” (particularly the “user’s needs” part) is something our industry has never been particularly good at. Design—of the software itself, the user interfaces, and the data representation—is certainly not going away, and isn’t something the current generation of AI is very good at. We’ve come a long way, but I don’t know anyone who hasn’t had to rescue code that was best described as a “seething mass of bits.” Testing and debugging—well, if you’ve played with ChatGPT much, you know that testing and debugging won’t disappear. AIs generate incorrect code, and that’s not going to end soon. Security auditing will only become more important, not less; it’s very hard for a programmer to understand the security implications of code they didn’t write. Spending more time on these things—and leaving the details of pushing out lines of code to an AI—will surely improve the quality of the products we deliver.

Now, let’s take a really long term view. Let’s assume that Matt Welsh is right, and that programming as we know it will disappear—not tomorrow, but sometime in the next 20 years. Does it really disappear? A couple of weeks ago, I showed Tim O’Reilly some of my experiments with Ethan and Lilach Mollick’s prompts for using AI in the classroom. His reaction was “This prompt is really programming.” He’s right. Writing a detailed prompt really is just a different form of programming. You’re still telling a computer what you want it to do, step by step. And I realized that, after spending 20 years complaining that programming hasn’t changed significantly since the 1970s, ChatGPT has suddenly taken that next step. It isn’t a step towards some new paradigm, whether functional, object oriented, or hyperdimensional. I expected the next step in programming languages to be visual, but it isn’t that either. It’s a step towards a new kind of programming that doesn’t require a formally defined syntax or semantics. Programming without virtual punch cards. Programming that doesn’t require you to spend half your time looking up the names and parameters of library functions that you’ve forgotten about.

In the best of all possible worlds, that might bring the time spent actually writing code down to zero, or close to it. But that best case only saves 20% of a programmer’s time. Furthermore, it doesn’t really eliminate programming. It changes it—possibly making programmers more efficient, and definitely giving programmers more time to talk to users, understand the problems they face, and design good, secure systems for solving those problems. Counting lines of code is less important than understanding problems in depth and figuring out how to solve them—but that’s nothing new. Twenty years ago, the Agile Manifesto pointed in this direction, valuing:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

Despite 23 years of “agile practices,” customer collaboration has always been shortchanged. Without engaging with customers and users, Agile quickly collapses to a set of rituals. Will freeing programmers from syntax actually yield more time to collaborate with customers and respond to change? To prepare for this future, programmers will need to learn more about working directly with customers and designing software that meets their needs. That’s an opportunity, not a disaster. Programmers have labored too long under the stigma of being neckbeards who can’t and shouldn’t be allowed to talk to humans. It’s time to reject that stereotype, and to build software as if people mattered.

AI isn’t something to be feared. Writing about OpenAI’s new Code Interpreter plug-in (gradually rolling out now), Ethan Mollick says “My time becomes more valuable, not less, as I can concentrate on what is important, rather than the rote.” AI is something to be learned, tested, and incorporated into programming practices so that programmers can spend more time on what’s really important: understanding and solving problems. The endpoint of this revolution won’t be an unemployment line; it will be better software. The only thing to be feared is failing to make that transition.

Programming isn’t going to go away. It’s going to change, and those changes will be for the better.

Post topics: AI & ML, Artificial Intelligence, Programming
Post tags: Commentary
Share:

Get the O’Reilly Radar Trends to Watch newsletter