Artificial Intelligence

In the AI Economy, There Will Be Zero Percent Unemployment

AI developer Andrew Mayne explains why technology could create more jobs and lead to unprecedented economic growth.

|

Joanna Andreasson/DALL-E4

I'm an AI developer and consultant, and when OpenAI released a preview in February of its text-to-video model Sora—an AI capable of generating cinema-quality videos—I started getting urgent requests from the entertainment industry and from investment firms. You could divide the calls into two groups. Group A was concerned about how quickly AI was going to disrupt a current business model. Group B wanted to know if there was an opportunity to get a piece of the disruptive action.

Counterintuitively, the venture capitalists and showbiz people were equally split across the groups. Hollywood producers who were publicly decrying the threat of AI were quietly looking for ways to capitalize on it. Tech startups that thought they had an inside track to disrupting Hollywood were suddenly concerned that they were about to be disrupted by a technical advance they didn't see coming.

This is the new normal: Even the disruptors are afraid they're about to be disrupted. We're headed for continuous disruption, both for old industries and new ones. But we're also headed for the longest period of economic growth and lowest unemployment in history—provided we don't screw it up.

As AI and robotics accelerate in capabilities and find their way into virtually every corner of our economy, the prospects for human labor have never been better. Because of AI-driven economic growth, demand for human workers will increase; virtually anyone wanting to enter the work force will have opportunities to find meaningful, well-compensated careers. How we look at work will change, and the continuous disruption will cause a lot of anxiety. But the upside will be social improvements to levels we cannot currently comprehend. Roles and jobs may shift more frequently, but it will be easier to switch and more lucrative to do so.

While some of my peers in artificial intelligence have suggested AI could eliminate the need for work altogether and that we should explore alternative economic models like a universal basic income, I think proposals like that don't take into account the historic effect of automation on the economy and how economic growth increases the demand for labor.

History and basic economics both suggest that AI will not make human beings economically irrelevant. AI and robotics will keep growing the economy, because they continuously increase productivity and efficiency. As the economy grows, there's always going to be a widening gap between demand and capacity. Demand for human labor will increase even when AI and robotics are superior and more efficient, precisely because there won't be enough AI and robots to meet the growing needs.

Economic Growth Is Accelerated by Technology

The goal of commercial AI and robotics is to create efficiencies—that is, to do something more inexpensively than prior methods, whether by people or machines. You use an industrial robot to weld a car because a human welder would take too long and wouldn't have near the precision. You use ChatGPT to help write a grant proposal because it saves you time and means you don't have to pay someone else to help write it.

With an increase in efficiency, you can either lower prices or not lower prices and buy a private island. If you don't lower prices, you run the risk of competition from someone who sees their own path to a private island through your profits. As Amazon's Jeff Bezos once said, "Your margin is my opportunity." In a free market, you usually don't get to reap high margins forever. Eventually, someone else uses price to compete.

Along with this competition comes growth, which also drives innovation. The computer add-on boards used for the Halo and Call of Duty games turned out to be really useful for the kind of computations it takes to produce an AI like ChatGPT. Thanks to that quirk of mathematics, Nvidia was able to add $2 trillion to its market cap over the last five years, and we were saved from the drudgery of writing lengthy emails and other repetitive text tasks. Along with that market cap came huge profits. Nvidia is now using those profits to fund research into everything from faster microchips to robotics. Other large companies, such as Microsoft and Google, are also pouring profits into new startups focusing on AI, health, and robotics. All of this causes economic growth and cheaper and/or better goods.

Even with continuous technological disruption displacing and destroying other industries, the United States gross domestic product has more than doubled over the last 20 years, from $11 trillion to $27 trillion. If you compare the U.S. to the slightly more technophobic European Union, you can make the case that Europe's limits to technologic growth—through legislation and through risk-averse investment strategies—is one of the factors causing slower economic growth (Europe's growth rate was 45.61 percent compared with 108.2 percent in the U.S.).

This was the problem India created for itself after achieving independence in 1948. The government enacted so many laws to protect jobs (the "License Raj") that it stalled the country's economic development for decades, nearly lost millions to famine, and got eclipsed by the Chinese.

If technology is a driving force for economic growth, mixing in superintelligent AI means accelerated growth. Even if there are periods of technological stagnation—which is doubtful—applying current AI automation methods will improve efficiencies across industries. If H&R Block could replace 90 percent of its seasonal employees with AI, it would see its profits skyrocket, given that labor is its biggest expense. Those profits would be reallocated elsewhere, that would increase the potential for even more economic growth, and that would in turn create better opportunities for the accountants.

What about physical labor? Outsourcing jobs overseas is just the final step before they're outsourced out of existence by robotics. If you don't have to build your product on the other side of the planet, you have efficiency in both cost and time to market. The less time goods spend in shipping containers crossing the Pacific, the more available capital you have. More capital means more growth.

If the last several hundred years of economic history are any indication, AI and robotics are going to increase the total surface area of the economy faster than we can comprehend. The more intense the disruption—like the assembly line, electrical power, or the internet—the greater the gains. There's not much evidence to expect anything other than huge economic growth if we continue to improve efficiency and see an acceleration as AI systems and robotics keep improving.

But what about the workers? A fast-growing economy alone doesn't guarantee that every labor sector will benefit—but other factors come into play that might.

An image generated using the prompt, "Illustration of AI as a doctor, teacher, poet, scientist, warlord, actor, journalist, artist, and coder." (Illustration: Joanna Andreasson/DALL-E4)

New Jobs at a Scale We Can't Predict

While innovation may eliminate the need for certain kinds of labor in one sector of the economy (farm equipment reduced the demand for farmworkers) it usually comes with an increase in competition for labor in other areas (increased agricultural productivity helped drive the growth of industrialization and the demand for factory workers). This allows us to switch from lower-paying jobs to higher-paying ones. Higher-paying jobs generally mean ones where innovation either leverages your physical capability (moving from the shovel to a bulldozer) or amplifies your cognitive output (going from paper ledgers to electronic spreadsheets).

Predicting how this will happen is hard, because we are really bad at imagining the future. To understand where we are headed, we have to get out of the mindset that the future is just the present with robots and weird clothes.

The first photograph of a person is believed to have been taken in 1838. Imagine trying to explain to a portrait artist at that time that photography not only did not mean the death of his occupation, but that this invention would lead to an entirely new medium, motion pictures, where an artist like James Cameron would work with a crew of thousands to shoot Avatar (2009), a film that would cost (in unadjusted dollars) more to produce than the entire 1838 U.S. military budget and would gross more than the entire gross national product of that period. The number of people who worked directly on Avengers: Endgame (4,308) was more than half the size of the United States Army in 1838 (7,958).

The future is bigger than we can imagine.

Change is equally hard to comprehend. Two centuries ago, 80 percent of the U.S. population worked on farms. If you told one of those farmers that in 2024 barely 1 percent of the population would work on farms, he'd have a difficult time imagining what the other 79 percent of the population would do with their time. If you then tried to explain what an average income could purchase in the way of a Netflix subscription, airplane transportation, and a car, he'd think you were insane. The same principle applies to imagining life 50 years from now.

Amazon was already a public company in 1998, when the economist and future Nobel Prize winner Paul Krugman predicted: "By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine." Amazon is now the second-largest employer in the United States, and its cloud service powers just about everything we now do online. Although we might be able to predict the possibility of disruption, accurately gauging the transformation it brings is still impossible.

While technology causes disruption across industries and shrinks many of them, it also expands the labor force in unexpected ways. A quarter-century ago, it may have seemed inconceivable that more people would work for a startup like Google than General Motors. Alphabet, Google's parent company, now has 182,502 employees; GM has 163,000. More people work for Apple (161,000) than McDonald's (150,000). Meta—Facebook's parent company—has more employees than ABC, CBS, NBC, and Fox combined (67,000 vs. 66,000).

And they aren't all programmers. At Microsoft, fewer than half the employees are software engineers. For a conglomerate like Amazon, the percentage is even less. Amazon has tens of thousands of people delivering packages, and Apple has human staff working in physical stores—despite the fact the company also sells online. While Amazon might try to shrink its human labor force via robotics, Apple is increasing it. When Apple launched retail stores, experts told them this was ill-advised, that shopping was all moving online. But Apple understood that some decisions required a physical presence and a human touch. If you want to talk to a Google or Meta employee, good luck. If you want to talk to someone from Apple, just go to your nearest shopping mall. Apple bet on technological innovations and human beings, and it now has greater net profit last year than Meta and Google combined ($100 billion).

The demand is so large for technically skilled people that companies are constantly pushing for an increase in the number of H-1B visas awarded each year. At any given time, the tech industry has approximately 100,000 unfilled jobs. Outside of bubbles and recessions, people laid off from tech companies generally find new jobs very quickly.

Economic growth also spurs new demand for traditional industries, such as construction. A million robots would barely make a dent upgrading the United States infrastructure, let alone globally. We're going to need more human foremen and site supervisors than we're capable of producing.

If we accept that the future economy is going to be much bigger than today's and that entirely new categories of jobs will be created—even in companies working hard to replace us with robots—we still have to accept the argument that many current occupations will go away. The skills you and I currently possess may become obsolete. Yet there are reasons to believe people at all stages of their career paths will have an easier and more rewarding experience switching jobs than ever before.

The Retraining Myth

When President Joe Biden said that "Anybody who can throw coal into a furnace can learn how to program," he might have been making a big assumption about what kind of labor the future will need and the types of jobs we will want. When we talk about job retraining, we should think about it in the context of an assembly-line worker learning how to do HVAC repair or a cashier learning how to do customer service for a car company.

Research on job retraining looks pretty bleak at first glance. The U.S. government spends about $20 billion a year on job programs and has very little data to support how effective that is. When you dig deeper into the data, you find that there's very little correlation between dollars spent on these programs and wage increases among the people who use them. Because of this, most labor economists argue that job retraining doesn't work.

Yet people learn new skills and switch careers all the time. Switching roles within a company requires retraining, and similar roles at different companies may be very different in practice. Retraining in practice works extremely well. What people really mean when they say "job retraining doesn't work" is that it's not that effective when the town factory closes and a government program materializes to help the unemployed workers find new jobs.

When you look into job retraining data, it becomes apparent that there's not a single catchall solution that works in every situation for every person. The most effective efforts are ones that find close matches for skills by providing consultation and resources, offer hands-on apprenticeship training so people can adapt on the job, and ease people into new skills while they're still employed. Artificial intelligence might end up playing a role here too: A study I commissioned while at Open-AI suggested that AI-assisted education can reduce the fear of embarrassment in learning new skills. ChatGPT will never judge you, no matter how dumb the question.

Those people who want well-paying careers and are willing to learn the skills will find jobs. By and large, even a 59-year-old won't have trouble finding meaningful work.

If that still sounds like a stretch, consider this: We have solid data that in a high-growth economy, job retraining can pull differently skilled and previously unemployable people into the work force in record numbers. The lowest unemployment rate in U.S. history was 0.8 percent in October 1944. That basically meant everyone who wanted a job and wasn't living in a shack surrounded by 100 miles of desert had a job. This included millions of women who didn't previously have opportunities to work outside the home. They were put into factories and assembly lines to fill the gap left by soldiers sent overseas and helped expand our production to new levels that didn't exist before.

Was World War II an outlier? Yes: It was a situation where there was so much demand for labor that we were pulling every adult we could into the work force. The demand in an AI-driven economy will be just as great, if not greater.

But won't we just use AI and robots to fill all those gaps? The short answer: no. The demand for labor and knowledge work will always be greater than the supply.

Never Enough Computers and Robots

David Ricardo, the classical economist, explained more than 200 years ago why we shouldn't fear robots taking over.

No, those weren't his precise words. But his theory of comparative advantage explained that even when you're able to produce something at extreme efficiency, it can make mathematical sense to trade with less-efficient producers. He used the example of why England should buy port wine from Portugal even though they could make it more cheaply domestically. If England made more profit on producing textiles, it made the most economic sense to dedicate its resources to textiles and use the surpluses to slightly overpay for wine from another country. It's basic math, yet government economists will huddle around a conference room table arguing that you need to keep all production domestic while ordering out for a pizza instead of making it themselves—even if one of them happens to be a fantastic cook.

When OpenAI launched ChatGPT in November 2022, we had no idea what to expect. I remember sitting in on a meeting debating the impact this "low-key research preview" would have. We came to the conclusion that it would be minor. We were wrong: ChatGPT became an instant hit, and it soon had more than 100,000,000 users. It was the fastest adoption of an application by a startup in history. This was great, except for one problem: We couldn't meet the demand.

There weren't enough computers on the planet to handle all of the users wanting access to ChatGPT. OpenAI had to use its supercomputer clusters intended to train newer AI systems to help support the need for compute. As Google and other companies realized the market potential for AI assistants like ChatGPT, they began to ramp up their efforts and increased the demand for compute even more. This is why Nvidia added $2 trillion to its market cap. People quickly realized this demand wasn't going to slow down. It was going to accelerate.

The goal of commercial AI is to efficiently replace cognitive tasks done in the workplace, from handling a customer service complaint to designing your fall product line. This means replacing neurons with transistors. The paradox is that once you maximize the efficiency of something like producing farming equipment, you end up creating new economic opportunities, because of the surpluses. Overall demand increases, not decreases. Even with robots building robots and AI creating new business opportunities, we'll always be short of hands and minds. Even lesser-skilled human talent will be in demand. Just like we needed everyone to participate in the wartime economy, we didn't reach near–zero unemployment because it was a nice thing to do; because of comparative advantage, it made the most economic sense.

When the Manhattan Project ran out of mathematicians, the government recruited from the clerical staff to do computations. The same happened at Bletchley Park with code breaking, and again two decades later at NASA. While today's computers handle advanced computations so fast that they can solve a problem before you can explain it to a person, we now cram mathematicians into rooms with whiteboards and have them think up new things for the computers to do guided by our needs. AI won't change that. Companies are actively building systems to function as AI researchers. They'll eventually be smarter than the people who made them—yet that will lead to demand for even more human AI researchers.

Even people in AI have trouble understanding this argument. They can make persuasive cases why AI and robotics will supersede human capabilities in just about every way, but they give blank looks to arguments about why the demand for intelligence and labor will always be greater than the supply. They can imagine AI replacing our way of doing things, but they have trouble understanding how it will grow demand at such a rate we'll still need dumb, clumsy people. The publicity around high-end computer shortages and the realization that we can't meet present demand, let alone future demand, should hopefully make people consider this in practical economic considerations.

Conversations about how to shape a future economy with concepts like the universal basic income are worth having—but they're trying to solve a problem that probably won't exist in the way that some people foresee. Human beings will be a vital part of economic development well into the future.

Nobody in 1838 saw motion pictures or the likes of James Cameron coming, let alone the concept of a "video game." Our near future is just as difficult to predict. But one thing seems certain: You might not need a job in 2074, but there will be one if you want it.