Robots

Are Robots Going to Steal Our Jobs?

Many technologists think so, but economists aren't so easily convinced.

|

"The reality is that we are facing a jobless future: one in which most of the work done by humans will be done by machines. Robots will drive our cars, manufacture our goods, and do our chores, but there won't be much work for human beings." That's the dire warning of software entrepreneur and Carnegie Mellon engineer Vivek Wadhwa.

Former Microsoft CEO Bill Gates agrees: Technology "will reduce demand for jobs, particularly at the lower end of skill set," he has predicted. Gates has also proposed taxing robots to support the victims of technological unemployment. "In the past," software entrepreneur Martin Ford declared last year, "machines have always been tools that have been used by people." But now, he fears, they're "becoming a replacement or a substitute for more and more workers." A much-cited 2013 study from the Oxford Martin Programme on Technology and Employment struck an even more dire note, estimating that 47 percent of today's American jobs are at risk of being automated within the next two decades.

The conventional wisdom among technologists is well-established: Robots are going to eat our jobs. But economists tend to have a different perspective.

Over the past two centuries, they point out, automation has brought us lots more jobs—and higher living standards too. "Is this time different?" the Massachusetts Institute of Technology economist David Autor said in a lecture last year. "Of course this time is different; every time is different. On numerous occasions in the last 200 years scholars and activists have raised the alarm that we are running out of work and making ourselves obsolete.…These predictions strike me as arrogant."

"We are neither headed toward a rise of the machine world nor a utopia where no one works anymore," said Michael Jones, an economist at the University of Cincinnati, last year. "Humans will still be necessary in the economy of the future, even if we can't predict what we will be doing." When the Boston University economist James Bessen analyzed computerization and employment trends in the U.S. since 1980, his study concluded that "computer use is associated with a small increase in employment on average, not major job losses."

Who is right, the terrified technologists or the totally chill economists?

This Time Is Always Different In 1589, Queen Elizabeth I refused to grant a patent to William Lee for his invention of the stocking frame knitting machine, which sped up the production of wool hosiery. "Thou aimest high, Master Lee," she declared. "Consider thou what the invention could do to my poor subjects. It would assuredly bring to them ruin by depriving them of employment, thus making them beggars." In the early 19th century, English textile workers calling themselves Luddites famously sought to protect their livelihoods by smashing industrial weaving machines.

The economist John Maynard Keynes warned in 1930 that the "means of economising the use of labour [is] outrunning the pace at which we can find new uses for labour," resulting in the "new disease" of "technological unemployment." In 1961, Time warned: "Today's new industries have comparatively few jobs for the unskilled or semiskilled, just the class of workers whose jobs are being eliminated by automation." A 1989 study by the International Metalworkers Federation forecasted that within 30 years, as little as 2 percent of the world's current labor force "will be needed to produce all the goods necessary for total demand." That prediction has just two years left to come true.

This year the business consultancy McKinsey Global Institute issued a report that analyzed the potential impact of automation on individual work activities rather than entire occupations. The McKinsey researchers concluded that only 5 percent of occupations are fully automatable using currently available technologies. On the other hand, the report also estimated that "about half of all the activities people are paid to do in the world's workforce could potentially be automated by adapting currently demonstrated technologies"—principally the physical work that takes place in highly structured and predictable environments along with routine data collection and processing.

In March, the consultancy PricewaterhouseCoopers concluded 38 percent of jobs in the U.S. are at high risk of automation by the early 2030s. Specifically, jobs in transportation and storage, retail and wholesale trade, food service and accommodation, administrative and support services, insurance and finance, and manufacturing are particularly vulnerable.

And that 2013 study from Oxford's Martin Programme on Technology and Employment? Economist Bessen points out that of the 37 occupations it identified as fully automatable—including accountants, auditors, bank loan officers, messengers, and couriers—none has been completely automated since the study was published. Bessen further notes that of the 271 jobs listed in the 1950 Census, only one has truly disappeared for reasons that can largely be ascribed to automation: the elevator operator. In 1900, 50 percent of the population over age 10 was gainfully employed. (Child labor was not illegal in most states back then, and many families needed the extra income.) In 1950, it was 59 percent of those over age 16. Now the civilian labor participation rate stands at 63 percent.

Of course, the jobs that people do today—thanks largely to high productivity made possible by technological progress—are vastly different than those done at the turn of the 20th century.

Are We Working Less? In a 2015 essay titled "Why Are There Still So Many Jobs?," MIT economist Autor points out that most new workplace technologies are designed to save labor. "Whether the technology is tractors, assembly lines, or spreadsheets, the first-order goal is to substitute mechanical power for human musculature, machine-consistency for human handiwork, and digital calculation for slow and error-prone 'wetware,'" he writes. Routinized physical and cognitive activities—spot welding car chassis on an assembly line or processing insurance claim paperwork at a desk—are the easiest and first to be automated.

Wikimedia/D J Shin

If the technologists' fears are coming true, you'd expect to see a drop in hours worked at middle-skill, middle-wage jobs—the ones politicians often refer to as "good jobs." And indeed, in 2013, Autor and David Dorn of the Center for Monetary and Financial Studies in Madrid found a significant decrease in hours worked in construction, mining, and farm work between 1980 and 2005; the researchers concluded that this was because the routine manual and cognitive activities required by many of those middle-class occupations were increasingly being performed by ever cheaper and more capable machines and computers. They also found a 30 percent increase in hours spent working at low-skill jobs that require assisting or caring for others, from home health aides to beauticians to janitors.

But this year a better-designed study by two more economists—Jennifer Hunt of Rutgers and Ryan Nunn of Brookings—challenged that conclusion. Instead of focusing on the average wages of each occupation, Hunt and Nunn sorted hourly workers into categories by their real wages, reasoning that the averages in certain jobs could mask important trends.

Hunt and Nunn found that men experienced downward wage mobility in the 1980s, due largely to deunionization and the decline in manufacturing. Beginning around 1990, the percentage of both men and women in their lower-wage category declined, while rising in the higher-wage group.

After adjusting for business cycle fluctuations, they found that there was a small increase in the percentage of workers in their best-compensated category (people earning more than $25.18 an hour) between 1979 and 2015, with very little change in the other groups—certainly nothing that looked like the radical polarization Autor and others fear.

So far, robots don't seem to be grabbing human jobs at an especially high rate. Take the much-touted finding by MIT economist Daron Acemoglu and Boston University economist Pascual Restrepo in a working paper released in March. Since 1990, they say, each additional industrial robot in the U.S. results in 5.6 American workers losing their jobs. Furthermore, the addition of one more robot per thousand employees cuts average wages by 0.5 percent. The pair defined a robot as a programmable industrial machine that operates in three dimensions—think of spot welding and door handling robots on an automobile assembly line.

In total, Acemoglu and Restrepo report that the number of jobs lost due to robots since 1990 is somewhere between 360,000 and 670,000. By contrast, last year some 62.5 million Americans were hired in new jobs, while 60.1 million either quit or were laid off from old ones, according the Bureau of Labor Statistics. The impact of robots, in other words, is quite small, relatively speaking. Moreover, when the researchers include a measure of the change in computer usage at work, they found a positive effect, suggesting that computers tend to increase the demand for labor.

In 2015, economists Georg Graetz of Uppsala University and Guy Michaels of the London School of Economics analyzed the effects of industrial robots on employment in 17 different countries between 1993 and 2007. In contrast to the Acemoglu and Restrepo study, "We find a negative effect of robots on low-skilled workers' employment," says Michaels in an interview, "but no significant effect on overall employment." Their study also found that the increases in the number of robots boosted annual economic growth by 0.37 percent.

Where Did the Jobs Go? Look Around! In a 2011 television interview, President Barack Obama worried that "a lot of businesses have learned to become much more efficient with a lot fewer workers." To illustrate his point, Obama noted, "You see it when you go to a bank and you use an ATM, you don't go to a bank teller." But the number of bank tellers working in the U.S. has not gone down. Since 1990, their ranks have increased from around 400,000 to 500,000, even as the number of ATMs rose from 100,000 to 425,000. In his 2016 study, Bessen explains that the ATMs "allowed banks to operate branch offices at lower cost; this prompted them to open many more branches, offsetting the erstwhile loss in teller jobs." Similarly, the deployment of computerized document search and analysis technologies hasn't prevented the number of paralegals from rising from around 85,000 in 1990 to 280,000 today. Bar code scanning is now ubiquitous in retail stores and groceries, yet the number of cashiers has increased to 3.2 million today, up from just over 2 million in 1990, outpacing U.S. population growth over the same period.

This illustrates why most economists are not particularly worried about the notion of widespread technological unemployment. When businesses automate to boost productivity, they can cut their prices, thus increasing the demand for their products, which in turn requires more workers. Furthermore, the lower prices allow consumers to take the money they save and spend it on other goods or services, and this increased demand creates more jobs in those other industries. New products and services create new markets and new demands, and the result is more new jobs.

You can think of this another way: The average American worker today would only have to work 17 weeks per year to earn the income his counterpart brought in 100 years ago, according to Autor's calculations—the equivalent of about 10 hours of work per week. Most people prefer to work more, of course, so they can afford to enjoy the profusion of new products and services that modern technology makes available, including refrigerators, air conditioners, next-day delivery, smartphones, air travel, video games, restaurant meals, antibiotics, year-round access to fresh fruits and vegetables, the internet, and so forth.

But if technologically fueled productivity improvements boost job growth, why are U.S. manufacturing jobs in decline? In a new study published in April, Bessen finds that as markets mature, comparatively small changes in the price of a product do not call forth a compensating increase in consumer demand. Thus, further productivity gains bring reduced employment in relatively mature industries such as textiles, steel, and automobile manufacturing. Over the past 20 years, U.S. manufacturing output increased by 40 percent while the number of Americans working in manufacturing dropped from 17.3 million in 1997 to 12.3 million now. On the other hand, Bessen projects that the ongoing automation and computerization of the nonmanufacturing sector will increase demand for all sorts of new services. In fact, he forecasts that in service industries, "faster technical change will…create faster employment growth."

Since the advent of the smartphone just 10 years ago, for example, an "app economy" has emerged that "now supports an astounding 1.66 million jobs in the United States," Progressive Policy Institute economist Michael Mandel reports. According to the Entertainment Software Association, more than 220,000 jobs now depend on the game software industry. The IBISWorld consultancy estimates that 227,000 people work in web design, while the Biotechnology Innovation Organization says that U.S. bioscience companies employ 1.66 million people. Robert Cohen, a senior fellow at the Economic Strategy Institute, projects that business spending on cloud services will generate nearly $3 trillion more in gross domestic product and 8 million new jobs from 2015 to 2025.

In 2014, Siemens USA CEO Eric Spiegel claimed in a Washington Post op-ed that 50 percent of the jobs in America today didn't exist 25 years ago—and that 80 percent of the jobs students will fill in the future don't exist today. Imagine, for instance, the novel occupations that might come into being if the so-called internet of things and virtual/augmented reality technologies develop as expected.

PROyeowatzup/Creative Commons

In a report this year for the Technology CEO Council, Mandel and analyst Bret Swanson strike a similar note, arguing that the "productivity drought is almost over." Over the past 15 years, they point out, productivity growth in digital industries has averaged 2.7 percent per year, whereas productivity in physical industries grew at just 0.7 percent annually.

According to the authors, the digital industries currently account for 25 percent of private-sector employment. "Never mind the evidence of the past 200 years; the evidence that we have of the past 15 years shows that more technology yields more jobs and better jobs," says Swanson.

Mandel and Swanson argue that the information age has barely begun, and that the "increased use of mobile technologies, cloud services, artificial intelligence, big data, inexpensive and ubiquitous sensors, computer vision, virtual reality, robotics, 3D additive manufacturing, and a new generation of 5G wireless are on the verge of transforming the traditional physical industries." They project that applying these information technologies to I.T.-laggard physical industries will boost U.S. economic growth from its current annual 2 percent rate to 2.7 percent over the next 15 years, adding $2.7 trillion in annual U.S. economic output by 2031, and cumulatively raising American wages by $8.6 trillion. This would increase U.S. GDP per capita from $52,000 to $77,000 by 2031.

The Unknown Future "Electrification transformed businesses, the overall economy, social institutions, and individual lives to an astonishing degree—and it did so in ways that were overwhelmingly positive," Martin Ford writes in his book Rise of the Robots. But why doesn't Martin mourn all the jobs that electrification destroyed? What about the ice men? The launderers? The household help replaced by vacuum cleaners and dishwashers? The firewood providers? The candle makers?

To ask is to answer. Electricity may have killed a lot of jobs, but on balance it meant many more. Developments in information technology will do the same.

Imagine a time-traveling economist from our day meeting with Thomas Edison, Henry Ford, and John D. Rockefeller at the turn of the 20th century. She informs these titans that in 2017, only 14 percent of American workers will be employed in agriculture, mining, construction, and manufacturing, down from around 70 percent in 1900. Then the economist asks the trio, "What do you think the other 56 percent of workers are going to do?"

They wouldn't know the answer. And as we look ahead now to the end of the 21st century, we can't predict what jobs workers will be doing then either. But that's no reason to assume those jobs won't exist.

"I can't tell you what people are going to do for work 100 years from now," Autor said last year, "but the future doesn't hinge on my imagination." Martin and other technologists can see the jobs that might be destroyed by information technology; their lack of imagination blinds them to how people will use that technology to conjure millions of occupations now undreamt of.