Young Men Are Playing Video Games Instead of Getting Jobs. That's OK. (For Now.)
Video games, like work, are basically a series of quests comprised of mundane and repetitive tasks: Receive an assignment, travel to a location, overcome some obstacles, perform some sort of search, pick up an item, and then deliver it in exchange for a reward—and, usually, another quest, which starts the cycle all over again. You are not playing the game so much as following its orders. The game is your boss; to succeed, you have to do what it says.
This is especially true in the genre that has come to dominate much of big-budget game development, the open-world action role-playing game, which blends the hair-trigger violence of traditional shooters with the massive explorable landscapes of games like Grand Theft Auto and the intricate craft and character leveling systems of pen-and-paper tabletop fantasy games like Dungeons & Dragons.
The games consist of a series of assignments combined with a progression of skills, awards, and accomplishments, in which you, the player, become more powerful and proficient as a result of your dedication. And dedication is what these games require. It is not uncommon for single-player games to take upward of 60 hours to complete. Online, multiplayer variants can easily chew up hundreds or even thousands of hours of time, with the most accomplished players putting in dozens of hours a week for months on end. Although these games are usually packaged in a veneer of fantasy, they work less like traditional entertainment and more like employment simulators.
So it is perhaps not surprising that for many young men, especially those with lower levels of educational attainment, video games are increasingly replacing work. Since 2000, men in their 20s without a bachelor's degree are working considerably less and spending far more time engaged in leisure activities, which overwhelmingly means playing video games. Over the same time frame, this group of men has also grown more likely to be single, to have no children, and to live with parents or other family members.
The surprising thing about the stereotypical aimless young man, detached from work and society, playing video games in his parents' basement: He's actually happier than ever.
I can relate.
In March, I spent a chilly weekend playing Mass Effect: Andromeda. This is the fourth installment in one of the gaming world's most popular franchises, and it represents something of a reboot for the science fiction series, with a new protagonist and a new galaxy to explore.
In the game, I played Ryder, a young woman—my choice—who, after the prologue, becomes a "pathfinder," the lead explorer on a joint human-alien mission to settle the Andromeda galaxy. That role came with considerable responsibilities. When I arrived in the new galaxy, I discovered that the mission had already gone awry: The life-sustaining planets that I expected to find had been made barren by a mysterious alien menace, colonist ships had gone missing, and the space station that was supposed to serve as the mission hub had been divided by cultural disputes, bureaucratic infighting, and resource shortages. In addition to shooting a predictably large quantity of menacing aliens and robots, my job—and it was most definitely a job—was to solve all of these problems.
That meant collecting basic resources from nearby planets and developing them into useful technologies. It meant setting up outposts for colonists and assigning them to various tasks. It meant resolving thorny personality disputes between various administrators and staffers on the space station, and assisting other staffers with their work.
In some ways I was playing sci-fi CEO, but in other ways it felt more like being an all-purpose flunky. Much of the game involves menial chores. My character was equipped with a handheld scanner that can catalog items and provide technical information about many of the objects she encounters. The game offers rewards for scanning just about everything, which means that the play frequently descends into a kind of cataloging and list making, hunting down items to scan, using the scanner to trace wires and connections and solve ancient alien relic puzzles that rather suspiciously resemble games of Sudoku. Land on any of the game's planets and pull up a map, and you'll see that it's covered with icons and markers, each of which represents a task to complete, a person to talk with, an item to acquire.
Completing all of these objectives would take dozens of hours. Most are not even directly related to the game's core storyline. Instead they are what gamers refer to as side-quests—secondary activities and sideline time wasters embedded inside a massive virtual time waster.
The multitude of priorities and objectives means that there is always someplace to go, something to do, someone to talk to, or some object to fetch. Like many games, Mass Effect: Andromeda is designed to entice the player with a cycle of discovery, frustration, achievement, and advancement that never fully resolves, because each in-game accomplishment leads to the discovery of more tasks and objectives, more upgrades and abilities, more work to be done. At its best, it was satisfying, distracting, and frustrating, all at once. I found myself feeling not only that I could play forever, but that I must.
Andromeda is not the most entertaining game I have ever played, but with its endless array of tasks to complete and objectives to achieve, it is among the most job-like in its approach to game design. At times it hit rather close to home.
The game boasts an intricate conversation system, and a substantial portion of the playtime is spent talking to in-game characters, quizzing them for information (much of which adds color but is ultimately irrelevant), asking them for assignments, relaying details of your progress, and then finding out what they would like you to do next.
At a certain point, it started to feel more than a little familiar. It wasn't just that it was a lot like work. It was that it was a lot like my own work as a journalist: interviewing subjects, attempting to figure out which one of the half-dozen questions they had just answered provided useful information, and then moving on to ask someone else about what I had just been told.
Eventually I quit playing. I already have a job, and though I enjoy it quite a bit, I didn't feel as if I needed another one.
But what about those who aren't employed? It's easy to imagine a game like Andromeda taking the place of work.
The economy has rebounded since the great recession, and national unemployment now sits below 5 percent. But that figure only counts people who are actively seeking work. Even as the unemployment rate has dropped, labor force participation—the number of people who either work or want to work—has dwindled. In particular, young men without college degrees have become increasingly detached from the labor market. And what they appear to be doing instead is playing video games.
In a September 2016 piece for Chicago Booth Magazine, adapted from a speech delivered the previous June, University of Chicago economist Erik Hurst described his 12-year-old son's fanatical devotion to gaming. "If it were up to him," Hurst wrote, "I have no doubt he would play video games 23 and a half hours per day. He told me so. If we didn't ration video games, I am not sure he would ever eat. I am positive he wouldn't shower."
For Hurst, the pull that games exerted on his son helped illustrate what's happening to young men in the broader economy. Between 2000 and 2015, he said, the percentage of lower-skilled men aged 21 to 55 who had a job dropped from 84 percent to 77 percent, "a massive change relative to historical levels." The decline is particularly acute among men in their 20s. Employment fell 10 points over the same period, from 82 percent to 72 percent. In 2015, he noted, 22 percent of men in their 20s who lacked a college degree had not worked a single day during the previous year—up from 10 percent in 2000.
In 2000, just 35 percent of lower-skilled young men lived with family. Now a slight majority of lower-skilled young men reside with their parents, whether they're employed or not. For those who lack employment, the figure is 70 percent. The vast majority of low-skilled young men—roughly 90 percent—have not built families. "If you're not working, as a man in your 20s with less than a bachelor's degree, you're pretty much single and childless," Hurst said last year on the podcast EconTalk.
In follow-up interviews, the economist has stressed that his research is preliminary and ongoing, and the particulars are subject to revision. Working papers released in the fall of 2016 and the spring of 2017 told essentially the same story: Low-skilled men are working less and living at home more.
The surprising thing about the stereotypical aimless young man playing video games in his parents' basement: He's actually happier than ever.
Instead of working, they are playing video games. About three quarters of the increase in leisure time among men since 2000 has gone to gaming. Total time spent on computers, including game consoles, has nearly doubled.
You might think that this would be demoralizing. A life spent unemployed, living at home, without romantic prospects, playing digital time wasters does not sound particularly appealing on its face.
Yet this group reports far higher levels of overall happiness than low-skilled young men from the turn of the 21st century. In contrast, self-reported happiness for older workers without college degrees fell during the same period. For low-skilled young women and men with college degrees, it stayed basically the same. A significant part of the difference comes down to what Hurst has called "innovations in leisure computer activities for young men."
The problems come later.
A young life spent playing video games can lead to a middle age without marketable skills or connections. "There is some evidence," Hurst pointed out, "that these young, lower-skilled men who are happy in their 20s become much less happy in their 30s or 40s." So are these guys just wasting their lives, frittering away their time on anti-social activities?
Hurst describes his figures as "staggering" and "shocking"—a seismic shift in the relationship of young men to work. "Men in their 20s historically are a group with a strong attachment to the labor force," he writes. "The decline in employment rates for low-skilled men in their 20s was larger than it was for all other sex, age, and skill groups during this same time period."
But there's another way to think about the change: as a shift in their relationship to unemployment. Research has consistently found that long-term unemployment is one of the most dispiriting things that can happen to a person. Happiness levels tank and never recover. One 2010 study by a group of German researchers suggests that it's worse, over time, for life satisfaction than even the death of a spouse. What video games appear to do is ease the psychic pain of joblessness—and to do it in a way that is, if not permanent, at least long-lasting.
For low-skilled young men, what is the alternative to playing games? We might like to imagine that they would all become sociable and highly productive members of society, but that is not necessarily the case.
"Every society has a 'bad men' problem," says Tyler Cowen, an economist at George Mason University. Cowen's 2013 book Average Is Over envisions a future in which high-productivity individuals create the vast majority of society's economic value, while lower-skilled individuals spend their days on increasingly inexpensive entertainment that helps order their lives and allow for a baseline level of daily happiness. Hurt's research suggests that we may be witnessing the beginnings of that world already.
In terms of behavior and performance, Cowen says, "Variance for men is always higher." That is, men are more likely to exhibit extremes of character and behavior, both positive and negative. A whole generation of men obsessively playing video games during their prime decades of life may not be ideal, but most would agree that it is preferable to riots.
Are riots a possibility? In his 2016 book The Wealth of Humans, journalist Ryan Avent writes about the evolution of the labor market from the industrial revolution through the coming era of automation and artificial intelligence, which he argues will make millions of jobs obsolete. Avent envisions a possible future in which wages for low-skilled individuals grow ever smaller, and in which many have a difficult time finding work at all. Even if we eventually adapt to the new labor market—finding, for example, work servicing the computers and machines that perform much of the world's labor—the economic disruption will be severe, and it's likely to produce political and social upheaval.
"In a very low-wage world," Avent writes, "more people will opt out of work. That will inevitably strain the social-safety net; societies will be ever more clearly divided into those who work and pay for social programmes and those who live off of them." If this dynamic is not handled with care, he worries that it could lead to "intense political conflict" between those two groups. Indeed, Avent raises the possibility that the global rise of populist movements on both the left and right may be an early signal that such tensions are already building.
Discussions about low-skilled unemployment and the future of work—or the lack of it—invariably turn to the prospect of reforming the welfare state by instituting a universal basic income. In many ways, the heart of the issue is not gaming but worklessness and its consequences.
Basic income proposals vary (see "The Indestructible Idea of the Universal Basic Income"), but the core idea is to provide everyone, rich or poor, with a guaranteed payment that would allow him or her to live without working, or at least significantly reduce the need to generate income. The goal would be to create a system in which everyone is entitled to a minimum standard of living, one that is sufficient but not very luxurious.
Left unanswered is the question of what happens after one's basic needs are provided for. Individuals vary, but virtually everyone seeks more out of life than low-level material subsistence. People whose survival needs are met seek power and growth, status and social connection—benefits even the most generous imaginable basic income cannot provide.
What, in other words, would people do with their time?
Hurst's research suggests that many people, or at least many low-skilled young men, would use it to play video games. Those living with and off of their parents are, in effect, already receiving a kind of basic income, administered privately at the family level. That is enough to survive, but for most people it is not enough to feel content.
That's where games come in. They don't put food on the table. But they do provide, at least in the short to medium term, a sense of focus and success, structure and direction, skill development and accomplishment. Spend any time reading video game reviews, and you'll find that two of the most common terms of praise are that a game made the reviewer "feel powerful" and that it provided a "sense of achievement." Games, with their endless task lists and character-leveling systems, their choice architectures and mission checklists, are purpose generators. They bring order to gamers' lives.
Even the most open-ended games tend to offer a sense of progress and direction, completion and commitment. In other words, they make people happy—or at least happier, serving as a buffer between the player and despair. Video games, you might say, offer a sort of universal basic income for the soul.
As with welfare and transfer programs, there are trade-offs: They can cushion individuals from life's difficulties and hardships, but they can also provide disincentives to work by making unemployment more pleasurable. In economists' terms, video games raise an individual's reservation wage—the amount of compensation it takes to make someone choose to work.
A crucial difference, of course, is that playing video games does not incur a direct burden on taxpayers. On the other hand, as Avent notes, a world in which prime-age workers are playing games rather than building employment skills and careers is one that is likely to see slower economic growth, and to strain the social safety net.
Yet games may also reduce the sort of political frictions that worry both Cowen and Avent: If such entertainments can make life more comfortable for the unemployed, they reduce the likelihood of anger and upheaval. Appealing, engaging games may raise the opportunity cost of both work and revolution.
What exactly does it mean for a game to be appealing and engaging? What does it mean for games to be fun—so much fun, in some cases, that players will devote hundreds or even thousands of hours a year to playing them?
The sheer amount of time that many players put into games is stunning to consider. A relatively modest single-player game like The Last of Us might take 10 to 20 hours to complete. A game like Mass Effect: Andromeda might take 60 hours to play through once, and 100 hours for a careful player to encounter all the content. The branching nature of the gameplay encourages multiple playthroughs. Online multiplayer games can take even more time. In 2015, Activision CEO Eric Hirschberg reported that Destiny, a complex mass-multiplayer shooter that mixes role-paying elements with squad-based action, counted 16 million players, and that daily players put in an average of three hours a day.
Video games, you might say, offer a sort of universal basic income for the soul.
One way of describing a game that has such pull on its players might be that it is fun. Another might be that it is addicting.
Questions about the effect of games on players and on society tend to omit a critical part of the equation: the people who make them. Erik Wolpaw is one such person. Until earlier this year, he worked as a writer for Valve, the company behind the online game distribution platform Steam as well as games such as Half-Life and Portal 2, which Wolpaw co-wrote.
Game designers, Wolpaw explains, tend to think in terms of making their products entertaining and, in many cases, replayable. Their goal, as straightforward as it may sound, is to make games that people want to play. Discussions about keeping players engaged happen, he says, but "not in a mustache-twirling, how-are-we-going-to-get-people-addicted way. There's a fine line between that psychology and good game design." This was true long before the rise of computer gaming. "People will never stop playing chess, because it's a great game. The discussions I hear are more about how can we keep these games interesting to keep playing."
One way to do that, it turns out, is to give people a sense of earned achievement. "What games are good at—what they are designed to do—is simulate being good at something," Wolpaw says.
A military shooter might offer a simulation of being a crack special forces soldier. A racing game might simulate learning to handle a performance sports car. A sci-fi role-playing game might simulate becoming an effective leader of a massive space colonization effort. But what you're really doing is training yourself to effectively identify on-screen visual cues and twitch your thumb at the right moment. You're learning to handle a controller, not a gun or a race car. You're learning to manage a game's hidden stats system, not a space station. A game provides the sensation of mastery without the actual ability.
"It's a simulation of being an expert," Wolpaw says. "It's a way to fulfill a fantasy." That fantasy, ultimately, is one of work, purpose, and social and professional success.
Discussions about the allure of video games are in some sense discussions of the idea that video games are too much fun, which is another way of saying that they are too well-made. The implication is that designers should place limits on themselves and their products.
"It seems like a dangerous path to go down," Wolpaw says, "that if you're just trying to make the best game that you possibly can, to say, well, we need to make it worse, because this might be too good."
Are video games too good? Do they make people happy? Should young men work more and play games less? What obligation do people have to work, raise families, or be conventionally productive in their lives? Questions about video games and work are in some ways just oblique ways of asking bigger questions about the relationship between the self and society. I won't try to answer them. I'm not sure anyone can.
But I can say something about the role that video games have played in my own life, and the imperfect comforts they've provided me.
As a child, I didn't spend much time playing video games, but I was fascinated by them all the same. I didn't have a console or a computer that could play games until high school, and I lacked the funds to play standalone boxes at length.
Because I did not have the opportunity to play regularly, I was never very good at video games—not like some of my more devoted friends, anyway. At arcades, I would die quickly, losing the quarters I had in just a few minutes. On friends' consoles, I would never get very far, and most games from that era lacked the capacity to save the gameplay at a particular point and return to it after a loss. On those occasions when I had opportunities to play, I found the experience both thrilling and vaguely disappointing. In part that's because the blocky, two-dimensional games of that primitive era never quite lived up to my imagination for what games could be. And in part it's because the appeal of the pastime, it seemed to me, was in the devotion of time and effort to the deep understanding of a game's challenges and systems. Video games created a space for intense concentration. Indeed, true mastery required it.
So mostly I looked on from afar, cheering friends as they made heroic runs through Super Mario Brothers and Sonic the Hedgehog, standing off to the side and watching with awe at the twitchy focus older teenagers brought to competitive arcade games like Street Fighter II. They sweated over their work with the anxiety and brow-furrowed determination of craftsmen and artists. I came to admire the way they shaped the games according to their whims and, equally, the way they had shaped themselves to the demands of the games. In the '80s and '90s, gaming was still the domain of losers and outcasts, nerds with no social skills. But at the arcade, or at home with a console, surrounded by friends, these gamers had found something that transcended the derision, something that made them feel accomplished and worthy. It was clear to me, even as an observer, that for the best players, games were a totalizing experience and also a significant personal accomplishment. It was something that they worked at—and, in the right setting, something they could be proud of.
It wasn't until I arrived at a small liberal arts college in 1999 that I started playing games seriously. By the late 1990s, most colleges had installed networks in dorms, which allowed students, nearly all of whom showed up with new computers, to play multiplayer games against each other. A classmate on my hall discovered an early version of Counter-Strike—at the time it was a fan-made modification of Valve's Half-Life—and soon we were devoting evenings and weekends to gunning each other down on the campus network.
A team-based shooter that rewards precision, fast reflexes, clever strategy, and highly coordinated teamwork, the game became an obsession. Over the next few years, we organized off-campus events in which dozens of players competed for entire weekends, drinking obscure caffeinated beverages, sleeping under tables strewn with cords and computers, and launching into volleys of profanity so artfully constructed that they ought to have counted as a form of literature.
As before, I was never very good at the game—fast-twitch hand-eye coordination has never been a strength of mine—but I found great satisfaction in the social aspect, the way it bound us together. Other tight-knit groups on campus had intramural sports; we had Counter-Strike. It was our bond. We were unified by a communal purpose.
Today, all of us are gainfully employed, and most are married with kids. We don't play Counter-Strike or any other games together anymore, but despite having dispersed across the country, we remain a close-knit group, gathering together most years and emailing dozens of times a day. The email listserv we use to communicate is named after our Counter-Strike team. A video game helped me make some of the closest and longest-lasting friendships of my life.
After college, I quit playing video games for several years. I moved to Washington, D.C., took a series of jobs, and started dating the woman who I would eventually marry. In January 2009, we moved in together. One month later, the startup I was working at crashed, a victim of the financial crisis. My girlfriend's advice—I promise I am not making this up—was to play video games. (Like I said, we are now married.) Not to the exclusion of all else, but to reduce anxiety and fill the gaps between the hours I spent job hunting each day.
I discovered that video games had improved to the point where they could finally come close to competing with my imagination. Games had become bigger, more beautiful, more open, and less reliant on thumb-twitch reflexes. I spent a feverish week immersed in Fallout 3, an open-world role-playing game set in a vast and fully explorable post-apocalyptic Washington, D.C., and another week playing the first Mass Effect. I played conventional sci-fi action games like Halo 3 and stealth adventure games like Assassin's Creed. Later, I dabbled in competitive multiplayer shooters like Call of Duty.
The best of these games overwhelmed my capacity to think about anything else at a time when more thinking wasn't terribly beneficial. They provided distraction, but also direction, and a sense of focused calm. I wanted a job, but I couldn't find one. Instead, I played video games. They weren't as good as a job. But they were better than nothing.
After about four months of unemployment, I was hired by a magazine called Reason. But I didn't stop playing. Since then, I have probably played somewhere in the range of 2,000 hours of video games—roughly the equivalent of a year's worth of work in a full-time position.
Did all those hours playing games make me feel fulfilled? Did they make me feel as if I had made good decisions in my life? Yes—and no. At times, I found video games an entertainment experience as smart and satisfying as any novel or movie or television show I have ever absorbed. At other times, I have let go of my controller late at night, overcome by existential emptiness and the realization that I have, yet again, just spent the better part of a day engaged in an activity of no practical value to me or anyone else. I enjoy games, but not without some reservation. Sometimes I go weeks without playing. And if I had to choose between gaming and work, I know I'd pick the latter.
But after long breaks, I have always made the choice to return to video games. And I suspect I always will. Because on the whole, they have made my life richer and better, more interesting and more tolerable.
I pose the question of whether video games offer actual happiness or just a pale simulacrum to Wolpaw, who describes himself as having grown up obsessed with games, and who turned that obsession into a job. "This is a philosophical question," he says. "They're certainly pleasurable." And then he pauses for a moment, as if to consider the question further. "I have spent a lot of time making games," he says, "but I have also spent more time playing games. And I don't regret it."
Neither, I think, do I.
This article originally appeared in print under the headline "Young Men Are Playing Video Games Instead of Getting Jobs. That's OK. (For Now.)."
Show Comments (222)