"Video games can never be art," thundered movie critic Roger Ebert in 2010. Responding to a provocative TED Talk that argued the opposite, Ebert dismissed the examples mentioned in the presentation, sniffing that they "do not raise my hopes for a video game that will deserve my attention long enough to play it. They are, I regret to say, pathetic."
Following Plato, Ebert argued that "art should be defined as the imitation of nature" (emphasis in original), that it is "usually the creation of one artist," and that games-unlike painting or literature-have rules and winners, thus disqualifying them from consideration as artistic expression.
Famous for having introduced stark, gladiatorial judgments to film reviews, Ebert (who died in 2013) didn't just give video games a big thumbs down, but two thumbs smack in the eyes. "Why are gamers so intensely concerned, anyway, that games be defined as art?" he asked, exasperatedly name-checking great competitors in chess, basketball, and football. "Bobby Fischer, Michael Jordan and Dick Butkus never said they thought their games were an art form…Why aren't gamers content to play their games and simply enjoy themselves? They have my blessing, not that they care. Do they require validation?"
Ebert was likely channeling the scorn he must have endured for decades as a defender of an expressive form that has only recently (if controversially) joined the hallowed hall of "serious" art. If you've gone to college in the past 40 years and told anyone you were taking a course in "film studies" (or worse yet, confessed you were majoring in the same), you doubtlessly inspired an eyeroll or two. In his essay collection Boxed In (1988), New York University's Mark Crispin Miller recalled the skepticism he faced upon entering the academic job market in the late 1970s. "You're solid in the Renaissance, and you do this film stuff," his advisor told him. "It's always good to have a traditional field…and to do some bullshit on the side…'Film studies.' Feminism. 'Children's lit.' It looks good. On the side."
So do "gamers"—those who make games, those who play them, and those who consider them as something more than a "pathetic" form of passing the time—require validation? Of course they do: just like devotees of all other forms of creative expression. This validation takes two basic forms.
The first is political and legal. Pretty much since the introduction of the arcade version of Pong in 1972, video games have been attacked for either causing or aggravating all the sins of the modern world. In this, they follow in the footsteps of virtually all new forms of popular entertainment, from novels to film to television to comic books and rock music. "There are no Hardys nor Chekovs in the movies," wrote a critic in a 1926 issue of The Dial, the journal that was in some ways the house organ of literary modernism. While the Dial piece held out hope that film might one day transcend its "primitive stage," its thrust was summed up by this all-caps headline: "NOT THEATRE, NOT LITERATURE, NOT PAINTING."
Video games, we've learned over the years, induce hyperactivity, epilepsy, and obesity in kids ("step away from the video games," counseled President Barack Obama in 2009). They desensitize us to violence, coarsen our language, and ruin our sleep habits. They've killed reading as a pastime and promoted unhealthy body images. They have singlehandedly destroyed-or maybe just further dumbed down-the movie business.
Given their vast popularity among younger Americans, it's not surprising that game makers have labored under constant threats of censorship, stifling regulation, and legal action. In the early 1990s, a "voluntary" ratings system restricting who could buy what games was foisted on the industry after Congress held dramatic hearings in 1993 about the ultra violence in titles such as Mortal Kombat, in which combatants kicked, punched, and hacked away at each other until blocky, heavily pixelated fountains of blood spewed from amputated limbs and decapitated torsos.
Liberal and conservative lawmakers reacted to such scenes with cartoon violence of their own, waving plastic guns around the Senate floor like Baby Face Nelson in a bank lobby. "Instead of enriching a child's mind," then-Sen. Joe Lieberman (D-Conn.) fretted, "these games teach a child to enjoy inflicting torture."
A new round of censoriousness came in 2005, when then-Sen. Hillary Clinton (D-N.Y.) declared that the raunchy video game Grand Theft Auto: San Andreas "encourages [children] to have sex with prostitutes and then murder them," and called for a Federal Trade Commission (FTC) investigation of video game ratings.
That same year then-Gov. Arnold Schwarzenegger-no stranger to fictional violence in both film and video game form-signed a law banning the sale of violent video games to any Californian under age 18. The government, Schwarzenegger insisted while fighting legal challenges to the law, has "a responsibility to our kids and our communities to protect against the effects of games that depict ultra-violent actions."
If such for-the-children grandstanding and associated legal threats have mostly receded of late, it's largely because gaming is no longer the province of easily marginalized outgroups such as kids and computer nerds (it helps, too, that the game industry has created lobbying associations). According to the Entertainment Software Association (ESA), a trade group for game makers, fully 58 percent of Americans now play video games, and they're not all acne-riddled teenage boys: The average player is 30 years old, and 45 percent of players are female.
By 2011, when the U.S. Supreme Court struck down the Schwarzenegger ban 7-2, full-throated calls for video game censorship seemed like a relic from another era. Part of that stems from a recognition of games' ability to do more than swallow quarters at a grungy arcade. In his majority opinion, Justice Antonin Scalia noted that, "Like books, plays and movies, video games communicate ideas."
Yet even as video games have become too ubiquitous to prohibit, their predicted damage to the fabric of society has stubbornly refused to materialize. Violent crime arrest rates for males between the ages of 10 and 24 are half of what they were in 1995; among females, there has been a 40 percent drop. Property crime is down across the board, too. Early concerns about creating legions of prostitute killers or pre-adolescent torturers now seem on par with the warnings Tipper Gore made about "the occult" in her 1987 book, Raising PG Kids in an X-Rated World. "Many kids experiment with the deadly satanic game," Gore wrote about the role-playing game Dungeons and Dragons, "and get hooked." Contra Gore, Satanism failed to take root.
The difference in responses to the 1999 mass shooting at Colorado's Columbine High School and the 2012 shooting at Connecticut's Sandy Hook Elementary School is instructive. The Columbine rampage was routinely attributed to the killers' playing of first-person-shooter (FPS) games such as Doom (where players were "space marines" fighting demons) and Wolfenstein 3D (World War II POWs vs. zombie Nazis), even though later analysis concluded, in the words of psychologist Peter Langman, "These are not ordinary kids who played too many video games." Families of dead students sued video game companies under product liability law (the case was ultimately dismissed), and books such as 1999's Stop Teaching Our Kids to Kill: A Call to Action Against TV, Movie and Video Game Violence, by former West Point professor Dave Grossman, argued for outright bans.
In Sandy Hook, while it was widely reported that gunman Adam Lanza spent long hours playing FPSs (as well as the patently innocuous Dance Dance Revolution), there were no serious calls for the censorship of video games or even particularly heated discussions about their content. Gun laws and Lanza's mental problems were front and center in the policy discussion about his actions, not the media he consumed. That's no small shift.
The second key form of video games' validation is cultural and aesthetic. Contrary to Roger Ebert's assertion that "no one in or out of the field has ever been able to cite a game worthy of comparison with the great poets, filmmakers, novelists and poets," video games are increasingly being taken seriously by degree-bearing academics and less-credentialed fans alike. At least one major cultural institution-the Smithsonian in Washington, D.C.-has already staged a major show, The Art of Video Games, just a few months before the Sandy Hook shooting.
The Smithsonian installation not only showcased the increasing complexity of gaming graphics, but also the non-narrative attributes of titles such as Flower (2007), in which the player keeps a flower petal aloft over a range of dreamy and changing landscapes and music. Flower is intended not as a contest per se-you don't have to win or lose-but as an "emotional shelter" for the player (think video game as tone poem). "For the first time we have gamers raising gamers," the exhibition's curator, Chris Melessinos, told reason at the time. "From this point forward, you are going to see a greater, more rapid appropriation and acceptance of video games as anything from art to a worthwhile pursuit."
While there's a counter-argument that being featured in quasi-official outposts of establishment credibility such as the Smithsonian is a form of entombment rather than empowerment, there is little worry that gaming culture will soon lose any of its vibrancy. Indeed, one of the form's most striking developments is the increasing sophistication not simply of games, but the responses to games.
Reviewers of last fall's blockbuster release, Grand Theft Auto V (GTAV) praised the latest installment of the controversial series in terms reminiscent of serious film and literary criticism. GTAV allows players to roam around a fictionalized version of California and assume various identities while undertaking a long and intricate story arc heavy on sex, drugs, and criminality. As the reviewer for Italy's SpazioGames wrote, GATV is "a game that is able to make a sublime parody of today's society, taking advantage of all the excesses and insanities to which the world is slowly getting used."
Such widely held sentiments point to a creative intelligence that Pong and Pac-Man did little to anticipate or inspire. Writing in The Guardian, Keith Stuart called GTAV a "dazzling but monstrous parody of modern life" whose fictional "world drags you in. It begs you to explore."
That doesn't mean the praise was universal. To the contrary: Game-Spot's Carolyn Petit, even while hailing the game's "vast, varied, beautiful open world" and perspective shifts, zeroed in on what she saw as an "unnecessary strain of misogynistic nastiness." GTAV, Petit wrote, "has little room for women except to portray them as strippers, prostitutes, long-suffering wives, humorless girlfriends and goofy, new-age feminists we're meant to laugh at." Perhaps the surest sign that a form of expression is approaching the status of "art" is the creation of a robust community of critics and explicators who aid the audience in thinking through the significance of their shared object of contemplation.
As the University of Tulsa's Joli Jensen stressed in her 2003 book, Is Art Good for Us?: Beliefs about High Culture in American Life, what we call "art" is not a set of fixed forms, media, and genres. It's an ongoing conversation in which all of us question and explore our place in the world through the production and consumption of an ever-increasing array of creative outlets. "It is important," Jensen says, "that we fundamentally respect the tastes and choices of people who are choosing forms different than our own. We should stop thinking that we're talking about something essential in each cultural form rather than constantly renegotiating what is good or bad, authentic or commercial. Thirty years from now today's commercial culture will be 'authentic' culture."
Which brings us back to Roger Ebert's pissy impatience with gamers and their grubby desire for validation. It wasn't so long ago that film studies was dismissed as "some bullshit on the side," a trivial object not worthy of serious scrutiny. The same goes, of course, for most forms of popular music, comic books- now sanctified for grownups as graphic novels-and even novels themselves.
In late 18th century England, for instance, novels were often seen as sub-literary works that inflamed mostly female readers to flights of troublesome fancy. These "foolish, yet dangerous books," in the words of Samuel Johnson, routinely degraded the already-impaired faculties of the "young, the ignorant, and the idle." Calls for suppression were routine and often successful. Just a century later, novels were popular with men and women, rich and poor, and on their way not simply to being respectable, but the dominant expressive form through which individuals worked through the world they lived in.
Surely something analogous is happening with video games in the 21st century. It's not that millions of gamers can't be wrong (to paraphrase the old line about Elvis fans), or that they can't have bad taste. It's simply that as larger and larger numbers of people around the world play Minecraft and create YouTube videos about playing Minecraft, or build online forums about GATV hacks and themes, or talk through their grief at the disappearance of Flappy Bird, what Jensen would call "expressive culture" is self-evidently flourishing. More people are using video games to think about who they are and how they fit into the world around them.
Video games are entering a remarkably fertile period in which creators and audiences alike reflect more consciously on what they're doing and what it all means. Now that the initial official backlash is mostly in the past, the genre is ripe to be explored fully and without apology or even explanation. We should all be on the lookout for the gamer version of Madame Bovary or Moby-Dick. It won't look like a novel and it certainly won't read like a novel, but like all great art, it will perform a similar function of engaging us in an open-ended conversation about our past, our present, and our future.