When the first issue of reason was published in May 1968, hardly anyone knew what a video game was. But that was about to change. That same year, inventor Ralph Baer patented the interactive television device that would go on to become the world’s first home video game console. The very first computer game, Spacewar!, was conceived by a Massachusetts Institute of Technology student just seven years before that.
As reason evolved into something bigger, so did interactive entertainment. Today, video games have leveled up to the top of the home entertainment heap. They’re a $67 billion global business, and roughly half of Americans say they play them every week.
There’s no deep mystery to the popularity of the medium. Video games offer something that reason has championed throughout its existence: individual choice and personalized experience. Gamers aren’t stuck with the pre-scripted narratives of Hollywood, nor are they mere onlookers watching the elites-only competition of big-time sports. Modern games put you, the player, and your individual decisions at the center of the story or the action.
This fall, gamers’ choices are about to expand even further. Sony and Microsoft, two of the biggest companies in the market, are preparing a new generation of video game consoles for release. Early announcements suggest that these devices will expand the roles of choice and personalization still further.
Sony’s new system, the Playstation 4, is built around social networking options that encourage players to capture their in-game experiences and share them with others. Microsoft’s new platform, the Xbox One, looks even more ambitious: The system is designed as a full-fledged home media hub, built to switch between TV, games, and a slew of other living-room media options. It’s voice-activated and motion-controlled, with biometric sensors powerful and finely tuned enough to make the Department of Homeland Security jealous.
Amusement junkies didn’t always have such power at their disposal. And they had to fight government censors and nannying legislators in order to get to where they are. As Steven Kent recounts in his 2001 book The Ultimate History of Video Games, early games were in many ways an extension of the pinball business. Because mid-century politicians associated pinball with in the first place gambling, they spent more than a little time trying to shut the nascent industry down. In January 1942, New York Mayor Fiorello LaGuardia signed an order banning pinball machines from the city. Police were allowed to smash them on sight—and they did. More than 3,000 machines were either destroyed or hauled away as a result of LaGuardia’s order, which stayed on the books until 1976.
It was around the same time that the video game business we know today was getting ready to boom. The first home video game console—the Magnavox Odyssey—hit the market in 1972. Meanwhile, full-size arcade games like Pong and Computer Space were beginning to pop up in pinball arcades and pizza shops around the country. The teen-friendly pastimes were bound together even further in 1977, when video game entrepreneur Nolan Bushnell opened the first Pizza Time Theater—later to become Chuck E. Cheese—in San Jose, California. The kid-friendly concept restaurant fused a pizza shop with a sizable video arcade, and it helped pave the way for the windowless, screen-lit gaming rooms that would litter indoor malls and other suburban shopping centers throughout the 1980s.
At the same time that video games were expanding their retail presence, they were also infiltrating American homes. The year that Bushnell opened the first Pizza Time, Atari, a company Bushnell had founded just a few years earlier, released its first home gaming console, the Atari 2600. After a rocky start, the system eventually became a huge success, with more than 10 million units sold by 1982. Games for the system were laughably simple by today’s standards, consisting mostly of slowly moving jagged lines and solid blocks of color. But they suggested the incredible potential of computerized home entertainment, setting players and developers on a path that would eventually lead to the Xboxes and Playstations of today.
That path was not without its political hurdles. By the early 1990s, video game graphics technology had become far more advanced, and game designers used their new freedom to create interactive worlds that were far more detailed—and in some cases more violent—than anything that had come before. That grabbed the attention of national politicians. Sen. Joe Lieberman, then a Democrat from Connecticut, arranged for a congressional hearing on the marketing of violent video games after seeing splashes of pixilated cartoon blood in such pulpy fantasy action games as Doom and Mortal Kombat.
The hearings made big headlines at the time, as politicians scolded game makers for marketing what they said was graphic content to children. Several states followed up with bans on the sale of violent games to minors. But courts consistently overturned those prohibitions. Now, with the passage of time and the advancement of technology, it’s hard to see the games that inspired such outrage as anything other than nostalgic early amusements. While many of the highest-profile modern games still revolve to some degree around violence, some of those same games offer a visual and narrative sophistication that rivals any pop-culture competition.
And while those competitors are still telling people what the story is and who the heroes are, video game designers have a more laissez-faire approach: Let the customer decide. No longer are players stuck with bland protagonists made to appeal to everyone: Role-playing games give players the ability to customize their avatars and their narratives, making them as weird and wonderful as they want. Play as an orc. Play as a robot. Play as a man or as a woman. Play with a silly-looking helmet or a scary demon mask. Play as a robotic orc wearing a silly-looking helmet and a demon mask. Over the 45 years that reason has existed, video games have been expanding the freedom that pop-culture consumers have to make those decisions for themselves. It’s an attitude that reason has celebrated from the beginning: It’s your game, your life, your story, your choice.