I'm a Gamer. The NO FAKES Act Could Get Me in Trouble.
A bill meant to fight AI deepfakes could devastate creativity in games like Fallout: New Vegas, Skyrim, and Minecraft, where mods keep old titles alive.

Congress is considering legislation that could kill the joy of hundreds of thousands of gamers.
I am a recent devotee of the video game Fallout: New Vegas (FNV). For those unfamiliar, FNV is a single-player, open-world, action role-playing game in which players wander the post-nuclear-war wasteland of the Mojave Desert with retro-futuristic weapons and side with factions vying for control. Here's the surprising thing: The game is 15 years old and is still being played by thousands of people every day. Even major titles usually fade into nostalgia within a few years of release. Few are routinely playing other 2010 releases, such as Call of Duty: Black Ops, Mass Effect 2, and BioShock 2.
One of the reasons for FNV's longevity is that the game designer, Bethesda, made it highly modifiable. Players have the tools to make changes to the game—called mods—allowing them to fix bugs, redesign weapons or outfits, and even add new storylines. Each player can make the world their own.
FNV isn't the only game that's built for mods. The Elder Scrolls V: Skyrim has nearly 100,000 mods, while Minecraft has over a quarter of a million mods available.
These types of games are an absolute blast. But Congress could put a damper on all this fun by passing the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act of 2025.
Introduced in an effort to combat the rise of increasingly convincing AI deepfakes, the NO FAKES Act aims to protect individual likenesses and voices by creating a property right to one's "digital replica." The bill would make it illegal to create any realistic, computer-generated version of someone's voice or likeness, or to alter any of their real performances without their permission.
The urge to crack down on deepfakes spreading lies and misinformation is understandable, but the law's broad definition of "digital replicas" isn't limited to celebrity deepfakes; it could sweep up ordinary gamers who are simply enhancing the games they love.
Sports-based games like F1 25, NHL 25, Madden NFL 25, and NBA 2K25 rely on the depictions of professional and college athletes. In Cyberpunk 2077—another favorite in my collection—the player is accompanied extensively by Keanu Reeves as Johnny Silverhand. Robust fan communities have led to mods of real-world celebrities and real people, sometimes with voice lines. Part of the fun of games like The Sims and Elden Ring is seeing how similar one can get to their own or another's likeness on the game's custom character builder.
The NO FAKES Act also creates liability for products or services designed to produce digital replicas without authorization. If a custom character resembles a real person without their authorization, the game could technically trigger liability. Even if this didn't pose much risk of litigation, developers could decide to restrict the range of faces, voices, and customizable features in order to ensure they don't run afoul of the bill.
This will likely lead to censorship. When someone reports that they believe their likeness has been used without consent, the onus will be on providers to remove material "as soon as technically feasible." This creates a heckler's veto, as companies must take down content when it receives a notice, regardless of whether the claim is frivolous or mistaken, or risk additional fines if they delay or are seen as not acting "in good faith."
Companies would also be responsible for preventing future uploads and, similar to the Digital Millennium Copyright Act (DMCA), for adopting a policy to terminate accounts of repeat violators. Major companies may be able to risk the hefty penalties of up to $25,000 per instance for providers, or $5,000 per instance for individuals. But smaller developers can't absorb these risks. Platforms might decide to preemptively remove borderline content, chilling creativity and speech, even in cases where First Amendment exceptions, like parody, apply.
Existing legal frameworks already offer protections that could logically extend to a "digital replica." Most U.S. states already recognize, through right of publicity laws, a person's right to control commercial use of their name, image, and likeness. Privacy, intellectual property, copyright, trademark, and defamation laws may also apply. For commercial uses, such as Keanu Reeves in Cyberpunk 2077, companies already develop fair contracts and compensation. (Rumor has it Reeves received $30 million for the role.) The consequences would fall hardest on small developers, hobbyists, and fan communities making non-commercial games or mods—a distinction recognized by most IP law but glossed over in the NO FAKES Act.
Innovation is at risk in the industry. Developers are experimenting with voice synthesis, procedural generation, and AI-driven non-player character behavior. Not all gamers are excited about these developments—I probably won't be buying games that rely heavily on AI—but that's a matter of personal choice and preference. The federal government shouldn't scare publishers and developers out of trying new things with AI and letting them succeed or fail in the free market.
Game developer Valve, which created the digital distribution platform Steam, is managing the risks of AI in video games the right way. In January 2024, it announced it would require developers to disclose use of AI in games, including pre-generated and live-generated uses, and guarantee that the game doesn't include "illegal or infringing content," and will "be consistent with" its marketing materials. Because these disclosures are transparently published on the game's Steam page, gamers can decide to support a game's use of AI or not.
The NO FAKES Act will chill creativity, stifle fan innovation, and expose both players and developers to costly legal risk. Better, voluntary solutions are out there.