Heavy-Handed Legislation Targets AI-Generated Replicas of Voices and Images
The NO FAKES Act imposes censorship, threatens anonymity, and regulates innovation.
Artificial intelligence has a lot of people either excited or worried. In the latter camp are creative professionals who fear that technology might replace their images, voices, and talents with exact replicas that show up sober for work and don't throw tantrums. Always prepared to capitalize on concerns, members of Congress propose legislation to create a new intellectual property claim in digital representations of real people. Of course, the bill goes further, mandating censorship of digital replicas and regulating technology capable of producing them.
You are reading The Rattler from J.D. Tuccille and Reason. Get more of J.D.'s commentary on government overreach and threats to everyday liberty.
A Right to Your Voice and Image
"Nobody—whether they're Tom Hanks or an 8th grader just trying to be a kid—should worry about someone stealing their voice and likeness," huffed Sen. Chris Coons (D–Del.) upon the introduction of this year's version of the Nurture Originals, Foster Art and Keep Entertainment Safe (NO FAKES) Act. "Incredible technology like AI can help us push the limits of human creativity, but only if we protect Americans from those who would use it to harm our communities."
Coons went on to boast that the bill, which has bipartisan backing, also enjoys the support of "leaders in the entertainment industry, the labor community, and firms at the cutting edge of AI technology." Mention of the entertainment industry is important here, since AI and replication of human likenesses were major issues during the 2023 Hollywood strike.
"If we don't stand tall right now, we are all going to be in trouble, we are all going to be in jeopardy of being replaced by machines," Fran Drescher, president of actors' union SAG-AFTRA, said at the time.
This isn't the first attempt to regulate AI-generated likenesses. Similar bipartisan 2023 bills, including an earlier version of NO FAKES, drew complaints from the Association of Research Libraries (ARL) that they allowed little room for fair use.
Fair Use Fears
"NO FAKES explicitly carves out digital replicas that are used in documentaries or docudramas, or for purposes of comment, criticism, scholarship, satire, or parody, from violating the law. This prescriptive approach offers certainty about the uses listed, but without the flexibility that a fair use analysis requires," ARL's Katherine Klosek objected.
Klosek suggested the existing Copyright Act offered a better model, since it "allows uses in accordance with a dynamic evaluation of factors, including whether a new use adds something to the purpose or character of a work."
Despite such objections, the current version of NO FAKES retains the prescriptive approach, specifying the circumstances under which digital replicas can be used without running afoul of the law. But the revised legislation goes further, dictating a regime of online monitoring and censoring of content that replicates people's voices and images, along with regulation of creative technology.
Censorship and an End to Online Anonymity
"The new version of NO FAKES requires almost every internet gatekeeper to create a system that will a) take down speech upon receipt of a notice; b) keep down any recurring instance—meaning, adopt inevitably overbroad replica filters on top of the already deeply flawed copyright filters; c) take down and filter tools that might have been used to make the image; and d) unmask the user who uploaded the material based on nothing more than the say so of person who was allegedly 'replicated,'" object Katharine Trendacosta and Corynne McSherry of the Electronic Frontier Foundation (EFF).
In fact, the bill creates civil liability for "the public display, distribution, transmission, or communication of, or the act of otherwise making available to the public, a digital replica without authorization by the applicable right holder." The term "right holder" is used since individuals can transfer the rights to their likenesses during their lifetimes. Upon death, "the right is transferable and licensable, in whole or in part, by the executors, heirs, assigns, licensees, or devisees of the individual" with a potential duration of 70 years after an individual's death so long as the right is actively exercised.
Worse, though, as EFF points out, liability also applies to "distributing, importing, transmitting, or otherwise making available to the public a product or service that is primarily designed to produce 1 or more digital replicas of a specifically identified individual or individuals without the authorization" of the individual, the rights holder, or the law. Even multipurpose technology could incur liability if it arguably "has only limited commercially significant purpose or use other than to produce a digital replica of a specifically identified individual or individuals" without authorization.
"These provisions effectively give rights-holders the veto power on innovation they've long sought in the copyright wars, based on the same tech panics," warn Trendacosta and McSherry.
To enforce these restrictions, the law requires online services to remove offending unauthorized material or disable access to it and to constantly be on the lookout for anything that matches "the digital fingerprint of an unauthorized digital replica." Online services that don't undertake a "good faith effort" to comply are liable for specified penalties on top of damages sought by rights holders.
Those bringing action under the law can also request court clerks to issue subpoenas to online services demanding the identity of people who are alleged to have uploaded unauthorized replicas so that they can be sued under the provisions of the law.
NO FAKES Inflicts a Lot of Collateral Damage
Which is to say, the law imposes a significant burden and incentive to be over-restrictive on online companies. That will be a pain in the ass for large firms, but just one more bureaucratic hassle that the likes of Google and Facebook can probably handle, though it will make them more restrictive and intrusive to protect themselves. The compliance costs might be impossible hurdles for small businesses.
For users of online services, the bill further endangers online anonymity and communication. Everything people upload will be subject to automatic filters looking for digital replicas, and companies will be poised to reveal their identities whenever somebody merely alleges a violation.
"NO FAKES is designed to consolidate control over the commercial exploitation of digital images, not prevent it. Along the way, it will cause collateral damage to all of us," add Trendacosta and McSherry.
Reason's Jack Nicastro recently reported that the Senate is considering including a moratorium on state-level AI regulation in the budget reconciliation process. The wide sweep of NO FAKES suggests that a similar pause on excessive regulation is overdue when it comes to federal lawmaking.