The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Should Congress Pass A "Deep Fakes" Law?
A few tentative thoughts.
Axios reports that several important legislators have proposed new criminal laws banning the creation or distribution of so-called "deepfakes," computer generated videos that make it seem like someone did something they didn't actually do. The technological ability to create deepfakes has caused a lot of justifiable concern. But I wanted to express some skepticism about the current round of proposed new criminal laws.
Let me focus on one prominent example discussed in the Axios story, Senator Sasse's proposed Malicious Deep Fake Prohibition Act. Under the bill, it would be a federal felony for individuals to:
(1) create, with the intent to distribute, a deep fake with the intent that the distribution of the deep fake would facilitate criminal or tortious conduct under Federal, State, local, or Tribal law; or
(2) distribute an audiovisual record with— (A) actual knowledge that the audiovisual record is a deep fake; and (B) the intent that the distribution of the audiovisual record would facilitate criminal or tortious conduct under Federal, State, local, or Tribal law.
This strikes me as a bit odd. It's already a crime to commit a crime under federal, state, local, or tribal law. It's also already a crime to "facilitate" a crime -- see 18 U.S.C. § 2 at the federal level, and state laws have their equivalents. Plus, it's already a tort to commit a tort under federal, state, local, or tribal law. This new proposed law then makes it a federal crime to either make or distribute a deepfake when the person has the intent to do the thing that is already prohibited. In effect, it mostly adds a federal criminal law hammer to conduct that is already prohibited and that could already lead to either criminal punishment or a civil suit.
Even if that seems a reasonable goal in theory, I think it's a bad idea to make federal crimes that prohibit acts undertaken in furtherance of any criminal law or tort. We see these provisions occasionally in federal criminal law, but experience with them suggests that they're misguided. I understand the goal. You want to punish an act that worsens an already-bad thing. That seems reasonable at first blush. But the problem is that it underestimates how much conduct is conceivably a low-level crime or tort, and how unrelated the crime or tort may be, when the statute might be applied.
We've seen this problem with the Computer Fraud and Abuse Act. The CFAA makes it a felony to commit an unauthorized access "in furtherance of any criminal or tortious act in violation of the Constitution or laws of the United States or of any State." 18 U.S.C. § 1030(c)(2)(B)(ii). Federal prosecutors have been very creative in coming up with such crimes and torts. For example, in the Lori Drew case, the government charged Drew with violating MySpace's terms of service with intent to further the tort of intentional infliction of emotional distress under state law. In United States v. Auernheimer, the government changed the defendant with unauthorized access under federal law in furtherance of unauthorized access under state law. And in United States v. Cioni, the government charged the defendant with unauthorized access in furtherance of another federal unauthorized access statute. The law is so filled with overlapping crimes and low-level torts that it's often easy to find something that might apply.
Consider an example of how the broad language might work with Senator Sasse's bill. Imagine Sally creates a deepfake in which she imposes her own face on a video clip of President Trump at a political rally. Sally thinks her clip is hilarious, as it really looks like Sally is President. The video is so funny that Sally wants to show it to her friends. She decides to throw a party with a live band at which she will hand out copies of her hilarious deep fakes video. The live band is very loud, however. It's so loud that the party is tortious under state law, as it's a private nuisance to her neighbors.
Would Sally's loud party be a federal felony under the Sasse bill? It might be, I think. Sally would be distributing copies of a deepfake video. It's a deepfake of herself, but that doesn't appear to matter under the Sasse bill. And she would be distributing the deepfake video with the intent to faciltate conduct that is a tortious nuisance under state law. Distributing the videos would plausibly be "facilitating" the loud party because the distribution of the videos would be the reason for the loud party and the reason people would attend it.
This is obviously far from what the drafters of the bill intended. But that's the problem with making it a federal crime to do something with intent to further any crime or tort. There are a ton of low-level crimes and torts out there. Intentionally facilitating any of them violates the new federal crime even if the harm has nothing to do with the harms that Congress had in mind.
The Sasse bill also has a potential problem of not distinguishing between devices and files. Reading the bill, it prohibits the distribution of an audiovisual record with the intent that the distribution would facilitate tortious conduct. But does that include a distribution of the physical device that includes the record? Let's say Bob has an old cell phone that he knows contains a deepfake. Bob gets mad at his neighbor Carl, and in anger he throws his old cell phone at Carl. Is that a prohibited act of distributing a deepfake in furtherance of the tort of battery, and perhaps even a 10-year felony (instead of a 2-year felony) because it "facilitates violence"?
Maybe, maybe not. But the fact that this is even a question suggests that the bill's language is overbroad. I'm a big fan of Senator Sasse, and I commend him for trying to take on this important question. But I think any deepfakes law should be a lot narrower than this initial draft.
Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Should Congress Pass A FILL_IN_THE_BLANK Law?
The answer is almost always "no".
So where do you draw the line between when Congress should/should not pass a law?
"...Bob gets mad at his neighbor Carl, and in anger he throws his old cell phone at Carl. Is that a prohibited act of distributing a deepfake in furtherance of the tort of battery, and perhaps even a 10-year felony (instead of a 2-year felony) because it "facilitates violence"?..."
Is it a tort of battery? Not if Bob first removes the battery from the cell phone. [rimshot] 🙂
"Distributing the videos would plausibly be "facilitating" the loud party because the distribution of the videos would be the reason for the loud party and the reason people would attend it."
While I'm pretty sure the proposed law still needs some work, I don't think I buy the plausibility of this claim. Regardless of the reason for the party, the reason it's a private nuisance is because the band turned their amps up to 11. The deep fake didn't make it any easier for the band to do so. Perhaps if the party itself was so large that the party goers were loud enough to constitute a private nuisance, but not if its just the band.
"It's a deepfake of herself, but that doesn't appear to matter under the Sasse bill."
Should it? Leaving aside this particular example, a deep fake of yourself could be just as malicious and damaging as a deep fake of someone else.
But the party existed to facilitate distribution of the deep fake, and Sally wanted it loud because that was how she wanted it, in furtherance of her distribution. She didn't want a quiet party. She chose the loud band because she wanted a loud party.
People have been prosecuted for far less implausible rationales. Remember the woman who was charged with chemical warfare because she put some toxin on a doorknob?
You've reversed the condition. Now the tortious conduct is to facillitate the distribution of the deep fake, which would not be illegal under this law. It's only illegal when the distribution of the deep fake was to facilitate the tortious conduct.
I did not think I had reversed it. Further contemplation confirms I do not think I have reversed it. But IANAL.
Just shows to go ya that laws can be twisted as much as wanted.
You explicitly reverse the condition. "But the party existed to facilitate distribution of the deep fake," For it to be illegal under this proposed law, it would have to be the other way, with the distribution of the deep fake facilitating the party.
Wouldn't most cases where you want to prosecute someone for a "deep fake" already be covered under other laws?
Make a video showing someone saying or doing something they didn't say or do, and try to pass it off as true? Well, that's libel. The media is different, but it's not terribly novel.
Maybe a law clarifying that fabricated footage of other people can be libel might be appropriate, but I just don't see the need for it to be it's own crime.
Regarding the thought of "using deep fakes to make it look like you're more awesome then you are", I think it falls under the same heading as "lying about your service record". It might be terribly improper, but making it illegal, in and of itself, is probably too far.
Not only that, it rally wouldn't actually prevent anyone from making or distributing a deepfake, it just creates another criminal charge that can be tacked on after they do
OH NO! We're going back to that horrific dark age...er 20 or 40 years ago where we didn't have effective omnipresent surveillance! We must past more laws curbing freedom of expression to protect our politicians and oppressed womenfolk because they're going to be victimized by this...somehow.
The more things that are illegal, the more people that are convicted of felonies. And that means fewer voters and gun owners, and more people on government assistance because on one will hire them. It's a transfer of power from the people to the government.
The more things that are crimes the more people might commit crimes without even realizing it. This means that people are less likely to speak out against the government, for fear that rocking the boat might make someone notice that i they forgot to dot, and the t they forgot to cross
I think Prof. Volokh's example is a stretch, but we all know these things happen. Laws aren't never-changing things. Prosecutors (and others) take them and apply them in ever more, creative ways.
Such as?
Such as the development of RICO statutes was intended to fight organized crime but was later used by the National Organization for Women against anti-abortion organizations.
(NOW isn't unique -- and they had a legitimate beef -- nor are all creative types on the Left. I use this example because it is tied to some writing I was doing on Nat Hentoff, and the episode was part of what drove him from the ACLU. The cases meandered through the courts for a decade, getting to SCOTUS thrice, the last of which was the 2006 iteration of Scheidler v. National Organization for Women, Inc.).
Of course that should be Prof. Kerr, not Prof. Volokh.
Standard thinking for legislators. Hey! We've got a problem. How do we fix it? Well, we don't really know, but let's make it a crime and see if that works.
I don't think we need a deep fake law. It seems to me that all the "misuses" of it that we can claim as tortious or criminal are already covered by those laws. Whether defamation, fraud, etc., the laws are already in place. This is just a different means of violating them.
I agree that this proposed law is a bad idea.
In my opinion, we ought to declare such laws as unconstitutionally void for vagueness. It takes way too much mental energy to determine which actions are legal and which are federal felonies under such a statutory scheme. The reason that we have written laws is to give people proper notice about which activities are illegal and which are legal. A statute like this fails to do so.
Along these lines, I am not even sure the phrase "deep fake" is clearly defined.
"the term 'deep fake' means an audiovisual record created or altered in a manner that the record would falsely appear to a reasonable observer to be an authentic record of the actual speech or conduct of an individual"
Does a person even know what a "reasonable observer" believes to be authentic? Does the speech by Sally in Orin's example even qualify? I kind of think it doesn't, because I don't think a reasonable person would think that this speech would be a record of Sally's actual speech or conduct. But, that Orin thinks it might be a deep fake while I am not sure is just another example of how this proposed law fails to give people adequate notice of what conduct is legal, and what is a felony.
Finally, is this law actually trying to solve a real problem? Why aren't state and federal prohibitions on fraud enough? Fraud is already illegal. What bad conduct are we trying to make a felony under this law that isn't already a felony?
"Finally, is this law actually trying to solve a real problem? Why aren't state and federal prohibitions on fraud enough? Fraud is already illegal. What bad conduct are we trying to make a felony under this law that isn't already a felony?"
I'm not sold on this law (especially not at the federal level), and I'm not sure how advanced the deep fake technology is, but eventually I think it will take new laws to address. Suppose you create a deep fake that puts someone's head on the similarly-sized body of a person in a porn scene (porn has led the rest of the internet revolution, so it'll be on the leading edge of this too) and release it on the internet for free. My very limited understanding is that it's already possible, though difficult, to make something like that fairly convincing, and it will only get more convincing over time. Something like that doesn't fit neatly into traditional legal categories (you aren't impersonating anyone, you aren't seeking any material gain, and might very well need a separate law to address it.
Also, existing fraud laws often require a similar showing about a reasonable observer, so I'm not sure how much success you would have with your void for vagueness challenge. Besides, would the response be to just delete that requirement and make even shitty attempts felonies?
Defamation.
All this law does is criminalize what would otherwise only be a tort.
Kerr's Sally hypo is not a deepfake. It describes something more akin to a meme. A deepfake is cgi that's so good that you can't even tell that what you are seeing and hearing someone say on your chosen av source wasn't actually said or done by that someone. So imagine instead that Sally distributed a deepfake of the local police chief saying that on the night in question, it won't be against the law to have loud parties. Or even better, that Sally's frenemy knew about Sally's party and distributed the deepfake police chief video hoping to trick attendees over at Sally's into feeling free to be loud.
It seems to me that what you COULD do, rather than making deep fakes an independent (And redundant!) cause of action, is treat their creation as satisfying any requirement to prove malice or intent.
So, you defame somebody by making a false statement, the defamed has to prove malice, prove that you knew the statement was false.
You defame somebody by going to the trouble of making a false video of them? Malice proven by virtue of that, likewise that you knew it was false.
^this.
"...prove that you knew the statement was false."
How do I know (and prove) a statement is false?
If I said, "(insert Jewish law professor name) secretly likes to wear Nazi soldier uniforms," 1. how do I know that's false and 2. how do I prove that's false.
If I create a deepfake video which shows (insert Jewish law professor name) wearing a Nazi soldier uniform, is that actionable for defamation?
BTW I agree this law isn't going anywhere and am just thinking about different points.
I'm saying it should be treated as fairly conclusive evidence of actionable defamation, because you knew quite well that you were creating fake evidence of your claim.
Now, a defense could be that the fake wasn't a fake of something defamatory. But, again, if the fake isn't somehow watermarked or stated to be fake, you've clearly generated a lie about somebody that was meant to be believed, and that it wasn't defamatory should now be on you to prove.
Nope! I have no clue if my statement is true or not (i.e. I made it up), and therefore can neither prove nor disprove the truthfulness of the statement.
You're talking about whether the person's internal mental state can be proven. I'm talking about who the burden of proof should be on.
Once you exert yourself to create a fake record of events, and represent it as a real record, the burden of proof should be on you, not the defamed, because you can't be unaware you were creating a fake record, no matter how irrationally certain you might be that the events portrayed really happened.
You're pedaling a lie, and know it, even if you think it's "fake but accurate". This should free the defamed of the need to prove your mental state.
If you try to pass it off as real? Absolutely.
If it's distributed as "obvious parody" or "artist's rendition"? That's a lot squishier.
Now, there's also the possibility that someone will strip away the context of the obvious parody and try to pass it off as "real", but then the person guilty of libel is the person who deliberately altered the context to fool people, not the person who made the original parody.
Simply put, the point where it goes from "free speech" to "defamation" is when you try to pass your art off as "real".
What you're saying is reasonable but unnecessary. Of course if one is found to have forged a defamatory document (or video) one is going to be found to have acted with malice (in the defamation sense); that's practically a tautology. You don't need a new law to deem that deliberately lying demonstrates intent to lie.
Don't tell us; tell him.
Deep fakes are high quality forgeries that resist attempts, technological and otherwise, to determine if they are fakes.
As this can be subjective, it is ripe for abuse by itself. In the Sally example, the body of the president would trivially be determined to be real footage of the real president. So it would not be a deep fake, no matter how good.
More likely, if someone was being nasty, it would be putting a famous person's head on the body of someone in sex or committing a crime, or a pretended crime as crime footage would be largely known, especially when rolled out to embarrass some politician nationally.
Sasse's legal piling on is a form of legislative virtue signaling. Very similar, for example, to other virtue signaling done by the senator with respect to Trump.
If the fake she makes is a video of herself doing things, what tortious conduct is there?
Not only is this proposed law a bad idea, it's completely unnecessary - and completely predictable.
Reason had an article (a few month back?) documenting the hysteria that pops up every time there's a new innovation. We heard the same parade of horribles about static photoshop, home VCRs and standard photography development to name just a few. Every generation has it's own "you can't trust your own eyes anymore" moment with predictions of the utter collapse of civil discourse and legal proceedings. The catastrophe consistently fails to come. This one will also fail. "Deepfakes" are still fakes and in the long run will be no harder to debunk than all the previous generations of "fake news" sources.
Well, assuming the technology allows for truly undetectable fakes (or at least undetectable to the level that very few defendants would have resources to show they're fake), how exactly do you know the catastrophe hasn't already happened? Because you've come up with some way to determine that the oft-repeated defense "that wasn't me" consistently remains a lie?
A true deepfake would encompass not only the visual image itself, but its context. For example, it's a wee bit tougher to "debunk" he-said-she-said of what happened in her room when everyone knows you were in her room and she has a picture.
My point is that the same claims of undetectability have been made at every innovation in technology. It wasn't true then and civilization consistently fails to collapse.
If your goal was just to refute the absurd, fine, but the fact that civilization hasn't collapsed doesn't mean there isn't a problem -- perhaps even a significant one.
I agree that it's unlikely that people won't be able to figure out that the deep fakes are fakes, eventually. But there could be some significant damages in the meantime that don't seem to fit into traditional tort or criminal categories, so I do think the law will have to expand to address it. It probably shouldn't start out at the federal level, and perhaps it shouldn't be this broad.
Ben Sasse is himself a deep fake.