The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Liking Post That Contains Porn Deepfake Can Lead to Liability, Court Says in Megan Thee Stallion Lawsuit
At least this is so when defendant "also ... allegedly directed viewers of her post to click on her 'Likes' page where the video had been archived" (not clear what the judge would have thought if the case involved solely the "like").
More from the Megan Thee Stallion lawsuit I blogged about below:
Under Section 836.13, Florida Statutes, a plaintiff may bring a civil action against anyone who "willfully and maliciously promotes" an "altered sexual depiction" of her without her consent. The statute defines "promote" broadly, covering actions like publishing, distributing, exhibiting, or presenting the altered content.
Count II of the Amended Complaint alleges Defendant did just that—first by "liking" a post on X.com containing a deepfake pornographic video of Plaintiff, then by directing her followers to her "Likes" page where the video remained accessible. The parties do not dispute that the video meets the statute's definition of an "altered sexual depiction"; they only disagree on whether Defendant's conduct amounts to "promotion." …
By "liking" an X.com post that featured the deepfake video, the video was exhibited on Defendant's X.com account's "Likes" page, "which tracked and displayed 'liked' posts for other users to see." Defendant also brought the video "before the public" when she allegedly directed viewers of her post to click on her "Likes" page where the video had been archived. (See Am. Compl. ¶ 45 ("Defendant intended for this statement [('Go to my likes.')] to encourage her followers and other members of the public to watch the [d]eepfake [v]ideo, which had been added to her 'Likes' page around the same time." (alterations added and adopted))).
Because Plaintiff plausibly alleges Defendant promoted the deepfake video under section 836.13, dismissal is unwarranted.
Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
I hope it was the post reading "Go to my likes" that put the claim over the edge, and not clicking like in the first place.
Good point -- I'm not sure from the opinion what would happen if only the "like" had been present, but I've added a subtitle elaborating on the matter.
But from an abundance of caution you will not be adding a like button to the VC threads 🙂
That is an important issue. Here in New Jersey, the state is threatening police officers with license revocation if they “like” social media pages that the state considers contrary to DEI or “racist.” I have been warning clients that there are serious and complex First Amendment issues at play here.
Given how easy it is to accidentally hit like — especially when using a phone — it would be pretty bad if merely hitting it could lead to liability.
There have been reports of people’s likes being artificially created, i.e. forged, that it’s possible to program a page so that simply clicking on it results in a “like.”
Your comment brought back memories of the MySpace worm from 2005. If you looked at somebody's profile a bit of JavaScript code would send a friend request to the author. He got a million friends. The author figured out how to embed executable JavaScript in a place that was supposed to contain only passive HTML.
How does one prove the 'deepfake' is a false depiction of a person in the flesh ?
The easiest way is to find the original video, document when & where it was uploaded to the naughty website of your choice, then present it to the judge. Maybe hire an expert (or just a clever teenager) to show how to run the face swapping software.
I guess I want to know at what point do fake depictions of
celebritiesanybody rise to the level civil action.Is she saying she never has sex so depictions of her having sex are defamatory? Or, she admits to having sex, but the deepfake sex is defamatory because it isn't real. "Yeah, I have sex, but fake videos showing I have sex is defamatory."
If Mother Jones has verifiable proof of Trump saying "I hate anybody who isn't a white male," and Mother Jones then produces an AI fake where Trump says just that. Is the video by Mother Jones defamatory where fake Trump says such things?
Is an AI video depicting known facts or highly probable situations defamation--even if the video is altered or completely fictitious?
If FoxNews shows an AI video of Biden stumbling up stairs that is entirely fictious, are they liable because Biden has tripped up stairs but not in the fake video?
Are you trying to determine if "fake but accurate" is legally liable?
The law bans admissions of secret recordings but not testimony by eavesdroppers. We think that video evidence is more powerful than audio and audio more powerful than testimony.