The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Inserting People into Porn Movies: The First Amendment Textbook Problem (2005)
A prediction comes true, for better or worse ....
I added this problem to the second edition of my First Amendment textbook back in 2005, and news accounts suggest that it's now quite timely:
Within ten or twenty years [of 2005], there will probably be consumer-usable software that can easily overlay people's photographs and voices onto movies that depict someone else. The program would automatically and seamlessly alter multiple scenes in which the character is shown from different angles, with different facial expressions, doing different things. (Of course, one can already do this in some measure with photos, but this hypothetical program would be much more sophisticated.)
The user running the program would give it (1) a file containing a movie, (2) files containing photographs of the person who is to be included in the movie, and, optionally, (3) recordings of the person's voice. The program would use image recognition algorithms to replace all of a given character's appearances with the new person, and would adjust the character's spoken words to simulate the new person's voice. The program would guess how the photographed person's features would likely move, or what the unphotographed parts of the person's body look like. It would also let the user provide input about such unknown items, and may let the user modify the person's appearance in other ways.
A filmmaker could thus easily replace a stuntman's image with the star's. Parodists and artists could merge politicians', celebrities', or actors' faces and voices into existing footage (as in Zelig or Forrest Gump). Parents could place their children and children's friends into kids' movies, which may make the movies more fun for the children. Similar technology could embed human actors into computer-generated movies.
But the most common use of the program would probably be pornography. Consumers would pay to download the program [EV adds today: maybe they won't even have to pay!]; separately buy a pornographic movie [EV adds: buying porn? who does that?]; get nonpornographic photographs and possibly voice recordings of celebrities or of acquaintances; run the program to merge the photographs and voices with the movie; and then watch pornography that "stars" whomever they lust after. Some such merged movies might be sold, but many will be made at home, to fit the user's own preferences.
Naturally, many people, famous or not, will be unhappy knowing that they are depicted without their permission in others' home sex movies. Imagine that Congress therefore decides to prohibit the distribution and use of the computer program that allows such movies to be made.
How would such a law be different for First Amendment purposes from normal obscenity legislation? Do you think the law should be upheld (even if that means changing First Amendment law), and on what grounds?
If you think the law should be struck down, what about laws that
(1) prohibit the use of the software to make such pornographic movies without the photographed person's consent,
(2) prohibit the noncommercial distribution of the movies, whether to a small group of friends or on the Internet, or
(3) prohibit the commercial distribution of the movies?
Don't limit yourself to considering whether such laws are constitutional under existing obscenity doctrine. Consider also whether you think there should be an obscenity exception at all, and whether you think it should be broader or narrower than it now is.
Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
I am more concerned about technology which would allow police to alter the video from their dashcams and bodycams.
BINGO! That's the greatest threat and concern.
With cameras everywhere now (home + business surveillance, dashcams [even private vehicles]), cell phones, etc. any image can be altered.
Good point. Seems far-fetched for your local police at this point but very much in the wheelhouse for our federal agencies and their psy ops.
At the moment, yes.
The history of software development has been that the first instances of anything are complicated and "user hostile". Later versions become more "user friendly" until we reach the point where a vast array of functions can be done automatically at the push of a button.
For under $1K, a person will soon be able to buy software to produce feature-length movies with the video quality you see in computer games. The technical aspects will be done automatically for the user. The ability to tell a good story will be as rare as it's ever been.
Wow, well said. When I sample many of the offerings on my Netflix subscription, it feels like we are already there.
This strikes me as little different at the core than outlawing artists from making parodies of political figures if the drawings are too lifelike. Just with a very sophisticated pen. Punish them for err.. 'image libel' if they assert it is real. Certain targeted and extremely drawn out cases of nonpublic figures passing a high bar may fall under harassment under certain circumstances. But otherwise this is a fairly clear cut case with no reason for interference other than our recent fetishization of women as STRONG INDEPENDENT poor helpless babes.
If you read again, you might note that EV was writing about people doing this at home for their own, er, consumption, so libel etc wouldn't work
Then it becomes a thoughtcrime and even more absurd to want to punish it.
I am allowed, with impunity of any sort, to allow myself to think of having sex with anyone, any thing, or any concept. These thoughts can be highly graphic if I so choose. I have not a clue as to how making them actually graphic, instantiated as 1s and 0s, is any different than instantiating them in neurons. So long as I don't imply that I actually had sex with them, I could tell anyone I like about my sex fantasies regarding this object, and even charge admission to describe this fantasy, Again, so long as this film acknowledges upfront that it is not real I see no conceivable problem. If it were to imply, or be taken to imply that it represents actual activity of the sex object thus pictured then the creation of the video would be (probably) libellous.
That's an interesting point, and a way of looking at the issue that I had not thought of before. One concern I have, in response, is that it's super-easy, with today's technology, to get rid of any warnings you might put in. For example, in your created, fictional, film of you having sex with [fill in name of celebrity], you put "This is fiction; Mr/Mrs X did not participate in any way in the creation of this film" at the very front of your flick. And maybe you even run a crawl every 5 minutes at the bottom of the screen, to be extra extra careful. Within 30 minutes of you publishing your movie, someone who's downloaded it will have digitally removed that initial warning, and then republished it. Perhaps, even with a new, false, warning: "This is the movie that X never wanted to see the light of day!!!"
I'm not sure that this fear should change the constitutional analysis. But it is something that will be happening. Alas.
Isn't this the same as hiring look-a-like actors?
I would not ban or restrict the use of the software or the sale of its output at all -- except that I would have the law insist that any such faked movie be clearly identified as fake when sold or distributed, and preferably watermarked so that someone with bad motivations can't easily "sanitize" it and pass it off as authentic so as, for example, to "prove" to voters (or to a court considering a defamation case) that an opponent did some nasty deed he actually didn't.
I agree with the other poster that sharing the video without that label should constitute defamation, even if the identity of the person depicted is never spoken. I would also impose the same restrictions even if such a video were made by other means, such as an impersonator, rather than by software as described.
Of course such things are already commonly produced (as by stunt doubles in movies) and if the video is labeled as fiction that should suffice.
I think this has to fall somewhere between the analysis of the tort of invasion of privacy and likeness rights. Likeness rights probably weight heavily into the third question. People are probably buying the commercial version which they wouldn't do absent the use of the likeness. On the con side you probably aren't creating in likelihood of confusion about actual endorsement by the figure and you probably aren't diluting the demand of the figure's normal work. Also have the parody/commentary issue. Maybe Uncle Luke and 2LiveCrew vs MCA is the most relevant precedent.
On the non-commercial side production, I think you have a hard time with the invasion of privacy side clashing with the first amendment. Obviously parody still comes into play like with commercial production. As our friends at Gawker can testify, SCOTUS has not struck down the invasion of privacy tort despite the content based inquiry of "newsworthiness."
On the software side, I don't immediately know how it gets struck down. I guess you could say software code is a written art and deserves the same protection under the First Amendment as a book. I don't know of any court that has gone that way, but it could. Absent that, I think you're looking at it like other software. Can you prohibit someone from distributing computer virus for other people to use and "victimize" people with? As far as a I know.
The hypothetical first posits that Congress bans the use of the software to make porn without consent of the virtual actors. This blanket ban is content-based and overly broad, a poor fit between the law's objectives and its means, and not limited to obscene or CP content. The banned software has lawful, non-invasive or non-infringing applications. The narrower bans, on distribution or on commercial distribution, would be harder to defeat. The Govt would argue that unpermitted distribution of falsified content that a reasonable person would find highly offensive is already prohibited (being potentially actionable, whether as a false light invasion of privacy, violation of right of publicity, libel or a crime as in state statutory revenge porn), not newsworthy, and thus categorically outside the scope of First Amendment protection. But even if one of these narrower bans stood, the maker of the software should be free from potential liability, the same as CraigsList, BackPage and the like ought to be free of responsibility for the ads posted there.
At one point, PGP was printed in book form because "It would be politically difficult for the Government to prohibit the export of a book that anyone may find in a public library or a bookstore." Not sure whether this actually established any sort of precedent.
[EV adds: buying porn? who does that?]
I don't have any experience on this, but I assume that a significant number of customers buy a significant amount of the stuff; presumably the producers have significant costs to recover and profit on top of that.
How much do you pay CBS to watch football on Sunday? They have production costs, too.
I pay them the time I am exposed to advertisements during the games. They exchange that time to the advertising businesses for money.
My guess is that the situation is similar to so-called "Pay to Win" games, see, e.g., the first comment under this Stack Exchange answer. Relatively few people pay anything at all, but there are enough big-spending "whales" to make the business as a whole financially viable.
Few if any mainstream businesses want to advertise on porn sites, IIRC there's even technology specifically designed to make sure that your ad doesn't wind up on one without your consent. Although I suppose Ashley Madison and its ilk are fairly big business, perhaps even big enough to support the free porn industry?
Jerry Pournelle had discussed the phenomenon of "digital zombies". Essentially, for a number of public figures, enough video footage exists that you can portray these people doing pretty much anything you want to show them doing, though body parts that have never been uncovered in a video may be subject to guesswork.
Apparently a lot of celebrities are licensing their appearances, so that anyone working their likenesses into a video without their permission is committing a trademark infringement.
And yes, it seems we're entering an era in which pictures and video are no longer proof of anything.
So, if you're gonna fake (picking a name out of a hat, at random) Tom Cruise for your fictitious porn, how do you decide on his, um, length? Can you get into legal trouble for shortchanging him, in the (again, um) length department? For overstating his (yet again, um) "talent?" Will there be a legal requirement to contact Mr. Cruise beforehand, to say, "Tom, we're gonna make a created porn where your likeness will be used. We've initially decided on 6.25 inches . . . any comment? Want us to make that longer or shorter? Any feedback on girth? We'd like to be accurate, of course, to the extent you'd want to cooperate?"
Or will we (parody protection??) be protected if we use celebrities that we don't like (Trump: 1 inch!!! Bill Clinton: also 1 inch!!!) and depict them unfavorably?
Frankly, I am already surprised that no one in the industry has come out with a depiction of our president's pee pee tape . . . I'm a bit disappointed in the lack of initiative by our nation's filth purveyors.
It seems an unlabeled video would easily qualify as defamation -- it is an assertion that the person voluntarily created a sex tape or agreed to participate in a porn movie. Probably libel per se.
Whoever holds the copyright to the "real" images could have a claim.
The more interesting issue would be a false light claim or defamation by inference if there was a disclaimer. Maybe you could argue that the disclaimer is like a question mark that will normally, but not always defeat a defamation claim.
That being said, this may end up giving celebs (and all of us) more privacy in a bizarre way because even if the video is real, we can now say "it was fake" using the fancy software.
Finally, the dash cam/body cam issue, it highlights the importance of preserving and disclosing the metadata with these videos, where any alteration could be detected.
This will almost certainly come up as an issue in child pornography.
In 2002, the Supreme Court struck down a ban on "virtual child pornography" (Ashcroft v. Free Speech Coalition). However it did not strike down a section of the law that banned such things as grafting a child's school picture onto a naked body. Chief Justice Rehnquist was adamant that there was a need to control virtual images "that are virtually indistinguishable from real children engaged in sexually explicit conduct."
Well. How indistinguishable is indistinguishable enough?
NYT article on Ashcroft
In the U.K. and Australia, the realism of the likeness is not at issue. In fact, sexually explicit images of the Simpson engaged in an orgy can get you put away.
U.K. law against images
I'm only allowed two links per post. Here is a link to an article by Slate's William Saletan on what will get you convicted in Australia.
Slate article on Australia law
I find it hilarious that people are actually against the one thing that will destroy almost all real cp once and for all. With virtual images, you destroy almost all the motivation to circulate or ability to find the real thing. This is under the naive assumption we would put more weight on protecting children than punishing people for their creepy kinks. Silly I know.
It's a really interesting topic for me (I've done almost all my legal work in child abuse cases.) I totally agree with you--it's that politicians are (probably correctly, unfortunately) calculating that, "If I support even the concept of creation of ANY type of kiddie porn--EVEN totally fake depictions, where no actual human child came within 10 miles of its creation--I am dead dead dead, politically."
Politicians want to remain politicians. And no one wants to do anything that might (even unfairly) create an impression, "Hey, that woman is soft on child sex abuse."
[Beyond allowed word limit for a post] Continuing . . .
. . . continued . . .
Politically, you can't even get research funded on these issues. For instance: Does viewing fake child pornography reduce or increase desire to view real (ie, with child victims) child porn? Does it increase or decrease a desire to engage in actual child abuse, or increase/decrease incidents of actual child abuse? These are critically important questions (assuming we all share a goal of wanting to reduce actual child sex abuse and wanting to reduce real child porn), but you just can't get funding for this. Way back when, before I went to law school and I was working as a therapist, I asked my mentor (at the crisis counseling facility where we were working) about applying for a grant to study these issue. I got an appalled look, a private meeting, and an explanation that even creating a grant proposal would professionally sink me, and would sink the career of any psychologist, sociologist, doctor, social worker (etc etc) who even hinted at the idea of studying anything that would require creating (even fake!!!) child pornography.
I took that advice to heart, and never brought it up again. And so, 30 years later, it still (as far as I know) as not been rigorously studied.
This is the kind of thing that has certain ethical implications for study as well. Many people believe (and I am one of them) that viewing pornography is harmful, and that viewing child pornography is particularly harmful. Additionally, creating *real* child pornography is accepted as harmful for the children involved.
It's difficult to study this because many people would consider it the moral equivalent of studying how much of different poisons is required to kill someone, by selecting 1000 people and administering those poisons to various people until they died....
Of course, the end result is that when trying to make decisions about these things, we're forced to "drive blind", so to speak, but then, I'm inclined to think that, exempting lawyers and politicians, there are certain things so awful, we shouldn't expose people to them, or even request volunteers for them.
If one side wants to argue that a fictional cartoon is destructive to the point where it needs to be banned and that outweighs the destruction that it could prevent, its up to them to make that case. Dodging the burden of proof by arguing the testing itself is dangerous, is pretty nonconvincing when you can't even offer clear reasons or why your position should be the default one. Why is a controlled test where guys look at cartoons as dangerous as swallowing poison? Does looking at a picture of Lisa simpson undressed jab at a real child like a voodoo doll? Will it turn normal people into pedos or pedos into superpedos? Even if you go partway and show that its harmful to the consumer, we're supposed to be in this utopian attitude where people are broadly allowed to engage in selfharmful behavior and we can't ban things just because some think its disgusting see gay marriage and the current kick on celebrating everything transgender.
Just think of how much fun you could have inserting Hillary's face in that movie of Hitler in the Bunker.
If students are allowed to do this to their teachers (forgive my mentioning), wouldn't the school have a compelling interest to intervene, on the basis of student performance?
Without getting into privacy or harassment, it would seem like there's a lot here that could interfere with the functioning of a school or workplace.
No longer hypothetical. Seen today:
FakeApp
I'm curious about where you developed your expertise on porn. In law school? During your clerkship? Or did you just pick it up on the street like the rest of us?