Fake News

Should the Government Regulate Deepfakes? We Asked People on Venice Beach

Deepfakes don't pose a novel threat, and they have many exciting applications that would be stymied by legal restrictions.

|

HD Download

The Chinese app Zao can replace a celebrity in a movie scene with anyone holding a smartphone. Stanford researchers have developed an algorithm that can edit talking-head interviews as easily as text, subtracting or adding words that the subject never actually uttered. And the short-lived app DeepNude allowed users to generate a fake (though believable) naked picture of just about anyone.

A recent Pew Poll found that 77 percent of Americans think the government needs to step in and restrict altered or made-up videos or images. The House is considering legislation called the DEEP FAKES Accountability Act, or the Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act, which would make it a crime to post "synthetic media" unless it's labeled by "irremovable digital watermarks."  

Some fear that deepfakes could bring an end to generally accepted truth. In this dystopian future, a 4Chan-using basement-dweller could alter the course of international politics.

What these breathless predictions overlook is that deepfakes are just the latest extension of a tradition of manipulation that goes back to the earliest days of captured and recorded media. They not only pose no threat to society—they have many exciting applications that would be stymied by legal restrictions.

To test the effectiveness of today's cutting-edge software, we challenged people on the Venice Beach boardwalk to spot the deepfake among a series of real video clips featuring Barack Obama.

Hosted and edited by Justin Monticello. Produced by Monticello and Zach Weissmueller. Shot by Monticello, Weissmueller, and John Osterhoudt. Additional graphics by Joshua Swain. Music by Silent Partner, Jingle Punks, RW Smith, and Riot.

"Technology" by Gregoire Lourme is licensed under a Creative Commons Attribution-Share Alike 3.0 license.

NEXT: Chicago Impounded This Grandmother's Car For a Pot Offense She Didn't Commit. Now She Owes $6,000

HD Download

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. “”We Asked People on Venice Beach””

    Is Leno doing Jaywalking again?

    1. It looks just like him.

  2. These acronyms are getting out of control. Can we please get some sensible acronym control regulation?

    1. Sensible Intelligent Legislation Leading You to Not Automatically Make Everything Stupid

  3. And how do we know these interviews are real, and not some deepfakes concocted in Justin’s sinister, cat-themed basement?

    1. I’m not even convinced that had he showed the videos to people on the Venice Beach boardwalk that it would constitute having shown them to real people.

  4. “A recent Pew Poll found that 77 percent of Americans think the government needs to step in and restrict altered or made-up videos or images.”

    1. That would put Hollywood out of business.
    2. This goes to show you that 77% of Americans think censorship is a great idea.
    3. Interviewing people from Venice Beach on any subject is like polling political opinions of residents of the local insane asylum…only the inmates would make more sense.

    1. “2. This goes to show you that 77% of Americans think censorship is a great idea.”

      Pretty sure it shows a large number of Americans shoot from the hip when some idiot walks up with a list of questions.

  5. In this dystopian future, a 4Chan-using basement-dweller could alter the course of international politics.
    That’s reserved for the editors at The NY Times.

    1. In this dystopian future, a 4Chan-using basement-dweller could alter the course of international politics.

      Basement-dwelling 4Chan users would be able to alter the course of international politics without a manifesto and subsequent shooting spree? The horror.

      1. You know who else dwelled in a (well fortified) basement?

        1. The Institute?

        2. Kasper Hauser.

  6. So they think the people who make deep fakes can’t remove “irremovable” watermarks?

    1. I bet the discussion of putting an irremovable watermark on a deep fake would cause tears.

      On two occasions, I have been asked [by members of Parliament], ‘Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?’ I am not able to rightly apprehend the kind of confusion of ideas that could provoke such a question. — Charles Babbage

      1. We need Eddy to find “GIGO” in that quote.

        1. All I know is, the most frequent question Babbage got from Parliament was “will this machine allow to communicate an image of my privy member to any lady who might be interested in the same? I ask on behalf of a friend of mine.”

  7. OMFG! What if Donald Trump didn’t actually pee on any Russian hookers?!

    1. How do you think Stormy got her name?

  8. “In this dystopian future, a 4Chan-using basement-dweller could alter the course of international politics.”

    Whoever came up with the idea that someone who manipulates things for laughs/because the ability is there/personal benefit/experiment is a basement-dwelling shut-in covered in Dorito crumbs?

    Deepfakes and trained AI audio are a threat. They shouldn’t be, but since a lie travels around social media so fast and is believed for years after it is proven false I can see the threat. The average social media user is borderline mongo-tier and will believe anything. This is easy to take advantage of.

    Hats off to social media companies for giving someone like your average polfag free, anonymous, and easy access to reactionary idiots. Your concept of giving the unheard minority a voice was a grand idea. It’s like open borders. What? You don’t like a polfag coming in? What did you expect? Oh, that unheard minority are spouting nonsense forcing politicians to cater to them or be smeared? Maybe you shouldn’t have given free megaphones to idiots.

    1. Maybe you shouldn’t have given free megaphones to idiots.

      I’m just gonna leave this here.

  9. if in a simulation we *are* the deep fakes.

    1. The real Dillinger wouldn’t have said that.

  10. I for one welcome our new deep fake overlords, because people are best served by not trusting the elites and power mongers.

    I especially fear hiding deep fakery behind stupid laws, because then only the rich and powerful will be able to use it, as with most illegal activities.

    And besides, contraband laws never stop what they purport to.

    I welcome the new deep fakery destroying people’s belief in trusted elites.

    1. >>contraband laws never stop what they purport to

      i love this axiom.

  11. “You can’t just take anyone’s voice and turn it into an Obama video,” Seitz said. “We very consciously decided against going down the path of putting other people’s words into someone’s mouth. We’re simply taking real words that someone spoke and turning them into realistic video of that individual.”

    Alrighty then.

  12. Because streaming audio over the internet takes up far less bandwidth than video, the new system has the potential to end video chats that are constantly timing out from poor connections.

    Nice! But what about Twitter? Can it end Twitter as well?

    1. Does the simulated video still offer cat face mode?

  13. we really need some sensible acronym control regulation I believe
    مشاوره قبل از ازدواج

  14. not like the media and politicians aren’t already making deep fake claims about each other. Schiffty making wild claims about phone calls, CBS taking words out of other phone calls to make a person racist, Dan Rather using the wrong typewriter to forge documents about a president and then claiming they may be forged but the story is real?

    Deep Fake exist already

    1. by the way can we still get deepnude somewhere?

    2. NBC fitting explosives to gas tanks.
      60 Minutes making up the Alar scare.
      Pick a source: “the world’s gonna end if we don’t wear hair shirts, ’cause new ice age, peak oil, overpopulation, global warming!!!!!!!!!”

    3. The antigun video where they ask the gun rights people what they would do about gun control and then cut to them sitting quietly before the interview rather than the ideas they put forth, making it look like they had no ideas.

      Are these just shallow fakes?

      Acceptable because they furthered the narrative?

      PS get woke bitches

  15. The “deepfake” problem is pretty easy to solve.

    Make folks sole proprietors of their image and likeness (and sound), and let them sue for unauthorized reproductions. The “parody” fair-use exception would need to get pared down, such that whats-his-name immitating Trump on SNL is allowed, but altered video to look like Trump saying the same things (in his actual voice) is not.

    The hard part will be structuring things such that it’s relatively easy to win such “deepfake” lawsuits when someone does use your image without your consent, while also making it meaningful regardless of the person’s wealth (that is, Richy McRichPants shouldn’t feel like they can make endless deepfakes just ’cause the judgements are always, to them, small change). Making damages percentage based, rather then an absolute amount, might work.

    But I think the important part is empowering folks to go after “deep fakes” of themselves that they don’t like, with no “can’t you take a joke” exception.

    1. Don’t need new laws for that.

  16. Definitely a good article on the subject. I am ready to wait on any regulation of deep fakes. My opinion may change if real problems or threats develop. I think the best part of this article is the idea that software is being developed that can spot the deep fakes. It may well be that government should put its efforts into supporting the detection software and not try to regulate.

  17. 1. The answer to any question that starts “should the government regulate __________?” is always no.
    2. We need a constitutional amendment against laws with titles selected solely to make a cute acronym.
    3. Just to be sure, let’s ban all social media. (ignore point 1 just for me)

  18. “and let them sue for unauthorized reproductions.”

    How far should this extend?
    Should the lady that got filmed having a bad day on the bus be able to sue who recorded it and posted it to reddit or should she only be able to sue the person that used her face and voice in a porn version? Also, how does one track down who created it? Any video shared on the internet could be a deepfake. You don’t know.

    1. And what about filming cops?

      1. That wouldn’t be an issue. You can still do it. If the cop does something illegal then you use the video as evidence. Part of that process does not require uploading it to the internet to enact mob justice.

  19. Seems to me it can be covered by an extension of existing laws.

    For example, I assume its possible to do the equivalent of libel by using a deepfake to suggest someone did something they didn’t.

    Demands from actors on the other hand that there be some legal “right to one’s image” seem excessive and ridiculous, because it would also mean you couldn’t do a drawing of an actor. It’s also funny, because the demand undercuts the idea that there’s any skill to acting, and they just get paid for their pretty faces.

    Still, I could see “truth in advertising” laws affect deepfakes incorporated into movies or other forms of entertainment, where the producer would have to advertise that he used deepfake techniques to use someone’s image.

    1. “For example, I assume its possible to do the equivalent of libel by using a deepfake to suggest someone did something they didn’t.”

      It’s hard to prove damages. If you’re retired and someone uploads a video showing you do something it’s not like you’re going to get fired.
      Then, there is the other problem of finding out who did it. There are a lot of accounts online by people with names like Mike Hunt and Moe Lestid.

  20. The easy answer to deepfakes is simply to be skeptical of everything and only trust video from reliable sources. Like CNN.

    Oh, and implant a webcam in your skull so every moment of your life is an available record for disproving deepfake alternative facts. Or for creating deepfake alternative facts, as the need arises.

    1. “He said, she said” disputes are going to get interesting when they both have video to prove their point.

      1. Or that won’t be necessary because they’ll both have videos of each other having sex. Would you have a dispute with someone that has video of you sucking on toes while wearing swim floaties and jerking off? Remember, she’s never judgemental until you piss her off.

    1. I did not watch the entire vid.
      The guy has a German (Dutch/Austrian) accent; I wonder if YT cut his legs off ’cause stupid German laws regarding anything to do with Nazis?

      1. No, it seems that YT is introducing some new policies regarding ‘violent content’. It’s WWI history so there’s not a lot of talk of Nazis– really none at all in fact. It’s all about Prussia and the Italians and the French.

  21. If deep fakes aren’t a serious problem in the upcoming election, they soon will be in the years to come. Why bother compiling a Pissgate Memo when you can create a video of a presidential candidate being pissed on out of thin air?

    People will soon find to manufacture bogus videos of their ex-wives abusing their children for the benefit of divorce courts, etc., too. The only solution I see is blockchain technology. In the future, if your video isn’t authenticated as original and legitimate through some blockchain service, it’ll be meaningless.

    1. “People will soon find to manufacture bogus videos of their ex-wives abusing their children for the benefit of divorce courts, etc.”

      On the plus side, I may soon find it easy to superimpose myself on Jimmy Page throughout the entirety of The Song Remains the Same.

      1. Her: “So, what do you do for a living?”
        You : “I sing songs. Here, let me show you this video.”

        Nice.

    2. The thing is

      1. There’s only a very thing slice of time before the people who would care enough to pay attention to the sourcing of the ‘evidence’ would stop paying attention to video.

      2. Those people are a tiny minority anyway – everyone else is just screaming because everyone else is screaming and ‘video evidence’ is just an excuse for them to pillory a person they fully intended to pillory anyway. So it will be exactly like it has been for, oh, the last several tens of thousands of years.

      1. When it becomes easy for people to take videos of those they hate and make them do and say anything they want, they’ll leverage that technology in all sorts of ways.

        There are already phone apps that let people take pictures of themselves and superimpose themselves into major movies and things like that.

        This isn’t just going to be about presidential campaigns. It’s going to be about people making a video of you stealing out of the petty cash drawer and your boss not knowing whether it’s real or not.

        It’s going to be videos of crimes in progress–and the jury not knowing whether the video is real. What if the cops can make a shooting victim do and say anything they want on video?

        The solution is blockchain so that we can authenticate that the video was shot within a certain time frame and that the video has been unaltered since then.

    3. “The only solution I see is blockchain technology.”

      Record the videos. Transfer it to your computer. Do some editing magic. Delete the original source. Upload.
      That way you can get in a fight with your wife, have her scream all kinds of crazy shit at you, record the audio, hire a hooker that looks like your wife, have her get half naked and smear her feces all over the baby’s bedroom, put in the audio, and then use that in court to say she’s unfit. The judge is going to take one look at the poop mommy and give you the kids.

      1. Use the camera’s private key, not yours.

    4. In the future, if your video isn’t authenticated as original and legitimate through some blockchain service, it’ll be meaningless.

      Uh, Ken… kinda my point above is that this shitty band aid works both ways. Somebody authoritative has to control the blockchain and deem what is and is not real/reality. Odds are pretty good that even a massive multi-national corporation stocked with some of the best minds on the planet is going to stamp an official blockchain seal of approval on lots of obviously bogus shit.

      Means, motive, opportunity, circumstantial evidence, mult-factor authentication… we’ve been seeing through much more prolific and elaborate ruses than deep fakes for a long time.

      As we see now, the bigger and more involved or protracted problem is when multiple sources across multiple platforms all shake hands, nod at one another, and say “Donald Trump peed on Russian hookers.” at the same time. Deep fakes don’t mean dick without massive amounts of collectivism and widespread collusion.

      1. “Somebody authoritative has to control the blockchain and deem what is and is not real/reality.”

        Actually, blockchain can be all about not needing to trust the middle man anymore.

        If you can authenticate that a file was uploaded by someone at a specific time and hasn’t been altered since it was uploaded, that would be an enormous improvement. It won’t eliminate all doubt about every video, but superimposing a candidates face on something that happened after the time that file was uploaded will be caught quickly.

        We’re not letting the perfect be the enemy of the good, are we?

        Because something does’t completely solve a problem in every case isn’t a good reason not to solve 99% of the problem. You will always need to be a little skeptical of the things you see. That’s true now, and it will still be true in the case of blockchain. All blockchain will do is make video easier to authenticate and deep fakes easier to detect. That is a significant improvement, but, no, it is not a complete substitute for reasonable caution.

    5. At least someone here is getting the wider implication for society that this is going to pose. Imagine Reason’s shock when police start manufacturing video evidence of people’s guilt. And the hilarious backpedaling they are going to do on “there shouldn’t be laws about deep fakes”

      This is going to be used far and wide to punish people that someone doesn’t like. Just wait until an you get an upstart politician looking to make real change and the establishment has “video evidence” that you were abusing kids. This is going to be phenomenally destructive to society.

      1. “Imagine Reason’s shock when police start manufacturing video evidence of people’s guilt. And the hilarious backpedaling they are going to do on “there shouldn’t be laws about deep fakes”

        To my eye, it’s more like my feelings about educating children–this is a job that’s too important to leave for the government to do.

        What usually happens is that the government sets some standard–and they shouldn’t be the ones setting the standards. If lie detectors are insufficient for the purposes of criminal court, the last thing we need is politicians setting standards for when evidence is acceptable. That just shouldn’t be subject to a popularity contest, and it may require more flexibility than a federal allow will allow.

        Ever heard the phrase, “Good enough for government work”? Private industry may be able to reach standards far above the level that some rent seekers manage to persuade politicians to push through.

        NASA hasn’t even test fired the heavy load capable rockets they started designing–years before SpaceX existed. Meanwhile SpaceX is already launching and relaunching heavier loads. This is the kind of outcome we can expect when the government gets involved in setting standards.

        And the standards of evidence in our court system are just far to important to subject to that influence.

  22. . . . unless it’s labeled by “irremovable digital watermarks.”

    So . . . by something that doesn’t exist and could never exist?

    Because if your deepfake AI is good enough you could just run the video through it to remove the watermark and rebuild a new video from scratch.

    Like what the fuck?

    The people are like the physicist from a story I heard long ago. This one physicist tells a colleague that he has a cunning plan to peer inside the event horizon of a black hole. See, he’s going to tie a rope around a kitten with a camera strapped to it. The he’ll lower the kitten into the hole. The colleague replies that that could never possibly work but the physicist replies ‘ah, but I have this whip.’

    Nerd harder motherfuckers.

    1. “So . . . by something that doesn’t exist and could never exist?”

      People are already doing things like this with blockchain. My first guess would be that we could do this with the Etherium platform already. Imagine a service like the ones that track music royalties for songs–and distribute that music digitally, as well. Think of Mastodon. Think of Steemit. Think of D-Tube, which is a blockchain based video distribution platform that competes with YouTube.

      This stuff already exists. The need and desire for news video to be authenticated against deep fake tech doesn’t exist yet–but as deep fake technology goes mainstream, that need and desire will probably arise.

      Why wouldn’t it?

      1. People are already doing things like this with blockchain.

        That’s not an “irremovable watermark”, that’s simple public registration of the image. It likely won’t satisfy the provisions of this law. Requiring people to sign or identify fake, libelous videos is b.s.; it’s like trying to get drug dealers to comply with gun registration. It’s an absurdity.

        What would work is for reputable publishers to document the provenance of everything they publish in a blockchain: who took it, where, how it was altered, who it was transmitted to, etc. That’s something they can do any time they like, it doesn’t require any government intervention. I won’t hold my breath, however, because it’s the media that like to fake things.

        1. “It likely won’t satisfy the provisions of this law.”

          Just for the record, I oppose any law regarding this. I’m talking about news sources authenticating their video in order to maintain and compete on the basis of their legitimacy–of their own free will.

          There is no need for government involvement, and there are plenty of good reason to oppose government involvement.

          “What would work is for reputable publishers to document the provenance of everything they publish in a blockchain: who took it, where, how it was altered, who it was transmitted to, etc. That’s something they can do any time they like, it doesn’t require any government intervention.”

          That’s my point!

          This technology already exists.

          1. That technology is almost as old as digital cameras. It doesn’t even require blockchain. News organizations have chosen consistently not to use it.

    2. Because if your deepfake AI is good enough you could just run the video through it to remove the watermark and rebuild a new video from scratch.

      This is only half, or less, of the problem and, IMO, puts the cart in front of the horse. The other half of the problem is “How much do you trust the watermark?” or even “Does the watermark, in any given context, even make sense?”

      Would you feel better if a video had a Ministry of Truth watermark on it?

  23. “And the short-lived app DeepNude allowed users to generate a fake (though believable) naked picture of just about anyone.”

    It was short lived because someone ruined everything with their Pelosi/Ginsburg makeout video.

    1. There was a great disturbance in the force, as if a million voices cried out….

      1. ..and then vomited.

    2. “It was short lived because someone ruined everything with their Pelosi/Ginsburg makeout video.”

      There are ‘way more people who should NOT be seen nekid than the alrternative.

  24. Hmm, maybe if people stopped looking to others to tell them what to believe and did something radical like think for themselves.

    1. Look at the hopeless romantic over here.

    2. Or more accurately – if people stopped looking to unknown others and instead talked with people face to face in person.

      For example – if you actually want to know what people think about deep fake – you gotta actually interview them on the street in person. Everyone watching the video of man-in-street interview is subject to yet more manipulation (eg the possibility that the man-in-street interview is itself a deep fake). Only the people who took part know what’s real and what isn’t.

      This is all just mass communications going to v 3.x . A continuation of the manipulation of the human animal – using scientific understanding of ourselves – that has been going on for 140 years or so. What is disturbing is that I’m sure there are a lot of techs working in this field for this purpose who believe themselves to be ‘libertarian’ who don’t realize (or don’t care) that everything they are doing is destroying actual liberty.

  25. This isn’t just going to be about presidential campaigns.
    https://hindutrend.com/shri-krishna/

  26. “They not only pose no threat to society—they have many exciting applications that would be stymied by legal restrictions.”

    Gotta love this take. “Who cares about the lives this is going to ruin? Think of the shekels that will be made in the economy!”

    There’s no threat. I mean, it’s not like wives will produce video evidence of their husbands beating them and their kids. Or of rape. And abuse the law with this technology. It’s not like police will have the perfect tools to manufacture proof of guilt to prosecute people. This is fine. Don’t worry. There’s no threat to society or the individuals that comprise it here.

    1. There’s no threat. I mean, it’s not like wives will produce video evidence of their husbands beating them and their kids. Or of rape. And abuse the law with this technology. It’s not like police will have the perfect tools to manufacture proof of guilt to prosecute people. This is fine. Don’t worry. There’s no threat to society or the individuals that comprise it here.

      Oh, there is a threat. But this isn’t the solution. It’s already against the law to provide fake evidence or to libel/slander people.

      The only thing this law is doing is give the government more tools to blackmail people into legal deals and cooperation, and to oppress political critics.

      But, then, those kinds of totalitarian and police state powers is exactly what you want in a state, isn’t it, Kivior?

    2. I mean, it’s not like wives will produce video evidence of their husbands beating them and their kids. Or of rape. And abuse the law with this technology.

      Yeah, because until the deep fake technology was invented women were completely stymied when it came to filing false allegations of rape. DCFS agents were always sitting around “waiting for the video” before kicking in the door to peoples’ houses and stealing their kids.

  27. I quit working at shoprite and now I make $30h – $72h…how? I’m working online! My work didn’t exactly make me happy so I decided to take a chance on something new… after 4 years it was so hard to quit my day job but now I couldn’t be happier.

    Heres what I’ve been doing…  ,,,

    CLICK HERE►► Read More

  28. If Deep Fakes are not banned, the only reasonable conclusion I can come to is that I will have to quit believing everything I see on the Internet

  29. The House is considering legislation called the DEEP FAKES Accountability Act, or the Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act, which would make it a crime to post “synthetic media” unless it’s labeled by “irremovable digital watermarks.”

    There is no such thing as an “irremovable digital watermark”. This would basically outlaw all “synthetic media”.

    It’s also completely unnecessary, since people can already get justice in civil courts when their image is being misused or the are being slandered/libeled.

    This is just another power grab by the federal government trying to gain new tools for blackmailing citizens and imposing censorship.

  30. Here is the solution to fake news by reputable media: document your provenance with digital signatures. Use forensic cameras (that digitally sign data) for recording, link publish images back to the original, unedited images, have interviewees digitally sign their interviews. Better yet, document it all in a blockchain for everybody to see.

    But that’s not what corporate media want. What they want is to be able to publish with impunity (they’d never be targeted under these kinds of laws, they’d just plead incompetence), while creating a chilling effect for anybody who ever published a meme with a speech bubble coming out of a politician’s mouth (“synthetic image”).

    Recognize this law for what it is: a power grab by censors and corporate media. The solution to deep fakes is much simpler, requires no laws, and can be adopted today by any media organization that wants to.

  31. All the news I recall seeing on teevee is at best mendaciously slanted. Only Reason and the WSJ get any slack from me, and I haven’t since 1980 bought any other periodicals for information value. They used to sell Pravda hurriedly translated into English and I’d buy that, but no more expecting anything to be true there than in “the” local paper, Slime or Newspeak.

  32. I’m not exactly sure what a Deepfake is. Thought it was a football term.

Please to post comments