Watch Barack Obama call President Trump a "total and complete dipshit":
Of course, that isn't real. It's a video created by comedian and Academy Award winner Jordan Peele, employing increasingly easy-to-use programs to demonstrate the coming age of nearly seamless "fake news" images.
A couple months ago, when the Reddit user deepfakes first publicized his ability to swap anyone's face into porn, the reaction was swift and mostly univocal: This was a threat to the very universe! As the reliably alarmist Motherboard hyperventilated:
An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences.
Well, no. For starters, the control and manipulation of images and events has been with us forever. The powerful have always been able to do this, going back to the days when leaders would kill people for publishing unauthorized versions of speeches. Contra Walter Benjamin, whose "The Work of Art in the Age of Mechanical Reproduction" (1936) is one of the most influential essays written in the past century, the ability for more and more of us to detach words and images from the specific time and place of their creation and instantiation is incredibly liberating. The same types of technology that allow us to put a mustache on the Mona Lisa and circulate that image globally also allow people to speak truth as they see it to power.
Reappropriating, misappropriating, decontextualizing, recontextualizing—as all that has become easier and easier over the years, the result has been a wellspring of letting the relatively powerless speak. That was the essential insight of the early scholars of fan fiction, such as the semi-notorious "slash" fiction written by Star Trek fans shortly after the original series was canceled in 1969. Fans started writing stories in which Capt. Kirk and Mr. Spock engaged in sadomasochistic sexual adventures and sell them using the code K/S via newsletters (hence the term slash). As Constance Penley of the University of California at Santa Barbara wrote,
Slash fans do more than "make do" [with mass-produced materials]; they make. Not only have they remade the Star Trek fictional universe to their own desiring ends, they have achieved it by enthusiastically mimicking the technologies of mass-market cultural production, and by constantly debating their own relation...to those technologies.
That same sort of turn is at work in all sorts of political messaging, too, from lefties such as Shepard Fairey and Robbie Conal to right-wing guerrilla artists such as Sabo. Technology that allows us to create and distribute deepfake videos are simply the latest and greatest methods of letting all sorts of people speak in all sorts of ways.
That isn't to say that the rise of videos like Peele's shouldn't give us pause. In an age of deep fakes, fake news, polarization, and paranoia, we need to become better and better at critically reading media and all sources of information. Two-thirds of us already believe the mainstream media publish a lot of horseshit, which is a good start. Back in the 1990s, as cable news started to proliferate and dictate what we considered news and reality, shows such as The Daily Show arose to help teach us how to read tropes and motifs more critically. Even before that, postmodern TV programs such as Mystery Science Theater 3000, The Simpsons, Beavis and Butt-head, and Space Ghost: Coast To Coast offered weekly lessons in how to consume media critically. They were funny as hell, but by foregrounding the act of interpretation (even or especially by asshats like Beavis and Butt-head) they also gave us a set of very useful tools.
What the Obama vid above and others like it drive home is the need to step up our game. This is one of the main lessons to emerge from those Russian trolls posting ads and dubious info on Facebook, too.
We will never be able to rein in fake news; indeed, we will never even be able to agree on its precise definition. But that doesn't mean we're powerless. Writing recently in The New York Times, University of Maine journalism professor Michael Socolow laid out a tentative program that would help "prevent smart people from spreading dumb ideas." Among his thoughts: Don't share surprising news that doesn't give links to evidence or supporting facts, be skeptical when a story perfectly confirms your most-intensely felt beliefs, and always ask, "Why am I talking?" The way forward is always through empowering individuals to be better filters for themselves, not delegating authority or interpretation to the government or other gatekeeping institutions.