Social Media

Pornhub Isn't the Problem. That Won't Stop the Politicized Crusade Against It.

The case against the popular pornography site rests on misleading data and hidden agendas.

|

It's not hard to understand why Pornhub—a giant clearinghouse of user-posted porn clips that gets massive amounts of web traffic—makes a politically popular scapegoat for problems plaguing the internet more broadly. Those most loudly denouncing the clip site and calling for its demise say a new push to shut down Pornhub is a matter of stopping child abuse.

Pornhub isn't without significant issues. But if advocates are most concerned about illicit content involving minors—rather than with trying to police what happens between consenting adults—then there is strong evidence that Pornhub's problems are much smaller in scope than the problems of popular social media sites such as Facebook, Snapchat, and Instagram. That these sites generally get a pass from Pornhub foes and the press suggests there's something more going on here than just a concern for protecting children. For politicians, activists, and media personalities looking to score an easy win, the campaign against Pornhub appears to be more about moral grandstanding and leveraging generalized shame around pornography than addressing the real problem of child abuse and exploitation.

The new anti-porn coalition

Pornhub's prime position in this discourse seems more related to high-profile public relations campaigns against it than the documented prevalence of harmful content.

Major anti-Pornhub campaigns have been led by the group formerly known as Morality in Media—now the National Center on Sexual Exploitation (NCOSE)—and by Exodus Cry, a nonprofit "which presents itself as an anti-sex-trafficking organization, but, per the mission statement laid out in their 2018 tax returns, ultimately aims to abolish sex work entirely," as Tarpley Hitt at The Daily Beast puts it. Exodus Cry stems from a controversial evangelical Christian church called the International House of Prayer.

Calling their efforts TraffickingHub, NCOSE and Exodus Cry have been teaming up to portray Pornhub as a uniquely prolific and unrepentant purveyor of smut featuring minors and abuse. But reports from the National Center for Missing and Exploited Children (NCMEC), tech companies, and law enforcement do not support this contention.

To make their cases against Pornhub, then, crusaders often resort to using statistics in weaselly ways, designed to help casual readers draw false impressions. For instance, in a letter calling for Pornhub to be investigated, Sen. Ben Sasse (R–Neb.) weaves general numbers about Pornhub traffic and searches in with warnings about "the exploitation of human trafficking victims in pornography streamed over the internet."

Shared Hope International peppers its calls to shut down Pornhub with general stats about child sex abuse material. "There were 18.4 million reports of child sexual abuse imagery online in the last year—making up more than one-third of all such reports in the entire history of the internet," the group said in March before accusing Pornhub of "conditioning viewers to tolerate the sexualization of and violence against youth, as well as the objectification of women and girls."

Promotional efforts by Exodus Cry and NCOSE (and the press and politicking that results) also rely on a few other rhetorical tricks. Without evidence, they hold up isolated tales of abuse as representative of Pornhub content more broadly. And despite evidence to the contrary, they suggest Pornhub is particularly bad for attracting and permitting illegal content, with executives at Pornhub indifferent to (or even encouraging of) pictures and videos featuring underage teen girls. Lastly, they suggest that the only feasible solution is to take drastic aim at porn or digital privacy more broadly—sometimes both.

A recent piece by New York Times columnist Nicholas Kristof neatly hits these notes. On Pornhub, "a search for 'girls under18′ (no space) or '14yo' leads in each case to more than 100,000 videos," wrote Kristof, before adding that "most aren't children being assaulted." (Conflating role-playing with actual abuse is also a common feature of anti-Pornhub advocacy.)

Kristof tells us that "after a 15-year-old girl went missing in Florida, her mother found her on Pornhub—in 58 sex videos." But police say 58 is the total number of at least semi-nude photos and videos of the runaway teen (most of which did not feature sex) that they were found across a range of websites, including Periscope, Modelhub, and Snapchat along with Pornhub.

That doesn't absolve Pornhub, of course. But it is yet another reminder that the problem goes beyond this particular site—something anti-Pornhub crusaders tend to studiously ignore.

A Google News search reveals ample activist campaigning and political hubbub about Pornhub, but few stories of actual prosecutions involving predators who used the site. The Florida case is one of the same handful that keeps making the rounds in anti-Pornhub articles. Another oft-repeated story occurred over a decade ago. Meanwhile, countless new stories about mainstream apps and gaming sites being used for exploitation receive little attention outside local crime news.

Here's a recent case out of the U.K. involving Facebook and at least 20 12- to 15-year-old boys. Here's a prosecution in Arkansas involving Instagram and Snapchat. Here's another recent prosecution involving Snapchat. And another. And another. And another. This September case out of California involved Snapchat along with Twitter, Telegram, and ICQ. Here's one involving Snapchat and Whisper.

An April investigation in New York called "Operation Home Alone" involved Fortnite, Hot or Not, Kik, Minecraft, SKOUT, Tinder, and other apps popular with kids and teenagers. In August, a spokesperson for the New Jersey Internet Crimes Against Children Task Force said platforms used by child predators included Kik, SKOUT, Grindr, Whisper, Omegle, Tinder, Chat Avenue, Chatroulette, Wishbone, Live.ly, Musical.ly [now TikTok], Paltalk, Yubo, Hot or Not, Down, Tumblr, Fortnite, Minecraft, and Discord.

Facebook reported nearly 60 million potential child sexual abuse photos and videos on its platforms in 2019. (The company told The Verge that "not all of that content is considered 'violating' and that only about 29.2 million met that criteria.") That makes Facebook the source of 94 percent of U.S. tech companies' reports to NCMEC.

Despite aggressive (and admirable) efforts to detect and report this content, research suggests that plenty of child sexual abuse material (CSAM) is slipping by Facebook filters and staff anyway. A Tech Transparency Project study of Department of Justice press releases from 2013 through 2019 found that of at least 366 cases involving Facebook, only 9 percent originated from Facebook reporting content to authorities.

The National Society for the Prevention of Cruelty to Children (NSPCC), a U.K.-based organization, looked at "1,220 offences of sexual communication with a child" in England and Wales during the first three months of pandemic lockdowns. "Facebook-owned apps (Instagram, Facebook, WhatsApp) were used in 51% of cases where the type of communication was recorded" and Snapchat in 20 percent, NSPCC said in a November press release.

Putting Pornhub in Perspective

The Internet Watch Foundation, an independent U.K.-based watchdog group, says that it "found 118 instances of child sexual abuse imagery on Pornhub" from January 1, 2017, through October 29, 2019.

Terms like child sexual abuse imagery (and CSAM, used more commonly in the United States) are officialese for a broad range of imagery, from the truly sickening to semi-nude selfies shared by older teens.

Mindgeek, the Montreal-based parent company behind Pornhub, has said that "any assertion that we allow CSAM is irresponsible and flagrantly untrue." For a few years now, Pornhub has at least been trying to clean up its act. The company has been doing more—though still not enough, according to many in the porn industry—to crack down on pirated videos, underage videos, and content made or shared non-consensually while also helping adult sex workers share and profit from their work.

In the past few days, Pornhub announced new policies surrounding content and removed all videos from non-verified accounts.

Many of these changes have long been on the list of porn performer and producer demands. Folks heaping praise on Kristof, Fox News host Laura Ingraham, and other Pornhub-critical members of the media overlook the years of work that those in the industry have been doing to pressure Pornhub into working harder to prevent illegal content of all kinds while also ensuring that adults who want to monetize their work can do so.

Little coverage has acknowledged this activism and lobbying from sex workers, nor the broader labor and intellectual property issues behind it. The focus in accounts like Kristof's—or legislation like Missouri Republican Sen. Josh Hawley's latest bill—is on eliciting disgust and condemnation about legitimate problems and then homing in on Pornhub, not taking a balanced look at the larger problem or a sober analysis of what steps might actually mitigate it.

In several cases that Kristof and Exodus Cry muster against Pornhub, the perpetrators of abuse were relatives of the victims. This is a finding that crops up again and again, no matter what type of child abuse we're talking about: Kids are being exploited by family members and others close to them, or while under the watch of state services. Yet rather than focus on difficult but commonplace problems in people's own communities (or favorite family-friendly social platforms) people would rather create boogeymen out of a company willing to do business with sex workers.

Abusive content is portrayed as something ignored by online platforms and untraceable by police when, in reality, perpetrators of abuse are often caught and prosecuted because of help from tech companies. Instead of seeing tech companies as allies in thwarting predators and getting justice for victims, however, legislators and activists are often scheming more ways to make a broader group of people and businesses responsible for these harms.

First, they go after the user-generated content platforms like Pornhub (or OnlyFans, or Craigslist and Backpage before them). Then they go after anyone who does business with the platform, from credit card companies and other payment processors to the web hosting services, software, and advertisers that power them. Ultimately, the target becomes anyone enabling adult sex workers to actually work more safely and independently and to profit from their work.

Last week, Visa and Mastercard announced that they would cease their business relationships with Pornhub, after facing mounting pressure to do so. "This news is crushing for the hundreds of thousands of models who rely on our platform for their livelihoods," Pornhub said in response. 

We've been here before, notes sex worker, activist, and author Maggie McNeill. A few years back, activists and officials pressured credit card companies to stop doing business with Backpage—until a federal court said it was unconstitutional.

Now people like Hawley and Kristof—who both have a history of promoting sex trafficking fables—are trying the same thing with sites like Pornhub.

Of course, there will always be more underground platforms, foreign servers, and ways for criminals to evade U.S. authorities and spread their filth. And these more removed platforms, unlike mainstream porn sites, won't be willing to cooperate with the government, making it harder to catch people perpetrating violence, fraud, theft, or abuse. But with payment processor options narrowed, consenting adult porn performers and producers are less able to profit from their legal work and more likely to need to rely on exploitative middlemen.

Once again, their efforts will only end up making things harder for the very people they claim to be concerned with, while also punishing and keeping down sex workers broadly.

Behind the numbers, Room for Optimism

Behind the big, bad numbers, there's evidence that detection of and action around exploitative material is growing, in a way that signals positive change from all sorts of tech companies.

The National Center for Missing and Exploited Children received 16.9 million total tips about potentially abusive online content in 2019, with about 3 percent of this content originating in the U.S.

Tips to NCMEC came from at least 148 different tech platforms last year, with 15,884,511 tips coming from Facebook; 449,283 from Google; 73,929 from Imgur; 19,480 tips from Discord; 123,839 from Microsoft; 7,360 from Pinterest; 82,030 from Snapchat; 596 from TikTok; 45,726 from Twitter; and 13,418 from Verizon Media; 306 from Vimeo; and 57 from Zoom.

Those are staggering numbers. But it's important to keep in mind that they don't represent verified or discrete examples of child porn. The data reflect the number of tips about content potentially featuring minors that were reported to the NCMEC CyberTipline, often by groups who have a huge legal and financial (not just moral) incentive to overreport rather than underreport. Tips largely come from tech companies, though also from international hotlines, government officials, activist groups, and the general public. A tip simply means that someone raised suspicions about a post or picture, not that the person behind it was necessarily a minor (or even a real person; NCMEC notes that tips may be about computer-generated imagery). One image, ad, or video could be the subject of multiple reports, as well.

According to Palantir's Angela Muller, "viral" content that spurs a lot of reports about the same image or video has contributed to the rise in tips, with much of the sharing being done by people who think they're helping by drawing attention to questionable posts.

Limiting NCMEC data to "cases with an identified victim and one or more adult offenders" yields much smaller numbers, thankfully. From July 2002 through June 2014, NCMEC identified 518 "actively traded cases" (which it defines as those "having been reported on five or more times" to the group's tipline) around the world, involving a total of 933 identifiable victims.

From July 2011 through June 2014, it found a total of 2,598 cases globally with identifiable underage individuals.

The number of tips to NCMEC has risen greatly over the past decade. Yet while many have reported on the rise, few offer any potential reasons for it other than rising rates of abuse and/or indifferent tech companies. This presents a misleading picture, since much of the increase can be attributed to better tech tools and an increasingly proactive approach from tech companies, as well as an increasing number of them partnering with NCMEC.

Pornhub started working with NCMEC in 2019. "In early 2021, NCMEC will release our total number of reported CSAM incidents alongside numbers from other major social and content platforms," says the company.

Tech companies are required by law to report potential CSAM when alerted to it.

For social media and other tech entities, growing legal and political pressure around online content moderation generally has coincided with better tools for identifying and filtering out obscene content and underage imagery. This means more red flags raised, and more required reporting, even absent any actual increase in illegal content.

Notably, CSAM numbers do include pictures and videos taken and shared by minors without any adult knowing. So greater numbers of young people owning their own photo- and video-capable phones and having access to social media sites might also explain part of the rise in reported numbers.

According to the Internet Watch Foundation (IWF), "self-generated sexual content featuring under-18s now accounts for nearly a third of all actioned child sexual abuse material online by the IWF."

Instead of chasing selfie-taking teens, tech execs, payment processors, and sex workers, authorities should focus on enforcing existing laws against those actually committing crimes or posting images of abuse. No one except abusers thinks that abusers should get away with this. But as long as tech companies are making good faith efforts to stop abusers from posting content, and cooperating with authorities to identify such content, they can be partners of law enforcement, not enemies. They are the ones in the best position to help recognize illicit content and report it, as well as to provide records that can help in prosecutions.

If you want to catch child abusers and other porn predators—while also protecting victims and workers—you should work with companies like Pornhub, not against them.

Yet for groups like NCOSE and Exodus Cry, and politicians seeking excuses to get government backdoors into encrypted communications, that will not do. Instead, we get a lot of hype about how tech companies—particularly the ones without many friends in Washington—and general data privacy protections are the real issue.