"Grindr knew," states Matthew Herrick's lawsuit against Grindr, the hookup app for gay men. Herrick and his lawyers say they filed many reports with the site over users purporting to be him. Even so, there wasn't much that Grindr could do about it. Each time it blocked an offending profile, Herrick's vengeful ex-partner created a new one, using different anonymous details for the profile but posing as Herrick to prospective sex partners over direct message. Herrick's suit claims that his ex did this to direct sex-seeking men to his home and workplace daily for several months.
For the past few years, Herrick has been trying to hold Grindr financially responsible for the harassment, which he endured between the fall of 2016 and early 2017. Today, his case was heard before the U.S. Court of Appeals for the Second Circuit.
Some are framing the case as a way to address both Grindr's negligence and toothless federal laws. "Herrick's story echoes the online harassment that many people have experienced, often with little to no legal consequences for the companies that created the technology in question," NBC reports. "A 1996 law designed to foster free speech online generally protects companies from liability."
The relevant part of that law, Section 230 of the Communications Decency Act (CDA), has been under fire from many angles recently, often by those with little understanding of what the law actually does and no real consideration on what a world without it would look like. Meanwhile, media and Congress say repealing Section 230 would cure much of what ails us online.
Grindr blocked the impersonating profiles and users. It just couldn't keep a highly motivated ex from finding ways around these bans. Herrick's suit suggests the site should have done more, such as banning the use of virtual private networks (VPNs) to access Grindr, or scanning phrases and photos sent through direct messages in order to screen them for matches to his image, address, etc. But to take these steps would seriously compromise the privacy of all Grindr users. They are far from the simple software tweaks Herricks' suit suggests.
NBC claims that Section 230 has allowed space for tech companies "to introduce products without much forethought about the hazards they could create." That's one hell of a misrepresentation. Effectively filtering out prohibited content and dealing with suspected abuse has become a lodestar for (and bane of) all sorts of social platforms, from dating apps to Facebook, digital marketplaces like Backpage and Craigslist, video platforms like YouTube, review sites like Yelp, and so many more.
It's a huge and complicated problem, and it's laughable to suggest tech companies aren't trying to address them. Most of the major platforms have faced lawsuits related to these issues. An inability to verify user identities has fueled two years of political ballyhooing. The problem is that motivated actors can game the best of online systems, just as they did old fashioned ones.
The "fixes" desired by activists, politicians, and plaintiffs, meanwhile, can introduce more problems than they solve.
"Online harassment is a serious problem, and one that defies easy solutions," write Rebecca Jeschke and Jamie Lee Williams of the Electronic Frontier Foundation (EFF), which filed an amicus brief in support of Grindr. "As the digital world grapples with potential strategies to make online life safer, we have to also fight back against misguided approaches that would undercut what makes the Internet an essential tool for modern life."
They add that
Section 230 does not mean that victims of online harassment have nowhere to turn. Most jurisdictions have laws against abusive speech. Law enforcement needs to get smarter about online harassment so it can protect people in danger, while courts should become comfortable with legal remedies against online perpetrators. We hope that the appeals court recognizes that holding platforms responsible is not the answer, and dismisses this case.
Herrick's appeal says this is "a case about a company abdicating responsibility for a dangerous product it released into the stream of commerce." It accuses Grindr of releasing a defective product that is "fundamentally unsafe."
NBC suggests that this argument "could reshape consumers' relationship with software, alter speech protections online and put pressure on Silicon Valley to find flaws in products before introducing them to the world." That's overselling the stakes quite a bit.
No matter the new language Herrick's lawyers are using, this is still a case that conflicts with Section 230 and the First Amendment. Herrick's case was previously rejected on such grounds last year, when a federal court agreed with defendants that Section 230 barred Herrick's claims against Grindr. And Herrick's appeal offers little that was not contained in his initial argument.
In a January 2018 opinion, U.S. District Judge Valerie Caproni noted that the CDA "bars Herrick's products liability claims and his claims that Grindr must do more to remove impersonating profiles. Each of these claims depends on holding Grindr responsible for the content created by one of its users." Meanwhile, "Herrick's misrepresentation-related claims fail on the merits because Herrick has not alleged a misleading or false statement by Grindr or that Grindr's alleged misstatements are the cause of his injury."
Carponi added that "although Herrick contends that Grindr is not an 'interactive computer service' … the Court finds that there is no plausible basis to argue that it is not." And Herrick's attempts to identify "any legally significant distinction between a social networking platform accessed through a website, such as Facebook, and a social-networking platform accessed through a smart phone app, such as Grindr," failed as well. "In either case, the platform connects users to a central server and to each other."