Reason Roundup

New Bill Would Empower Trolls, Make It Easier for States To Sue Tech Companies

Plus: Atlanta shooter blames "sex addiction," Maryland wants new occupational licensing requirements, and more...

|

The PACT Act is back, and still problematic. The latest bill addressing online content moderation contains a big old mess of changes that could enable trolls and seriously burden solo actors, small businesses, and big tech companies alike, while also giving state prosecutors new power to sue entities they don't like. Called the Platform Accountability and Consumer Transparency (PACT) Act, the measure has bipartisan support.

Sponsored by Sens. Brian Schatz (D–Hawaii) and John Thune (R–S.D.), the PACT Act was first introduced last summer—and widely panned by constitutional scholars and internet law experts. And while the new legislation contains some changes, lawmakers still failed to fix many earlier problems.

The biggest problem with the PACT Act involves Section 230, the federal law that shields internet platforms from some liability for user-created content. An existing exception to this applies when federal criminal laws are concerned. Under the PACT Act's proposed changes, however, federal civil laws would also be exempted, too. This means federal regulatory agencies could sue online entities when things their users post allegedly violate civil laws, including anti-discrimination and accessibility statutes.

Even more significantly, the PACT Act would let state attorneys general get in on the action—"allowing state attorneys general to enforce federal civil laws against online platforms," as Schatz's press release puts it. That means that for the same alleged violation, a company could face the wrath of the federal government and dozens of state prosecutors at the same time.

"That is a potentially massive change"—and not a good one, points out Techdirt's Mike Masnick:

State AGs have long whined about how Section 230 blocks them from suing sites—but there are really good reasons for this. First of all, state AGs have an unfortunate history of abusing their position to basically shake down companies that haven't broken any actual law, but where they can frame them as doing something nefarious… just to get headlines that help them seek higher office. Giving them more power to do this is immensely problematic—especially when you have industry lobbyists who have capitalized on the willingness of state AGs to act this way, and used it as a method for hobbling competitors. It's not at all clear why we should give state AGs more power over random internet companies, when their existing track record on these issues is so bad.

The PACT Act would also remove Section 230 protections for companies that have "actual knowledge" of illegal content or illegal activity posted by users and do not remove it within four days.

Unlike many similar proposals, the PACT Act would at least require some sort of official documentation saying content is illegal before being forced to remove it. Illegal activity is defined as "activity conducted by an information content provider that has been determined by a trial or appellate Federal or State court to violate Federal criminal or civil law" and illegal content is defined as content "provided by an information content provider that has been determined by a trial or appellate Federal or State court to violate" federal criminal or civil law or state defamation law.

The takedown requirement is perhaps more pointless than anything else. Companies "already take down content with a court order. And often don't or drag their feet without one," writes Masnick. "This is another fix for a problem that doesn't exist"

The PACT Act's provisions go way beyond Section 230. Some of the other requirements in it include mandating internet companies to do the following:

  • Publish an "acceptable use policy."
  • Provide a 5-days-per-week, 8-hours-per-day hotline for people to ask a "live company representative" questions about the acceptable use policy and any content moderation decisions the company makes, as well as to report content that a user thinks may be illegal or may violate the company's acceptable use policy.
  • Provide an email complaint system for the same purposes.
  • Provide a formal appeals process for people who don't like a company's content moderation decisions.

It would also require internet companies twice yearly to submit to the federal government a report outlining how it enforced its acceptable use policy.

The biannual report would have to say how many reports it received, about what kinds of content, and who made these reports; the number of times the company took action on content and what kind and what type of action was taken; the number of content removals broken down by what rules were violated, who flagged the content, what country the content provider was based in, and "whether the action was in response to a coordinated campaign" of some sort; the number of times a company did not remove or take action on flagged content; and the number of appeals it received and what sorts of action were taken on those appeals.

In other words, online entities that allow user-generated content would have to explain and answer to the government for just about every possible content moderation decision made. And failing to properly comply with this transparency requirement would be considered "an unfair or deceptive act" under federal law.

"Individual providers" would be exempt from some of these requirements, including running their own hotlines and submitting biannual transparency reports—but only if they receive fewer than 100,000 unique monthly visitors (which is not really that much).

"Small businesses" would be exempt from the hotline and reporting requirements only if they saw less than 1 million unique monthly visitors.

"Internet infrastructure" services such as web hosting, domain registration, data storage, and cloud management companies and providers of broadband internet access would be exempt.

Overall, the PACT Act is sweeping in terms of the burdens it would place on internet companies without a clear indication of how these changes would solve any existing issues. The bill "solves for things that are not problems, and calls other things problems that are not clearly problems, while creating new problems where none previously existed," Masnick suggests.

As mentioned above, some of the Section 230 changes could be especially damaging. Allowing state prosecutors to sue websites could open up the litigation floodgates and lead to a lot of biased, agenda-driven lawsuits that benefit attention-seeking attorneys general at the expense of internet users and companies. The PACT Act would also burden online platforms and publishers with a crazy amount of new content moderation rules, paperwork, and "transparency" requirements without any obvious upside.

"Forcing every website that accepts content from users to post an 'acceptable use policy' leads us down the same stupid road as requiring every website have a privacy policy," writes Masnick. "It's a nonsensical approach—because the only reasonable way to write up such a policy is to keep it incredibly broad and vague, to avoid violating it. And that's why no one reads them or finds them useful—they only serve as a potential way to avoid liability."

The required process for allowing people to report, question, contest, and appeal all content moderation decisions would be an even bigger burden—and one that allowed for targeted harassment and censorship campaigns by groups intent on punishing certain platforms or silencing certain groups.

"This bill basically empowers trolls to harass companies," Masnick writes. "All it will do is harm smaller companies, like ours, by putting a massive compliance burden on us, accomplishing nothing but…helping trolls annoy us."


FOLLOW-UP

Georgia shooter claims sex addiction. Police say Robert Aaron Long, the man suspected of shooting up three massage parlors and killing eight people on Tuesday, blamed his actions on "sex addiction." Friends of Long's said he was deeply Christian and felt guilty about patronizing sex workers at massage parlors. Already, some media have been minimizing the shooter's actions and instead blaming the existence of massage parlors that permit sexual activity for "leaving the women working there particularly vulnerable to violence and abuse."

Police have also expressed what seems like sympathy for the shooter's sexy-women-made-me-do-it defense, with Cherokee County Sheriff's Office Captain Jay Baker saying at a press conference yesterday that the businesses were "a temptation for him that he wanted to eliminate" and "he was fed up, at the end of his rope. He had a bad day, and this is what he did."


FREE MARKETS

https://twitter.com/walterolson/status/1372465199450615808?s=12


QUICK HITS

• A bill to remove the decades-past deadline for ratifying the Equal Rights Amendment passed the House of Representatives yesterday.

• A baby born to a woman who received the COVID-19 vaccine while pregnant "marks the first known case of a baby born with coronavirus antibodies in the U.S., which may offer her some protection against the virus."

• Japan has ruled same-sex marriage bans unconstitutional.

• New Hampshire is coming for tiny homes.

• Meet the new war on terror, just as bad as the old war on terror.