Technology policy

The EARN IT Act Is the New FOSTA

The new bill takes aim at internet freedom and privacy under the pretense of saving kids.

|

A new bill with high-powered bipartisan backing takes aim at free speech and privacy online under the pretense of saving sexually abused children. Sound familiar?

Like the 2018 "sex trafficking" law FOSTA, the new Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act could strip crucial legal protections from a huge array of apps, blogs, social media, messaging services, crowdsourced content platforms, and much more, upending the internet as we know it in the process.

Theoretically, EARN IT would simply set up a federal commission to determine best practices for preventing the online spread of "child sex abuse material." (That's Congress' catch-all term used for sexual imagery featuring minors, from the truly sick and reprehensible to semi-clothed selfies 17-year-olds take of their own volition.) Effectively, it would give the Trump administration carte blanche over the rules of engagement on the internet.

"Best practices" as determined by the commission could mean all encrypted communication services being built with a "backdoor" for government access. It could mean making users of social media, online marketplaces, and apps from Yelp to Grindr submit proof of real identities before they can join or post. It could mean turning over more user data to authorities or disallowing any accounts that post sexual material at all.

We just don't know. And neither do legislators—the bill would give a commission selected by the attorney general the regulatory equivalent of a blank check.

anti-Section 230 Politics Drives both EARN IT and FOSTA

Failure to follow the commission's practices—to "earn it"—could mean internet companies lose the protections of a federal communications statute known as Section 230.

Passed in 1996, Section 230 says providers and users of interactive computer services aren't automatically liable for each other's content or conduct. At the time, web entrepreneurs worried that any system for reporting, flagging, and making corporate judgment calls on user-posted content would open them up to legal liability for things like defamation and obscenity (an interpretation based on how First Amendment law is applied to physical bookstores, newsstands, etc.). Section 230 was passed, in part, to explicitly permit then-dominant tech companies like AOL and MSN to monitor and moderate users' posts without risking endless time in court.

It's now the legal framework undergirding the internet as we know it, protecting both open speech and community-specific censorship on platforms no one dreamed of back in '96.

But Section 230 has long been hated by state attorneys general and civil suit lawyers, because it means they can only hold the people who actually say or do illegal things criminally (and financially) accountable for those things; they can't win against web companies that merely host actionable content before being made aware of its existence, that it was criminal, or that it would contribute to a criminal act.

Lately, industries that failed to adapt well to tech-forward competitors (such as newspapers) or industries that simply resent new competitors for cutting into profits (like hotel chains losing money to Airbnb or besot-from-all-sides Match.com) have also joined the fight against Section 230. Their opposition lends credence to claims that the law's destruction would also doom social media, amateur journalism, "peer-to-peer" business models, and more.

Meanwhile, activists and politicians have cited every major social ill and safety issue as a reason to amend or abolish Section 230.

With FOSTA, lawmakers succeeded in carving out an exception by conjuring the specter of an epidemic of U.S. "child sex trafficking." Not only were their numbers nonsense, but the law goes far beyond sex trafficking, making it a federal crime to host any content that could facilitate any prostitution.

The result has been the suppression of all sorts of speech and content related to sex work, as well as major sites like Craigslist shutting down all personals ads. And with this (and the government's unrelated seizure of Backpage), a huge amount of hardship for sex workers of all sorts, particularly those most vulnerable to violence and abuse already.

FOSTA has yet to be used by any government entity in the nearly two years since it was signed into law, though it is being invoked in private civil lawsuits against Craigslist and email marketing company Mailchimp. Held up by supporters as an absolutely urgent and crucial tool for stopping horrible abuse, the only attempts to use FOSTA's power in court so far seek payouts from companies far removed from that abuse.

But perhaps the point of FOSTA—a law that may be hard-pressed to pass constitutional muster in court—was never meant to be used directly. The chilling effect on internet speech, and on economic liberty, has already worked without a single charge being filed.

Now, lawmakers are looking to recreate this magic with the EARN IT Act.

EARN IT isn't about saving kids

Anything related to child pornography—producing it, receiving it, accessing it, viewing it, possessing it, distributing it, selling it, and so on—is a federal crime, and these statutes are vigorously enforced. (Unless you're with the FBI, that is.) For tech companies, it's also a federal crime to fail to report such images if discovered on their servers.

And nothing in Section 230 stops the Justice Department from using federal criminal laws against tech companies.

This means that nothing prevents authorities from holding child porn perps liable in criminal court. "Put simply, Section 230 does not keep federal prosecutors from holding providers accountable for [child sex abuse material] on their services," writes Riana Pfefferkorn, associate director of surveillance and cybersecurity at Stanford Law School's Center for Internet and Society, in a detailed post about the EARN IT Act.

This should cast suspicion on claims that we need EARN IT to protect children or to punish those who exploit them. The real power of the EARN IT Act is conditioning legal liability protection on to-be-determined rules that can be changed based on the whims of an unelected team of bureaucrats, corporate cronies, and well-connected activists.

The commission's rules would not be legally binding, but failure to follow them would mean losing Section 230 protection for civil lawsuits and state charges related to sexual content involving the underage. In that case, tech companies can expect an onslaught of civil suits from lawyers willing to further exploit child sex abuse victims and charges from state attorneys general looking to make names for themselves. (Remember, all it takes is for high schoolers to have used a platform's private messaging tools to exchange explicit photos with one another and a site is liable.) Some would go out of business, while independent startups may never be able to get off the ground in the first place.

The great paradox is that whichever way companies could react to mitigate this risk is bad news for users and bad for stopping the spread of exploitative and illegal material. Their options are to clamp down on users and content across the board, hoping to somehow filter out anything bad, or to relax reporting and moderation policiesopening up platforms for even wider distribution of criminal and abusive content—in order to try and avoid liability that turns on staff having received notice of that content.

The 15-member commission would be chosen by the U.S. attorney general, who also has the option of entirely rewriting the commission's rules.

That means Bill Barr "could single-handedly rewrite the 'best practices' to state that any provider that offers end-to-end encryption is categorically excluded from taking advantage of this safe-harbor option," points out Pfefferkorn. "Or he could simply refuse to certify a set of best practices that aren't sufficiently condemnatory of encryption. If the AG doesn't finalize a set of best practices, then this entire safe harbor option just vanishes."

The bill does say that companies can "earn" Section 230 protection by taking "reasonable measures" other than the feds' best practices. But it's language that makes compliance more inscrutable overall, leaving it up to federal judges to decide on a case-by-case basis what measures are reasonable.

Will lawmakers and MEDIA learn?

If tech companies comply with the best practices Barr's commission could dream up, we might all lose more online privacy and safety. In the case of encryption, for instance, a backdoor accessible to U.S. authorities in all our communications would also be accessible by foreign and domestic hackers and scammers. Meanwhile, child porn distributors and other criminals could simply switch to encrypted communication services located outside the U.S.

No matter what rules are ultimately drafted, the EARN IT Act would give Washington enormous leverage over tech companies and platforms of all sorts.

Sex workers—criminalized or not—would stand to lose even more access to advertising spaces, social media, payment processors, and other digital realms. If FOSTA is any guide, LGBTQ organizers and others fighting for sexual dignity, privacy, and rights will face increased scrutiny online, too.

In the lead-up to FOSTA's passage, few media outlets reported on its downsides; now, the internet is awash with articles about how FOSTA harms vulnerable classes of people, makes sex work more dangerous, and spurs senseless suppression of speech online. All but two senators and a handful of U.S. representatives voted for FOSTA; now, a bill aimed at repealing FOSTA is being backed by the likes of Sens. Elizabeth Warren (D–Mass.) and Bernie Sanders (I–Vt.).

Perhaps with the EARN IT Act, more media and legislators could skip the period of credulously buying the for-the-children rhetoric and start seeing this legislation for the dangerous, cynical, civil liberties-grabbing ploy that it is.

NEXT: Police Powers During a Pandemic: Constitutional, but Not Unlimited

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. Hey ! ???? Looking for some fun to get into? ???? Me too! ????Let’s get to know each other???????? on a???? much more personal level ==>> http://bit.ly/3cWe5Jv

  2. EARN IT should have been the bill that legalizes prostitution.

    1. Until that happend go and visit sex in Noord-Brabant place wher you can chat with nice young girls

  3. 1. Never vote for a bill with a cutsie acronym for a title.
    2. When it comes to sex, mother nature knows better than the legislature. “Children” does not include biological adults capable of breeding. Maybe a serious discussion of some age in the 11 to 14 range to be the actual cut off for “children”? Of course, any artificially legislated age will be to old for some, and to young for others. But then the issue is control, not saving children however defined.

    1. “Children” does not include biological adults capable of breeding.

      It’s not your fault, but Reason’s retardation on the issue is wearing off. There are no children or adults reproducing via the internet. The internet is ancillary at best.

      ENB is shooting herself in the dick on this one. It’s not, and shouldn’t be, a crime for 17-yr.-olds to wear skimpy clothing in real life, the internet doesn’t change that. It shouldn’t necessarily be a crime for them to post videos of them torturing someone either. It absolutely should be permissible as evidence in a trial if they are found to have actually tortured someone (where a federal judge may eventually decide if it truly were protected free speech or not).

      But the ‘Internet first, last, and only’ mindset that is required to preserve socialist media has always included an element of willful retardation even before the internet existed.

      1. “adults reproducing via the internet.”

        Is that a challenge? I think it could be done and the smart way to go about it is to cover both sides. Internet conception and internet abortion.

        1. I think it could be done and the smart way to go about it is to cover both sides. Internet conception and internet abortion.

          The internet prevents conception and aborts adults at a rate greater than billions-per-minute.

  4. It could mean making users of social media, online marketplaces, and apps from Yelp to Grindr submit proof of real identities before they can join or post.

    Doesn’t this discriminate against the poor and minorities, who cannot obtain proof of real identity?

    1. It would really cut down the hispanics on Grindr, that’s for sure. Most of the user-base would consider that a plus.

  5. The internet (and child porn on it) predate section 230. Without Section 230, there could be no ‘EARN IT’ law.

    Reason pretty flatly states it here:

    Section 230 was passed, in part, to explicitly permit then-dominant tech companies like AOL and MSN to monitor and moderate users’ posts without risking endless time in court.

    The law, passed by congress, explicitly permitted tech companies to censor their users’ posts without the risk of potential recourse from the users.

    Congress chose administrators over users and Reason cheered, calling it the 1A of the internet (despite the fact that the 1A explicitly forbids Congress from doing what they did). Now that Congress is altering the deal, Reason suddenly discovers that it (sometimes) dislikes big government.

    1. Don’t be stupid, Mad. You know that the alternative was two absurd extremes, where the ISPs would either be liable and have to moderate everything or could moderate nothing at all for fear of becoming liable.

      1. Now we know better so should it get removed it will be replaced with something saying the service or site is liable. You let the person post? It’s on you and as well as them if it’s illegal. You didn’t identify them and they defamed someone? It’s on you. EZPZ.

        Otherwise sites just allow it and then claim the Shaggy defense of “It wussin me”.

    2. “The law, passed by congress, explicitly permitted tech companies to censor their users’ posts without the risk of potential recourse from the users.”

      So long as they did so in good faith based on the content being illegal or offensive, where examples were given of what was meant by “offensive”.

      1. So long as they did so in good faith based on the content being illegal or offensive, where examples were given of what was meant by “offensive”.

        Oh! Well as long as Congress is determining what constitutes good faith and gives examples then, by all means, carry on!

        Shit, Congress gets good enough at this and we won’t even need the 1A any longer. We can just rely on them to define what is and is not offensive.

    3. Section 230 is not a law. The Communications Decency Act is a law and Section 230 is a part of that law describing who is held responsible for breaking the law. Despite Reason’s torturous relationship with the English language the section simply assigns blame for various lawbreaking actions to the people who actually committed it rather than the nearest person with deep pockets.

      The idea is that if you publish an illegal Op Ed in a newspaper (set aside arguments about words being illegal and just accept for the moment that they can be) that newspaper is on some level responsible for the appearance of your inflammatory text. Someone must have affirmatively and actively chosen to include it and that choice of what to include means that your speech in some part becomes their speech and now that the illegal speech belongs to them on some level they become liable for it.

      But that’s not how Facebook or Twitter works. When you post a picture of that burrito you ate last night no one at Twitter looks at your post and says “Yes, this is what needs to be on my website. Everyone should see mad.casual’s burrito!” The acceptance of your post is passive and if you post an image of the corpse you just raped and murdered or maybe a scan of a Disney movie from 30 years ago (guess which one will involve more jail time) no one at Facebook or Twitter is going to look over your post and affirmatively choose that it should appear on their website. They haven’t made a choice to include your post, so how can they be held treated as if the speech belongs to them?

      “But it’s their responsibility to remove objectionable things as soon as they are discovered!”
      Given that 300 hours of video are uploaded to Youtube every minute there is no conceivable way that internet platforms can review everything before it’s posted and even responding just to things that are flagged is impossible to do fairly. Not for lack of trying mind you, but just take a look at the rampant issues and abuses with Youtube’s content ID for example or the infinite problems with trying to tell the difference between terrorist propaganda and anti-terrorist programming or just plain news.

      But don’t let all that convince you. Just consider the future case. If any moderation whatsoever opens internet platforms up to infinite legal liability one of two things is going to happen: There won’t be ANY moderation at all, zip, zilch, zero, none, or there will be no user content, no ability to post anything which could possibly be used to generate a liability for the provider. The internet will have to split into two parts: a lawless hellscape where absolutely nothing is off limits (probably not a website anyone actually wants to either visit or run) or Television 2: Electric Boogaloo.

      1. “But that’s not how Facebook or Twitter works.”

        Yes. Congress is complaining about everything that shitty business model has resulted in. To make it worse the companies/sites have done little to correct any problems that arise from that shitty business model.

        CNN has a good business model. They sell advertisements and viewers can’t broadcast illegal shit. Pretty solid.

        Twitch? They sell advertisements and people livestream muslims getting shot to shit. That’s problematic. Not a good business model when that can happen.

        1. So phone companies are a “bad business model” because you can use them to say whatever you want including planning crimes and what not?

          1. So phone companies are a “bad business model” because you can use them to say whatever you want including planning crimes and what not?

            Phone companies don’t charge for advertisements and the people advertising via phone, telemarketers and politicians, are generally recognized as subhuman scum.

          2. Phone companies facilitate communications between two parties. Not broadcast to the world. Also, phone companies obtain your name, DOB, address, and SSN to sign up. Same with ISP’s. That’s what FB, YT, and everyone else should have been doing. That’s how phone companies avoided a lot of regulation. When cops show up with a warrant all of the info is there from the user.

        2. I’ve literally never seen anyone shot on Twitch, at least not outside of a video game. I’m sure people occasionally use it for some shit, but that’s no reason to start some sort of moral panic and try to shut the internet down because *gasp* it allows individuals to easily state and distribute their views instead of leaving them under the thumb of large corporations who are heavily incentivized to aggressively moderate and censor any even potentially objectionable content.

          1. German guy tried to kill a bunch of Jews, door was locked, so he drove around killing people live. Also, the NZ shooting was rebroadcast on there. Same with YT.

            It’s not shutting the internet down. It’s holding companies accountable. Internet existed before these companies. It will exist after them too.

            Internet is not YT, FB, IG, Twitter, and Twitch. Those are just companies that use it. If they would have been identifying users this wouldn’t be happening.

            1. It’s not shutting the internet down.

              It’s scare mongering from people too stupid to imagine media companies moderating, distorting, or controlling a message without approval from Congress.

            2. There’s no separating the internet from the platforms that make it worth using. One of the biggest values of the internet is that it allows people to easily voice and spread their views, often with little to no oversight, which helps facilitate the expression of dissenting opinions. The more you erode Section 230, the more you give companies a financial incentive to crack down on any even remotely objectionable opinion to prevent being held liable for it. This is of course ideal for social conservative bootlickers and police fetishists, but a dire loss to everyone else.

              1. There’s no separating the internet from the platforms that make it worth using.

                I’m not sure what this word salad is supposed to mean. I think it only makes sense if you subscribe to some preconceived notions about the internet not existing to which plenty of people don’t subscribe.

                The more you erode Section 230, the more you give companies a financial incentive to crack down on any even remotely objectionable opinion to prevent being held liable for it.

                This is untrue both in theory and in practice. The only way this works is if you think Congress has a mandate to protect free speech, which is the opposite of the 1A, and if you also assume Congress can’t be bought.

                This is of course ideal for social conservative bootlickers and police fetishists, but a dire loss to everyone else.

                OIC, you’re a sock.

      2. Bullshit. There was moderation prior to section 230.

        Your false cries of “How ever will we do moderation without a priori legal protection?” demonstrates either an honest ignorance of the technology or a willful retardation in service of defense of a law, enacted by Congress that establishes/prohibits free speech and the right to petition the government.

        There won’t be ANY moderation at all, zip, zilch, zero, none, or there will be no user content, no ability to post anything which could possibly be used to generate a liability for the provider

        Because, prior to section 230, everybody was simultaneously afraid to speak in public for fear of being liable for their speech and venues, media, and organizations cowered in fear of speech lest they discover two great big black things standing by their side.

        Section 230 was part of the larger CDA which was an explicit attempt from Congress to control speech on the internet and was struck down. Any attempt to accept section 230 or co-opt it as some freedom-preserving good effectively converts it into a poison pill. FOSTA/SESTA, EARN IT… all side effects of the pill that says Congress can decide what constitutes good faith moderation.

  6. who cares if congress passes an overly broad law that allows a blank check to regulators to violate rights with impunity?

    We have a strong judiciary that will strike down any overly broad and vague law. So we’re good.

    1. As clearly stated in the 1A, we couldn’t trust the judiciary, so Congress had to pass a law. Now that Congress has passed the law we don’t have to worry about the judiciary. Unless we can’t trust Congress not to pass more laws. In which case, we can’t trust judiciary.

      1. Good point.

        They take an oath to uphold the constitution when they take office. So by definition congress would never pass an unconstitutional law.

        Done and dusted. Nothing to see here. Move along….

  7. The only real basis for comparison we have are the actions that The FCC foisted on YouTube this year. A huge host of changes to the way any child labeled “child friendly” can run. These include: No targeted ads, no background playing of videos, no comments, and no playlists.

    I have seen nothing in any of the demanded changes that would actually protect children. A lot of sound and fury at great expense for precisely zero benefit.

    I expect this to be a repeat performance.

    1. “I have seen nothing in any of the demanded changes that would actually protect children.”

      That’s because you can only see tracking cookies if you go to the folder where they are located. That was the big change. Their business model was tracking children and they paid a settlement to keep their business model and act like they were doing something to not track children. They still are because there isn’t much of a way around it unless they cut all tracking cookies.

      The Earn It act may fix all of that which means Youtube shelled out a settlement for nothing. Better than paying a full fine, I guess.

  8. It’s not FOSTA. It’s quite different. Just because you say it is doesn’t make it so. Like how you claim CDA 230 is the internet’s 1A. Stop saying that. A chicken isn’t a duck no matter how often you say it.

    Unlike FOSTA it provides a safe harbor if the needs are met. If it’s fought then Graham said he would remove the safe harbor on the next bill and he’d get just as much backing. They’re giving companies a chance on this one instead of zero tolerance like FOSTA. He’s got you by your big hairy balls. If you fight it then it only gets worse.

    We need to ask ourselves how this even happened.

    Companies asked themselves the question of what happens when someone uploads something illegal on their service years ago. They said “Oh well, not our fault” instead of working their business around it not happening.

    They’re lucky they got 25 years of that.

    “because it means they can only hold the people who actually say or do illegal things criminally (and financially) accountable for those things;”

    It’s pretty convenient for them that they’re not identifying anyone, isn’t it? It’s convenient for users too. You say “Hold that person accountable”, but not only is that person wearing a disguise, but they’re fucking invisible and they were able to commit the crime because those facilitating the arena didn’t bother checking ID’s.

    1. A chicken isn’t a duck no matter how often you say it.

      You’re wrong because internet. – Reason

    2. You are asking something that was until extremely recently, completely impossible. Even now, it’s only possible for a huge company with Google-level resources, and even then, not accurately. The sheer number of false positives that Google gets is absurd.

      Use your speech to text app for a few minutes without editing. Now, think if you would be willing to have that jumble of words be legally enforceable.

      1. It’s not impossible. Pre CDA 230 I had a website that I owned and posted shit on it. I was liable. Worked really well.

        If Google or whoever should choose to host other people’s shit then they should be responsible. People can host their own shit. They’re just choosing not to. Option has always been there. Companies just decided to profit off of it, rake in data and cash, and then wash their hands of the content.

        This bill gives them the option to take some steps so they can continue to host and not be liable. Seems as friendly as it’s going to get.

      2. Use your speech to text app for a few minutes without editing. Now, think if you would be willing to have that jumble of words be legally enforceable.

        So, if we can prove that even one person involved in CDA section 230 wasn’t sober, it will be unenforceable?

        This is so fucking backwards.

        Use your speech to text app for a few minuets without editing. Now imagine that everyone does it. Now imagine that none of it is enforceable, legal or otherwise.

        People say things they don’t mean all the time. People get sued for it all the time. By and large, they only pay out when their speech causes someone damages and a good portion of the time not even then. Restricting people’s ability to sue for mistakenly damaging speech doesn’t, in any way, prevent or lessen the amount damaging speech or the damage done. Nobody’s calling for a ban on mistaken speech, just a fair exchange as (occasionally) judged by 12 people, the way they used to do way back in the olden days of 1996.

  9. besot-from-all-sides Match.com

    “Besot[ted]” means “drunk.” The word you’re groping for is “beset,” which means “attacked on all sides,” “assailed,” “harassed.” Indeed, “beset on all sides” is a tautology.

  10. It’s basically the South Korean model. I don’t see it surviving any lawsuits.

    Also, in case this bipartisanship has confused anyone, the general direction they’re moving towards is banning pornography altogether, just like Korea did.

    1. “It’s basically the South Korean model. I don’t see it surviving any lawsuits.”

      Here’s the catch. The part that won’t survive the lawsuits is the part that gives the safe harbor. It gives a privilege. That one is tricky.

      If it doesn’t survive any lawsuits then it will be exactly like FOSTA and it will all vanish. Best to let it survive.

  11. At that point it is said this is a bent unique motherboard. With past Aldi PCs it was the situation that you couldn’t place anything into free spaces, on the grounds that the mainboard was introduced the other path round and ordinary PCI cards didn’t fit.
    Help With Online Classes

  12. Every month I am earning online more than $8650 by doing a very simply online job from my home. By doing this in my part time I was able to save enough to buy me a new car in just a few months. This is so freaking easy that everyone should try it… Start making some dollars online today by following instructions on this website ==>>HERE══════► Click it here   

  13. I am making a good MONEY (500$ to 700$ / hr )online on my Ipad .Do not go to office.I do not claim to be others,I yoy will call yourself after doing this JOB,It’s a REAL job.Will be very lucky to refer to this…. Read more  

Please to post comments

Comments are closed.