A new twist in the endless debate over end-to-end encryption

The EARN IT Act deserves a close look

|The Volokh Conspiracy |

The battle between Congress and Silicon Valley has a new focus: the EARN IT Act of 2019. (The title is an embarrassing retronym that I refuse to dignify by repeating; you can safely ignore it.) A "discussion draft" of the bill is attributed to Republican Sen. Lindsey Graham and Democratic Sen. Richard Blumenthal. To hear the Electronic Frontier Foundation (EFF), the Center for Internet and Society, and Gizmodo tell it, the draft EARN IT Act is an all-out assault on end-to-end encryption.

The critics aren't entirely wrong about the implications of EARN IT for encryption—but they've skipped a few steps. To understand the controversy, it's useful to start with the structure of the bill. That will show how EARN IT could affect the deployment of end-to-end encryption—and why the draft bill makes sense as social policy.

The central change made by the bill is this: It would allow civil suits against companies that recklessly distribute child pornography. It would do this by taking away a piece of their immunity from liability when transmitting their users' communications under Section 230 of the Communications Decency Act (CDA).

Originally added to the CDA as part of a legislative bargain, Section 230 was one of the best deals tech lobbyists ever struck. The CDA was a widely popular effort to restrict internet distribution of pornography to minors. Tech companies couldn't stop the legislation but feared being held liable for what their users did online, so they agreed not to fight the CDA in exchange for Section 230. The next year, the law's measures to protect children online were ruled unconstitutional, leaving the industry protections in Section 230 as more or less the only operative provision in the entire CDA.

Both Section 230 and the content monitoring it protects have become increasingly controversial in recent years. Conservatives believe they have been unfairly targeted for arbitrary deplatforming and demonetization; liberals complain that there are still too many right-wing proselytizers and rabble-rousers on the internet. Fewer and fewer people are happy that Section 230 gives tech companies broad discretion to do what they please about user content.

But the most recent attacks on Section 230 have been more focused. Congress first took aim at sites, like Backpage, that were accused of knowingly accepting ads that fostered prostitution and sex trafficking. In 2018, Congress enacted the Fight Online Sex Trafficking Act (FOSTA), which provided that internet companies could be held liable for assisting in sex trafficking if they "knew or should have known" what their customers were doing. 18 U.S.C. § 1595; 47 U.S.C. § 230(c)(5).

Silicon Valley's advocates hated FOSTA, but their resistance didn't matter. Times had changed. It was hard to argue that the big platforms couldn't afford to monitor their users' content. Much of the industry caved rather than look soft on sex trafficking. The bill passed with overwhelming bipartisan support.

This brings us to the EARN IT Act, which would treat child pornography more or less the way FOSTA treated sex trafficking content—by lifting the immunity of companies that don't take reasonable measures to prevent its distribution. It's a surprise that EARN IT came second to FOSTA. Sex work has defenders on the left and the libertarian right, after all, but no one defends child pornography. What's more, ads touching on sex and companionship have a good deal more First Amendment appeal than child sexual abuse images, which are after all direct evidence of crime. But now that FOSTA has shown the way, it's no surprise that a similar attack on child pornography has gained traction.

In seeking to counter this proposal, Silicon Valley and internet freedom advocates have moved away from the First Amendment paeans and "don't break the internet" memes they used in their unsuccessful fight against FOSTA. Rather, they're arguing that EARN IT will hurt end-to-end encryption.

How so? The short answer is that EARN IT imposes civil liability on companies that distribute child pornography "recklessly." Victims—presumably the individuals whose images are being circulated—could sue online platforms that recklessly ignored the exchange of child pornography on their services. By itself, that rule is hard to quarrel with. Recklessness requires more than simple negligence. A party is reckless if he deliberately ignores a harm that he can and should prevent.

For anyone who has defended a tort case, being reassured that the jury has to find your client acted recklessly is cold comfort. You don't know what facts you'll be accused of ignoring. You don't know what emails may have been sent to even low-level employees in your company. You don't know what measures you'll be accused of ignoring. And you don't know when a trawl through the company's email servers will show that someone somewhere treated the problem with insensitivity.

To address that fear, EARN IT offers a safe harbor to companies that follow best practices in addressing child pornography. Notably, this is more protective of social media defendants than FOSTA and not strictly necessary for those willing to trust the good sense of American juries. Nonetheless, the bill creates a commission to spell out the safe harbor best practices. To further protect industry, 10 of the 15 members of the commission must agree on the best practices, and six of the 15 must have worked at a computer service or in computer science, giving the commission members with a technical background a blocking minority. To protect the government's interests, the attorney general is given authority to review and modify, with reasons, the best practices endorsed by the group. Companies that certify compliance with the best practices cannot be sued or prosecuted under federal child pornography laws. Companies that disagree with the best practices that emerge from this process can still claim the safe harbor if they implement "reasonable measures … to prevent the use of the interactive computer service for the exploitation of minors."

To see what this has to do with encryption, just imagine that you are the CEO of a large internet service thinking of rolling out end-to-end encryption to your users. This feature provides additional security for users, and it makes your product more competitive in the market. But you know it can also be used to hide child pornography distribution networks. After the change, your company will no longer be able to thwart the use of your service to trade in child pornography, because it will no longer have visibility into the material users share with one another. So if you implement end-to-end encryption, there's a risk that, in future litigation, a jury will find that you deliberately ignored the risk to exploited children—that you acted recklessly about the harm, to use the language of the law.

In other words, EARN IT will require companies that offer end-to-end encryption to weigh the consequences of that decision for the victims of child sexual abuse. And it may require them to pay for the suffering their new feature enables.

I don't doubt that this will make the decision to offer end-to-end encryption harder. But how is that different from imposing liability on automakers whose gas tanks explode in rear-end collisions? If the gas tank makes its cars cheaper or more popular, the company will get the benefit; but now it will also have to pay damages to the victims of the explosions. That makes the decision to offer risky gas tanks harder, but no one weeps for the automaker. Imposing liability puts the costs and benefits of the product in the same place, making it more likely that companies will act responsibly when they introduce new features.

There is nothing radical about EARN IT's proposal, except perhaps for the protections that the law still offers to internet companies. Most tort defendants don't get judged on a recklessness standard. Juries can award damages if the defendant was negligent, a much lower bar. Indeed, in many contexts, the standard is strict liability: If your company is best able to minimize the harm caused by your product, or to bear its cost, the courts will make you the insurer of last resort for all the harm your product causes, whether you are negligent or not. Compared to these commonplace rules, Section 230 remains a remarkably good deal, with or without EARN IT.

But critics of EARN IT have focused on the role the bill provides for the attorney general Because Attorney General William Barr could modify the best practices recommended by the commission, critics say, Barr might unilaterally declare that end-to-end encryption is never a best practice when dealing with the scourge of child pornography. That would effectively prohibit end-to-end encryption, or so the argument goes.

There's one problem with this: EARN IT doesn't impose liability on companies that fail to follow the commission's best practices. They are free to ignore the recommendations of the commission and the attorney general, which are after all just a safe harbor, and they will still have two ways to avoid liability. First, they can still claim a safe harbor as long as they adopt "reasonable measures" to prevent child sex exploitation. Second, they can persuade a jury that their product design didn't recklessly allow the spread of child pornography.

The risk of liability isn't likely to kill encryption or end internet security. More likely, it will encourage companies to choose designs that minimize the harm that encryption can cause to exploited kids. Instead of making loud public arguments about the impossibility of squaring strong encryption with public safety, their executives will have to quietly ask their engineers to minimize the harm to children while still providing good security to customers—because harm to children will in the long run be a cost the company bears. That, of course, is exactly how a modern compensatory tort system is supposed to work. Such systems have produced safer designs for cars and lawn mowers and airplanes without bankrupting their makers.

Maybe it's time to apply the same rules to internet products and services.

The views expressed here do not reflect those of my firm or clients.

Advertisement

NEXT: The ratification of the Equal Rights Amendment could lead to an Article V Convention for Proposing Amendments

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. I surely hope that UPS, Fedex, USPS, etc are held equally liable if someone slips a kiddie porn pic into an envelope and ships it. Opening and inspecting every package and letter is a small price to pay for saving the children, after all.

    1. What about Canon and Nikon as digital camera producers, or Apple, and all the Android phone makers since they install high quality cameras on their devices.

      Or Brother, HP, and Epson as they make consumer photo printers on which to print those dirty digital photos.

      Have they done enough to ensure that their products aren’t being used to make kiddie porn and thus escape liability for it?

    2. Yep. And what if an end-user implements their own strong encryption and sends the encrypted packet through the network? Is the provider required to detect the fact that it can’t inspect the contents of the digital package and reject it? What if user A sets up an FTP (file transfer protocol) server on their own machine and receives an encrypted package from user B? Who’s liable — the manufacturer of the FTP server software? The authors of the strong encryption software? User A’s and B’s internet service providers? Should we demand that computer manufacturers or operating system vendors make it impossible to run strong encryption software on their machines as a ‘best practice’? Where does the potential liability end?

      1. Don’t forget the packet(s) may traverse other provider’s networks between A’s and B’s individual ISPs, particularly if is A and B aren’t contracted with a Tier 1 ISP

        1. I’m sympathetic to these arguments- I think this proposal is a very bad idea- but I would note that one thing Baker might say is that “recklessness” is a pretty tough standard to meet. Basically, it means deliberately disregarding a known risk.

          So if child porn is carried on your network without your knowledge, there’s no liability. If child porn is carried on your network which was carelessly designed in a manner that would facilitate the transmission of child porn, no liability either. If there is material carried on your network that you should have figured out to be child porn, still no liability.

          Basically, you would either have to know it was child porn or have intentionally disregarded evidence that child porn was being transmitted. In practice that’s going to be very hard to prove, and this legislation is likely to be symbolic.

          But the symbolism is bad nonetheless.

          1. “Recklessness” is only difficult for incompetent prosecutors afraid to wave that kiddie porn red flag in front of the press and the jury.

          2. Baker makes it pretty damned clear that he considers end-to-end encryption a red flag for being reckless.

          3. First, Baker is making it pretty clear that there is some risk associated with merely using end-to-end encryption. Otherwise what “consequences” and “require[ments for] them to pay for the suffering their new feature enables” is he referencing?

            Second, as a practical matter this doesn’t end well. The providers can’t just submit an affidavit saying “I did not know about this” and the case will just be over. There will be no easy summary judgment, especially for those providers who are not following centrally planned “best practices” however defined.

            1. There are lots of summary judgments in public figure defamation cases, which use a recklessness standard. It’s harder than you think.

              1. I’m not sure that analogy is much comfort at all. Public defamation cases involve relatively unsympathetic claimants, seeking to impose liability that implicates an important constitutional right, that intertwines with the court’s interest in allowing free discussion of a public figure. Future child pornography reckless cases will involve the most sympathetic possible private claimants, seeking to impose liability on someone with no constitutional interest in avoiding liability, and for which denying summary judgment has no affect on the overall discussion of public figures or matters of public concern.

    3. Remember that one time people sent bombs through the mail, and they prosecuted the post office personnel who delivered them? Oh wait, that didn’t happen.

  2. The one problem I see here is the use of a Government Commission to enact safe harbor rules. For many years US Industry has operated quite well with private groups getting together to establish standards which a then promulgated by ANSI and the like. Building Codes are developed through a similar process. The Internet operates through similar Engineering Task Forces.

  3. I don’t doubt that this will make the decision to offer end-to-end encryption harder. But how is that different from imposing liability on automakers whose gas tanks explode in rear-end collisions?

    Because an exploding gas tank is not a desirable feature in itself, E2EE is. Additionally E2EE is not a liability in and of itself, unlike an exploding gas tank

    A more apt analogy might be a car company that chooses make cars with an opaque trunk, rendering any contraband that might be transported invisible to the casual observer

    1. While there is nothing in the draft about E2EE (from what I have been able to gather a tech blogger just claims that this is what the bill is secretly about) if it winds up being an issue one can see why. With things like blockchain coming up it will be impossible to stop internet criminals unless some form of responsibility is placed on the companies facilitating it. (ISP and phone companies require name, SSN, DOB, address, and do a credit check only to verify the person is real and you hence why they have avoided regulation in that regard. People with horrible credit get phones and internet service.)

      If we are going to go by what the tech blogger is saying, that this bill is just a way to make demands, then why stop at E2EE and take a look at what the draft says? Take note of the age limits part.

      How does an interactive computer service have age limits? How does one adhere to best practices to retain CDA 230 immunity?

      Probably name, SNN, DOB, address, etc. If you’re using a service that has your name and you can’t hide that you’re less likely to use it for a nefarious purpose. Only idiots make threatening phone calls from their own phone. If it is required that you verify your identity to use an interactive computer service then what good is a VPN other than surfing the web? You can’t interact without someone having the knowledge it is you.

      People seem to gloss over that part of the draft.

      1. People seem to gloss over not having transparent car trunks, or mandatory glass walls on houses, or giving copies of all keys to cops. Shall we all walk around naked so cops can see we have no pistols in our pockets?

        Let’s mail unsealed letters. Better yet, no envelopes — require everyone to send postcards.

        Why don’t we just rid of that awful 4th amendment and get it over with? Why are we pussy-footing around this abuse of privacy?

        Are ye daft, man? Is government the boss of us, or is it the other way round?

      2. “With things like blockchain coming up it will be impossible to stop internet criminals unless some form of responsibility is placed on the companies facilitating it.”

        So you stop gmail/facebook/etc from doing end-to-end encryption. I encode the instructions to my terror cell using (3d party encryption|my own encryption|a one time pad). All any of the agents know about that is it is something in a format they don’t know – might be encrypted, might be random, might be my new video format.

        You can outlaw sending encrypted traffic – we already do that for ham radio – but it’s unenforceable. You can try and have ISP’s etc block traffic they can’t read, but that’s kinda unenforceable as well; it’s pretty hard to tell whether I have encoded a little bit of encrypted data in the low order bits of an image file or whatever.

        I just don’t see hoe an ISP can in practice stop encryption.

        1. Nobody except the tech blogger that originally wrote about E2EE is talking about ending E2EE. The media then just parrots that. Then everyone starts freaking out about it. It was just stated that the bill “could” go after it in the original article I read. It doesn’t seem likely and if we’re going to talk about what they “could” do with the bill it seems more likely that identity verification is going to be a thing.

          Let’s say a guy is busted for sending illegal content through social media. Let’s say the transmissions are encrypted. Let’s say he was busted by sending the images to an undercover cop. Encrypted or not the police can obtain a warrant, get his login credentials through the service, see where his messages came from, who he sent to, etc. Since all of those accounts would have the identity verified they’d be able to catch more people than by banning encryption.

          Banning encryption doesn’t matter if people are setting up anonymous accounts and using a TOR/VPN combo. The resources/time needed to obtain the identities on that wouldn’t be viable. If it is stated that identities must be verified to retain CDA 230 immunity then one can use TOR/VPN and still get caught because the means they used to send it would have to be verified accounts.

          This doesn’t stop P2P and other forms, but it will put a dent in the mainstream communication tools, it will force the mainstream tools to verify identity, and in the end will accomplish more than banning E2EE at this rate.

          Everyone is freaking out about encryption going, but no one seems to care that this bill could mean that your email and so on could require as much as your phone or ISP does when it comes to personal information.

    2. That’s not a reasonable analogy, gas tanks are never designed to explode, but even 1 in 100,000 exploding in a rear ender is too much. If a tech company is held liable for 1 in 100000 users sending kiddie porn it’s also too much.

  4. The comparison to exploding gas tanks seems to rather casually ignore the fact that child porn requires a deliberate criminal act by a third party. The better analogy would be trying to hold car manufacturers responsible for drunk drivers. We could, after all, install breathalyzers in the starting mechanism of every car. Or just include complex puzzles which drunks would have more difficulty solving. It’s only money. Is the failure to put those kinds of controls in a car reasonable or money-grubbing cheapness?

    Baker just waves away all those concerns with false analogies and blithe assumptions. FOSTA is a bad law that’s already hurting people. EARN-IT has the potential to be worse.

  5. “Conservatives believe they have been unfairly targeted for arbitrary deplatforming and demonetization; liberals complain that there are still too many right-wing proselytizers and rabble-rousers on the internet.”

    Their complaint is arbitrary. I’m not aware of any conservative that thinks that Redstate’s rule forbidding promotion or support of “parties other than the Republican Party, or candidates running against Republican primary, caucus, and/or convention nominees” needs to be subject to a “deplatforming” exception, because it would eat the rule. What conservatives want is for Twitter to be treated differently from Redstate. (Someone will comment that Twitter purports to be neutral but isn’t, whereas Redstate is explicitly partisan. The rejoinder is that Twitter is not explicitly neutral but, even if it were, the consumer response would be to simply stop using the service, or go ahead and sue Twitter for the alleged misrepresentation.)

    “It’s a surprise that EARN IT came second to FOSTA.”

    No it isn’t. As a general matter there is nothing surprising about fucked up or random prioritization by Congress. But specifically, FOSTA was a direct response to Backpage. I’m not aware of a similar service for child pornographers, for the pretty obvious reason that the penalties (and frequency of enforcement) of child pornography are much greater than the penalties for sex work, precisely because the latter “has defenders on the left and the libertarian right”. You’re confusing cause and effect. The fact that people care less about sex work increased the likelihood that there would be a service like Backpage, and it is the existence of Backpage that causes demagogue moralists to legislate a solution to a problem that shouldn’t be illegal in the first place.

    1. ” But specifically, FOSTA was a direct response to Backpage.”

      Whose owners had been explicitly cooperating with authorities in general and the FBI in particular on Human Trafficking issues and particularly the trafficking of underage prostitutes for years before the government took down Backpage.

      At least half a dozen involuntarily trafficked teens were rescued explicitly because of that cooperation by Backpage.

      The case against the owners of Backpage is a pile of lies.

  6. “But how is that different from imposing liability on automakers whose gas tanks explode in rear-end collisions?”

    The analogy is broken because the gas tank that explodes is the injury the plaintiff is suing for. The automaker’s liability is direct, not vicarious for their reckless failure to stop someone else from harming the plaintiff. The better analogy is that EARN IT is similar to holding gun makers liable if their guns are used by other people to commit crimes. Note how even you frame it. It is about not “recklessly allow[ing] the spread of child pornography.” Nobody would suggest that an automaker who kills customers with exploding tanks is merely failing to avoid “allowing the spread of death”. No, the automaker is the fucking killer.

    1. I shouldn’t have used vicarious here. I don’t mean vicarious liability in the strict legal sense. I mean a person who uses E2E can never be held liable for child pornography without the involvement of some other person, namely the child pornographer. That isn’t the case with the automaker.

  7. Typo in the main headline. “…To end” not ‘o end,’ of course.

    (I admit that the typo is not relevant to the actual issues. But it is easy to correct.)

  8. I can’t see any reason in this argument, as any service provider who allows E2E is going to be held reckless as E2E will allow transmission of child porn. Therefore, E2E will not be allowed and the primary goal of totalitarians like Graham, Blumenthal, and Barr will be accomplished.

    However the primary use of E2E is not, and was never, child porn, which is, thankfully, rare, but rather, protecting personal communications from eavesdroppers, including, and especially, the government. Fascists like these three hate the idea that you can keep anything secret from the government because they see citizens as subjects, and government as sovereign.

    One of your contributors, Randy Barnett, was on Fox this weekend, eloquently noting that the Constitution is not the law that governs us, but rather, the law that governs those who govern us, and where in that document it indicates that the government can prohibit me from encrypting my communications is beyond me. Clothing your schemes to remove my freedom in some sort of framework built around dismantling third party liability protections doesn’t change the real purpose here.

    1. The part where they can prohibit you from using encryption is in the same section where they can infringe on your right to keep and bear arms, the one where they can strip search every traveler without any kind of probable cause whatsoever, and the one where they can take your property without so much a charging a crime.

      1. Don’t see those sections either.

        1. It’s under “qualified immunity”.

  9. I wonder if the provision re the Atty General is a poison pill. Giving an unethical sack of crap like Barr any regulatory authority (let alone the authority to change regulations enacted by the commission) is probably a deal-killer for anyone who has zero percent remaining trust/respect for the man. (And even people who love that he’s protecting the president will see the danger when a different, equally-unethical, Atty General is in that position, in the future.)

  10. How does a string of ones and zeros become criminal?
    A picture of a naked human is legal or illegal based on the age of the model. How does a company involved in moving ones and zeros determine the age of the model? Does removing encryption (the arrangement of the ones and zeros) make that determination more or less difficult?
    (and the gas tank thing is a red herring; is a car manufacturer liable for the robbery of a bank if a car is used for the getaway?)

    1. In summary: you can get a list of hashes from the NCMEC of all the CP they’ve found, then when handling any image you calculate the hash and check their list – if it’s a match then the image is CP.

      Works basically the same for video, and there are variations that work even for modified images. It’s not perfect – the list of hashes isn’t complete, and there are pretty easy ways to defeat it anyway that a smart person could think of (and that I’ll just jot down in the margin here…), but few criminals are smart, and most are lazy.

      1. Few criminals are smart, but the dumb pedophiles get caught without needing to read their email. And the ones that still exist outside jail cells are smart enough to navigate the dark web and use anonymizers and the like – because all the ones who aren’t that smart (and engage that activity online) are probably already caught. If they were easy to catch, we’d already have caught them.

      2. “In summary: you can get a list of hashes from the NCMEC of all the CP they’ve found, then when handling any image you calculate the hash and check their list – if it’s a match then the image is CP. ”

        Ah, no. If you calculate the hash, and check against the list, and discover that the hash is on the list, then the user has a file whose hash is on the list. It’s not CP until the image is examined, the person(s) depicted are identified, and their ages are verified to be under the age of majority.

        A hash hit on a system you do not own and control might get a law enforcement agency sufficient probable cause to obtain a search warrant to actually examine the file.
        Then again, some smart asses might intentionally create hash collisions for CP images, on images that are not CP, destroying even that.
        (Recall that, during the peak RIAA activity, some individuals put files on file-sharing systems that had names that suggested they were unauthorized copies of copyrighted works, just for the sake of burning RIAA bots that scanned file-sharing networks, looking for unauthorized copies of copyrighted works.)
        Seems to me that an ISP that falsely accuses somebody of possessing CP has offered to pay a settlement for defamation per se. If they’re stupid enough to accuse the same person twice, the settlement would be expected to be quite significant.

  11. “In other words, EARN IT will require companies that offer end-to-end encryption to weigh the consequences of that decision for the victims of child sexual abuse.”

    Horseshit. In reality, it will require companies to assume that vague and sloppy recommendations will be misinterpreted by government prosecutors eager to earn fame by bashing kiddie porn and heartless tech companies. The lure is irresistable, the accountability nil, and the march towards big brother inevitable.

    1. But it’s for the children! Why do you hate the children? Millions of innocent little children every single day are kidnapped and sold into sex slavery and all you can think of is the danger of giving government a brand new baseball bat with which they can bash any random internet company that displeases them, as if they don’t have enough baseball bats as it is.

  12. The argument that encryption allows kp is as idiotic as saying roads allow drive by shootings.

  13. That is a lot of words to say “end to end encryption is bad”.

    Authoritarians like Baker won’t be satisfied until every dead beat cop can have access to and read every message and image sent between anybody and for any reason.

    Police should try something novel to bust child porn: real police work. And stop trying to make EVERYONE less safe by insisting on weakening or eliminating end to end encryption because police aren’t doing their jobs anymore.

  14. Well that gas argument is certainly stupid. This more akin holding the manufacturer liable because they made the gas tank easy to replace, then someone used it to put in a negligently designed one made by a 3rd party.

    You’re also either naive or disingenuous in some other respects. AG Barr has argued a ban on unbreakable encryption is necessary for, among other things, this very purpose. To suggest it’s not practically a foregone conclusion what his idea and justification of what the best practice should be is either naive or knowingly misleading. Certainly more foregone than the DOJs current interpretation of the foregone conclusion doctrine on compelled decryption just to start.

    Second, the idea that this doesn’t necessarily make unbreakable end-to-end encryption not commercially viable in the US market is also naive or deliberately misleading. If it’s a best practice, and you don’t do it, you’re just not being honest about the likelihood the government couldn’t persuade the jury on such an emotionally charged allegation. That’s not a gamble any profit seeking entity would be willing to take.

    Third, you’re a legal expert who’s directly contradicting encryption and security experts about how bad backdoors are for security, in your suggestion that designing without backdoors would be a acceptable design choice for balancing competing interests.

    Finally, you’ve given no consideration to security impacts for the 99.9999% of people who are not engaged in distributing CP.

    1. Here’s another reason. The government is upset that Huawei back doors have compromised the government back doors.

  15. I’m going to join the chorus here, and say I agree with Baker’s critics here.

    I’m also wondering why anyone would believe that a jury would be able to make a reasonable distinction between “reckless” (“Pay up”) and “negligent” (“We’ll let you off this time, but don’t do it again”) conduct on the part of the tech companies, since the jurors are unlikely to have a good grasp of the technical points involved.

    I sure wouldn’t bet my company on that.

  16. “The risk of liability isn’t likely to kill encryption or end internet security. More likely, it will encourage companies to choose designs that minimize the harm that encryption can cause to exploited kids. Instead of making loud public arguments about the impossibility of squaring strong encryption with public safety, their executives will have to quietly ask their engineers to minimize the harm to children while still providing good security to customers—because harm to children will in the long run be a cost the company bears.”

    This is nonsense and the author should be ashamed of himself. There is no alternative to strong encryption – letting one unauthorized person peek at encrypted comments means creating a backdoor that can be exploited by others. You clearly have no competence on encryption as a technical matter. Produce some encryption experts who agree with you before spouting nonsense. (Hint: they don’t exist).

    And the analogy with exploding gas tanks is also ridiculous. This isn’t a case of weighing profit against a danger, this is a case of weighing risks to innocent consumers against other risks to innocent consumers. Degrading encryption isn’t just being negligent with consumer data security, but actively facilitating the insecurity of consumer data. That’s entirely the point.

    No company should be liable for following best encryption practices, because as a practical matter data security is just as important as stopping child pornography, and far more likely to be abused for bad ends. (Try talking to someone whose identity has been stolen, or whose accounts are otherwise compromised in a way that does something like wipe out their retirement, and try to explain how their data should be vulnerable on the off case that you catch someone with child porn).

    Meanwhile, end users can implement encryption on their own, and given child porn is highly illegal, they likely do or will. So you’ve made everyone else more vulnerable, and probably caught zero pedophiles you wouldn’t have otherwise caught. The only people this is a win for are government snoops who want to read everyone’s mail.

  17. As someone who has been involved in one or another part of the software industry from the ’60s to the aughts, I would have to add this to the remarks of Asbaroka.

    The most we can ask of an Internet communications provider is that it includes on a customer’s machine a program that examines a requested transmission for illegal content. If none is found, the content will be accepted for end-to-end encryption and transmission.

    However,this hardly foolproof for attacking child pornography. The customer could use third-party encryption on a message *before* requesting transmission and that would make the Internet communication site immune from being charged with facilitation, because it’s resident examination would not find anything illegal. In the worst case, the transmitter could include an unencrypted image of Mary and Her Little Lamb. Any accusation of submitting illegal content could be defended by the statement “Prove It!”

    Unfortunately, I think that this is an unsolvable problem.

Please to post comments

Comments are closed.