Encryption

Encryption Foes in Washington Won't Give Up

Government officials keep trying to make us expose our data to them—and the criminals who ride on their coattails.

|

It's impossible to overstate just how much governments hate not being able to read your mail, listen to your phone calls, and peruse your text messages. When all that snoopy officials can pull up is scrambled gobbledygook, they just know they're missing out on the good stuff, like little kids bristling at whispered adult conversations.

That explains the U.S. government's decades-long war against private cryptography and its most recent manifestation in the crusade against "warrant-proof encryption."

No matter how much government officials stamp their feet and hold their breath, it'd be a bad idea to give them the access they want to our data. And they do keep stamping their feet.

"By enabling dangerous criminals to cloak their communications and activities behind an essentially impenetrable digital shield, the deployment of warrant-proof encryption is already imposing huge costs on society," Attorney General Bill Barr huffed last summer when he delivered the keynote address at the International Conference on Cyber Security in New York City. "It seriously degrades the ability of law enforcement to detect and prevent crime before it occurs. And, after crimes are committed, it thwarts law enforcement's ability to identify those responsible or to successfully prosecute the guilty parties."

If that sounds familiar, it's because it's essentially a rephrasing of former FBI Director James Comey's 2014 argument that "those charged with protecting our people aren't always able to access the evidence we need to prosecute crime and prevent terrorism even with lawful authority. We have the legal authority to intercept and access communications and information pursuant to court order, but we often lack the technical ability to do so."

In turn, Comey barely rewarmed the Clinton White House's overwrought 1994 warnings that "the same encryption technology that can help Americans protect business secrets and personal privacy can also be used by terrorists, drug dealers, and other criminals."

The encryption technology that gets officials so hot and bothered year after year grows increasingly widespread for the simple reason that it satisfies a very real demand. Barr may worry about privacy-minded terrorists and drug sellers, but most people are more concerned about hackers, identity thieves, and nosy busybodies. In response, tech companies build end-to-end encryption into a host of products so that regular people can benefit without memorizing a user's manual.

In response, Barr and company argue that all they want is a "back door" built into communications services so that they can gain access when necessary—and only after they jump through all the legal niceties, we're assured.

But weakened, government-accessible encryption isn't a magic solution that will be used only to catch bad guys. It will be weakened encryption, period.

"The problem with backdoors is known—any alternate channel devoted to access by one party will undoubtedly be discovered, accessed, and abused by another," notes David Ruiz, a writer with the internet security firm Malwarebytes Labs. "Cybersecurity researchers have repeatedly argued for years that, when it comes to encryption technology, the risk of weakening the security of countless individuals is too high."

"Encryption is one of the few security techniques that mostly works. We can't afford to mess it up," cautions Matt Blaze, a cybersecurity expert at the University of Pennsylvania. "As someone who's been working on securing the 'net for going on three decades now, having to repeatedly engage with this 'why can't you just weaken the one tool you have that actually works' nonsense is utterly exhausting."

How can we know that the critics are right? Because the U.S. government itself claims that a Chinese company has, for years, been misusing exactly such back doors.

"U.S. officials say Huawei Technologies Co. can covertly access mobile-phone networks around the world through 'back doors' designed for use by law enforcement, as Washington tries to persuade allies to exclude the Chinese company from their networks," the Wall Street Journal reported on February 12.

Well, it's only fair. For half a century, the CIA and German intelligence spied on international communications courtesy of back doors they built into the products of Crypto AG, a company the agencies co-owned.

The CIA and its German partner kept that arrangement secret for a long time, but mandated access to everybody's messaging apps would be public knowledge and serious hacker-bait. It might even be a target for bad actors wielding the hacking tools that were stolen in 2017 from the National Security Agency—an exploit generally considered among the most significant events in cybersecurity.

Whoopsies. It's almost like you really shouldn't trust government types with the ability to peruse your communications and paw through your data.

Despite that history, Senators Lindsey Graham (R-S.C.) and Richard Blumenthal (D-Conn.) are floating a bill that would make tech companies "earn" Section 230 protection against liability for other people's communications that pass through their platforms by adopting "best practices" that satisfy amorphous government standards.

"The AG could single-handedly rewrite the 'best practices' to state that any provider that offers end-to-end encryption is categorically excluded from taking advantage of this safe-harbor option," writes Riana Pfefferkorn, associate director of surveillance and cybersecurity at Stanford Law School. "Or he could simply refuse to certify a set of best practices that aren't sufficiently condemnatory of encryption. If the AG doesn't finalize a set of best practices, then this entire safe-harbor option just vanishes."

The whole thing is cloaked in the language of "child sex-abuse material" so that privacy advocates have to argue against a measure nominally aimed at kiddy porn in order to protect strong encryption protection for everybody's communications. Yes, once again, government officials pretend that the terrible things they want to do are all about protecting the children.

Meanwhile, any back doors forced into our encrypted communications are likely to affect harmless people more than they inconvenience criminals and terrorists.

"Short of a form of government intervention in technology that appears contemplated by no one outside of the most despotic regimes, communication channels resistant to surveillance will always exist," states a 2016 report from the Berkman Center for Internet and Society at Harvard University.

Unwilling to rely on commercial products that may or may not keep their secrets, criminals and terrorists develop their own encryption products—including secure phones. They're very unlikely to comply with law enforcement demands for back doors.

"I think there's no way we solve this entire problem," the FBI's Comey admitted to the U.S. Senate Judiciary Committee in 2015. "Encryption is always going to be available to the sophisticated user."

But what about the rest of us? Despite all the evidence of the foolishness of their efforts, government officials keep trying to make us expose our data to them and the criminals who ride on their coattails.

NEXT: Brickbat: All the Modern Inconveniences

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. Is it time to break out my old PGP T-Shirt? Is it still classified as munitions?

    1. I remember when it came out that there was a list of trigger words the snoops were looking for and it became somewhat common in certain communities to regularly include a list of those words in communications. Maybe it’s time to use a program that randomly sends out texts and e-mails consisting of a random string of characters. Decipher that!

      00001 99731 53916 05454 26816 17914 82032 91637 59650 41734 22719 50489 86817 64764 45383 24603 51602 88971 34589 21726 61220 21534 00009 The sparrow flies at midnight – time to flip the pancakes. G has been notified.

  2. The greatest threat to the U.S. is government. They are not our protectors, they are our enemy. They act like occupiers, not representatives. FTG. They should be encouraging, not discouraging the strongest encryption possible.

    1. Indeed. They view private citizens as a pesky annoyance at best.

    2. “The greatest threat to the U.S. is government. They are not our protectors, they are our enemy.” — Well Said…

      I think it was famously worded in history a little differently as, “The necessary !!-EVIL-!! that must exist in any civilized nation”.

      As such they should be thought of as a necessary evil instead of as the, “disease masqueraded as its own cure”, it is often thought to be.

  3. Wouldn’t just outlawing all social media companies, and messaging apps solve this ‘problem’?
    If everybody has to use snail mail, not only would crooks be unable to use electronic encryption, the general public would have to learn to read and write again. That would give us the 99% literacy Bernie wants without having to actually elect him.

    1. And a technology level more like Cuba!

    2. “Wouldn’t just outlawing all social media companies, and messaging apps solve this ‘problem’?”

      It would be more effective were the internet to require some kind of unique authentification key for each user to be entered on login. Companies like Twitter and Facebook require users to use their real names so outlawing them would achieve little or nothing.

      1. “Companies like Twitter and Facebook require users to use their real names”

        When I had a FB I didn’t. Xing-Fe Rodriguez isn’t a real person as far as I know. Nor did a lot of my friends. My partner had a twitter and didn’t use their real name. I can find someone that isn’t on FB or twitter, make accounts under their real names, and as a result make the “real name” aspect entirely useless. Services like Paypal and Equifax require real names to sign up. That’s why they need your SSN.

        1. I know that people sign up on Facebook using false names, but isn’t a violation of their terms and couldn’t it lead to negative consequences?

          If we really wanted to get serious about this attribution problem, we could implement something like what Paypal has for all users every time they login to the internet.

          1. Seeing as they subjectively dance around their own terms we can both assume the TOS is pretty meaningless unless they want to use it to defend themselves. False names are only removed if reported and the person doesn’t show an ID. By that time the damage would already be done.

            Unless it’s unconstitutional I see them having to go the Paypal route to retain the immunity. If someone has a complaint and you hand over who it was then you’re not liable. If you can’t show who it was then the burden falls on your company. That provides incentive to verify everyone.

            If it is unconstitutional to provide privileges in such a way then I see 230 going and being replaced with something else that will say they’re liable.

  4. “But weakened, government-accessible encryption isn’t a magic solution that will be used only to catch bad guys. It will be weakened encryption, period.”

    Underlying all of this is a mistaken assumption that keeping our communications from the prying eyes of government can’t be a good thing all by itself–regardless of other considerations.

    We live in a time when plenty of people think that racists, homophobes, xenophobes, and misogynists shouldn’t be allowed to hold management positions in government or the public sector. Meanwhile, there are plenty of progressives who will enthusiastically tell us that opposing affirmative action is racist, opposing gay marriage is homophobic, support for President Trump’s wall is xenophobic, and opposing abortion is misogynist. There is no good reason to assume that the social justice warriors of the future won’t use the ability to monitor our communications to vet such people from positions of authority. If the progressives could use what people say to each other to keep racists, homophobes, xenophobes, and misogynists out of management positions, why wouldn’t they–because they care so much about the First Amendment?!

    How many times have you heard social justice warriors say that the First Amendment doesn’t protect hate speech?

    People on the left have legitimate things to fear from government monitoring their speech.

    There are plenty of Second Amendment advocates who will openly argue that one of the legitimate reasons to own firearms is so that we can band together and overthrow the government in a violent insurrection if that ever becomes necessary to reestablish and secure our rights and freedoms. I wish more privacy advocates were willing to make the same kind of case for encryption. People who are concerned about burglars should be free to own a firearm, and if people want to use encryption to protect themselves from cyber-criminals, that’s fine. However, if people want to use encryption to hide their communications from the government, that’s fine, too!

    1. No one has anything to hide until their lifestyle becomes illegal.

      1. Any gay (etc.) person over the age of about 30 should have a visceral, personal dread of loss of privacy and anonymity. It wasn’t that long ago that homosexual activities in many places in the US could land one in prison. Why so many gay folk are in league with big government on this is beyond all comprehension. The wind can shift against you just as fast as it came around to begin with.

        1. What does Islam or (cliche warning) Sharia law say?

      2. No one has anything to hide until their lifestyle becomes illegal.

        There is no way to prevent encrypted speech if plaintext is completely free and unfettered.

    2. There are plenty of Second Amendment advocates who will openly argue that one of the legitimate reasons to own firearms is so that we can band together and overthrow the government in a violent insurrection if that ever becomes necessary to reestablish and secure our rights and freedoms. I wish more privacy advocates were willing to make the same kind of case for encryption.

      2A Advocates: “Shall not be infringed…”
      Encryption Advocates: “Section 230 is the 1A of the internet…”

      You motherfuckers sowed the wind.

      1. I would have thought that a combination of 1A and 4A would be the encryption advocate go-to.

      2. For the hundredth time, Section 230 is about stopping necessarily frivolous lawsuits. People who aren’t even alleged by the plaintiff to have authored the defamation in question shouldn’t be required to answer in court for statements that they didn’t author just as gun manufacturers shouldn’t be made to answer for robberies that they did’t perpetrate. That’s as more about due process than anything else. The connection to the First Amendment is only that 230 critics are trying to use injustice as a means to shut down social media platforms the way gun control advocates are trying to do an end around the Second Amendment by bankrupting gun manufacturers for mass shootings they didn’t perpetrate.

        I don’t get the connection between Section 230 and encryption anyway. We should be free to encrypt our communications regardless of whether social media platforms should be subjected to a blizzard of necessarily frivolous lawsuits.

        1. The issue is more to do with negligence as far as a lot of lawyers and politicians seem concerned. It is known that the products are being used to cause harm to people. Well-documented. Nothing is being done because CDA 230 prevents a lawsuit. A reasonable person would be able to see the small steps needed to correct the issues. If the party responsible is aware of an issue and doesn’t correct it and the result is severe emotional distress you could have a negligence claim that’s workable. CDA 230 instantly means you don’t.

          The gun manufacturer analogy is poor because the manufacturer has no control over what happens when they sell the gun. Same with a store. Tech companies have full control over the product while you use it. In fact, the end-user never owns the product. That one part of the constitution also hinders people from doing things about guns. Private companies don’t provide free speech.

          1. If it’s anything, it’s about breach of contract and fraud. The platforms shouldn’t be able to demonetize or deplaform people for putting up content that didn’t violate the terms of service at the time the content was published–changing terms of service notwithstanding. If they do that, then they owed the content creators some consideration for the time, effort, and lost revenue at the very least.

            Allowing necessarily frivolous lawsuits isn’t the solution to anything.

            P.S. Gun control advocate’s claim that the markers of AR-15s and 30 round magazines know exactly what their arms are manufactured to do–and they’re going after them for the speech in their advertising among other things. It’s just like going after a brewer for drunk drivers because they sell their products in bars, where they know people will drive home, or because they market themselves as something to enjoy on festive occasions away from home. The reason gun manufacturers and brewers aren’t responsible for mass shootings and DWI manslaughter isn’t because of some arbitrary measure of “control”. It’s because the manufacturer and the brewer weren’t the ones who perpetrated the mass shooting or the manslaughter. It’s the same thing.

            You should not need to defend yourself in court from any charge when everyone in the court, including the plaintiff and or the prosecutor, agrees that you weren’t the one who actually perpetrated the crime.

            1. “The platforms shouldn’t be able to demonetize or deplaform people for putting up content that didn’t violate the terms of service”

              I disagree. They’re private. They’re a business. It’s working for them. They’re making more money. Ethically it may not be kosher, but it’s their business.

              Allowing lawsuits means products improve and harm is lessened. If my steering wheel comes off and I can’t sue they probably won’t correct the issue that made my steering wheel come off. My old car had an issue with gas pedal sticking. Since it was happening to multiple people it was better to fix it than to deal with lawsuits.

              With the tech companies a lawsuit isn’t possible so why bother? Let that person get impersonated, have their reputation, job, marriage ruined, and then sell the data of the fake account used to do it even though it was created and used from an IP that leads to a dead end. Can they fix that issue? You bet. Why bother though? They going to get sued if they don’t? No.

              A magazine isn’t a firearm. It’s a magazine. The magazine didn’t fire the shots. It’s pretty-far removed. While I disagree with the bumpstock ban it does seem to circumvent the ban on auto weapons. It wasn’t a shock that they went after it.

              1. “I disagree. They’re private. They’re a business. It’s working for them. They’re making more money. Ethically it may not be kosher, but it’s their business.”

                Actually, it’s almost certainly a violation of basic contract law.

                If I put up a poster saying I’ll give anyone a reward of $100 for finding my cat, if you go out and find my cat, I’m contractually obligated to pay you $100. And putting a disclaimer at the bottom that I can change the terms of the contract at any time is routinely ignored by the courts–because if one party can change the terms of the contract at any time, then enforcing that would be legalized fraud. They spent time, effort, and money to find your cat, and they brought it to you in good faith, and now you owe them $100.

                In the case of a platform like YouTube, plenty of content creators created content for YouTube, only to have YouTube change the terms of the contract unilaterally and destroy them completely. YouTube built a business on the backs of content creators like the ones they deplatformed. You don’t get to build a business on the backs of content creators on the basis of what amounts to a fraudulent lost cat sign–collect the benefit of the cat–and then hand the content creators a big bag of nothing but a lost audience for their efforts.

                If YouTube decides they don’t want the cat after all, that’s one thing. Doesn’t mean they don’t owe the person who found the cat what was promised in the contract.

                1. They all say they can remove content as they how see fit. Seems pretty tight to me.

                  1. Does it say they can deplaform you on the basis of standards that didn’t exist in the ToS at the time the content was originally created?

                    1. Yes. Everyone that has been kicked off of any website and sued has lost the case. A private company can choose to remove anything that you decide to host on their service. That’s how it should be.

                      The only stipulation may be if someone paid for advertising, was removed, and money was not refunded. I think a case is currently going through the courts.

    3. “and if people want to use encryption to protect themselves from cyber-criminals”

      Fraud and Identity theft are the types of cyber-crime that people are most likely to face. If you want to avoid being victimized, a little common sense and skepticism is required. And think before you click. Encryption is an unnecessary complication and not going to help a willing dupe.

      “However, if people want to use encryption to hide their communications from the government, that’s fine, too!”

      No. Two words: Meta data. The military has no idea of the actual content being communicated, but the meta data gathered can be enough to warrant death by drone.

  5. “Encryption is always going to be available to the sophisticated user.”

    And truly unbreakable encryption is always going to be available to the sophisticated user. One-time pads are trivially easy to implement, fairly easy to use, and unbreakable without access to the pad.

    The only difficulties are exchanging the pads, and securing the unused pads. Destroying pads after they are used is a good plan, too.

    1. Encryption on the internet requires enormous prime numbers, maybe 300 digits long, or longer. The strength of the encryption relies on the difficulty of finding these numbers. I would have thought that there must be a list of these big prime numbers somewhere on the net which code breakers can consult, making their work much easier.

      1. I would have thought that there must be a list of these big prime numbers somewhere on the net which code breakers can consult, making their work much easier.

        I’m sure you would have thought that, given your general inability to reason.

        1. “given your general inability to reason.”

          Can you articulate what you find unreasonable? A prime number will remain a prime number when added to a list. Even when it’s 300 digits long.

          1. There are more than 10^297 prime numbers that are 300 digits long. By comparison, there are about 10^80 atoms in the observable universe. You wouldn’t even have begun to “make a list” even if you could somehow use every single atom in the universe to represent a prime number.

            1. I think mtrueman is victim to something I used to see a lot when I taught physics. People in general have a hard time visualizing really big numbers. So when someone says 300 digits long, they really have no idea what that means. I got distracted by a few students regarding big numbers (I think we were discussing the size of the known universe or something) and I brought up how big a googol was. Then I mentioned a googolplex. I tried to explain how meaningless the concept of a googolplex was to anything beyond math trivia. A student challenged me that he could write out a googolplex. He truly couldn’t understand how many fucking zeroes there are.
              People get so used to hearing billions or trillions. But they generally don’t have any idea that a trillion written out is 1,000,000,000,000. And that is just 10^12.

              1. I understand large numbers as well as humans can reasonably be expected to. But I’m not a cryptographer. What mystifies me is how do cryptographers arrive at these prime numbers in the first place, and what’s stopping would-be code breakers from following in their foot steps?

                A few weeks back I wrote a little code for my Android to use with the “CLrepl” app:

                (defun isprim (tar)
                (defun moop (tar count acc)
                (cond ((eq count 1) acc)
                ((eq (mod tar count) 0) nil)
                (t (moop tar (- count 1) acc))
                )
                )
                (moop tar (ceiling tar 2) t)
                )

                I figure cryptographers have to do something similar to come up with a number from which to start. And if they can do it, why not someone who would follow them, especially, like the government, if they can surveil computer systems from the get-go before any encryption is implemented.

                1. I understand large numbers as well as humans can reasonably be expected to.

                  Your earlier comment suggests that you don’t.

                  1. “Your earlier comment suggests that you don’t.”

                    How about the comment you are responding to?

            2. So how do we arrive at these prime numbers in the first place when we want to encrypt? And what’s to stop government from finding out these numbers, other than the difficulties of a brute force search that you mention?

              1. The classic gentle introduction to the topic area is “Applied Cryptography” by Bruce Schneier. In short, cryptography doesn’t work the way your posts would seem to indicate that you think that it does, or perhaps guess that it does. Regardless, that book will answer the questions you’ve posed and much more. If you want something a bit less weighty (it’s a long-ish book), wikipedia has articles about prime number generation, cryptography, and steganography – and since these aren’t emotionally loaded topics (unlike political issues), wikipedia is actually not a bad initial resource for that kind of thing, though wiki vandals can strike anywhere.

      2. Some types of encryption on the internet, like public key encryption, do make use of large primes. But it’s not as simple as just having the prime number, because there’s modulo math and large polynomial exponents involved too. Having a table of prime numbers is not particularly helpful.

        Not all encryption on the internet uses large primes though. Elliptical curve cryptography doesn’t use large primes.

        Generally, encryption depends on doing math problems. Prime polynomials modulo and elliptical curves have a similar property that is very valuable for cryptography purposes: they are relatively easy to perform, and extremely difficult to reverse.

        One-time pads, though, barely require math at all and without the access to the key generally cannot be broken.

      3. From a tutorial:

        It is very simple to multiply numbers together, especially with computers. But it can be very difficult to factor numbers. For example, if I ask you to multiply together 34537 and 99991, it is a simple matter to punch those numbers into a calculator and get 3453389167. But the reverse problem is much harder.

        Suppose I give you the number 1459160519. I’ll even tell you that I got it by multiplying together two integers. Can you tell me what they are? This is a very difficult problem. A computer can
        factor that number fairly quickly, but (although there are some tricks) it basically does it by trying most of the possible combinations. For any size number, the computer has to check something that is of the order of the size of the square-root of the number to be factored. In this case, that square-root is roughly 38000.

        Now it doesn’t take a computer long to try out 38000 possibilities, but what if the number to be factored is not ten digits, but rather 400 digits? The square-root of a number with 400 digits is a number with 200 digits. The lifetime of the universe is approximately 10^18 seconds – an 18 digit number. Assuming a computer could test one million factorizations per second, in the lifetime of the universe it could check 10^24 possibilities. But for a 400 digit product, there are 10^200 possibilities. This means the computer would have to run for 10176 times the life of the universe to factor the large number.

        It is, however, not too hard to check to see if a number is prime{in other words to check to see that it cannot be factored. If it is not prime, it is difficult to factor, but if it is prime, it is not hard to show it is prime.

        So RSA encryption works like this. I will find two huge prime numbers, p and q that have 100 or maybe 200 digits each. I will keep those two numbers secret (they are my private key), and I will multiply them together to make a number N = pq. That number N is basically my public key. It is relatively easy for me to get N; I just need to multiply my two numbers. But if you know N, it
        is basically impossible for you to find p and q. To get them, you need to factor N, which seems to be an incredibly difficult problem.

        [There’s a lot more, but basically realizing that there is a special mathematical relationship between two primes, p and q, and the product (p – 1)(q – 1), and how exponentiation and modulo arithmetic works…just knowing a bunch of large prime numbers is not particularly helpful…there are 24,739,954,287,740,860 prime numbers less than 10^18…As of December 2018 the largest known prime number has 24,862,048 decimal digits. ].

        1. When I started to look into cryptography like the RSA, or especially Diffie Hellman, which I think is the more elegant, I was surprised to see how simple the math was. Modulo arithmetic and exponentiation shouldn’t be beyond the abilities of a motivated high schooler. I’ve never heard of elliptical curve encryption, so I’ll have to take a look.

  6. Government officials keep trying to make us expose our data to them—and the criminals who ride on their coattails.

    the government *is* the criminal here.

    1. The criminal tie-in is a red herring. The prime problem is government. Always has been, always will be.

      You’re generally not going to get too much crap for shooting at a criminal breaking into your home (unless you’re in Massachusetts), but shooting at any government person is going to get you in hot water PDQ, even if any rational person would agree that it was justified.

  7. Common, what good is having a big, comprehensive government if it can’t track our movements and communications? We should be making it easier for our overlords to enforce right-thinking and quash dissidents.

  8. Wow. A whole article on encryption written by the most fly-over libertarian Reason has to offer and not one mention of free speech or the 1A. I wonder if we’d get an article from Tuccille about how the government shouldn’t be able to dictate how (not) secure the safe you lock your guns up in should be without mentioning the 2A? Good thing Reason has allowed the focus, defintions, and narratives to shift away from the 1A entirely in favor of section 230. I’m sure you guys will have lots of luck telling Congress they can’t re-write or reinterpret the law that they wrote.

    Jesus Christ has this magazine gone to absolute shit.

    1. I’m not sure what it has to do with the 1st amendment or section 230.

  9. Despite that history, Senators Lindsey Graham (R-S.C.) and Richard Blumenthal (D-Conn.) are floating a bill that would make tech companies “earn” Section 230 protection against liability for other people’s communications that pass through their platforms by adopting “best practices” that satisfy amorphous government standards.

    Because when I think “best practices”, I automatically think “Congress”.

  10. Encryption is becoming a thing. People want it. Services are offering it. What good is a backdoor if you don’t know who is sending or receiving? Anonymity is a thing. People want it. Services are offering it. Your backdoor only becomes a thing of value if you eradicate the anonymity.

  11. “By enabling dangerous criminals to cloak their communications and activities behind an essentially impenetrable digital shield, the deployment of warrant-proof encryption is already imposing huge costs on society,” Attorney General Bill Barr huffed last summer

    He needs to take that up with God. God created a universe with laws that make encryption easy.

    In any case, the arguments against encryption control are pretty much the same as the arguments against gun control. You can’t consistently oppose gun control and support encryption control.

    1. “God created a universe with laws that make encryption easy. ”

      She made a universe where meta data analysis is even easier. And meta data analysis is enough for a government to issue a sentence of death by drone. Your encrypted communications may still be intact, but what good is that if you’ve been blown to pieces by jack-booted government thugs.

  12. ★Makes $140 to $180 consistently online work and I got $16894 in one month electronic acting from home.I am a step by step understudy and work essentially one to two or three hours in my additional time.Everybody will complete that obligation and monline akes extra cash by simply open this link……Read MoRe

  13. Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty nor safety. Benjamin Franklin (1706-1790)

    1. By using private companies for your speech you’re giving up liberty to have them pay the hosting costs.

      Whether we like it or not once someone provides a for-profit program and kids are using to negotiate drug deals the government pops its head in and says “Hey guys! What’s going on here?”

      Every fucking time.

      If anyone out there wants to start a tech business my advice is to not go public with your product until you figure out how to prevent people from using it to sell humans, drugs, or send illegal nudes. If you can’t figure out how to do that then accept that humans are a massive liability and the unknown variables aren’t worth it.

  14. Barr should be looking at the “dangerous criminals” in his own department before worrying about those outside of government.

Please to post comments

Comments are closed.