We're From the Government and We're Here to 'Ghost' Read Your Emails

Habitually untrustworthy snoops still demand we trust them to monitor our communications.


Professional snoops are willing to back off their holy war against encryption—with just a few minor tradeoffs, of course. All we need do is allow them to invisibly join our conversations at will as "ghost" parties to encrypted communications. It would put our governmental overseers in the position of, say, that creepy neighbor who sneaks into the house of the couple next door and hides in the pantry while they lock the doors in the mistaken expectation of spending some quality time by themselves.

If you don't find that proposal reassuring, you're not alone. But let's see what the snoops are up to.

"It's relatively easy for a service provider to silently add a law enforcement participant to a group chat or call," mused Ian Levy, technical director of Britain's Government Communications Headquarters' (GCHQ) National Cyber Security Centre, and Crispin Robinson, GCHQ's technical director for cryptanalysis, in an article published last November. "This sort of solution seems to be no more intrusive than the virtual crocodile clips that our democratically elected representatives and judiciary authorise today in traditional voice intercept solutions and certainly doesn't give any government power they shouldn't have."

That Levy and Robinson gloss over a few details is obvious to even non-technical readers. While they evoke old-school wiretaps as featured in Hollywood movies, the "ghost proposal" requires redesigning whole new communications services to defeat attempts to maintain a modicum of privacy.

"This proposal to add a 'ghost' user would violate important human rights principles," an international consortium of civil liberties organizations, tech companies, and security professionals objected in a recent open letter. They went on to specify that the scheme floated by two officials with Britain's Government Communications Headquarters (GCHQ) would "pose serious threats to cybersecurity and thereby also threaten fundamental human rights, including privacy and free expression."

According to the open letter by the consortium, which includes heavy-hitters such as the Electronic Frontier Foundation (EFF), Human Rights Watch, Apple, Microsoft, Bruce Schneier, and Philip Zimmermann:

To achieve this result, their proposal requires two changes to systems that would seriously undermine user security and trust. First, it would require service providers to surreptitiously inject a new public key into a conversation in response to a government demand. This would turn a two-way conversation into a group chat where the government is the additional participant, or add a secret government participant to an existing group chat. Second, in order to ensure the government is added to the conversation in secret, GCHQ's proposal would require messaging apps, service providers, and operating systems to change their software so that it would 1) change the encryption schemes used, and/or 2) mislead users by suppressing the notifications that routinely appear when a new communicant joins a chat.

Don't worry, add Levy and Robinson, "almost all users aren't affected by it."

That's true—if they're referring to the people you'd expect them to want to monitor.

As a 2016 report from Harvard's Berkman Center for Internet and Society pointed out, many people and businesses don't really care to thoroughly conceal their communications. Most of us, despite the existence of powerful encryption, are pretty easy pickings for snoops.

But some people, for reasons good and evil, are more privacy minded. For the likes of terrorists, criminals, journalists, and political activists, the Berkman report concludes, "communication channels resistant to surveillance will always exist…new services and software can be made available without centralized vetting."

ISIS, the terrorist organization, apparently developed its own encrypted chat app several years ago specifically to evade outside efforts to intercept communications.

Just weeks ago, the Justice Department boasted that an international law enforcement effort had brought down "a criminal enterprise that facilitated the transnational importation and distribution of narcotics through the sale of encrypted communication devices and services." Adds the press release, "the government conservatively estimates there were at least 7,000 Phantom Secure devices in use."

Does GCHQ really think ISIS and Phantom Secure are the types of organizations likely to cooperate with the ghost proposal? Of course not. It's big companies catering to the bulk of the population that will play along. Privacy-minded people, good and bad, will ignore ghost proposal requirements.

But this isn't the first time government snoops have leveraged fears of terrorists and criminals to sell the public on backdoors into communications systems that would ease mass surveillance of the general public.

"We do not want to do anything that would damage our own national security or public safety by spreading unbreakable encryption, especially given the international nature of terrorism," warned the Clinton administration in 1996. It touted an ultimately failed effort to peddle limits on encryption exports and promote a "key escrow" scheme that would give the government access to everybody's communications.

In 2010, the feds came up with the idea of requiring communications providers to redesign their systems to ease wiretapping (maybe Levy and Robinson are just resuscitating old ideas).

And let's not forget the FBI's wild inflation of the number of encrypted phones that have thwarted its search efforts—by a factor of about eight. "The government has long held discredited views about encryption. Now we see the FBI is struggling with basic arithmetic," Sen. Ron Wyden (D-Ore.) coldly responded.

Governments in the nominally free West have, so far, been largely unsuccessful in their efforts to gain backdoor access to encrypted communications. That's a good thing given that such deliberate security vulnerabilities would almost certainly be abused.

How can we be sure? Because it already happened: Built-in wiretapping access to cellphones was exploited to bug high-ranking Greek government officials 15 years ago.

And that's assuming the government means well.

"Any functioning democracy will ensure that its law enforcement and intelligence methods are overseen independently, and that the public can be assured that any intrusions into people's lives are necessary and proportionate," Levy and Robinson promise.

But these ghost proposal authors work for GCHQ which, along with the NSA, FBI, and other agencies, were implicated by Edward Snowden in electronic surveillance that even the intrusive European Union condemned as a violation of privacy rights. Who do they think they're fooling?

Given their history of spying on whoever they want, whenever they want, that GCHQ and other government snoops still want mandated backdoors into communications systems is convincing testimony as to the continued effectiveness of encryption. The abuses in which they've already engaged are good reason to deny them any further formal permission to eavesdrop on conversations.

The consortium is right to push back against the GCHQ ghost proposal as a threat to privacy and free expression. And the rest of us—those who aren't reassured by lurkers in the pantry—should do our best to keep our encryption current and beyond the reach of nosy ghosts.

NEXT: Ted Cruz and Alexandria Ocasio-Cortez Join Push for Over-the-Counter Birth Control

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. I ain’t afraid of no ghost!
    Ghost busters!

    1. “Parody” busters are clearly needed everywhere in our nation to control the legitimacy of email correspondence, especially on college campuses. In addition to the benefits produced by a firm regime of surveillance, training and hiring a large control staff would also do wonders for the economy. One thing is clear: the current status quo clearly can no longer be tolerated, so let’s get to work fast. There is, incidentally, legal precedent that could easily be adapted as training material. See the documentation of our nation’s leading criminal “satire” case at:


      1. Didn’t you get caught trying to rip someone off?

        1. Excuse me?

  2. An old quote by Glenn “Instapundit” Reynolds is still applicable:

    “The surveillance state is part of the state. Where surveillance is a priority – say, when political enemies are concerned – it’ll be ruthlessly efficient. The rest of the time, like when it involves protecting Americans from terrorists, it’s just another government job.”

    1. “The rest of the time, like when it involves protecting Americans from terrorists, it’s just another government job.”

      Done incompetently by a total moron (too stupid to get a decent job in the private sector)

  3. I’d like to point out that consumer choice has already pretty much defeated this kind of proposal–and that defeat is becoming more pronounced all the time.

    About six months ago, Australia announced that they wanted to do something similar–concentrating on the enforcement side of requests for encrypted data. They wanted to find services AUS$10 million for failing to comply with government requests for encrypted data. Signal’s response to that is instructive here:

    “By design, Signal does not have a record of your contacts, social graph, conversation list, location, user avatar, user profile name, group memberships, group titles, or group avatars,” Joshua Lund, a Signal developer wrote. “The end-to-end encrypted contents of every message and voice/video call are protected by keys that are entirely inaccessible to us. In most cases now we don’t even have access to who is messaging whom.”

    —-Ars Technica


    Dear Government:

    How are we supposed to comply with a subpoena for Ken Shultz’s data when we can’t differentiate between Ken Shultz’s data and everyone else’s? We don’t even know who Ken Shultz is. We don’t even know if Ken Shultz is on our service. If you want Ken Shultz’s data, why don’t you send a subpoena to Ken Shultz? Here’s a USB drive with all the data we have on Ken Shultz. Yes, it’s empty. Now we’ve complied with your subpoena.


    XYZ messaging service

    P.S. Fuck you!

    One thing that might be worse than the government wanting to violate our privacy rights is well-intentioned people depending on the government to protect their privacy for them–when they’re perfectly capable of protecting their own privacy by exercising their own freedom of choice.

    Download Signal from the app store. It’s free. It uses the contacts you already have on your phone. If you’re so fucking lazy that you can’t be bothered to download a free program to protect your privacy, why should the government or anyone else bother to protect your privacy for you?

    1. I was wondering the same. At the end of the day, cryptography is just math, which can be and is implemented by many open source and commercial software packages. There’s no real way to enforce that people don’t use it.

      1. And different companies are increasingly coding their products so as to avoid any kind of accountability for its content.

        Lavabit (the email service) is another example. Yeah, they famously shut their service down rather than continue to operate without their customers’ knowledge that the federal government was subpoenaing their data. When the service came back online, it was with a new protocol that strips all identifiable metadata from encrypted emails and, furthermore, the keys to the encryption can never be subpoenaed–because Lavabit doesn’t have access to them. All of the encryption and decryption is done locally.

        Protonmail has end to end encryption, and part of their innovation is that they’re in Switzerland, where the privacy laws are such that in order to comply with an international subpoena for communications (rather than bank records), the alleged crime needs have happened in Switzerland. Meanwhile, their app is such that you hardly notice that encrypted emails aren’t generally “sent”, the way we think of it. You get a link to the message that was sent to your email address, and when you sign in to read it in their app from your phone, the message is still on their site rather than sent to you through your ISP. (You can send encrypted email the old fashioned way, too, if you want).

        The point is, the ability of government officials to reach our data is in a phase where government officials may imagine they can do more than they’re capable of doing. Signal went to a domain fronting for countries that the application was blocked in for a long time, and if that isn’t working right now as well it used to, I suspect they’ll find a way around that. There’s always an action and reaction aspect to this with the services trying to stay one step ahead of vicious government actors.

        The bigger obstacle to me seems to be adoption of these technologies. I wish more people cared enough about their privacy to install and use great programs–that are as good or better from a usability standpoint than the programs people are using now. I certainly plan to use people wanting the government to solve their problems for them to shame them–when they could be doing what they want the government to do for themselves. It’s pathetic to hear people complain about their lack of privacy and refuse to do simple and free things to protect it–and insist on government interference in the economy, essentially, because they’re too lazy to think for themselves.

        1. Yep, I agree 100%, especially the last paragraph. I’ve been using Signal with friends for a while now, especially after I discovered how insecure Telegram is. Still, some seem to (for no particular reason) insist on continuing to use TG or some other insecure messenger.

    2. Not a knock against Signal, but as I always say when people discuss encryption, you need to protect the endpoints as well.

      I strongly recommend Signal and WhatsApp, but just like Tor, don’t pretend you’re invulnerable because you’ve installed an app.

      Former Trump campaign chair Paul Manafort found this out the hard way recently, when the FBI obtained messages he’d sent over WhatsApp from the people who received them.

      In another current investigation, the FBI was able to access Signal messages sent by former Senate Intelligence Committee aide James Wolfe, and had at least some information about the encrypted messaging habits of New York Times reporter Ali Watkins, after the Justice Department seized her communications records as part of a leak investigation. Though it’s unknown how the FBI gained access to these encrypted chats, it wouldn’t necessarily have taken a crypto-breaking backdoor if investigators had device access or records from other chat participants.

      1. Yeah, that is actually a great point. The messages are obviously decrypted when they arrive at the device, so if the device is compromised, you’re in trouble. Likewise, if the device ends up in someone else’s possession, they can just scroll through the chats, no need for fancy math.

        1. Good encryption is rarely “broken”. The real players are always looking at the endpoints. How can they be compromised, what’s the weakness between the user’s eyes and the device he’s reading it on?

      2. My first guess would be that that the government took their phones in physical for and these guys didn’t do anything to to protect their phones from anyone who might pick it up. Some people undo the access security because it’s seems like a pain in the ass. If I’m Manafort of somebody attached to the Senate Intelligence Committee, I’d be really careful about keeping my phone secure.

        The FBI shouldn’t have any more interest in me–unless they want to see my conversations with friends and family, which are pretty boring. In the future, I suspect that we may live to see the day that disagreeing with affirmative action, thinking that bakers shouldn’t be forced to bake cakes against their will, etc. could be used as criteria by the FBI to make calls on things like gun transfers, no fly lists, or opportunities for employment.

        How many progressives think that racists, sexists, or homophobes should be allowed to work in management?

        I appreciate the point about how no one is secure because they installed a program, and socially engineered hacks are surely the biggest security vulnerability of all. If you don’t want to be burglarized, you can get as fancy as want with the technology, but you also ought to lock your front door when you go on vacation.

        I hope my point about how we probably already have enough choices available to us to keep our data as reasonably secure as we want it to be–without much help from the government at all.
        In fact, even if they hate the fact that we can secure our data from their prying eyes, I don’t think they can do much about it.

  4. Don’t worry, add Levy and Robinson, “almost all users aren’t affected by it.”


  5. “It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call,” mused Ian Levy,

    I made a few private inquires last autumn about having a Go Topless rally at Robinson’s Arch, and by the first Shabbat of my current visit to Israel …

  6. “We’re From the Government and We’re Here to ‘Ghost’ Read Your Emails.”

    That’s not how it goes.
    It goes like this: “We’re from the Government, and we’re here to fuck you.”

  7. A presumption implicit in this and similar articles and commentary is that there ought to be an absolute right to some sort of privacy, with details usually not provided by the claimant. That is not stated or implied by the fourth amendment or anything else in the US Constitution, which requires only that the government specifies and limits the search or seizure objective and gives sworn statements of probable cause to believe a crime has been committed or is about to be. That done, the government has the lawful authority to search or seize within the prescribed limits.

    And the government, now faced with nearly perfect encryption that impedes their ability to conduct some searches, wants a remedy that, as many have observed, is largely empty. Those who are honest will comply when presented with a request or warrant. Those who are unwilling to comply, but fail to secure the information, well will have it examined anyway and, perhaps, used against them or others in a court. The proportionately small to tiny remainder will decline to comply with the government’s demands, but might find themselves confined for contempt of court, during which they can contemplate the trade-off between exercising their power to deny the government’s lawful access and allowing it, with different and possibly more severe consequences. It is not clear that requiring legal encryption to provide for lawful government access will change much unless use of unapproved encryption carries fairly severe punishment.

  8. What, no mention of Carnivore?

    That said, the weak point in encryption is, and probably always will be, the humans. It might take you ten years to crack someone’s password. But it’ll take about two hours with them in a dark room.

Please to post comments

Comments are closed.