The CIA Can't Protect Its Own Hacking Tools. Why Should We Trust Government Privacy and Security Proposals?
The very idea that our intelligence agencies could keep encryption bypasses secret is absurd.

We are often told that law enforcement must have a way to get around strong encryption technologies in order to catch bad guys. Such a "backdoor" into security techniques would only be used when necessary and would be closely guarded so it would not fall into the wrong hands, the story goes.
The intelligence community does not yet have a known custom-built backdoor into encryption. But intelligence agencies do hold a trove of publicly unknown vulnerabilities, called "zero days," they use to obtain hard-to-get data. One would hope that government agencies, especially those explicitly dedicated to security, could adequately protect these potent weapons.
A recently released 2017 DOJ investigation into a breach of the CIA Center for Cyber Intelligence's (CCI) "Vault 7" hacking tools publicized in 2016 suggests that might be too big of an ask. Not only was the CCI found to be more interested in "building up cyber tools than keeping them secure," the nation's top spy agency routinely made rookie security mistakes that ultimately allowed personnel to leak the goods to Wikileaks.
The released portions of the report are frankly embarrassing. The CCI cyber arsenal was not appropriately compartmentalized, users routinely shared admin-level passwords without oversight, there seemed to be little controls over what content users could access, and data was stored and available to all users indefinitely. No wonder there was a breach.
It gets worse. Because the CIA servers lacked activity monitoring and audit capabilities, the agency did not even realize it was hacked until Wikileaks publicly announced it in March of 2017. As the report notes, if the hack was the result of a hostile foreign government like, say, China, the CIA might still be in the dark about the hack. Might there be other unknown breaches that fit this bill?
The report recommended several measures the CIA should take to shore up its internal defenses. Among the few that were not redacted: do a better job of protecting zero days and vetting personnel. Okay, so don't make all of the same mistakes again: got it.
Well, it looks like even this goal was too ambitious for the CIA. Intelligence gadfly Sen. Ron Wyden (D–Ore.), who first publicized the report, wrote a letter Director of National Intelligence John Ratcliffe stating that "the intelligence community is still lagging behind" three years after the report was first published. He demanded public answers for outstanding security problems in the intelligence community, such as a lack of basic practices like multi-factor and email authentication protocols.
What a snafu. It is absurd enough that the CIA of all places cannot even implement basic password protection programs. But when intelligence hacking units cannot even manage to protect its own hacking tools, our troubles multiply.
The CIA is unfortunately not uniquely incompetent among the intelligence community. The National Security Agency (NSA) found itself the victim of a similar zero day link in the 2016 Shadow Brokers dump. These are just two incidents that the public knows about. A culture of lax security practices invites attacks from all kinds of actors. We don't know how many times such hacking tools may have been discovered by more secretive outfits.
Many policy implications follow. There is a strong case to be made that intelligence agencies should not hoard zero-day vulnerabilities at all but should report them to the appropriate body for quick patching. This limits their toolkit, but it makes everyone safer overall. Of course, foreign and other hostile entities are unlikely to unilaterally disarm in this way.
The intelligence community supposedly has a process for vetting which zero days should be reported and which are appropriate to keep secret, called the Vulnerabilities Equities Process (VEP). Agencies must describe a vulnerability to a board who decides whether it's dangerous enough to need patching or useful enough for spying purposes.
For example, a vulnerability in some technology that is only used in China would probably be kept for operations. Theoretically, a vulnerability in some technology that is widely-used in the United States would be reported for fixing to keep Americans safe. As these incidents show, this does not always happen.
The VEP process is clearly insufficient, given these high-profile breaches. The very least the intelligence community can do is appropriately secure the bugs they've got. Efforts like Wyden's seek to impose more accountability on these practices.
There's a more general lesson about government efforts to improve security and privacy as well.
As implied earlier, we should strongly resist government efforts to compromise encryption in the name of law enforcement or anything else. Some of the most technically savvy government bodies cannot even secure the secret weapons they have not advertised. Can you imagine the attack vectors if they publicly attain some master encryption-breaking technique?
It also demonstrates the weaknesses of many top-down proposals to promote privacy or security. Government plans often attempt to sketch out master checklists that must be followed perfectly on all levels to work well. They can be time-consuming and burdensome, which means that personnel often cut corners and shirk accountability. Then when disaster inevitably strikes, the conclusion is that "people didn't stick to the plan hard enough," not that the plan was generally unrealistic to start.
There isn't a lot that the public can do about seemingly out-of-control intelligence agencies failing to secure potent cyberweapons beyond making a fuss. "National security" and all that. But it does give us a powerful argument against granting more power to these insecure intelligence bodies to break strong encryption. Governments can't even protect their secret cyber weapons. They almost certainly will not be able to protect a known backdoor into encryption.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
users routinely shared admin-level passwords without oversight
lol. When everyone is an admin, no one is.
I'll just take this opportunity to point out that if someone has access to your computer/phone/device then they have access to any unencrypted data on that device. A login password (I'm looking at you Windows!) is less than worthless if the data isn't encrypted. Just FYI. Phones tend to be more secure because your login password does actually encrypt everything on the phone. But everything on your PC is just dangling out there for anyone to see if they have physical access to the machine, or if your network isn't secure.
Windows is fundamentally insecure. If it weren't for AutoCAD, I wouldn't have had any use for Windows for years. Now that AutoCAD has a web app, I can't imagine why I'd use Windows again. Lack of driver support on a laptop maybe?
Excuse me Ken. Have you heard of GNU/Linux - the free (as in beer and freedom) and open-source operating system that respects your rights as a user?
Not only have I heard of it. I've been using it for a long, long time.
Not a fan of Ubuntu, since they were caught with their hand in the cookie jar.
https://www.gnu.org/philosophy/ubuntu-spyware.en.html
Visit For Indian Independence day quotes and status https://www.silberstatus.com/74th-independence-day-2020-whatsapp-status/
"Vulnerabilities Equities Process." I can't think of a more striking example of bureaucratese utterly devoid of semantic meaning. It's linguistic sludge designed purely to cloak the true purpose.
That one jumped off the page at me too. Government employees do love their acronyms.
I'm a little disappointed that they couldn't come up with a word that starts with "I" instead of "Equities" so that they could have have the acronym be VIP. That's just sloppy, although I guess I shouldn't expect anything more from them.
I quit working at shoprite and now I make $65-85 per/h. How? I'm working online! My work didn't exactly make me happy so I decided to take a chance on something new… after 4 years it was so hard to quit my day job but now I couldn't be happier.
Here’s what I do….............new Income Opportunities
I still recommend using Signal for everything.
"By design, Signal does not have a record of your contacts, social graph, conversation list, location, user avatar, user profile name, group memberships, group titles, or group avatars. The end-to-end encrypted contents of every message and voice/video call are protected by keys that are entirely inaccessible to us. In most cases now we don’t even have access to who is messaging whom.
https://signal.org/blog/setback-in-the-outback/
There is no backdoor, and if it there were one, it would lead to nothing but the time your account was activated and the time your account was deactivated. That's all they have to give a court.
It's honestly probably a good thing the CIA can't protect their own tools because, if they could, we would never know they exist at all nor would we have any idea if they were using them on us, which as it turns out other agencies have done.
If the CIA develops a tool, are they legally bound from sharing it with the NSA or FBI? I'd guess the answer is 'no, they are not'.
Because of hero Snowden, right? Bullshit. You people really are sheep who regurgitate what you're told. Everything you know or read about it was filtered through some person or organization with an agenda. All we have in America anymore are echo chambers, as this article and most others from Reason show.
Theoretically, a vulnerability in some technology that is widely-used in the United States would be reported for fixing to keep Americans safe. electrical contractors grand rapids mi
Lots of conclusory language from someone who doesn't know what they're talking about.
The CIA are worse than nothing. They created computer viruses and hacks that they let escape. (The jury is still out on actual viruses; why was Ft. Devens shut down?) This vile agency needs to be abolished before they destroy all our American values.
Of course, we can't trust them. Check out https://iortly.com/cybermobbing/ this page and I will explain why it's a bad idea.