The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Keeper -- Loser, Weeper
Episode 210 of the Cyberlaw Podcast
In the news roundup, Nick Weaver, Ben Wittes, and I talk about the mild reheating of the encryption debate, sparked not just by renewed FBI pleading but by the collapse of the left-lib claim that building law enforcement access into IT devices is impossible because, uh, math. The National Academy report on encryption access has demonstrated that access is well within the zone of plausible technology policy, with support from a group of prominent tech experts, such as Ray Ozzie, all of whom know math.
Speaking of law enforcement, it was a good week for cybercrime enforcement. Nick and I touch on two victories for the good guys, with the Carbanak mastermind busted in Spain and Yevgeny Nikulin extradited to the US over Russian objections.
Meanwhile, DHS is moving forward on one of the more significant new efforts to control terrorist travel across borders -- using social media data. The agency will be requiring social media names (but not passwords) from visa applicants, according to a proposed rule now gathering comments. Maury Shenk, Ben, Nick, and I talk about the privacy and first amendment issues implicated by the proposal. We don't agree on most of those issues.
But we find surprising unanimity in mocking Julian Assange for deservedly losing his internet access at the Ecuador embassy. The panel even endorses Matt Green's wicked suggestion for trolling Assange from the sidewalk outside Assange's Ecuadoran squat.
We close the news with a quick sack dance over the prone form of Keeper Security, which has dropped its libel suit against Dan Goodin and Ars Technica, probably because it was going to lose; the defendants' coverage of Keeper's serious security problems was straight and fair. Bottom line: there are plenty of good password managers; why use one whose management sues to suppress news about its product's security holes? When that sinks in, Keeper won't just be a loser; here's hoping it will be a weeper too.
Our interview with David Sanger covers the vulnerability of the US grid, the psychic income and electoral popularity that Vladimir Putin gets from crossing the West's red lines, and whether we'd be better off sparking an escalating set of cyberattacks now, rather than later when we'll be even more vulnerable.
If the last question reminds you that John Bolton will soon be the National Security Adviser, you're not alone. We take a few minutes off from cyberlaw to explore just what kind of national security adviser Bolton will be. My bottom line: better than his reputation, and maybe much better.
As always The Cyberlaw Podcast is open to feedback. Send your questions, suggestions for interview candidates or topics to CyberlawPodcast@steptoe.com or leave a message at +1 202 862 5785.
The Cyberlaw Podcast is hiring a part-time intern for our Washington, DC offices. If you are interested, visit our website at Steptoe.com/careers.
Download the 210th Episode (mp3).
Subscribe to The Cyberlaw Podcast here. We are also on iTunes, Pocket Casts, and Google Play (available for Android and Google Chrome)!
Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
"by the collapse of the left-lib claim that building law enforcement access into IT devices is impossible because, uh, math."
Wouldn't the claim be, not that you can't build in a backdoor for law enforcement, but instead that having done so, you'd never be sure only law enforcement could use it? Because, you know, math doesn't work differently for cops than everyone else.
Bingo. Once the back door is built in for anyone, it's there for everyone. Because, math.
Oh, and we know that security agencies all have perfect intentions and would never abuse the privilege. Also, they have perfect internal security and would never suffer leaks, intentional or unintentional.
What nasty left-libs you guys are, especially you, Bellmore.
Up yours. You're as bad as Kirkland.
Not very gracious of you, Chem_Geek.
Can't take a joke, or a compliment?
OK. I get it. You're stupid and obnoxious and proud of it.
Jokes have limits. Calling someone a left-lib is a horrible thing to do. It's always in poor taste.
Wow, you really take redundancy seriously.
He was pointing out that left-lib was a random adjective to put into that sentence, not calling you a lib, Chem.
Indeed the credential used to access the back door would be subject to being stolen from the government and being used by bad actors. Depending on the model, this would either be merely catastrophic or catastrophic and unrecoverable.
Indeed, any sensible plan would have to explain what the disaster recovery plan was under various scenarios.
In many cases it wouldn't have to be stolen, because the bad actors would be in the government.
The claim is not that it is impossible to create a flawed encryption system, or one that has backdoors for the Government. The claim is simply the truth of reality: there is no way to prevent such from being exploited by anyone else, hence the characterization that it is impossible to both have encryption and not-have encryption. It is also important to realize that adversaries will simply elect to use encryption that does not have those flaws. And encryption really is nothing more than math, and you can't stop anybody from using math.
In fact, it has been recently proven (Wikileaks) that some widely used encryption systems (Dual_EC_DRBG) did have deliberate flaws in them, put there by Government operatives (NSA) so that it could be broken. In older times, it required nation-state level resources to exploit even such purposely secretly flawed systems. But nowadays, anyone can do it. So there is no question that it is POSSIBLE to create and promulgate encryption that can be broken by the Government. You can even keep it secret, for a little while. It's just that doing this means that criminals, terrorists, and adversarial governments can also break the encryption. At the same time, they will use encryption that you can't break. So, is that a great idea?
Apparently, pointing out these inconvenient facts makes one a "nasty left-lib."
I keep hearing these claims - that you cannot protect these back doors.
I do not believe them, and I have enough of a background in cryptography to understand what is going on.
Consider, for example, a system where the decryption key is held, in three pieces, by three independent and relatively hostile interests. Say, an ACLU lawyer, an FBI lawyer, and an EFF lawyer. All have sworn to provide the key if they are provided credible evidence that the use is within previously agreed upon parameters.
It is techologically possible to do this and do it perfectly. It has been done before - the master key to one of the major world-wide card encryption system used this approach, back when I worked for them. There was never a breach.
No system is perfect, but not ever being able to get into the communications of dangerous individuals is also a far from perfect state.
You mean like the FISA Court?
Yeah, perfect.
I was going to write a long response leaning on Buchanan's public choice theory, but yours was more to the point.
Mesoman, that's an interesting system. I don't know that we want to vest judicial-like powers in the ACLU and EFF, but so be it.
My question would be, if these are keys that are 'baked in' to each device, what happens when the ACLU reports a server breach that has compromised their 1/3rd of the key? Is there a method to revoke/replace them on device (hint: this is an infinite regress, since the key that authorizes management of the key is now morally equivalent to the key itself)? Can the backdoor now be disabled to ensure that unauthorized access no longer occurs?
What if the ACLU intentionally destroys their part of the key? Have they effectively locked the backdoor forever?
We all know that the only 100% secure encryption system is to completely randomize the entire packet's contents. It has the downside of being unusable but it has the upside of being completely secure. How much money might the government spend trying to find a way to crack it?
No, it's not unusable. You just have to trade one time pads in advance.
One time pads are utterly unbreakable, everybody avoids them just because they want unbreakable security that doesn't require meeting in person first.
They still have the potential for decryption, most obviously if the wrong person gets the pad. Only my method is 100% secure.
Besides, my comment was clearly a joke.
Or somebody could just read over your shoulder.
Are you the same Stewart Baker, JD, who worked for the NSA and DHS, and who was arguing for the deliberately-flawed government-backdoored encryption system (Clipper) back in 1993 ? Skating On Stilts?
Your general thesis is that we must surrender privacy for the sake of security from terrorism. So all citizens who feel like obeying a law that makes the use of encryption a crime could be surveilled. But terrorists probably won't obey those laws, so their communication will be secret. But since you can monitor every communication in the world, you will be able to detect (you hope) when someone is keeping a secret. (That is, the mere use of non-government-breakable encryption is itself a terrorist act.) Anyone who tries to keep a secret from the Government is a Terrorist. But don't worry, we will only go after the real terrorists. Because we know what each person is thinking and doing at all times, and we are Just and Wise gods who will Know when someone in our opinion is being Bad. And we won't hurt any good people. And only we will have this ability because only we know math.
I think I see a few problems in your storyboard.
That opening paragraph is so ridiculous is has more in common with a Poe than a lawyer.
Baker's argument is so poorly presented he can't even to link to the position he's arguing against. Instead his link for the "left-lib claim" is a link to someone else's better thought out argument for government backdoors.
The argument against government backdoors is not "uh, math", it's "uh, reality", the reality that it's impossible to build a backdoor for the government without leaving it open for everyone else. And if you think you'll hide the key I have an NSA leak to demonstrate otherwise.
Even with multi-part keys, there will be abuse. They make it sound like it would be extraordinary circumstances and a group of adversarial (that is, at least one privacy-oriented member) committee, coming together to save the world. The reality is that even if the keys could be kept secure (highly dubious), the intent is that this backdoor be used routinely. Not just the NSA, not just the CIA. not just other secret agencies, but also the FBI, DHS/TSA, Border Patrol, DEA, and ffs probably the Department of Agriculture. As well as your local police. And your local politicians who control the police. That's how these things go, and all the police agencies clamoring for this are actually pretty up front about it. They're going to do it at routine traffic stops.
The process will be totally automated and no humans will be involved, unless you have the resources to appeal for a post-sentence review. (Hint: Your life will have been destroyed years before that happens.)
We have already seen the corruption of the FISA court, and that's just one time we know of because the target was the highest profile person in the entire world.
In what universe is outlawing the ability to keep a secret from the state not a totalitarian nightmare?
And btw I am an encryption expert, and total right-winger. So much for ridiculous assumptions.
It's nice to find silver linings, but I suggest the ransomware attack that shut down much of Atlanta, and still affects some systems, overshadows the noted cybercrime victories. Plus, more news of breaches. Then again, info theft is more of a dog-bites-man or student-shoots-up-school yawner these days.
One interesting side question: let's suppose for a moment that we 'nerd harder' and solve the problem (for the record, I am dubious) and it is now known and agreed (by some majority, obviously not 100%) that we will implement this for the US. That is to say, some cryptographic material belonging to the US government will be keyed into the devices.
Surely France having a good (not perfect) record on civil rights will also insist that devices sold in their country will have a back door keyed to their authority. And so too Japan Italy and Poland. And then Ukraine. And then Russia, Iran and China.
So then we have a dilemma. Who will decide which authorities are keyed-in to the backdoor? Each jurisdiction could pass a law mandating their key for devices sold in their country, but devices travel across borders easily, so if you make a Russia-keyed version then the smart ne'er do wells will buy them elsewhere.
Unless of course, each device is keyed to all the government authorities around the world. In which case it becomes a bit of a problem that China can backdoor phones from Britain and Venezuela can backdoor phones from Canada.
And of course, the next logical moves is that countries will mandate that phones sold in their country do not have any unapproved authorities keyed to the back door. Which brings us back to the shopping-trip problem.
The obvious response, (Not saying it's a *good* response!) is that a message between systems would pass through a server which decrypted it using the source master key, then encrypted it using the destination master key.
And sent a conveniently unencrypted copy to the NSA, of course.
Why the NSA master key and not also the the KGB GRU key as well?
Won't China pass a law forbidding devices in their territory from encrypting all messages towards the GRU key?