Criminal Penalties for Companies that Won't Help Decrypt? Not Yet—But Keep Watching
The stick has been suggested. Now where is the carrot?


The Wall Street Journal appeared to have a scoop yesterday evening: Senate Intelligence Committee Chair Richard Burr (R-N.C.) was drafting legislation that would actually institute criminal penalties against companies (like Apple) who resist or refuse orders to decipher encrypted messages on the tech devices they've created.
Given that Apple CEO Tim Cook is being very vocal about his company's refusal to compromise iPhone security to help the FBI crack the phone possessed by deceased San Bernardino terrorist Syed Farook, the gesture appeared to be a threatening escalation in the conflict.
But it turns out such legislation is not happening … yet. A spokesperson for the senator said he was looking at tightening the rules surrounding encryption but was "not considering criminal penalties in his draft proposals."
Burr is famously one of the anti-encryption politicians who simply doesn't even want to listen to debate about the potential dangers of forcing tech device companies to compromise their security at the demands of law enforcement and prosecutors. He wants to push forward with legislation without a proposed commission to explore the solutions, saying "I don't think a commission is necessarily the right thing when you know what the problem is. And we know what the problem is."
But Burr has absolutely no interest in engaging or even acknowledging the potential negative consequences of his solutions, and he makes it abundantly clear in a USA Today commentary. All he cares about is Apple doing what they're told and helping the government fight crime. He simply dismisses any complaints by saying that Apple isn't being forced to decrypt the phone itself or provide an actual back door, a semantic argument that ignores a discussion of the potential consequences of what Apple is actually being asked to do. Indeed, even after attempting to suggest this is an isolated case, Burr makes it abundantly clear that he and like-minded folks want Apple to do this on demand in order to help fight the government fight crime. Every time security is deliberately weakened like this, the risk is elevated that the mechanism for breaching phones will escape the control of those responsible and will end up in the hands of criminals, hackers, or autocratic governments. (If you want to learn more about the risks from the government weakening encryption—read here).
It's easy to suspect that the leaking of the possibility of criminal sanctions was the deliberate floating of the "stick" to punish stubbornness. Now that it's been presented, stay tuned for the carrot. Here's what consumers should be worried about: If the government forces Apple and Google and other tech companies to compromise their security, they'll also have to shield these companies from the consequences if (and when) this all goes sour. If the government makes Apple compromise its security, and then mechanism for doing so gets out and Apple users are targeted for theft and fraud, the company would face tremendous liability. Say you have an iPhone. Apple creates a system to bypass your phone's security. Somebody uses that system to get into your phone data and get access to your credit card information, for example. Wouldn't you want to hold Apple legally responsible for knowingly compromising your security?
One likely solution: Shield Apple and other companies like it from legal liability for breaches. Keep consumers from suing them. The government would pretty much have to do this if it makes tech company cooperation mandatory. This was the "solution" in the Cybersecurity Act of 2015, included and passed in the recent budget omnibus. It encourages companies to share data about users and customers with government agencies for the goal of helping fight crime. But it also shields companies who participate from lawsuits over breaches as an incentive and a reward.
So as this fight spools out, definitely keep an eye out for that "compromise." I have doubts that companies like Google or Apple will deliberately accept it, given how important their users feel their security is (Burr is losing a USA Today poll that was embedded within his own commentary). But liability protection could be implemented in legislation regardless, and it will be consumers who will be screwed over, both by increased risk of fraud and the inability to turn to the courts to hold companies liable.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Simple solution:
Design your product so that user creates/controls keys.
You are welcome.
Isn’t that what Apple did?
Yes.
In this case, the only user who knows the passkey is dead.
That’s why I’m confused about what exactly the government wants. What can Apple do to this specific phone that the FBI can’t?
It can send a “malicious update” to the firmware of the phone.
Don’t you have to agree to the update and enter a password?
Does the owner of the phone have to accept the update, or is it an auto-update? My personal phone doesn’t update unless I accept the update. My work phone doesn’t update unless the admins push out the update to it. But the phones don’t automatically update when the manufacturer sends out an update.
We’re (mostly) smart people here, its in the source document and has been described online.
The FBI wants Apple to create a software tool that
1) is ‘locked’ to this specific phone’s identity. This is their sop to the tool ‘not being a backdoor’ – “see? its only going to work on this one phone!” Until you change the number written in the code to that of another phone anyway. But that’s not the FBI’ sproblem.
2) will bypass the 10-tries-then-we-dump-your-data restriction. This will allow them to keep inputting potential *PASSWORDS* in without danger of losing the encrypted data. Once you have the password to unlock the phone (and its a 4-6 digit password – 50% chance of cracking it in a day at the low end with a brute force attack) it’ll decrypt itself.
3) must also not introduce new and eliminate existing delays between password attempts that aren’t inherent to the hardware itself. Right now there’s an around 5 second delay between password attempts. I don’t know how much of that is because it takes that long to encrypt and compare the input password with the stored encrypted one and how much is because there’s a ‘wait before trying again’ coded in.
4) must allow automated password attempt (not fat-fingering them in on the keyboard) through the physical ports or wifi/bluetooth.
So, see?! Its absolutely not an encryption backdoor. Its just a password backdoor. Totally different (as far as the law is concerned).
We don’t even know if this is *possible* yet. Apple’s fighting this under the assumption that it is but that doesn’t mean it is.
Even if it is possible to do the above, that doesn’t mean its possible to *force* the phone to accept the update automatically. Especially while locked and encrypted.
——————–
What I don’t understand is, if this is so important, the FBI can’t copy the *whole* phone, and run it in an emulator. Run multiple copies simultaneously, and then try out passwords to their hearts content.
Thanks for that. I didn’t know about the 10-attempt limit. That seems to be the most crucial part. If the password is only 4-6 digits (and they are just digits, right?), then it it should be fairly trivial without that limit.
Run multiple copies simultaneously, and then try out passwords to their hearts content
I am asking Google that right now. Nothing yet, but I found this John McAffee article instead. He offers to crack the phone in 3 weeks for free. It’s an amusingly apocalyptic piece.
Looks like the FBI took down your link.
This is potentially helpful.
Fixed
This stackexchange is helpful
It seems like you can’t even copy the raw bits without knowing the PIN, or a lot of information about how the data are written. Not sure if that is true
Its not.
It might not be possible to simply connect a cable and image the phone like that, but its certainly possible to use other physical techniques to scan the structure of the phone’s memory modules.
Just like you can format your hard drive but forensic specialists can disassemble the platters and physically scan them to find traces of data that appear to be gone to the drive’s sensors.
Can you comment on this part?
And this
dismounting and copying the physical data is just the first step.
Then you remount it inside a virtual iphone. Put the encrypted data back in the simulated cells it belongs in. You don’t need to know *what* any particular block is, only be able to identify the block and where you copied it from.
Then you run your fake iphone. If its erases the data you just reset that virtual iphone, recopy from the master you made, and do it again. And again, and again. And if you have the hardware to spare you can do this multiple times simultaneously, rather than sequentially with just one physical phone.
It ain’t easy, it ain’t cheap, but once you have the capability then you don’t need to fight with Apple or employ legal trickery to do this.
Its like making your own lockpicks instead of trying to compel Masterlock to come around and pick the lock for you.
Thanks
And, this security override software will be able to be installed on a presently locked-down phone.
There’s no possible way that could lead to something bad. After all, the intent is good.
I’ve been reading about this. At least one website claimed that it really could only be used on this phone.
The reason is that there are device-specific aspects that need to be coded into the firmware update. But then it also has to be signed by Apple with a secure key for the phone to accept the updates. So even if the firmware leaked, you’d have to change the device-specific aspects to use it on another phone, but then you would need a new secure key that only Apple can provide.
It really is the Apple key that is the crux of all this. And it sounds like those remain fairly secure.
So while I’m still not 100% decided, the more I learn, the more I think that this might be a narrow enough and legitimate search and that the FBI request isn’t unreasonable. But I don’t claim to understanding everything, so maybe there is a larger risk that I haven’t seen addressed.
I think the problem they’re asking for the *development* on a new tool. Not ‘put this hardware we give you onto your network’. Write, test, and deploy something that doesn’t exist at all right now. Oh, and eat the cost.
If nothing else, that (IMO) goes well beyond the scope of the All Writs Act.
So while I’m still not 100% decided, the more I learn, the more I think that this might be a narrow enough and legitimate search and that the FBI request isn’t unreasonable.
The request isn’t really dubious. It’s the court order that’s contentious/unreasonable.
I’m 100% sure Apple can unlock the phone. I’m 100% sure the government has the ability to ask and even the ability to make them. I’m -100% sure the government has the right to do so.
Especially, in a token “investigation for investigation’s sake” case when we have literally hundreds of prosecutors with thousands of similar phones lined up in investigation for crime/safety’s sake cases.
It’s far shorter than this. I think I read 85 milliseconds? Anyway, definitely not 5 seconds.
It’s far shorter than this. I think I read 85 milliseconds? Anyway, definitely not 5 seconds.
It gets longer after each try.
The 85 ms is the time it takes to to compute the PIN key from the PIN.
But the phone enforces an escalating wait time with each wrong PIN. By the 9th try, it is one hour (apparently). And there is a feature that wipes the system-level key after a 10th bad attempt (though it is optional…not sure if they can tell if it is turned on in this particular phone, other than trying it).
Exactly. Unless Apple somehow has the private key *for that phone*, there’s nothing they can do that the Federalistas already can’t.
You should at lest understand what the government wants before announcing how simple it is to fix.
The government realizes Apple can’t give it the key. What they want is a custom iOS which disables the auto-wipe after 10 password failures and eliminates the pause between password attempts, so they can brute force the password.
Whelp, smartphones were fun while they lasted.
I’ve made $66,000 so far this year working online and I’m a full time student. I’m using an online business opportunity I heard about and I’ve made such great money. It’s really user friendly and I’m just so happy that I found out about it.
Heres what I’ve been doing… http://www.alpha-careers.com
That’s a nice company you have there.
Too bad child pornographers love it so much.
Over the past few decades, Apple has often benefited from the protections of U.S. law. It has sought the protection of those laws when it is the victim of a crime. However, in this instance, the company seems to run from those same laws. Apple’s acceptance of the law appears to be uneven.
What a piece of shit. It sounds like a psychological trick an abusive husband would use to against his wife.
Someone should explain to him what “consent of the governed” actually means (hint: it’s not voting).
He’s basically saying that government is nothing more than a protection racket which is true.
The Soviet farmer has often benefited from giving all his grain to the state. He has taken bread and milk from the state to feed his family. However, in this instance, the farmer seems to run from his duty to the state. His acceptance of resources while withholding of the same appears to be uneven.
Sounds legit.
to be fair they have abusive manipulative women who do psychological tricks to their husbands,boyfriends and sons, but I realize we live in a time and country where it’s borderline illegal to allege that women aren’t perfect and infallible
Not if I knew they’d been enslaved into doing it. I’d want to hold the state responsible.
Richard Burr (R-N.C.) was drafting legislation that would actually institute criminal penalties against companies (like Apple) who resist or refuse orders
I trust Dick Burr — if that is really his name — was also drafting legislation that would actually institute criminal penalties against legislators who do not uphold their oath of office.
I’d love to see that motion made on the Senate floor immediately after Dick makes his.
Maybe I used to just be naive but it seems like there used to be a time when Americans were perfectly comfortable with not giving cops every power they asked for in the interest of individual rights and liberties. Was I just imagining that? If not, is it that people are bigger pussies or way dumber or both?
You were just imagining it.
The All Writs Act didn’t just suddenly show up.
The Third Party Doctrine establishes that the Constitution only protect *reasonable expectations* of privacy – if you don’t have a reasonable expectation then the government can compel others to violate that privacy.
Warrantless searches and seizures can be conducted by the Customs and Border Patrol agency within 100 miles of a border.
None of this shit is new. Hell, the Patriot Act isn’t even new anymore.
Which would destroy the value of proprietary encryption relative to that of open source encryption, putting encryption further beyond the control of the government in the long run. But not before some lives are destroyed and the “if you have nothing to hide” argument repeated ad infinitum sends you into a rage coma.
Which would destroy the value of proprietary encryption relative to that of open source encryption, putting encryption further beyond the control of the government in the long run. But not before some lives are destroyed and the “if you have nothing to hide” argument repeated ad infinitum sends you into a rage coma.
Schadenfreude-ly a Bernie Sanders-led ‘War on Entropy’ doesn’t sound that far-fetched.
Oh, goodness. Just imagine the fun of bringing in Linus Torvalds, Richard Stallman and Eric S. Raymond to explain why they won’t unlock a Linux phone for los federales.
Raymond: “Seriously?”
Torvalds: “That’s the dumbest idea I’ve ever heard.”
Stallman: “[munches toenail]”
Judge: Bailiff, place these . . . men? Into custody. Yes, their beards too.
Linus would drop a fuck on there somewhere.
Apple should totally do this, then publish it as an API, using the Senator’s phone as the example in the documentation.
/sarc
Screw the Feds, encryption rocks!
http://www.Anon-Net.tk
Senator Burr eerily looks like Sen. Pat Geary from the Godfather Part II. He might even have the same alarming level of scum and villainy behind that facade of his. All that’s left for this scene would be an Al Pacino lookalike pulling the strings and an unfortunate accident in a Nevada motel.
I predict that the assholes will force Apple to do this and when they do the FBI won’t find jack shit on the phone. They’ll try to cover their asses with some vague lie about how they found something “that helps us connect the dots”.
The technology is so developed that we can watch videos, live streaming, TV serials and any of our missed programs within our mobiles and PCs. Showbox
All we need is a mobile or PC with a very good internet connection. There are many applications by which we can enjoy videos, our missed programs, live streaming etc.