Apple Versus the FBI
It's possible that the FBI is not primarily concerned with the particular evidence stored on the San Bernardino shooter's phone at all.


The technology industry is in an uproar over the revelation of a series of court orders directing Apple to compromise iPhone security to assist in the FBI investigation of deceased San Bernardino shooters Syed Rizwan Farook and Tashfeen Malik. In a public letter to Apple customers published last Wednesday, CEO Tim Cook strongly condemned the order to build a so-called "back door" for government access into encrypted technology. The FBI, on the other hand, counters that this request is a reasonable and narrow means to bring about justice for the victims of terrorism. Stripping away the emotional rhetoric from all sides, the core question is whether a company can be compelled to build a tool for law enforcement that will compromise the security of any of its devices.
In the short term, such a precedent could create significant ethical dilemmas for technology firms' customer and business relationships while opening the door to similar demands from foreign governments. In the long term, the FBI's efforts may ultimately—and ironically—undermine the agency's broader goal of accessing critical evidence by encouraging tech companies to build products that are impervious to infiltration by design.
This incident is only the latest conflict in a years-long encryption and security war waging between privacy- and security-minded groups and the law enforcement community. As more communications are digitized, authorities have been calling for industry assistance to build so-called government "backdoors" into secure technologies by hook or by crook.
Those in law enforcement fear a scenario where critical evidence in a terrorism or criminal case is beyond the reach of law enforcement because it is protected by strong encryption techniques that conceal data from anyone but the intended recipient. Hence, leaders at agencies like the Department of Justice, the Department of Homeland Security, and the National Security Agency, along with President Obama, have weighed in against strong encryption.
The technology industry and privacy activists, in contrast, emphatically oppose any proposed government backdoor into encryption and security technologies. Civil liberties groups such as the Electronic Frontier Foundation (EFF) and American Civil Liberties Union (ACLU) guard the right to privacy as inalienable. Computer scientists, alternatively, point out the catastrophic security vulnerabilities that such schemes would invariably generate. After all, malicious hackers or foreign governments could exploit these "back doors"—also known as "golden keys"—for their own malevolent interests. When it comes to the question of encryption, there is no tradeoff between privacy and security—weakening encryption weakens everyone's security.
Tensions temporarily thawed last fall, when the White House deescalated its push to require government access into encrypted technologies. However, the San Bernardino attacks in December of last year doused new fuel on this simmering conflict.
Since the shootings on December 2, 2015, law-enforcement authorities have been working to piece together Farook and Maliks' motivations and collaborators by combing through their conversations in the months leading up to the attack. Authorities quickly obtained the personal cell phones and hard drive that the duo had attempted to destroy. Additionally, investigators retrieved a company iPhone that Farook did not bother to destroy from his personal vehicle, in the hopes that it would contain further leads.
Because Farook was a county employee, the county was able to reset the phone's Apple ID password and thereby grant the FBI access to some associated iCloud data. But there was a big problem: The last data backup occurred on October 19, which means that any data in the period between that day and the attack was now inaccessible to investigators. If they hadn't reset the password, engineers could have possibly connected to Apple's servers and executed a fresh iCloud backup (D'oh!), but now there may be extra data on the phone that will never make it to the iCloud.
Investigators cannot simply guess the iPhone PIN number because of a security feature that wipes the phone's memory after enough false guesses. And because Apple instituted full-disk encryption on iPhones in 2014, traditional methods of data request and extraction were fruitless. So the g-men called in some judicial muscle and asked Apple to build a tool that would allow agents to unlock the device for a comprehensive search.
According to CEO Tim Cook's letter, Apple has provided whatever lawful resources and direction that it could to law enforcement in the past and in the course of the current investigation. But when the FBI asked the company's engineers to craft a tool that would allow agents to bypass iPhone security, Cook balked.
It's not that the investigation or request were unworthy in themselves—Cook explains that Apple wanted to bring justice for the victims of this horrendous terrorist act just as much as anyone else. But to Apple, complying with such a request would constitute a fundamental violation of its customer's trust. Publicly compromising the integrity of its devices was simply a non-starter for the tech giant.
What's worse, Cook points out, it is highly unlikely that such a request would be a one-time deal for the FBI. Despite FBI Director James Comey's reassurances that his agency only seeks to access a single device rather than a broad "backdoor" into millions of devices, many in the security and privacy communities are deeply concerned by the precedents and vulnerabilities that this action could generate. Today, it's an obviously guilty terrorist's personal iPhone. Tomorrow, it could be an entire operating system—and there's no guarantee that the public would be made aware of such a backdoor.
Indeed, it's interesting that the FBI chose to pursue this tool in such a public fashion in the first place. According to The New York Times, Apple initially requested that the FBI not publicize its order to decrypt. (Some have speculated whether Apple might have complied if the order was not publicized, or if it merely wanted to challenge the order away from the court of public opinion. But that's a whole other story.) Although it would seem to be in the FBI's interest to manage this development behind the scenes, the FBI curiously insisted on making this event public, which thereby provoked Cook's public letter. This could indicate that the FBI wanted to make a salient precedent out of this order—and perhaps embarrass the encryption evangelists at Apple by appearing to be "weak on terrorism" as a cherry on top.
Looking back on the FBI's struggles with encryption, it is clear that it preferred a public, legislative mandate compelling technology firms to grant some kind of special access for law enforcement investigations. But unfortunately for the agency, actual examples of encryption thwarting a criminal investigation were for a long time stubbornly non-existent. Meanwhile, the technical arguments against breaking security technologies for law enforcement put the kibosh on any legislative proposals for a backdoor mandate. So the intelligence community decided to bide its time, waiting for the moment that some "terrorist attack or criminal element can be shown to have hindered law enforcement," in the words of top intelligence attorney Robert S. Litt's leaked memo to his colleagues.
From the outside, it is clear that the tragic San Bernardino attacks could provide a perfect storm of emotion and technical haziness that is ideal for establishing a precedent. Indeed, legislators have again picked up the possibility of mandating federal access into secure technologies under threat of criminal penalty in light of this debacle. It is possible that the FBI is not primarily concerned with the particular evidence stored on this particular device at all. Rather, this could serve as an excellent public saga to spur the formal rule changes that law enforcement has desired the whole time.
Regardless of the FBI's true intentions, it appears that Apple will fight this order through the courts as much as it possibly can. But in terms of extra-judicial subversion, two can play that game. Technology companies may react to this FBI campaign in a manner that ultimately undermines law enforcement goals.
Already, Apple is talking about encrypting iCloud data in such a way that it could not comply with law enforcement warrants for information even if it wanted to. Other companies in the U.S. and abroad are no doubt observing this fracas with an eye to the future as well. The political climate is demonstrating more and more that trusted third parties are not only security holes, they are prime targets for government coercion and customer liability. It's not hard to understand why many companies will work to simply remove themselves from the equation.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Am I the only one who thinks that camera circle on the phone looks photoshopped in?
It's looks faker than Obama's birth certificate.
What's funny is my wife was born in Hawaii and her birth certificate form was identical down to the shade of green. I asked her where she got ger fake birth certificate
Guadalajara, those mexican forgers do a good job.
The iPhone 7 will run Apple's next-generation A10 mobile chipset and, if previous versions are anything to go by, it should be an alarmingly powerful chipset. Apple's A8 and A9 chipsets dominated the mobile space in 2014/15 showcasing just what was possible with processing power when you have complete control over specs, hardware and software.
For more details please visit ...... http://todaytechspot.com/
Is hers also signed by Ukelele?
It just says The Wookie
Obama's BC is real which is great because it proves he's ineligible because his father was a foreigner.
In the long term, the FBI's efforts may ultimately?and ironically?undermine the agency's broader goal of accessing critical evidence by encouraging tech companies to build products that are impervious to infiltration by design.
That would be a doggone shame.
Wouldn't it?
You just know the three-letter agencies are checking the papers every day for news of the invention of the first successful quantum computer. "Soon, we'll read EVERYTHING, you'll see."
Ok, Mr. Govt. We'll make your forensic tool. It'll cost 20 times our estimated cost, take 20 years and not work when we're done.
Oh, no, it'll work, but only for encryption developed prior to 2005
Get a Cherokee and a Choctaw in here now.
But it crashes when it hits Netscape.
"Ok, Mr. Govt. We'll make your forensic tool. It'll cost 20 times our estimated cost, take 20 years and not work when we're done."
This is what I'm not understanding.
Why can't Apple just say: (?)
"we don't have such a tool and have no design to build such a tool. So, we can't help you."
Of course, I'm not a tech guy.
I thought that they did say that.
Besides not being a tech guy, I don't pay attention.
But, I thought their stand was just refusal on principal, but not refusal due to incapacity.
ugg, principle, not principal. Of all the fuck ups to make on this website.
They did. The response seems to be "Fuck you. Do it."
Well, traitor Snowden backs Apple on this, so I think that's all we need to know about who's right on this one.
/Securi-State Neanderfuck
Giving people more reasons to not do commerce and banking online sounds like exactly what our economy needs. Just the fear that security is being weakened is a problem.
"Bank teller -- the job of the future!"
Get rid of cash. Bam, problem solved. *Runs into unforeseen oncoming train*
The FBI (and other state and federal agencies) is not merely interested in backdoors to security on consumer devices. It is hell bent on discouraging or neutering security on all consumer devices.
Those of us old enough should remember the attempt at preventing any keys larger than eight bits back in the 1990s. Significant encryption was really illegal then. I was convinced at the time that the government was not really interested in their ability to decrypt files, but in their ability to decrypt files and streams in real time. And now we know about the NSA's shenanigans.
Yes, government law enforcement wants to block any and all protection measures that would impede their access to yours and my data and communications. They are not so much interested in what is admissible in court. They want to know everything that is going on everywhere.
But that's not what this is about. The FBI just wants help with getting into the iPhone 5c, a phone that already has an (accidental) back door due to a design flaw by Apple. Complying with the court order in this case sets no precedent and doesn't break any future phones. But the unnecessary fuss Apple kicks up over this case actually threatens legislation intended to keep secure cryptography on phones legal. Apple's conduct is so counterproductive and illogical that you have to wonder what their real goals are.
Right. And the NSA doesn't collect and store all telephone metadata (and probably more).
This is not about access to a device used in a crime. It is about real-time monitoring of American citizens. They want real-time access like they had in the analog days. It's just that simple.
Oh, I have no doubt that that's what the FBI wants in general. But that isn't what this case is about. This case is about the FBI wanting access to a particular iPhone 5c for a particular case. Getting that access relies on a flaw in that particular phone and has no bearing on the security of other phones, and it sets no precedent that would limit the ability to create secure phones.
Except the precedent whereby Apple (or another tech company) could be forced to use its keys to sign software at the government's whim.
Meaning that once this software is created, the FBI could change the phone ID number in the software, go back to court and then get the court to order APPLE to sign the software again so the FBI can crack *that* phone.
So what? The government can force a bank to open your safe deposit box or a landlord to open your apartment. Corporations being ordered by courts to provide keys to places where private information is stored isn't new, unusual, or dangerous per se.
The issue here is that Apple's product is supposed to provide a level of technical protection that it fails to provide, and the solution to that is to fix the technology.
This isn't merely a matter of the government seizing something that currently exists, but a matter of the government forcing a company to create something that does not currently exist so they can destroy people's security. This is extremely dangerous. Where does such a power end?
If the FBI had asked Apple to build a complex, general purpose tool to break security on well-designed iPhones, I would agree: that would be too much power.
But that's not what is happening. Apple is supposed to make a one line change to their OS, sign it, and upload it to the phone; they can make that change specific to that phone if they like. That's the kind of thing Apple developers do every day, it takes all of ten minutes, and it compromises nobody's security.
Apple screwed up with security on the iPhone 5c, otherwise the FBI couldn't make this demand of them. Apple should get massive bad publicity for this and then go on to fix their future phones so that they are bullet proof. Letting Apple get away with this screw-up by portraying the FBI as the bad guys is not helping anybody's security.
The "flaw" you speak of isn't a flaw. You could call it a security oversight if you wished. But if that's what it is, it's a security oversight inherent in any consumer product that has encryption.
Apple could make the phone impossible to brute force by requiring users to input 50-character passwords every time they unlocked their phones, but nobody would use iPhones then because it would be too inconvenient.
So the "flaw" you keep talking about is that Apple settled on a four-digit PIN (at minimum) to keep the phone locked and an auto-wipe of the phone's data if you try more than ten times to unlock with the wrong passcode. This is a great security feature if your phone falls into the wrong hands, and it's also convenient enough for users. That convenience of a manageable passcode length is the "flaw," and it's not something anyone else has figured a way around.
The FBI wants to get around this security by forcing Apple to turn off the auto-wipe feature do they can brute force the phone's PIN. The data on the phone itself might be encrypted anyway. Or there might not be anything there. The FBI don't care. They want this legal precedent established so they can force companies, Apple among them, to routinely compromise security features on their products whenever LE demands it. And it turns out they always demand it. They think they have a right to know everything.
And it will cost the companies money and customers.
Yes, and on a properly designed phone, the auto-wipe feature cannot be disabled by anybody, even the manufacturer, even with digital signatures and firmware updates. The fact that the FBI could get around this with Apple's help is exactly the design flaw, and it's a bad one.
No, it is not inherent. The PIN on your SIM card doesn't have this problem, for example. All that needs to happen is that the crypto chip inside the phone keep track of the number of PIN unlock attempts reliably and securely. But instead of keeping track of this in the crypto chip, on earlier iPhone models, Apple kept track of it in the OS. Apple understands that this is a flaw and that it's easy to fix because on later iPhones, they have actually fixed it.
Imagining, that Apple's refusal to cooperate with the FBI is somehow going to preserve your privacy is naive. The FBI has the resources to do this without Apple's cooperation, it's just easier to ask Apple. And the NSA almost certainly has not only figured out how to do this, they probably also have Apple's source code and private keys.
@Rational Exhuberence
Nice try troll boy. There is no 'flaw' in the security of the iPhone 5c. Later phones featured more security features as the software/hardware product offerings progressed. That's like saying a 90's computer was flawed because it only had a CD, but lacked a blu-ray player.
So yeah, then since the FBI already has magic to do what they want, the court order and theatrics are for what exactly? It's easier for the FBI to go public and then be completely exposed as bumbling ass clowns?
Because that's exactly what's happened, this weekends revelations about their sloppy methods and Apple's existing cooperation and assistance, has significantly poisoned public sentiment for the Feds on this matter.
This automatic notion that they 'probably' have the 'secure keys and source code' also sounds like pure horseshit, LEO romantization fantasy. I don't know where you've been, but we've yet to see an example of government agency that wasn't a near perfect example of total incompetence. How does the NSA get past that? Roswell alien technology?
That's a good analogy (90s computer CD...)
We cannot retroactively apply today's expectations of security to past products. Otherwise, we could be demanding Microsoft rewrite Windows XP from the ground up with modern security so that we can keep using it. Microsoft has the right to abandon WinXP and leave it in a "unsecured" state (by today's standards) because it is too costly to keep it secured.
Comey, is that you?
...CEO Tim Cook strongly condemned the order to build a so-called "back door" for government access into encrypted technology. The FBI, on the other hand, counters that this request is a reasonable and narrow means to bring about justice for the victims of terrorism.
Which are you going to believe? The powerful entity with an army of automatons under its control or the FBI?
Stripping away the emotional rhetoric from all sides, the core question is whether a company can be compelled to build a tool for law enforcement that will compromise the security of any of its devices.
I don't think that's the core question at all. The core question is who run Bartertown. Some of us hold to the idea that the core purpose of government is "to secure these rights", others (like I think every single candidate for President in both the GOP and the Democrat Party) think the core purpose of government is to keep us safe - whether that be safe from terrorists or evil corporations or hunger or hurtful speech or a million other hobgoblins. As soon as you start arguing about balancing your rights against the government's "needs" you've lost the argument.
It's disturbing how fitting the Bartertown analogy is.
Tina's character (whatever her name was) is an obviously charismatic leader kept in her position my a coalition of the majority of interests and providing an illusion of justice.
Master Blaster controls the methane plant, and every indication exists that the duo built it themselves, providing a productive output that is essential to the very existance of Bartertown.
Aunty Entity.
*Both* were essential to Bartertown's existence.
AA provides a minimal government (including a monopoly on violence and a court of last resort) to support the rest of Bartertown's functions. Because AA *owns* Bartertown. its *her* gaff and everyone there uses it as a safe place to conduct business and pays a fee to do so. So she skims a little off the top of each transaction for ensuring that BT is a (relatively) safe place to do business.
Master-Blaster, not content with the profits from the business *he built*, is the one trying to *steal* the rest of Bartertown from Aunty Entity.
Reminds me of arguing with someone over the "Congress shall make no law..." and they dismissed my argument as being that of an ideologue unwilling to compromise and unable to see the trade-offs involved in absolutism. No, you half-wit, it's you who are the ideologue unwilling to compromise and see the trade-offs. "Congress shall make no law..." is the compromise - the freedom to do things is the freedom to do bad and stupid things, and that's the trade-off you gotta make for the freedom to do good and smart things. It takes an ideologue to insist we can have some utopian world where everybody is perfectly free to do only the latter and at the same time have a government wise enough, good enough and powerful enough to eradicate the former.
Going forward, couldn't a government that can force a technology company to essentially undermine their own product also simply outlaw encryption, at least for the average person on the street? It would be cold comfort knowing manufacturers were capable of making uncrackable products but that you were prohibited from getting them.
Consider that effective encryption was virtually illegal into the 1990s. First under HW Bush, then under Clinton, the FBI and other law enforcement agencies tried to stop improved encryption technologies but failed. After 9/11 they were able to get their entire wish list enacted that they'd been trying to push for many years. This situation is no different.
The encryption on the iPhone was already undermined, due to a design flaw in the phone. The FBI simply wants help taking advantage of that design flaw. Contrary to what Apple claims, the FBIs demand has no bearing on the security of future iPhones, provided Apple fixes the problem (which they should).
The encryption on the iPhone was already undermined, due to a design flaw in the phone.
If that is true, why do they need Apple's help? Seems like it isn't such a "flaw" if the FBI can't break it.
The flaw is easiest to exploit by Apple signing a firmware update for the iPhone. It can probably be exploited in other ways as well, but that would be a lot harder. The flaw isn't a big risk for regular phone users, which is probably why Apple didn't worry about it.
In fact, the design of the iPhone 5c is almost ideal for government surveillance: it is secure enough to protect users against casual spying, but it has a backdoor that Apple (and governments with the help of Apple) can exploit.
I may be feeding a troll here, but in the interest of possibly learning something new:
You are the first person I have seen that refers to a design flaw in the 5c that the FBI wants to exploit. Could you go into more detail?
The FBI wants to get into the iPhone 5c by disabling the limit on the number of PIN attempts through a firmware update so that they can brute force the PIN. A correctly implemented PIN system should not allow firmware updates of any form to reset the PIN count or even take place without unlocking the phone first. The best way of implementing this is to put the PIN verification and cryptographic keys into a secure computing element, making it impervious to any firmware updates and many other forms of attacks. Such secure computing elements are dirt cheap (every SIM card has one).
See also here:
http://www.counterpunch.org/20.....ere/print/
But the FBI's demand does have a bearing on the security of future iPhones - it's called "precedent". Once you establish that the FBI has a right to issue a Writ of Assistance to force Apple to help it break into an iPhone, you think Samsung can argue that doesn't apply to an Android phone? If the FBI can compel assistance in breaking into one iPhone it can compel assistance in breaking into all of them. If the FBI can do it, why can't the NSA, the CIA, the NYPD or Barney Fife?
The only reason a "Writ of Assistance" works in this case is because the cryptography on that phone is flawed. No manufacturer can help the government recover encryption keys from a correctly implemented system, and no "Writ of Assistance" is going to change that.
Calling a previous generation device flawed because it doesn't have newer model device security features, is stupid. So you're not only a troll but a hack as well.
Case in point: you ascribe magical competence and superiority to the government regarding computers. That is contrary to all available evidence. They have macro communications decrypting capabilities, but for consumer devices, they are using the same monkey garbage that teens download off torrent sites. That's why they can't crack the phone, because the script kiddies haven't caught up with it, and the only way is to force Apple to write them the tools they want. Apple spends magnitudes more on R&D on their devices, and has oodles of top tier talent. The government can't compete, can't keep up, and that's why they are hyperventilating thru various blow holes trying to enslave private individuals to do their dirty work.
Except for the pesky precedent that it would set, and once set, would be used again and again.
The encryption isn't flawed it's the password acccess which is flawed. Simple solution for eveyone is use a 50 digit password.
Which nobody would use. So Apple is stuck using what's kinda secure and what's reasonably convenient.
Rational Exuberance is right that the auto-wipe feature shouldn't be something that can be altered or disabled with a firmware update. You can expect the next iPhone to change that.
A 4-6 digit pin is perfectly fine and secure (the actual encryption key is 256 bits); you don't need 50 digit passwords. The problem with the iPhone 5c is that the limit on the number of unlock attempts can be reset in software. That's what allows the pin to be brute-forced.
Actually retard, it can't be reset with firmware.
No firmware exists to do it, and any firmware that could be hacked to do it, isn't securely signed by Apple.
Since there's no firmware or tool to do what you claim, there's no fucking key to that lock.
it IS you, Comey!
This is exactly what the whole damn argument is about. Forcing a technology company to undermine its own security is effectively outlawing encryption. If Congress can't or won't pass a law prohibiting encryption, the executive branch agencies can simply ignore Congress by forcing companies to give them the means to bypass the encryption, and if you can bypass the encryption then the message ain't fucking encrypted. The FBI wants encryption to be illegal and they don't care about some nonsense about Congress supposedly being the ones to make the laws.
Thread winner This has always been the issue. Law enforcement agencies have been crippled (in their assessment) by increasingly cheap and effective encryption since the 1990s. They want to break the technological advantage that citizens have over them.
Law enforcement agencies have always claimed to be crippled by demands that they follow the Constitution. That pesky Constitution always gets in the way of catching the bad guys.
If Apple complies with the FBI order, it wouldn't "cripple" anything. The FBI is asking Apple to help exploit a flaw in a weak cryptosystem. Giving them that help doesn't change anything about the ability of people to build strong cryptographic systems that the FBI cannot break.
No matter how many times you say the same thing, I still think you're full of shit and either knowingly or unknowingly ignorant of how governments operate.
I come from a country where this wouldn't even be an issue because it would be clear that the company would have to comply. That's why I understand (unlike you) that "how the government operates" is irrelevant here and there is no point in debating this with the FBI; the only way to secure information on a phone is to make sure that the information is technically secure. As long as companies like Apple get away with selling devices with poor security and then engage in legal and political posturing to cover up their mistake, we won't have security.
So, I'm sorry to have to tell you: it's you who is "full of shit" and how is "ignorant of how governments operate", because you seem to delude yourself into believing that if you somehow stop the FBI through legal or political pressure, the government is magically going to stop invading your privacy. Stop being such a bloody fool.
Your last paragraph is the only one in the entire thread that actually makes some degree of sense. Having said that, one must always do what one can in the public eye to dissuade bad actors in government.
He comes from a country whose government ensures perfect technology designs?
That's 100% batshit crazy.
He's a moron whose read one dubious article, and thinks the FBI should get to do whatever it wants because privacy is moot and they will spy on everything.
Also 100% batshit crazy, and a completely pathetic quitter. Fuck off troll.
OMG, they wouldn't be bypassing the encryption they would sideload a version of iOS that didn't have the "wipe drive" password protection. If the FBI guesses the password the phone decrypts itself. Now if he used a strong password they'll never get in.
MY HERO!!!!!!!
Wonder what the Feds would do if the IOS developers refuse to play along.
Axcuse them of being racists. What else?
That's what the shrieking eels do.
The feds go the "obstruction of justice - rape cages" route.
Nice company you have there. Shame if something were to happen to it.
Nice 2-party oligopsony ya gots dere... Be a shame if some thoid party was to suddenly find the means to muscle in on yer racket...
The iPhone 5c (and maybe some other Apple iPhones) are vulnerable to a particular form of snooping, based on modifying the operating system of the phone. That's a particular shortcoming of those phones. It's easy to avoid, and phones from other manufacturers don't have this problem. Apple itself looks like they are going to fix this in the iPhone 7. Complying with the court order on the San Bernadino phone doesn't set any precedent, nor does it create tools that will compromise privacy or security in the future. Apple should simply have complied, made sure the iPhone 7 didn't have this problem, and be done with it.
Instead, they are creating this tempest in a teapot and actually threatening the voices of reason in Congress who have been pushing for legislation to ensure that phones can use secure encryption. Apple's strategy is so crazy that one has to wonder whether they aren't secretly in league with anti-cryptography forces.
There's an added chapter for the next edition of United States of Paranoia!
I didn't say that they were actually secretely in league with anti-cryptography forces, just that it was so crazy.
My guess is that Apple just screwed up big time, first in the design of their phone, and then in handling this situation.
Apple should simply have complied, made sure the iPhone 7 didn't have this problem, and be done with it.
How the fuck is Apple going to make a phone that doesn't have the problem of the government insisting they have the right to demand access to it?
The FBI always has a right to demand access to evidence. The question is whether a company can actually provide that evidence. Right now, Apple can provide access to that evidence because they have a backdoor into the phone, and that's why Apple is in this predicament. If the phone was designed without a backdoor, then they could respond to the FBI demand by saying "sorry, there is no way of doing that" and that would be the end of it.
Right now, the FBI is not (and cannot) demand that future phones are designed to facilitate access for them. That's a legislative battle that rages on separately from the FBI demands in this particular case.
"The FBI always has a right to demand access to evidence"
No, they don't. Not legally at any rate. They can always want it, demanding it backed by the full force of the state is a different animal.
They can demand access to a suspect's safe deposit box, to their locked writing desk, to their journal, and to their medical records. Based on what legal theory shouldn't they be able to demand access to the contents of someone's phone?
Furthermore, what makes you think that denying them this access legally is going to make any difference anyway? The NSA and other agencies are going to get at this data if it is technically possible.
The only solution isn't the legal magic underwear you believe in, but sound, unbreakable technology. Such technology is cheap, simple, and widespread, and people like Tim Cook should have their feet held to the fire for not implementing it.
We get that you fault Tim Cook and Apple for not securing the 5c enough.
The problem is that gov't is not the entity that is supposed to hold Tim Cook's feet to the fire (referencing the quote, "full force of the state").
It is the CONSUMER that should hold Tim Cook's feet to the fire, via boycotts, and some consumers will do so, having seen this whole fight laid out in public.
The FBI has the ability to request a court provide them a warrant to search for evidence.
The problem in this case is that the FBI already has the evidence. What they lack is the ability to understand the evidence, and they are attempting to enlist Apple in converting the evidence into an accessible and understandable form.
Warrants were never intended to serve as a form of conscription, and that is the heart of the issue.
Governments don't have rights. Governments have powers. And no, they don't always have a legitimate authority to demand access to evidence.
No, all Apple can allegedly provide is a way to defeat the "wipe drive" password protection. If the guy used a strong password they might never get in.
A pin is not a password; a pin doesn't need to be particularly strong. A 4-6 digit pin is quite secure if implemented correctly.
The actual encryption key on the iPhone 5c is always 256 bits, no matter how long or short your pin is.
Yes, it absolutely would. The moment the government is able to compel a tech. company to circumvent their own security for the government's convenience then, by golly, the government will do it again... and again... and again regardless of whether that security is flawed to begin with or not.
Related: The hullabaloo about the phone wouldn't even be a thing if the county where Farook worked has installed a $4 application that allowed remote unlocking (since he was a county employee and the phone was provided to him through work).
From ABA Journal: http://www.abajournal.com/news.....iphone_but
Didn't someone remotely change the password on the phone, which is one reason we're in this situation now? Doesn't that suggests it was someone at the company since it was a company phone?
Wouldn't the 13th Amendment ban on involuntary servitude trump the all writ law that allowed the govt to force to work for them? Any thoughts on that?
That depends on corporate personhood, really.
Also, if you don't want that debate, the "except as punishment for a crime" clause gives the perfect cover by charging the company with something like contempt or obstruction.
This sounds like the "bootstrap" paradox in time travel.
Which came first, the involuntary servitude, or the punishment for a crime?
FBI: "You have to do this."
APPLE: "No I don't, unless being punished for a crime."
FBI: "THERE'S YOUR CRIME!"
Re: UnCivilServant,
No, it doesn't. A person writes code, not the company. Let's say Tim Cook changes his mind but his programmers don't. What is the FBI to do, arrest the programmers? Compel them to obey? This is the very reason the FBI is throwing a tantrum in front of the cameras and giving Apple meaningless deadlines, because they know they can't make the company make its people comply.
Agree. If the State Department can take 1.5 years to forward some emails to the FBI, I think Apple can take at least that long to complete a much harder task.
If Tim Cook changes his mind and the programmers don't, the programmers are in danger of losing their jobs.
Hasn't stopped the government from drafting men into the Army.
Most phones are already designed that way. That kind of security has been a goal going back to the first SIM cards. Apple simply screwed up with their phone software (and it looks like they were already planning on fixing that in upcoming models).
The collective reality of the public is likely being cleverly engineered by both Apple and the FBI working in collusion.
A billion-buck company exploiting the commercial platform provided by goddamn communists isn't going to give a fucking shit about liberty or privacy.
Boiling cauldrons in the governmental/corporate labs are blending obtuse elements on this one.
Another chapter for Jesse's book.
In 1984 Apple was the girl that threw a sledgehammer into the teeth of corporate fascism with the Superbowl Mac commercial. A single million-dollar donation to the LP--legal according to the DemoGOP Supreme Court--would cost less than a bunch of bootlicking lawyers, cause the looters to back away in horror, and net them more goodwill than bombing Nazi Germany in 1944!
Other than exquisitely designed mass-marketing methinketh Phillips groans mightily as he stretches the temporal utility of an ad beyond the impact of its 60 seconds.
Fantastic work-from-home opportunity for anyone... Start working for three to eight hr a day and get from $4000-$8000 each month... Regular weekly payments... You Try Must.....
------- http://www.workprospects.com
on ironic bit that seems to get brushed over... this is a government owned phone. they want to make civilians more vulnerable, but they are using an example that explicitly makes a government owned system more vulnerable. why would they ask to have their own security compromised?
It is a phone owned by a government.
It is not a government owned phone system and it is not the government that wants to force Apple to create a way to disable a security feature. The only slightly weird aspect is that the owner does not mind the phone being opened.
This must be a record number of links in one post.
That is all. I'm weary of this topic for now.
I don't remember anyone discussing the impracticality of the government's request. To rewrite the software and re-manufacture the hardware would take a considerable amount of time and presumably testing to make sure it didn't have too many holes. Andthen, because those who worked on it would presumably have to have security clearances, wouldn't those employees working on it need to be screened, etc. We pay billions for sophisticated NSA and CIA software and hardware, let them figure it out. And even assuming they got what they wanted, we now know they wouldn't tell other agencies who needed it anyway.
I hear Bill Gates it saying apple should comply. of course he thinks that because he know it will hurt them
What the FBI wants more than anything is the encryption code(s) that allows devices to recognize that code/updates are coming from a "trusted" source, ie. Apple.
I do not like what the FBI is doing, but frankly, the sooner we eliminate any notions of "trusted" sources the better off we will all be.
Apple shouldn't be in the business of providing us encryption, they should be in the business of providing products that allow us to use our own form(s) of encryption. You can argue that that would only result in the FBI seeking to strong arm someone else, but once installed the FBI would have a hard time even figuring out who to strong arm.
Don't most of us presume that if we commit a capital offense (or any other felony for that matter), law enforcement will pursue a search warrant through a judge for our cell phone, to look for evidence and investigative leads? Wouldn't they be irresponsible if they didn't.
I want to be a good Libertarian. I really do. But do you have to be so all fired ignorant when you write? Use the English language! Learn it! Embrace it! Use "doused" properly you idiot or find an editor - somewhere, apparently not at Reason - that will help you.
Doused has multiple meanings. It was used correctly here.
I've made $76,000 so far this year working online and I'm a full time student.I'm using an online business opportunity I heard about and I've made such great money.It's really user friendly and I'm just so happy that I found out about it.
Open This LinkFor More InFormation..
??????? http://www.workpost30.com
What makes governments dangerous? Power.
What does government surveillance do? Increases government power.
"You have nothing to worry about if you have done nothing wrong."
That depends on who is defining what is wrong and what is right.
Are political opponents doing something wrong?
Are unfavorable news reporters or agencies doing something wrong?
What happens when the President (any President) or his devotees, who can find out about anyone in the US, does not like someone and decides to do something about it; whether it be political or personal?
Government officials seem oblivious that the potential for abuse from these programs is astronomical. We can not have government surveillance that in the hands of less than desirable government officials (which is most of them) can silence or destroy dissenters and political opposition.
We already have a safe that can not be unlocked even with a Judge's order, it is our mind. Governments have not had a master key or backdoor key to our minds since the beginning of time and we as a species have still managed to survive and multiple.
It is not about why they pass such ridiculous laws or the purity of their intentions, it is about what some future demented politician might use these laws for.
before I saw the paycheck that said $8517 , I didn't believe that my mother in law woz like realy bringing in money in their spare time from their computer. . there uncles cousin started doing this 4 only ten months and as of now cleard the mortgage on there villa and bourt a gorgeous Saab 99 Turbo . learn this here now...
Click This Link inYour Browser...
~~~~~~~~~~~~~[] http://www.Mom80.Com
Somehow, for some reason, I tend to look askance at BOTH Apple and The FBI. If this makes me a bad citizen, so be it.
Even if FBI claims are correct, they want to look into this particular phone and only this phone, it appears that there are at least a couple of other law enforcement/prosecutorial types who have expressed their desires in this direction, and they seek more than a look at/into just one phone, it seems that they have a longish list. As for the FBI, do they simply seek entry into this one phone, as they claim, or do they seek to set a precedent that would have the broadest applications in the future, let's not forget "tomorrow", which sometimes comes all to quickly. By the way, one might note that the actions/antics your choice of the president might have "poisoned the well of public trust".
Getting to Apple, I for one wonder as to the validity of it's claims. For instance, re the data on individuals that it collects, and here I wonder about Google and others too, how can the individual determine the validity of the claims made re the security of personal data collected, and it's use? So far as I can see, the individual cannot, and is left to take their word for it. To what extent is one willing to trust corporate entities such as Apple, given the history of corporate double talk and flat out lies from the corporate world. That Virginia might also be a point of interest.
The Fit Finally programs and guides are based on over 600 research studies conducted by some of the biggest Universities and research teams of the world.
We take pride in the fact that our passion for better health and fitness is 100% backed by science and helps 100's (if not 1000's) of people every year since 2010. Just try it:
http://03615gbnxbyy5y42r9r8o80.....kbank.net/