U.K. Demands Access to Any Apple User's Data, Anywhere in the World
The reported order from Britain's Home Office is further proof that governments pose a greater privacy risk than corporations.

The United Kingdom's Home Office is reportedly demanding that Apple, one of the world's largest tech companies, provide law enforcement access to its users' private data, not just in Britain but around the world.
European regulators are no strangers to making demands that the rest of the world has to live with. In 2022, the European Parliament mandated USB-C connections as the standard for data transfer, causing Apple to abandon its proprietary Lightning ports worldwide.
But the U.K.'s demand for access to Apple user data poses a grave threat to global digital security.
"Security officials in the United Kingdom have demanded that Apple create a back door allowing them to retrieve all the content any Apple user worldwide has uploaded to the cloud," The Washington Post reported earlier this month. "The British government's undisclosed order, issued last month, requires blanket capability to view fully encrypted material, not merely assistance in cracking a specific account, and has no known precedent in major democracies."
Neither Apple nor the government have confirmed the order, though the Post notes that the request was made "under the sweeping U.K. Investigatory Powers Act of 2016, which authorizes law enforcement to compel assistance from companies when needed to collect evidence" and "makes it a criminal offense to reveal that the government has even made such a demand."
"These reported actions seriously threaten the privacy and security of both the American people and the U.S. government," Sen. Ron Wyden (D–Ore.) and Rep. Andy Biggs (R–Ariz.) wrote last week in a letter to newly minted Director of National Intelligence Tulsi Gabbard. "If Apple is forced to build a backdoor in its products, that backdoor will end up in Americans' phones, tablets, and computers, undermining the security of Americans' data, as well as of the countless federal, state and local government agencies that entrust sensitive data to Apple products."
Wyden and Biggs asked that Gabbard "giv[e] the U.K. an ultimatum: back down from this dangerous attack on U.S. cybersecurity, or face serious consequences." Wyden also released a draft of a bill designed to close loopholes in U.S. law that could allow foreign governments to make such demands of American companies.
Apple devices use 256-bit encryption, which is currently the most secure option available and is considered virtually unbreakable. Data is also end-to-end encrypted, meaning it can't be accessed—even by Apple—without a user logging into their device.
That level of security is comforting to users but vexing to law enforcement: Last year, police in Detroit complained that a new software update caused inactive iPhones to periodically reboot, making it more difficult for cops to unlock and access phones collected as evidence.
The feds are also particularly hostile to Apple's position of maximal user security. Multiple times in recent years, the FBI demanded access to locked devices to aid an investigation; each time, Apple refused. In one such case, law enforcement collected an iPhone owned by one of the perpetrators of the 2015 mass shooting in San Bernardino, California. The phone only allowed a certain number of incorrect passwords before it would lock out any further attempts, so the bureau sought a court order requiring Apple to unlock the phone.
"Because Apple has no way to access the encrypted data on the seized iPhone, the FBI applied for an order requiring Apple to create a custom operating system that would disable key security features on the iPhone," the Electronic Privacy Information Center wrote. Apple CEO Tim Cook said the FBI was asking the company "to write a piece of software that we view as sort of the equivalent of cancer."
A federal judge issued the order, which Apple contested, but the FBI later withdrew the request and said it was able to retrieve the information another way.
Still, calls persist for tech companies to develop a "backdoor" for encryption that would allow access to law enforcement but no one else.
"Technology has become a tool of choice for some very dangerous people," then-FBI Director James Comey said in 2014. He elaborated in 2017 that privacy and security stem from the "bargain" that "there is no place in America outside of judicial reach," and "widespread default encryption…shatters the bargain."
In 2015 Senate testimony, Comey said someone should create a way for law enforcement to circumvent encryption without compromising user security. "A whole lot of good people have said: It's too hard; that we can't have any diminution in strong encryption to accomplish public safety, else it'll all fall down and there'll be a disaster. And maybe that's so. But my reaction to that is, I'm not sure that we've really tried."
Speaking at the 2018 Aspen Security Forum, then-FBI Director Christopher Wray echoed Comey's call for some sort of encryption master key: "There's a way to do this. We're a country that has unbelievable innovation. We put a man on the moon….And so the idea that we can't solve this problem as a society—I just don't buy it."
If that sounds like too simple of an explanation, there's a reason.
"One of the fallacies I think of this debate is that it's often framed as, we've solved so many hard technical problems….Surely if we can put a man on the moon, we can design a secure backdoor encryption system," computer science professor Matt Blaze said in 2015. "Unfortunately, it's not so simple. When I hear, 'If we can put a man on the moon, we can do this,' I'm hearing an analogy almost as if we're saying, 'If we can put a man on the moon, surely we can put a man on the sun.'"
U.S. and U.K. officials want someone to build an unbreakable lock and give law enforcement a key. This would be alarming enough about an actual lock, but a backdoor to access digital encryption would render it inherently vulnerable.
In effect, there is no way to introduce an intentional flaw to data encryption that cannot be exploited by bad actors. A group of computer scientists studying government requests wrote in a 2015 article in the Journal of Cybersecurity that "such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend."
If the U.K.'s order stands, Apple will either have to weaken every user's security worldwide or cease operating in Europe altogether. Either move would be disastrous, either to the company's bottom line or to its users' data privacy. Since some U.S. officials seem to agree with the U.K.'s motivation, it's clear that governments pose a much graver threat to privacy than corporations.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
England is as fascist as the Socialists they fought against in the last century
Wankers of Bellends, Tossers of Twat! The lot are Rubbish!, and I should know!
Twat tossing is a thing over there? Go ahead and link a video if you want.
There's a lot of that going around now, unfortunately.
See? Right here, this is the problem with borders.
If the U.K.'s order stands, Apple will either have to weaken every user's security worldwide or cease operating in Europe altogether.
So, when you're an American company, and you do business in Europe or other countries, you have to abide by... never mind.
You just want the US to lose WWI-part 2 for lack of shipbuilding capacity.
When I hear, 'If we can put a man on the moon, we can do this,' I'm hearing an analogy almost as if we're saying, 'If we can put a man on the moon, surely we can put a man on the sun.'"
To be fair, all the talk of fully-decentralized-while-still-solving-the-Byzantine-Generals-problem, perfectly secure, perfectly anonymous, perfectly verifiable/completely transparent, fully-documented ledger, contract/trust stores that are going to end governments and monetary policy as we know it doesn't help.
Though that's sort of the inverse of what they want. Or are you saying "it doesn't help" in the sense that they hear that, and then want the backdoor even more?
I'm saying that when his techbros gaslight people with promises of invisible pink unicorns killing all the evil tyrants but leaving all the benevolent techbros in charge, walking on the sun seems like a cakewalk.
Heil Starmer!
Apple already said it would withdraw from the UK market before they would concede to such a demand.
Cops hate Apple, good enough endorsement for me.
Hopefully
Hooray to Apple on this one. Their security features, like the erase-after-10x tries un-locking are great.
It isn't looking good for the future of England. I don't know that they can right the ship at this point.
No pointy kitchen knives or scissors, no I phones, and hammers have to be licensed and registered. It must be so safe there:
Then again....
https://www.statista.com/statistics/864736/knife-crime-in-london/#:~:text=The%20number%20of%20knife%20or,when%20there%20were%2015%2C928%20offences.
England has already fallen, to be replaced by an Islamic Republic.
England is part of the five eyes intelligence network.
It's a backdoor for US intel too.
Yup. The US government gets around laws against spying on innocent citizens by letting other countries do it and then buying the information.
And for criminals, who inevitably would find and exploit any back door.
The reported order from Britain's Home Office is further proof that governments pose a greater privacy risk than corporations.
Corporation: Give us all your data.
Us: Far out man... convenience.
Government to corporation: Give us all their data.
Corporation: Why... I never! Maybe if you asked nicer!
Reason: Damn government, keeps demanding all my data I willingly handed over to that stranger in a toga!
Corporations use your data to try to sell you stuff.
Governments use your data to rob you and lock you in a cage.
Sounds like an EXCELLENT reason to not give all your data to a stranger.
I mean, really, the issue is so simple that only a moron couldn't understand it.
Encryption is math, and only math.
Math is universal; it inherently works the same for everyone. No amount of technical innovation can invent mathematics that work differently for law enforcement than for everyone else.
So, there is no way to prevent anyone who has a decryption key from using it; the math cannot tell if the user is law enforcement or not.
Therefore, any encryption system that has a key installed for the use of law enforcement will become completely and permanently worthless the moment the law enforcement key is known to bad actors.
Human beings are not perfect; therefore one of them will make a mistake or be corrupted, and any law enforcement key will be leaked to bad actors, rendering the encryption system useless.
(Yes, in theory, the company could set up a system that encrypts each and every user a different law enforcement key, and hands over just that one user's key to law enforcement upon proper application. Someone would have to keep a record of those keys and which users they apply to somewhere. This record of the keys and users is now the highly valuable item that will be sought by bad actors, and which renders the entire system useless when they acquire it.)
Put each user's record on a blockchain! Ooh, so techy!!
And people are concerned about DOGE being able to access their tax data.
It's realistic to be concerned about that also. DOGE teenagers could become insta-multi-millionaires selling data found in tax records. If they want audits get accountants. Accountants aren't as good at moving data around undetected.
The reported order from Britain's Home Office is further proof that governments pose a greater
privacyrisk of any negative outcome you are worried about than corporations.There, fixed it.
Fuck you Britain. As the famous Texas flag said, "Come and Take It".
"U.K. Demands Access to Any Apple User's Data, Anywhere in the World."
Fuck the UK and their fascist government.
Now you know why sane American colonialists demanded independence in the late 18th century.
If the U.K.'s order stands, Apple will either have to weaken every user's security worldwide or cease operating in Europe altogether.
Why punish all of Europe for Britain's arrogance? Simply boycott Great Britain.
Come on. Can't we just give the back door key to authorized law enforcement officials? We all know that all of them are more saintly than Mother Theresa.
Maybe your back door. Mine remains heavily guarded.
Russia, China, Iran etc waiting to see if they should pass a similar law…
The Caliphate will not be denied.
I'm thinking Apple just needs to stall. The current UK government seems highly unstable and deeply unpopular.
The answer is Fahrenheit 451. Abandon creating and storing sensitive data. Use IT only for trivial matters.
I've looked at the USB-C and Lightning connecter specs and there is no reason Apple cannot have both. However, the USB-C connector is light-years ahead of Lightning in all its technical attributes. It is also a non-proprietary connector. The only way the regulation could be bad is if the Lightning connector is forbidden and it isn't. Both connectors can be used or just the USB-C.