Queen Elizabeth gave her assent to the British Investigatory Powers Bill on Tuesday, the last step needed before the massive surveillance authorization bill becomes law in 2017. A new, deeper analysis of the final law by tech experts suggests there's more to fear than simply government access to citizens' browser history. This law may ultimately put everybody's data privacy and security at risk.
To refresh everybody's memory, the Investigatory Powers Bill—nicknamed by critics the "Snooper's Charter"—formally legally increases the power of the British government to engage in online surveillance, provides rules to allow for the bulk collection of citizen metadata, and the authority to hack into devices remotely. The law requires Internet Service Providers to store information about users' browser history for a year and hand over this information to government officials when provided a warrant. Essentially the law formalizes some secretive surveillance methods already being used by government that were exposed by Edward Snowden, but it also provides for some judicial oversight.
While the law is being sold as a way to keep the United Kingdom "safe" and to fight terrorism, the reality is that a whole host of government agencies who have nothing to do with national defense will also have access to this information. These are agencies that investigate fraud and deal with taxation and licensing issues. It is abundantly clear to anybody familiar with the law that it is designed and intended to be used to investigate domestic crime, not just terrorism.
But there's more. Privacy advocates and tech companies had been fighting with the British government over the crafting of the law, particularly about the inclusion of mandates for encryption "back doors" so that government officials would not be stymied in their surveillance efforts.
While the new law doesn't officially mandate encryption back doors, U.K.-based tech media site The Register scoured the 300-page law and discovered buried deep within something just as bad. Government leaders will be able to give a company what they're calling a "technical capability notice" that can impose obligations and changes upon the products (software, apps, whatever) that may demand "removal by a relevant operator of electronic protection applied by or on behalf of that operator to any communications or data."
That is to say: The law doesn't mandate encryption back doors outright, but it gives the government the authority to demand that specific companies remove the encryption protecting data. That means the British government expects that all of these companies will have the capacity to break their own encryption on the demand. So in reality, the law does mandate encryption bypasses and back doors for communication tools, but it's allowing the companies to maintain control over the "keys."
If this sounds familiar to Americans, this line of the law has the same impact as the widely mocked terrible legislation proposed by Sens. Diane Feinstein (D-Calif.) and Richard Burr (R-N.C.) in the spring. In response to Apple's refusal to help the FBI decrypt the iPhone that was in the possession of one of the San Bernardino terrorists, the senators crafted the technologically illiterate "Compliance with Court Orders Act of 2016." Like the text of the British law, it doesn't order tech companies to create back doors for the government to bypass encryption, but it does require that the tech companies themselves bypass their own encryption when given a court order to do so.
What's the big deal? There is a simple truth that everybody who works within the tech industry or writes about technology understands that many government officials are either choosing to ignore or unwilling to accept: When a company creates encryption methods that have built-in bypass methods, there is no guarantee it will stay in the hands of the company or that only the "right people" will gain access. Accidents happen. Espionage happens. We saw an example of it earlier in the year when an internal security bypass used for testing operating systems at Microsoft accidentally got out into the hands of hackers. The hackers, who apparently had no malicious intent, used it try to demonstrate to government officials and the FBI the flaws in demanding that software and tech devices have encryption that can be bypassed.
For everybody else who isn't a citizen of the United Kingdom, they will have to worry that the security of their own tech devices and communication tools will be deliberately compromised by tech companies with encryption back doors so that they will be prepared to comply with demands for data by the British government. The law also permits sending these letters to tech companies based outside the United Kingdom but who operate within its borders. These demands affect people outside the United Kingdom in very significant ways.
Oh, and one other important detail: The company that receives one of these orders cannot tell anybody they've gotten an order unless the government gives them permission. So a company could be made to compromise its own security and encryption and not warn its own customers.
Since the law is not yet in operation, it's not yet clear what will happen to tech companies who are implementing "end-to-end" encryption, designed so that the company itself cannot access the data or information being communicated through its apps or tools. The law does require the government official to consider the "technical feasibility of complying" with the demand.
Perhaps the law will serve to push more tech companies into instituting end-to-end encryption in order to be able to tell the United Kingdom that it's simply not "feasible" for them to comply with a decryption order? If so, it's appropriate, given the broad surveillance power the United Kingdom has claimed for itself, to start worrying whether the next step will be to prohibit end-to-end encryption on communication tools within the country.
Ultimately, such a law isn't going to be effective in stopping smart criminals who are insistent on secrecy. There are so many third-party encryption tools out there created outside the United Kingdom and realistically out of their control, and they're advertising the fact that they don't have back doors to bypass. That means those who are really at risk of government snooping under this system are the average joes who don't think they'll ever be a target of this law because they aren't terrorists and don't realize this law is much, much broader than what they've been told.
This is perhaps why Snowden, in response to Donald Trump's election, noted that surveillance and online privacy issues are bigger than one particular politician or government.