Apple Delays Plan To Snoop on Users' Pictures
An encryption back door will lead to abusive authoritarian surveillance—even if you present it as a way to stop child porn.
A controversial plan by Apple to scan users' photos for the possible presence of child pornography has been shelved (for now) after tech privacy advocates warned of a future of oppressive government surveillance.
In early August, Apple announced it was rolling out a tool that would scan everybody's iPhone and iPad photos when they were uploaded into iCloud storage. Apple promised that no human would actually be looking at the photos unless a complex tech program found that they matched known pictures of child pornography. If the program found child porn, Apple would disable the account and send a report to the National Center for Missing and Exploited Children, a quasi-private nonprofit created and primarily funded by Congress.
The announcement was greeted with shock and dismay by tech privacy and security organizations like the Electronic Frontier Foundation (EFF). One massive problem is that there's no reason to believe that, once implemented, this system would remain confined to child pornography. A technology that could be programmed to detect images of child pornography could be programmed to detect other types of images if Apple is pressured or ordered to do so by a government. This technology could be used by authoritarian leaders to figure out who is transmitting images they don't approve of or those associated with protests and activism. Will Chinese authorities demand to know who has images of Winnie the Pooh on their phones, given President Xi Jinping's apparent sensitivity to the comparison?
Apple had initially planned to implement the scanning system with an August patch. But on Friday, in a message to The Verge, the company announced it was delaying the plan. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," a representative from Apple told The Verge.
Federal law already requires Apple to report Child Sexual Abuse Material (CSAM) to the authorities whenever the company finds any. But the code also clearly states that platforms and providers are not required to monitor users' content or "affirmatively search" for CSAM images, which is what Apple was proposing to do.
Apple's plan made a policy of violating users' privacy just to make sure they weren't breaking the law. If the U.S. government actually attempted to scan all our photos just to be certain we weren't storing illegal images of child pornography, we would see it as the massive Fourth Amendment violation that it is. Government is not supposed to be looking through our private files absent suspicion of criminal activity and a warrant.
But data held by third parties lack such strong constitutional protections, and so the government pressures companies like Apple to provide this information as a way of bypassing constitutional privacy protections. And in other countries, like China, where those protections are simply nonexistent, Apple has given governments even broader access to user information.
If Apple moves forward with this plan, the consequences could be huge and dangerous—a mechanism for worldwide mass surveillance. It's an official policy of an encryption back door designed to not look like an encryption back door. The problem with back doors bypassing users' technology privacy is that once they've been created, nobody can guarantee that only the "right" people with the "right" intentions will be able to access them and only under certain legal circumstances. Encryption back doors will ultimately lead to abuse, both by criminal elements and government officials (and sometimes those are the same people).
As such, EFF Executive Director Cindy Cohn responded to Apple's delay by saying the plan isn't good enough. Her organization wants Apple to drop the program entirely:
The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.
Show Comments (49)