The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Fourth Amendment

"Apple Plans to Scan US iPhones for Child Abuse Imagery"

|

From Irish Times (Madhumita Murgia & Tim Bradshaw):

Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people's personal devices….

The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US….

The proposals are Apple's attempt to find a compromise between its own promise to protect customers' privacy and ongoing demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography….

"It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of … our phones and laptops," said Ross Anderson, professor of security engineering at the University of Cambridge.

Although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple's precedent could also increase pressure on other tech companies to use similar techniques….

It would be important to learn just how much government pressure Apple was under to implement such a feature, or even whether the government actively solicited this (even in the absence of coercive pressure). Some courts have concluded that the Fourth Amendment applies even to private searches if the police "instigated" or "encouraged" the search, and the private entity "engaged in the search with the intent of assisting the police"; see also, for instance, this decision and this nonprecedential decision. The Supreme Court's Skinner v. Railway Labor Executives' Ass'n points in that direction as well (though the program there had some special features, such as removal of legal barriers to the searches). Other courts, though, conclude that mere "governmental encouragement of private 'searches'" isn't enough, and that the private search becomes government action covered by the Fourth Amendment only if there is compulsion (perhaps including subtle compulsion).

Note, though, that there's also a twist here: The Court has held that police drug dog sniffs of luggage aren't "searches" for Fourth Amendment purposes because they "disclose only whether a space contains contraband" (setting aside the possibility of drug dog error), and thus don't invade any legitimate privacy interest. Could hash-value-based searches be treated the same way, so that even if Apple's search is treated as government action subject to the Fourth Amendment, it wouldn't be treated as a "search"? That's unsettled, see U.S. v. Miller (6th Cir. 2020):

Did the hash-value matching "invade" Miller's reasonable expectation of privacy? According to the Supreme Court, binary searches that disclose only whether a space contains contraband are not Fourth Amendment "searches." Illinois v. Caballes (2005). The Court has held, for example, that the government does not invade a reasonable expectation of privacy when a police dog sniffs luggage for drugs. United States v. Place (1983). Yet the Court has also held that a thermal-imaging device detecting the heat emanating from a house invades such an expectation because it can show more than illegal growing operations (such as the "hour each night the lady of the house takes her daily sauna and bath"). Kyllo v. U.S. (2001). Which category does hash-value matching fall within? Is it like a dog sniff? Or a thermal-imaging device? We also need not consider this question and will assume that hash-value searching counts as an invasion of a reasonable expectation of privacy. Cf. Richard P. Salgado, Fourth Amendment Search and the Power of the Hash, 119 Harv. L. Rev. F. 38 (2005).

If any of you know more about the governmental involvement in this decision, or for that matter the broader state action law related to such searches, please let me know. Thanks to Christopher Stacy for the pointer.

UPDATE: Here's Apple's announcement:

Another important concern is the spread of Child Sexual Abuse Material (CSAM) online. CSAM refers to content that depicts sexually explicit activities involving a child.

To help address this, new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

Apple's method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user's account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.

This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users' photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.