Privacy

Lawsuit Challenges Clearview's Use of Scraped Social Media Images for Facial Recognition

Databases of involuntarily supplied identities make for a plug-and-play surveillance state.

|

Facial recognition technology is getting more sophisticated, more reliable, and more pervasive as the world eases its way toward becoming an all-encompassing surveillance state. That surveillance state doesn't even have to be built; it's increasingly ready for deployment as law enforcement agencies cut deals with private companies that have already assembled the tools and databases for use. As with cell phone tracking, that plug-and-play quality does an end-run around safeguards that, at least nominally, restrict government actors, and invites legal challenges based on civil liberties concerns.

"We're suing Clearview AI in California," Mijente, an immigrant rights group, announced March 9. "The facial recognition firm is dangerous. Its surveillance tool—used by 2,400+ policing agencies—chills free speech & endangers immigrants, protesters & communities of color. We won't be safe till it's gone."

Mijente joined with NorCal Resist and four individual activists in a lawsuit seeking "to enjoin Defendant Clearview AI, Inc. ('Clearview') from illegally acquiring, storing, and selling their likenesses, and the likenesses of millions of Californians, in its quest to create a cyber surveillance state."

"Clearview has built the most dangerous facial recognition database in the nation by illicitly collecting over three billion photographs of unsuspecting individuals," the plaintiffs add.

The lawsuit follows up on revelations from early last year that Clearview AI, which sells its facial recognition services to law enforcement agencies, populated its vast database by scraping images from social media services without the permission of either the posters or the hosting companies.

"Clearview AI, a tech startup, has created an app that enables law enforcement agencies to match photographs to its database of over 3 billion photos scraped from millions of public websites including Facebook, YouTube, Twitter, Instagram, and Venmo," Reason's Ron Bailey noted in January 2020. "For comparison, the FBI's photo database contains only 640 million images."

That presumptuous use of personal images blew new life into the vestigial privacy concerns of social media executives. Companies including Facebook, LinkedIn, Twitter, and YouTube demanded Clearview stop its invasive practices and delete the scraped images. So far, that hasn't happened.

Some government regulators also raised objections to the intrusive way the images were sourced. Clearview withdrew from the Canadian market after that country's privacy commissioner found that the company used personal photos without consent, in violation of the country's laws.

"It is completely unacceptable for millions of people who will never be implicated in any crime to find themselves continually in a police lineup," commented Daniel Therrien, Privacy Commissioner of Canada. Clearview's facial recognition tools had been in use by dozens of Canadian police agencies, including the Royal Canadian Mounted Police.

The Mijente/NorCal Resist lawsuit alleging the violation of Californians' privacy, in violation of state protections, follows on legal action in Illinois by the American Civil Liberties Union. That lawsuit alleges "violation of Illinois residents' privacy rights under the Illinois Biometric Information Privacy Act (BIPA)."

The controversy over facial recognition comes to a head as the technology matures and becomes more reliable and easier to use. China widely deploys the technology as part of its efforts to monitor and control its population and companies based there have become leaders in the field. Other countries have adopted China's facial recognition advances as well as its totalitarian interest in identifying and punishing anti-government protesters.

Even in the United States, it's become routine to scan travelers; millions of flyers were compared by Customs and Border Protection against databases last year. The COVID-19 pandemic has spurred plans for "touchless" biometric passports, incorporating facial recognition technology at checkpoints to identify travelers.

The pandemic has also offered an opportunity to refine the technology, so that it can more effectively identify subjects wearing masks, just based on the appearance of the eyes and nose. The Department of Homeland Security (DHS) is champing at the bit to deploy the new algorithms — initially for travelers, but ultimately for everybody who passes in front of an official camera.

"Without masks, median system performance demonstrated a ~93% identification rate, with the best-performing system correctly identifying individuals ~100% of the time," DHS boasted in January. "With masks, median system performance demonstrated a ~77% identification rate, with the best-performing system correctly identifying individuals ~96% of the time."

Having access to vast databases of faces — whether derived by the FBI from driver's license records or scraped by Clearview from social media — gives surveillance state efforts using this rapidly advancing technology the ability to compare images captured at checkpoints or from (less reliable) scans of street scenes against known identities.

The use by government agencies of social media images scraped by Clearview for facial recognition is reminiscent of their purchase of cell phone location data from marketing firms and telecommunications companies for tracking targeted individuals. In both cases, private firms compile information in ways that would raise eyebrows, or be explicitly forbidden, for government actors. In both cases, the over-clever end-run around civil liberties protections invites legal challenges.

"Our concern is that the Supreme Court rejected the Government's argument in Carpenter that [cell-site location information] is truly voluntarily provided to the phone carriers," the Treasury Inspector General for Tax Administration wrote last month of IRS use of cell phone location data purchased from private parties. "The Court's rationale was that phone users do not truly voluntarily agree to share the information given the necessity of phones in our society. Courts may apply similar logic to GPS data sold by marketers."

Plaintiffs in U.S. lawsuits, as well as regulators in Canada and elsewhere, say that similar concerns apply to facial recognition databases. People didn't post vacation photos so they could later be used by cops to identify suspects and scan crowd scenes. If the challenges prevail, they probably won't stop the advance of the surveillance state by themselves, but they may slow it down a bit.