Surveillance

NFL's Facial Recognition Technology Sparks Fears of Fan Surveillance

Personal data retained by government or private entities are always at risk of compromise, misuse, or access by law enforcement.

|

Earlier this month, a viral post on X claimed that the NFL would start using facial recognition technology (FRT) "to verify the identity of people entering [each] stadium." According to the rumor, the league had partnered with Wicket, a facial recognition software company, and would soon begin its fan surveillance program.

Wicket's chief operating officer, Jeff Boehm, told Reason in a statement that the company's FRT technology was being used to verify the identity of "credential holders seeking access to restricted areas of the stadium, not for ticket holders."

Boehm went on to say there were "no US Government agencies or departments currently utilizing Wicket in an operational role." He acknowledged that "Wicket has not made our source code available to any third parties."

Despite Boehm's assurances, video surveillance of fans—even if done by a private company—could be cause for concern. Once Wicket collects the data, where do they go? Who can access them?

Technology has become virtually synonymous with convenience. But what might seem convenient for an iPhone user—the ability to unlock your phone with a fingerprint or your face—can also be used by law enforcement in some jurisdictions to force you to unlock your phone against your will. This kind of scenario represents a direct threat to our Fourth and Fifth Amendment rights.

But police also have another way of getting your digital data, whether biometric or otherwise: They can buy them. With no federal legislation currently banning this practice, law enforcement agencies are always seeking out new sources of personal data on Americans they can buy and store for future use.

While Wicket is a participant in the Face Recognition Technology Evaluation program through the National Institute of Standards and Technology, Boehm's response means that Wicket's FRT and related encryption code have not been subjected to an independent audit for accuracy and security.

Avoiding independent audits is becoming increasingly difficult for most commercial tech companies. Major FRT vendors like Microsoft, IBM, Amazon, and Google have had their FRT failures publicly exposed, heightening concerns about this technology.

NFL Communications Director Tim Schlittner told Reason, "The NFL performed testing/trialing of Wicket to ensure its capability and accuracy. Our cybersecurity team also conducted a vendor risk assessment to ensure the technology met all of our requirements."

Yet neither its testing nor cybersecurity risk assessment of Wicket's system has been made public by the NFL.

If FRT technology advances to the point where it can correctly identify a person with 100 percent certainty—regardless of race, facial type and features, gender, or any other factors that have the potential to return false positives or false negatives—it could usher in a revolution in facility and critical systems security.

Imagine scenarios where airliner aircrews, ground crews, maintenance personnel, food vendors, and others who work at airports are required to undergo FRT as a measure against threats. Similarly, on military bases, both uniformed and civilian personnel, along with their spouses, might be required to submit to FRT for the same reason.

While technological perfection may not yet be possible, it is important to remember that regardless of the well-intentioned reasons FRT advocates provide, personal data held by government or private entities are forever at risk of compromise, misuse, or access by law enforcement.

Regarding whether the NFL shares data from information technology systems other than Wicket with law enforcement entities, Schlittner cited the NFL's existing privacy policy for its shop.

That policy currently allows for the sharing of user login and password information, precise geolocation data, and racial or ethnic information with law enforcement "as required by law or legal process," as well as "to anticipate, prevent, detect, and investigate suspected fraud, harassment, or other violations of any law, rule, or regulation" for "legal, security, or safety purposes."

The NFL will also sell that very same kind of data to law enforcement, according to the policy. For now, biometric data collected by the NFL—specifically "a scan of your facial geometry used to verify your identity"—are not up for grabs to the highest bidder. The concern, of course, is that given the incentives to monetize such data, NFL customers and fans would be well-advised to periodically check the NFL's privacy policy page for any changes.

There's no question that enactment of the Fourth Amendment Is Not For Sale Act (which passed the House 219–199 in April) would restore a proper constitutional "no probable cause, no warrant" standard for law enforcement access to commercially stored data. Yet even if the bill became law, it does nothing to preclude the "garbage in, garbage out" problem associated with inaccurate FRT scans or other biometric data.

It was a botched fingerprint analysis by the FBI 20 years ago that nearly sent innocent Oregon lawyer Brandon Mayfield to prison for years on bogus terrorism charges. More recently, false positive police FRT scans have led to personal and professional disaster for several people. According to the American Civil Liberties Union, "in nearly every one of those instances, the person wrongfully arrested was Black."

To prevent those kinds of errors and abuse, Congress should pass legislation mandating that FRT or any other biometric technology alone cannot be the sole basis for an arrest, as well as requiring mandatory periodic database audits to purge questionable or demonstrably false biometric data from federal law enforcement databases. It's the kind of criminal justice reform that should be a high priority when the 119th Congress convenes in January 2025.