TSA's Facial Recognition Tech Raises Questions About Bias and Data Security
Analysts and lawmakers are concerned about a new TSA program that instructs passengers to insert their IDs into a machine and takes a pictures of them.

The Transportation Security Administration's (TSA) trial rollout of biometric facial recognition technology at airport security checkpoints has raised questions about the risks it poses to travelers' privacy and the possibility of discrimination.
The TSA's program gives passengers the option to insert their ID into a machine while looking at a camera until a screen below flashes, "Photo Complete." Then the passenger passes through the checkpoint. The TSA says the images are then deleted and that the camera only turns on when a person places their ID card in the scanner, according to Jason Lim, the TSA's identity management capabilities manager.
The TSA is testing the technology at 16 major airports, and the agency insists that the program is voluntary. However, in a March 14 interview with Kyle Arnold of The Dallas Morning News, TSA Administrator David Pekoske said that if the TSA gets its way, biometric screening technology will eventually not be optional.
But even without mandating facial recognition, fears of delay or poor treatment by TSA staff may lead travelers to submit when they'd rather not. "Whenever there is a power imbalance between powers, consent is not really possible," says Meg Foster, a justice fellow with the Georgetown Law Center on Privacy and Technology. "How does TSA expect them to see and read an inconspicuous notice, let alone tell a TSA agent they want to opt out of face recognition? Especially if it may not be clear to them what the consequences of opting out will be."
The TSA stresses that there is no "discernable difference in the algorithm's ability to recognize passengers based on things like age, gender, race and ethnicity" due to its use of advanced camera technology. But past research paints a different picture. In a 2018 study on the facial analysis of three commercial gender classification algorithms by the Gender Shades Project at the Massachusetts Institute of Technology, researchers found that "darker-skinned females are the most misclassified group." An independent assessment from the National Institute of Standards and Technology in 2019 also found that face recognition technologies are "least accurate on women of color."
Additionally, the TSA risks travelers' privacy by collecting personal data and sending it to the Department of Homeland Security (even if the TSA asserts that the data is anonymized, encrypted, and eventually deleted).
The Cybersecurity and Infrastructure Security Agency noted last year that the federal government is failing to recruit enough cyber talent, making the government's information security vulnerable to hacks. Recent years have seen numerous data breaches of federal agencies, including a breach of the U.S. Department of Transportation just last week.
As William Owen, the communications director at the Surveillance Technology Oversight Project, explains, "Even if the plan is to eventually delete all photos and IDs, the data collection during the current pilot program leaves more than enough time to put passengers' most sensitive information at risk."
These issues led a group of five senators to write a letter to Pekoske in February of this year about the TSA's "alarming use of facial recognition technology" at American airports. "American's civil rights are under threat when the government deploys this technology on a mass scale, without sufficient evidence that the technology is effective on people of color and does not violate American's right to privacy," the letter stated.
Sens. Jeff Merkley (D–Ore.), Cory Booker (D–N.J.), Bernie Sanders (I–Vt.), Ed Markey (D–Mass.), and Elizabeth Warren (D–Mass.) called for an immediate "halt" of the TSA's tech deployment until questions about discrimination, transparency, and data storage are answered. Similarly, Jeramie Scott of the Electronic Privacy Information Center recommended an outside audit to determine "that the technology isn't disproportionally affecting certain groups and that the images are deleted immediately."
Like all TSA techniques, facial recognition is just one more piece of security theater. Requiring passengers to partially undress, limiting how much shampoo they can fly with, and subjecting them to random searches have not made the skies safer. Using an algorithm to match their faces to government identification won't either. But unlike those other indignities, it will potentially expose them to a government data breach.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Here’s what happens when Reason lets an intern write an article: Opinions on the efficacy of a technology are based on four and five-year-old studies that used old systems.
Great article, Mike. I appreciate your work, I’m now creating over $35,200 dollars each month simply by doing a simple job online! I do know You currently making a lot of greenbacks online from $28,200 dollars, its simple online operating jobs.
.
.
Just open the link—————————————>>> http://Www.JobsRevenue.Com
I am making a good salary from home $6580-$7065/week , which is amazing under a year ago I was jobless in a horrible economy. I thank God every day I was blessed with these instructions and now it’s my duty to pay it forward and share it with Everyone,
🙂 AND GOOD LUCK.:)
Here is I started.……......>> http://WWW.RICHEPAY.COM
Face recognition really is objectively harder for dark skin tones.
They could use iris patterns, vein patterns, retinal patterns, or fingerprints instead; those don't depend on skin tone.
Easily start receiving more than $600 every single day from home in your part time. i made $18781 from this job in my spare time afte my college. easy to do job and its regular income are awesome. no skills needed to do this job all you need to know is how to copy and paste stuff online. join this today by follow details on this page.
.
.
Apply Now Here———————————->>> https://Www.Coins71.Com
That is your concern? Priorities...
Why didn't you provide better evidence of the efficacy?
All you white people look alike
Cash generating easy and fast method to work part time and earn an extra $15,000 or even more than this online. By working in my spare time I made $17990 in my previous month and I am very happy now because of this job. you can try this now by following
the details here...... http://Www.Smartjob1.com
No, not at all.
Some of us are fat, some are really fat, and some are huge.
But enough about groomer Jeffy.
>>The Transportation Security Administration
Central Data Intake
Well played.
Pretty much my thought.
How long will it be before they (admit to) stop(ping) deleting the new pics and use them to continually update their database?
About three days after the program becomes involuntary.
The upside of this is, at least I won't have to pretend to appreciate the "good work" the first TSA goon is doing by checking "your papers, please".
Let's not stand in the way of AI research. Just... let it happen bruh.
This can't possibly blow up in our face.
Finally, scientific proof that they all look alike.
But enough about the Irish.
So, we've long since conceded the 4th, 9th and 10th Amendments and we're just concerned about data breaches of data the government has gathered on us.
So black face in the airport is now fine for libertarians. I have a flight coming up and don't want issues?
Shoe polish?
Maybelline? ( unless there's a DMulvnry boycott)
"How does TSA expect them to see and read an inconspicuous notice, let alone tell a TSA agent they want to opt out of face recognition? Especially if it may not be clear to them what the consequences of opting out will be."
WTF do you think happens now at TSA check points? You show your super-official photo ID and a human uses that to recognize your face. What is so different about a machine doing that?
And if you are paranoid about digital storage of your image, or just your presence at the airport, you probably don't want to know what is already happening.
Drive your car with fake license plates (have several sets) and pay for everything on your way with cash. Not surefire. but...
If the police pull you over they'll seize your cash because it's obviously for drugs.
That's true also for humans. Turns out, dark skin just makes facial recognition harder because... physics. That's why many populations of humans evolved towards light skin tones and why humans generally prefer light skin tones, across the globe, in all populations.
Is that the institutional racism stuff I keep hearing about?
"Is that the institutional racism stuff I keep hearing about?"
Personally, I am more concerned with the "institutional stupidity" I see everywhere, all around me, all the time.
I dunno. Is "natural evolution" an "institution"?
Just get rid of the TSA. "problem" solved.
It is a pain in the ass sometimes, But I have avoided all airports since they became constitution-free zones.
The TSA was just a federalization of airport security. There is no need for it.
Facial recognition technology should only be used to harass White people (naturally, to weed out insurrectionists, Christians, and homophobes).
IOW, almost all white people.
Gawds I hate agreeing with these clowns, but, something about a broken clock comes to mind.