Airports are Experimenting with Facial Recognition Tech
The US Transportation and Security Administration (TSA) confirmed in December that it would be testing a new facial recognition system at some of the nation’s largest airports.
Sixteen airports including Boston, Atlanta, Los Angeles, and Denver are already using the system – in which artificial intelligence (AI) determines whether a passenger’s face matches their photo ID.
Those who pass the test are allowed to continue towards their terminal, while those who are flagged as potential mismatches are directed to speak with a TSA agent. Though accuracy data has yet to be released, TSA Administrator David Pekoske claims the face-scanning system makes fewer mistakes than a human would.
Given the relative safety of flying as a method of transportation, it is unclear why the TSA feels the need to install this controversial system at airports. The only benefits I see here are time saved (for passengers) and the ability to operate with fewer staff (for the TSA).
TSA claims passengers will be allowed to opt out of the program, though it is unclear if travelers will feel as if they have a choice.
“What we often see with these biometric programs is they are only optional in the introductory phases – and over time we see them becoming standardized and nationalized and eventually compulsory,” explains Albert Fox Cahn, founder of the Surveillance Technology Oversight Project. “There is no place more coercive to ask people for their consent than an airport.”
Speaking with The Washington Post, Cahn also expressed concerns regarding racial bias: “I am worried that the TSA will give a green light to technology that is more likely to falsely accuse Black and Brown and nonbinary travelers and other groups that have historically faced more facial recognition errors.”
For years, studies have shown that facial recognition technology is less accurate when scanning Black, Brown, and Asian faces than white faces.
TSA official Jason Lim, who works directly with the AI program, says the agency is “very satisfied with the performance of the machine’s ability to conduct facial recognition accurately” and confirmed that “demographic equitability” is a “significant element in [the agency’s] testing.”
The AI system, formally known as Credential Authentication Technology (CAT), saw an increase in use during the pandemic when travelers sought out contactless options at airports. The TSA has already promised the public that it won’t store the biometric data collected through face scans, but there are no laws preventing the agency from doing so in the future.
As we’ve seen time and time again, government agencies that claim not to be storing data have that information stored somewhere. And with mass data collection comes the risk of infiltration by foreign players.
The use of CAT at airports is supported by President Joe Biden, who claims it would “enhance security effectiveness, operational efficiency, and the passenger experience.”
White this may be true, any benefits come at a steep price. Not only is this program a major breach of privacy for law-abiding citizens, but it carries a high risk for abuse. If this program is adopted nationwide, we can assume that the biometric data collected at airports will eventually be combined with government databases. As we’ve seen in China and Russia, this data can be used to identify and silence political opponents, dissidents, protestors, and critics. Though Sleepy Joe is unlikely to promote the use of personal data in such a way, there’s no telling what a future Administration may choose to do with this sort of information.
Editor’s Note: You can say that, yes, the government has been discreetly using facial recognition for quite a while in airports. That is true. And you cannot fly without a government ID. But normalizing this gross breach of privacy is not the answer.
In my opinion, the purpose of airport security is to ensure that people do not take over the airplane. You do not need to identify someone to make sure they are not carrying weapons or explosives. If they machine are not able to tell, then we need better machines. An airline should be like a bus. You pay cash, you step onto it and you step off of it someplace else.
I object to government agencies using airport security to restrict the travel of people they do not like. That has Bog Brother (or CCP) written all over it.