Facial recognition surveillance is here — but privacy protections are not


 
On Wednesday, Congress held a hearing on the “problem of visa overstays” focusing, in large part, on the use of biometrics to address this problem.
In a little publicized section of Trump’s Immigration Executive Order, the president ordered Department of Homeland Security (DHS) to “expedite the completion and implementation of a biometric entry exit tracking system.” The purpose of the tracking system is to confirm who is entering and exiting the country to, for example, identify people who have overstayed their visas. Congress has already provided up to a billion dollars over the next ten years to implement the program.

ADVERTISEMENT
This is a massive effort to implement a biometric surveillance program that poses serious risks to our privacy and civil liberties. The increasing use of biometric data requires a meaningful look at how we should regulate the collection, use, dissemination, and retention of biometric information and implement the necessary safeguards to prevent biometrics from becoming the basis of a mass surveillance program. A large-scale biometric surveillance program would allow the government to identify whoever they want, when they want, without the person’s knowledge or consent, and without a reason for doing so. Currently, Customs and Border Protection (CBP), the component of DHS responsible for implementing the Biometric Entry-Exit program, has run a series of pilots to test the program. In 2015, CBP ran a facial recognition pilot to test the use of the technology on travelers entering the country. CBP is currently expanding pilots that test the use of facial recognition on travelers exiting the country.
CBS This Morning did a piece on airport facial recognition tests for which I was interviewed for. video: https://t.co/8roGAdZhkr
— Jeramie D. Scott (@JeramieScott) June 16, 2017Last year, CBP started testing part of the biometric exit program known as the Traveler Verification Service (TVS) at Atlanta Hartsfield International Airport. The agency will now expand the initial implementation of TVS to Washington Dulles International Airport and Houston George Bush Intercontinental Airport this summer.
CBP is also partnering with airlines like JetBlue and Delta to implement face recognition technology at various points in airports. JetBlue is running a self-boarding program using facial recognition in lieu of checking boarding passes. Delta aims to use facial recognition as part of baggage drop off.
The airlines are selling the use of facial recognition as a convenience feature, but it’s part of a larger effort by the government to implement a biometric surveillance program. And, it’s not clear if passengers realize what they are actually signing up for. Even if some of the passengers are aware, there is still a lack of information about the government’s biometric entry-exit program. We don’t know, for example:
  • How exactly do these biometric tracking systems work?
  • What were the detail findings of the reports associated with the various pilots?
  • How expansive will the biometric entry-exit program become?
  • Will these biometric tracking systems move beyond ports of entry like airports?
  • How will CBP ensure that the collection and use of biometric data will not expand beyond the original purpose?
  • What privacy and civil liberties protections are currently in place?
The organization I work for, the Electronic Privacy Information Center (EPIC), is pursuing information about these programs to better inform the public. EPIC has filed Freedom of Information Act requests regarding the CBP’s use of face recognition for the agency’s biometric entry and exit programs.
Trump administration wants all US citizens flying abroad to have facial recognition scans, worrying privacy groups. https://t.co/n8Wd9Bk6cL
— The Associated Press (@AP) July 12, 2017Transparency about these biometric surveillance programs is essential. We are heading down a path where expanding surveillance is used as a panacea for societal problems — not better immigration policies or economic policies or foreign policies, but the continued diminishment of our privacy and civil liberties in the name of security. Inevitably, a disproportionate amount of the burden created by these programs will fall on religious, ethnic, and racial minorities but the programs (and their possible expansion) will impact us all.
This will not end well unless we take a step back and have a serious debate about what circumstances we, as a society, want to allow the use of biometrics for surveillance. And perhaps even more important, how do we address the use of biometrics without defaulting to an implementation that creates a mass surveillance infrastructure.
CBP has indicated that they are deleting the face scans of U.S. citizens for the pilots the agency is running but this decision could easily change and probably will when the program is fully implemented.
Additionally, the collection of biometric data will inevitably end up in additional agency databases like the FBI’s Next Generation Identification biometric database. This already occurs with the Transportation Security Administration’s Pre-check program. When travelers sign up and submit their biometric information for Pre-check, the data is disseminated to and retained in DHS and FBI biometric databases for use as those agencies see fit.
Facial recognition rules could transform future of marketing: https://t.co/xxdIhbGQXA pic.twitter.com/jIEJvMOEt1
— The Hill (@thehill) March 18, 2016Many if not all of these databases will be exempted from Privacy Act safeguards that require accurate information, the ability to correct the information in the database, and a restriction that the information collected by relevant to the purpose of the database.
All this is possible because of the lack of laws and legal precedent regulating the collection, use, dissemination, and retention of biometric data. Which is why when CBP says they are dedicated to protecting privacy, it rings a little hollow because there is very little the agency could do at this point that would clearly violate any privacy-protecting law.
We need to implement regulations for the use of biometrics, particularly facial recognition and similar biometrics that can be done remotely, without the subject’s knowledge, and on a mass scale. Biometrics like face recognition can easily be used for mass, indiscriminate surveillance of the public. Given the risks posed by biometric surveillance, it should not be after the implementation of a vast biometric surveillance infrastructure that we decide the policies of its use.
Biometric surveillance poses serious privacy and civil liberties risks, and we need to respect those risks and address them now.
Jeramie D. Scott is the Electronic Privacy Information Center's (EPIC) National Security Counsel and Director of the EPIC Domestic Surveillance Project. His work focuses on privacy issues implicated by domestic surveillance programs with a particular focus on drones, cybersecurity, biometrics, and social media monitoring. Follow him on Twitter @JeramieScott.
The views expressed by contributors are their own and not the views of The Hill. 
Source