Computers, Privacy & the Constitution

Government Usage of Facial Recognition Technology: a Threat to Privacy and Constitutional Rights

-- By MoneshDevireddy - 04 Mar 2024

Facial recognition technology (FRT) is an innovation of modern times that allows the comparison of two or more pictures of faces and determination of whether they represent the same person. FRT’s contemporary usage by government agencies poses disturbing implications for society—namely threatening our individual expectation to privacy and other important constitutional rights.

I. Facial Recognition Technology and Government Usage

FRT is widely used at the federal and state levels of government, from the FBI to local law enforcement agencies, and even at airports and other transit hubs to facilitate travel and protect national security. The FBI’s facial recognition system allows it access to criminal and civil photos from various databases, such as visa applicant photos, mugshots, and driver’s license photos; access to such databases means that law enforcement face recognition affects more than 117 million American adults, with one in two adults in a face recognition network. Despite these staggering statistics about the prevalence of its usage, FRT is largely unregulated, and government agencies widely use the technology without democratic oversight or transparency as to specific usage and technical limitations—a fact that must change if we are to protect our civil liberties and constitutional rights.

II. Privacy

The Fourth Amendment ensures our reasonable expectations of privacy by protecting against unreasonable searches and seizures by the government. Academics evaluating FRT’s implications on our Fourth Amendment rights have come out on either side—concluding that FRT does threaten our right to privacy, or that it doesn’t. Those that argue the latter position rely heavily on the Supreme Court’s holding in United States v. Katz, which held that there exists no right to privacy in public places. It follows, then, that we subject ourselves to identification/examination of our faces when we step into public places, so facial recognition technology doesn’t violate our privacy rights in this context any more than would a random passerby checking us out. Additionally, should an individual be vehemently opposed to even a brief scan of his/her face, (s)he can don a mask as an extra preventative measure, an action that has become a social norm since the COVID-19 pandemic.

But this cursory analysis does not paint the whole picture. With the pervasiveness of CCTVs, recording much of our public movements, police body cameras, and other mediums of potential surveillance, it is very possible to use FRT in a more invasive form. Such an application has been termed as “dragnet, real-time” usage, in which law enforcement can use FRT in a “suspicionless and surreptitious” way, allowing them to scan people’s faces as they go about their day and compare the faces with identifying information such as name, address, criminal history, immigration status, etc. Such a usage extends our Katz analysis, introducing the argument that, while we may consent to superficial inspections of our faces by others in public settings, we do not invite strangers to determine intimate details about our work and personal lives based on such inspections.

Other judicial holdings relevant to this issue include Carpenter v. United States, which held that the use of advanced technologies to engage in the prolonged and sustained surveillance of a person’s public activities may prompt Fourth Amendment concerns, and Kyllo v. United States, which indicated that a search occurs when an instrument that is not available to the public is used for an investigation that, without the technology, would normally be constitutionally protected. As such, more information regarding how exactly government agencies are using FRT is required, in order to determine whether such usage fits within the boundaries of our jurisprudence concerning privacy and surveillance.

III. Equal Protection

Aside from Fourth Amendment concerns, facial recognition technology has serious potential to undermine our equal protection rights. The United States is no stranger to discriminatory applications of its policing power. And FRT has potential to be the latest tool for law enforcement to perpetuate racist and anti-activist surveillance, widening pre-existing inequalities. Professor Amanda Levendowski terms this inequitable application of FRT that would be disadvantageous to certain minority communities as “deployment bias”—allowing the government to "weaponiz[e] surveillance technologies, such as face surveillance, against marginalized communities…render[ing] their movements hypervisible to law enforcement." Providing this ability to the government without proper safeguards not only might affect racial minorties, but other marginalized populations, such as undocumented immigrants (by ICE) or Muslim citizens (by, e.g., the NYPD). Whether we attack FRT on equal protection grounds or on another theory, efforts must be made to curb the technology’s ability to further inequities in our society.

IV. The Road Ahead

It is now time to think about prophylactic measures that can be taken to prevent a quasi-Owellian state of affairs in which the government can access intimate details about our personal, vocation, and criminal backgrounds as easily as they can run our license plates.

First, proper legislative oversight over FRT must be implemented before allowing its rampant use. Regulatory agencies should collaborate with experts from diverse fields, including law, ethics, and technology, to develop guidelines that prioritize privacy protection and mitigate discriminatory outcomes. These regulations should encompass transparent data collection practices, stringent security measures, and accountability mechanisms to hold entities responsible for FRT misuse or abuse.

Furthermore, enhancing transparency surrounding FRT is essential to build public trust and accountability. Entities deploying FRT should provide clear information about its usage, including the purposes, methodologies, and potential risks involved. Transparency reports detailing data practices, algorithmic biases, and performance metrics should be regularly published to enable independent auditing and scrutiny. Moreover, accountability mechanisms such as oversight boards or independent regulators should oversee FRT deployment to ensure compliance with legal and ethical standards.

Finally, respecting individuals' rights to autonomy and consent is fundamental to ethical FRT deployment. Entities collecting facial data should obtain informed consent from individuals, clearly outlining the purposes and scope of data usage. And individuals should have the right to access, correct, or delete their facial data held by FRT systems, empowering them to exercise control over their personal information.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r3 - 15 May 2024 - 22:16:39 - MoneshDevireddy
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM