National Academies Press: OpenBook

Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance (2024)

Chapter: 4 Equity, Privacy, Civil Liberties, Human Rights, and Governance

« Previous: 3 Use Cases
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

4

Equity, Privacy, Civil Liberties, Human Rights, and Governance

The implications of the use of facial recognition technology (FRT) for equity, privacy, civil liberties, and human rights are consequential, but the terms are contested, do not have fixed, universally accepted definitions, and overlap in important ways. In the following text, they are used to capture ways in which FRT can impact a core set of interests related to freedom from state and/or private surveillance, and hence control over personal information. Importantly, harm from surveillance is distinct from harms imposed by faulty or inadequate technical specifications and also distinct from harms that are measured in terms of their effects on diversity, equity, inclusion, and accessibility. In other words, although some potential FRT harms arise from errors or limitations in the technology, other potential harms arise and become more salient as the technology becomes more accurate and capable. Furthermore, it is important to emphasize that FRT can interfere with and substantially affect the values embodied in privacy, civil liberties, and human rights commitments without necessarily violating rights and obligations defined in current statutes or constitutional provisions.

This chapter considers the following related topics:

  • The intersection of FRT with equity and race,
  • Privacy and other civil liberties and human rights concerns associated with FRT use,
  • Governance approaches for addressing these concerns,
  • Governance issues raised by the use of FRT in criminal investigations, and
  • Approaches for addressing wrongful FRT matches or overly intrusive deployment of FRT.
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

EQUITY, RACE, AND FACIAL RECOGNITION TECHNOLOGY

FRT intersects with equity and race in several key ways, as follows:

  • FRT manifests phenotypical variation in false positive (FP) match rates. As discussed in Chapter 2, FRT developed in a particular region tends to overrepresent particular phenotypes in its algorithmic training sets. Many FRT systems deployed in the United States are trained on imbalanced, disproportionately White, data sets. As a result, the systems yield consistently higher FP match rates when applied to racial minorities, including among populations that are Black, Native American, Asian American, and Pacific Islanders. Although overall error rates are, in absolute terms, very low in the best systems today under ideal conditions, individuals represented in these populations are nevertheless at higher risk of being erroneously identified by certain facial recognition systems.
  • FRT provides law enforcement with a powerful new tool for identifying individuals more rapidly, at a distance, and at greater scale and thus, depending on where and how it is used, has the potential to reinforce patterns or perceptions of elevated scrutiny by law enforcement and national security agencies, especially in marginalized communities. Put bluntly, some communities may be more surveilled than others, and increased scrutiny can lead to neighborhoods being designated as high-crime areas, a feedback loop that can further justify use of FRT or other technologies that disproportionately affect marginalized communities. Moreover, the use of FRT has raised concerns in some communities—including Black, Hispanic, and Muslim communities—reflecting in part differential intensity of past interactions with law enforcement and other government authorities.
  • Several equity issues arise from the fact that reference galleries used by law enforcement—notably those based on mugshots—do not include every possible individual of interest for a scenario and may overrepresent and over-retain individuals from particular groups. This means that
    • Differential intensity of policing can lead to differential frequency of law enforcement contacts, which leads to a differential rate of representation in law enforcement reference galleries. This effect is compounded by the fact that mugshots are not removed when cases are dropped or lead to acquittals.
    • Differential representation in galleries increases the probability of an FP match—that is, anyone in the gallery could become an FP match. Being
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
    • in the gallery at all is also a precondition for a false match based on lack of a high enough match score threshold. Conversely, not being in the gallery—because one has never had a law enforcement contact—not only makes the chance of an FP match zero but also makes the chance of a true match zero.
  • All six known cases where wrongful arrests have been made on the basis of FRT involve Black individuals identified using FRT. These incidents likely represent a very small percentage of arrests involving FRT; comprehensive data on the prevalence of FRT use, how often FRT is implicated in arrests and convictions, or the total number of wrongful arrests that have occurred on the basis of FRT use do not exist. However, these cases have significance beyond what the numbers would suggest because they have occurred against a backdrop of deep and pervasive distrust by historically disadvantaged and other vulnerable populations of policing and because all of the reported wrongful arrests associated with the use of FRT have involved Black defendants. A brief summary of the cases follows:
    • Robert Williams was arrested in 2020 for a 2018 theft of watches on the basis of FRT identification made on the basis of a screen capture from security camera footage. He was detained for nearly 30 hours before being released on a personal bond. The detective working the case subsequently determined that Williams was not the person captured in the security camera footage.1,2
    • Nijeer Parks was arrested by police in New Jersey in 2019 after an erroneous FRT identification. He spent 11 days in jail after being charged with aggravated assault, unlawful weapons possession, using fake identification, shoplifting, marijuana, possession, resisting arrest, leaving the scene of a crime, and accusations of nearly striking a police officer with a car. He faced up to 25 years in jail, before he was able to produce evidence that he was 30 miles away when the crime occurred.3
    • Michael Oliver was arrested by Detroit police in 2019 on charges of stealing a cellphone. The investigator used FRT to identify Oliver as the suspect from video of the theft. It quickly became clear, however, that a

___________________

1 T. Ryan-Mosley, 2021, “The New Lawsuit That Shows Facial Recognition Is Officially a Civil Rights Issue,” MIT Technology Review, April 14, https://www.technologyreview.com/2021/04/14/1022676/robert-williams-facial-recognition-lawsuit-aclu-detroit-police.

2 K. Johnson, 2022, “How Wrongful Arrests Based on AI Derailed 3 Men’s Lives,” Wired, March 7, https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives.

3 K. Hill, 2020, “Another Arrest, and Jail Time, Due to a Bad Facial Recognition Match,” New York Times, December 29, https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.html.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
    • misidentification had occurred, because Oliver has visible tattoos on his arms while the individual filmed stealing the phone had none.4
    • Randal Reid was arrested in 2022 driving to his mother’s home in DeKalb County, Georgia, on a warrant issued in Louisiana on suspicion of using stolen credit cards. At the time of his arrest, Reid had never been to Louisiana. He was released after 6 days in detention.5
    • Alonzo Sawyer was arrested in 2022 for allegedly assaulting a bus driver near Baltimore, Maryland, after FRT labeled him as a possible match to a suspect captured on closed-circuit television (CCTV) footage.6
    • Porcha Woodruff was arrested and held for 11 hours in Detroit in 2023 for carjacking and robbery, despite the fact that she was 8 months pregnant at the time of the crime and the perpetrator was not.7

Perhaps the most detailed record has been developed by the press in the Williams case. The Williams arrest (see Box 4-1) and other cases illustrate that a combination of overconfidence in the technology, use of low-quality probe or gallery images, and poor institutional practices can lead to significant adverse impacts. In the six cases, the consequences have included false arrest and imprisonment, legal costs, interruption of normal activities of life and work, and loss of employment. Although six known wrongful arrests may seem like a small number, the lack of adequate data on law enforcement use of FRT makes it challenging to place these serious errors in a broader context. One cannot say with any confidence if these wrongful arrests are the only such examples, or if, instead, they are the tip of the iceberg. Nor can one say with assurance whether, or how much, the increased FP rate for phenotypically dark-skinned individuals contributed to these mistakes, although it is hard to accept that all six publicized wrongful arrests with FRT occurred with Black individuals as a matter of chance. In several of these cases, it appears that poor FRT procedures, inadequate training, and poor police investigative processes contributed to the erroneous arrests.

These intersections of FRT and race occur against a backdrop of historic and systemic racial biases that influence the development of technology. One commonly cited example with relevance to FRT is the history of film photography, which for many decades was calibrated for lighter skin tones (see Box 4-2). Although much work has

___________________

4 E. Stokes, 2020, “Wrongful Arrest Exposes Racial Bias in Facial Recognition Technology.” CBS News, November 19, https://www.cbsnews.com/news/detroit-facial-recognition-surveillance-camera-racial-bias-crime.

5 K. Hill and R. Mac, 2023, “Thousands of Dollars for Something I Didn’t Do,” New York Times, March 31, https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.html.

6 K. Johnson, 2023, “Face Recognition Software Led to His Arrest. It Was Dead Wrong.” Wired, February 28, https://www.wired.com/story/face-recognition-software-led-to-his-arrest-it-was-dead-wrong.

7 K. Hill, 2023, “Eight Months Pregnant and Arrested After False Facial Recognition Match,” New York Times, August 6, https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.html.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

been done in recent decades to address this bias, adequate lighting and contrast continue to be a challenge with darker skin tones.

CIVIL LIBERTIES, PRIVACY, HUMAN RIGHTS, AND FACIAL RECOGNITION TECHNOLOGY

“Civil liberties” is not a phrase found explicitly in the U.S. Constitution or any statute. It is used generally to capture a suite of fundamental rights and freedoms that protect individuals from unjust or oppressive government conduct. In the United States, civil liberties may be thought of as those rights associated with the federal and state constitutions. These include freedom of speech, freedom of assembly, freedom of the press, the right to privacy, and the right to due process when the government acts against a person. The term “human rights” is used globally to encompass a similar set of rights as captured in United Nations and other international agreements.

FRT has the potential to impact civil liberties and human rights because it changes the scale and cost of collecting detailed data about a person’s movements and activities. Without FRT, a person can be momentarily observed in public, but it is expensive and difficult enough to make it practically impossible to track that person’s movements extensively over time and space without a technical affordance that may be associated with an individual such as a cellphone or license plate. The proliferation of cameras can amplify the threat to civil liberties and privacy posed by FRT, including privately and law enforcement–operated CCTV cameras, doorbell cameras, and smartphones. Combined, these make it increasingly easy to identify people using images captured of their face. When FRT data are associated with space and time, the technology can become a means to evaluate a person’s habits, patterns, and affiliations. Similar concerns have arisen with technologies such as license plate readers and cellphone location services. Some of the use cases identified in Chapter 3 may—depending on how they are implemented, used, and governed—implicate civil and human rights in concerning ways.

Privacy and Facial Recognition Technology

Privacy is commonly understood to include the right to control one’s own personal information. This includes all forms of personal data, including, at least to some extent, personal movement, and behavior in the physical world and online. Of course, when people move around in public places, they can be observed. However, as was discussed in Chapter 1, FRT has the potential to further erode privacy in public spaces because it is inexpensive, scalable, and contactless and because it is very hard to avoid without masking one’s face. Such identification and tracking impinge on privacy because of what

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

it can reveal about a person’s habits, behaviors, and affiliations that are reasonably not expected to be shared without permission. The potential to be tracked surreptitiously also unsettles widely shared expectations that one’s movements will not be tracked or controlled in public spaces, at concert venues, at schools, etc., when one has not done anything unlawful. Defined in this way, privacy concerns itself less directly with the substance or subject matter of the information—whether about political or religious affiliations, financial data, medical information, sexual, or reproductive information—and more with the ability to preserve individual autonomy and freedom through the control of that information. This sense of autonomy, and hence control, includes the ability of persons to preserve their anonymity, as well as to control the circumstances and audiences to which personal information is revealed, at least to some extent. Importantly, most people understand that giving up a little control merely by moving through a public space does not mean that they have acquiesced to a complete loss of control. The fact that some inferences can be drawn about a person who moves in public does not mean that there are no privacy interests to defend.

Privacy guarantees can be found in federal constitutional provisions related to freedom of speech and association, protection against unreasonable search and seizure, and substantive due process rights protecting privacy, family, and intimate associations. State constitutions can also provide privacy protections, sometimes to a greater degree than the U.S. Constitution. Federal and state statutes, such as the Privacy Act8 and the Health Insurance Portability and Accountability Act,9 can also provide legal protections of privacy interests against both the civil and government actors.

Indiscriminate use of FRT in public and quasi-public places can have significant impacts for privacy and related civil liberties. Indeed, the collection of images in public places that could be subject to FRT may deter people from exercising their civil rights. FRT can be used to scan lawful protests or other large gatherings for potential or known threats. However, in the process, data would be collected on individuals who raise no legitimate law enforcement concerns. The use of FRT to identify individuals in other public or quasi-public spaces raises similar concerns—especially absent regulation or other controls on how such information is collected, stored, and used. These concerns may be heightened in locations associated with religious, political, or medical practice. Moreover, the use of FRT in public or quasi-public spaces might also have particularly adverse consequences for the privacy of individuals such as informants, undercover agents, protected witnesses, and victims of abuse. Furthermore, collected data could conceivably be sold to foreign actors, increasing exposure for U.S. citizens while abroad.

___________________

8 Privacy Act of 1974, as amended, 5 U.S.C. § 552a.

9 Health Insurance Portability and Accountability Act. Pub. L. No. 104-191, § 264, 110 Stat.1936.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Individuals can also apply FRT to an image using an online service such as PimEyes, allowing them to identify individuals in images obtained on the Internet or captured using smartphone, doorbell, and other cameras. Widespread availability of such a capability alters expectations about anonymity in public and private places and is especially troubling because it can be used to identify individuals for harassment, intimidation, stalking, or other abuse. Already, one can take a photograph of someone standing nearby or across the street, run it through PimEyes, and receive a small gallery of likely matches, permitting the potential identification of one stranger by another.

Privacy concerns have also been raised regarding how the data used in FRT systems are gathered. Although many law enforcement agencies likely rely on galleries of mugshots or driver’s license photos, the leading private FRT vendor, Clearview AI, compiles its FRT gallery by collecting public images from the Internet, including social media, without consent from the platform or the individuals pictured. To date, Clearview AI has built a database of more than 30 billion images. This practice has met with pushback from some governments. In 2022, the United Kingdom’s privacy watchdog, the Information Commissioner’s Office, ordered the company to “delete all data belonging to UK residents,” becoming the fourth country—following Australia, France, and Italy—to do so. Following a lawsuit filed by the American Civil Liberties Union under Illinois’s Biometric Information Privacy Act, which creates a private right of action, Clearview AI signed a settlement that permanently barred the company from selling its database to most private businesses. Despite this opposition, the company’s FRT systems are still frequently used by law enforcement across the country. According to Clearview AI, as of 2021, the company counts 3,100 law enforcement agencies as customers, along with the Army and the Air Force. In March 2023, the company reported that its database has been used nearly 1 million times by U.S. law enforcement.

In addition to general privacy concerns raised by inclusion in large databases, data in such centralized repositories are highly sensitive and may be an attractive target for exfiltration by third parties, including criminals and foreign governments. Indeed, it is potentially highly useful to adversaries of the United States.10 Protecting the security of such data is essential to protecting the national security of the United States and the privacy and civil liberties of Americans.

___________________

10 For example, a 2015 breach of data held by the Office of Personnel Management resulted in the exposure of data impacting 22.1 million Americans. The federal government and its data contractor agreed to a $63 million settlement with individuals whose personally identifiable information was stolen. See E. Katz, 2022, “A Judge Has Finalized the $63M OPM Hack Settlement. Feds Now Have Two Months to Sign Up for Damages,” Government Executive, October 26, https://www.govexec.com/pay-benefits/2022/10/judge-finalized-63m-opm-hack-settlement-feds-two-months-damages/378950.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Other Civil Liberties Concerns

FRT has been used by business owners to monitor customers and identify potential shoplifters, resulting in several cases of businesses using a false match from an FRT system as the basis for excluding or removing an individual. The prospect of authorities and property owners detaining an individual, or denying access to a store, venue, or other establishment solely on the basis of an FRT match, without recourse, may in many circumstances be viewed as an unwanted expansion of state or private powers. For example, a 2020 investigation from Reuters found that Rite Aid had deployed FRT systems at more than 60 stores in predominantly low-income minority neighborhoods to assist in loss prevention.11 The investigation further identified cases in which false matches generated by the FRT system resulted in an individual being wrongfully asked to leave the store on suspicion of shoplifting by Rite Aid management.

Human Rights and International Perspectives

Human rights are rights enjoyed by all persons. The Universal Declaration of Human Rights,12 a key document setting forth fundamental human rights worthy of universal protection, was adopted in 1948 by the United Nations General Assembly. It provides a basic framework for later conventions and other legal instruments that have emerged in the development of international human rights law. They include the right to be free from “arbitrary interference with [one’s] privacy, family, home, or correspondence, [and from] attacks upon [one’s] honour and reputation.” Human rights principles are expected to be respected by both government and private actors. The United Nations’ Guiding Principles on Business and Human Rights, for instance, state that “business enterprises should respect human rights.”

The use of FRT is being questioned beyond the United States. In 2018, Big Brother Watch, a civil society organization, investigated the use of FRT by police departments in the United Kingdom, demonstrating how FRT “disproportionately misidentif[ies] minorities and women, and 95 percent of [UK] police’s matches have misidentified individuals.”13 In 2019, researchers with the University of Essex Human Rights Centre published a report on the deployment of live facial recognition (LFR) technology by the London Metropolitan Police Service, noting a “lack of publicly available guidance on the use of LFR.”14 In 2022, Chatham House published a report documenting a swift increase

___________________

11 J. Dastin, 2020, “Special Report: Rite Aid Deployed Facial Recognition Systems,” Reuters, July 28, https://www.reuters.com/article/us-usa-riteaid-software-specialreport-idUSKCN24T1HL.

12 United Nations General Assembly, 1948, “The Universal Declaration of Human Rights,” https://www.un.org/en/about-us/universal-declaration-of-human-rights.

13 Big Brother Watch, 2018, Face Off: The Lawless Growth of Facial Recognition in UK Policing, London, England: Big Brother Watch, https://bigbrotherwatch.org.uk/wp-content/uploads/2018/05/Face-Off-final-digital-1.pdf.

14 P. Fussey and D. Murray, 2019, Independent Report on the London Metropolitan Police Service’s Trial of Live Facial Recognition Technology, Colchester, England: University of Essex Human Rights Centre.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

in “the deployment of facial recognition in public spaces for police surveillance” in Latin America without adequate regulations.15 In China, where the deployment of FRT has been particularly extensive (e.g., to track Uighurs through their daily lives in Xinjiang province), the Supreme People’s Court, in a “joint stance with Beijing’s top government bodies,” called for stronger consumer privacy protections from “unwarranted face tracking,” introducing new guidelines in 2021 requiring commercial venues to obtain “consent from consumers to use facial recognition,” to limit FRT use to “what is necessary,” and to protect consumer’s data.16,17

THE GOVERNANCE OF FACIAL RECOGNITION TECHNOLOGY

The impacts of FRT on equity, privacy, and civil rights are greatest when images are indiscriminately collected, stored, and analyzed with little or no input, regulation, or oversight from individuals, communities, civil society organizations, or governmental bodies. FRT raises difficult questions for governance because it raises many novel and complex legal questions. The complexity arises from the following:

  • Many actors are involved in FRT system design and development, the collection of images for training template extraction models, and deployment and use of FRT capabilities. Some of these activities raise unsettled legal questions that depend, in part, on where and how FRT is used (e.g., in a public or commercial space, by a private or a government actor, etc.); and
  • Regulation of FRT might take place at different levels of government (i.e., national, state, and local). Furthermore, at any given level, FRT might be subject to regulation by existing general laws (e.g., related to intellectual property, privacy, law enforcement), technology specific law or regulation, or both.

There are several pathways for federal regulatory action on FRT. First, a court might interpret the U.S. Constitution as providing limits on the government’s use of FRT or as providing constraints on state or national authority to regulate FRT. Constitutional law on both questions is unsettled, and there are no directly applicable or dispositive

___________________

15 C. Caeiro, 2022, Regulating Facial Recognition in Latin America: Policy Lessons from Police Surveillance in Buenos Aires and São Paulo, London, England: Royal Institute of International Affairs, https://doi.org/10.55317/9781784135409.

16 E. Dou, 2021, “China Built the World’s Largest Facial Recognition System. Now, It’s Getting Camera-Shy,” Washington Post, July 30, https://www.washingtonpost.com/world/facial-recognition-china-tech-data/2021/07/30/404c2e96-f049-11eb-81b2-9b7061a582d8_story.html.

17 National Institute of Standards and Technology (NIST), 2020, “Facial Recognition Technology (FRT),” February 6, https://www.nist.gov/speech-testimony/facial-recognition-technology-frt-0.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Supreme Court rulings. See the discussion below on how constitutional protections might apply.

Second, Congress could enact a statute directly regulating FRT. However, although there are legislative proposals to regulate FRT, no legislation on the regulation of FRT has been enacted into law.

Third, a federal agency could issue a regulation or initiate an enforcement action under a statute of general application (i.e., not related to FRT) to address both state and private uses of FRT. Alternatively, guidelines for use by federal agencies could be developed, potentially as directed by an executive order.

Legislative Approaches to the Governance of Facial Recognition Technology

U.S. Federal Law

Currently, no federal statute or regulation imposes a general constraint on the public or private use of FRT. However, there are existing agency authorities or legislative mandates that may have applicability to FRT in specific instances.

The Federal Trade Commission (FTC), for example, has used its authority under Section 5 of the Federal Trade Commission Act to regulate “unfair or deceptive acts or practices in or affecting commerce” to take action against a photo-app developer that allegedly deceived consumers about its use of FRT18—and could potentially address other FRT-related acts or practices.

Federal laws requiring privacy impact assessments and system of record notices impose transparency requirements on federal agencies that use FRT. For example, a May 2016 Government Accountability Office (GAO) report identified privacy and transparency concerns with the Federal Bureau of Investigation’s (FBI’s) use of FRT. In response, the FBI expedited work on system of record notices (which notify the public about the existence of systems and the types of data they collect) and privacy impact assessments (which examine how systems collect, store, manage, and share personal information).19

Another avenue for federal action is the establishment of rules for the procurement and funding of FRT, and non-binding standard-setting activities (such as those of the National Institute of Standards and Technology [NIST]20).

___________________

18 Federal Trade Commission, 2021, “FTC Finalizes Settlement with Photo App Developer Related to Misuse of Facial Recognition Technology,” September 18, https://www.ftc.gov/news-events/news/press-releases/2021/05/ftc-finalizes-settlement-photo-app-developer-related-misuse-facial-recognition-technology.

19 Government Accountability Office, 2019, “Face Recognition Technology: DOJ and FBI Have Taken Some Actions in Response to GAO Recommendations to Ensure Privacy and Accuracy, But Additional Work Remains,” https://www.gao.gov/products/gao-19-579t.

20 NIST, 2020, “Facial Recognition Technology (FRT),” February 6, https://www.nist.gov/speech-testimony/facial-recognition-technology-frt-0.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Training data used by FRT algorithms may be protected by contract or privacy law, but the scope of these protections is unclear. Social media platforms alleged that Clearview AI violated its terms of service by collecting facial images from the Internet. In response, Clearview AI asserted a First Amendment right to collect the images.21 Some have asserted that U.S. copyright law protects against the collection and use of facial images from the Internet. Open questions include whether such activities fall under the fair use exception or special provisions that apply to providers of search engines and similar tools.

Under the Supreme Court’s interpretation of the free speech clause of the First Amendment, regulation of the commercial collection and use of data may be prohibited. In Sorrell v. IMS Health, the Court held that the sale, disclosure, and use of pharmacy records was First Amendment speech.22 The idea that information is speech23 gives private actors powerful support for the assertion that FRT development and deployment cannot be regulated. Nevertheless, the First Amendment only applies to private speech. Sorrell, therefore, does not preclude federal regulation of state actors such as state and municipal police agencies. It may, however, be constitutionally impossible to regulate elements of the private market—including firms that market their services aggressively to police.

A few bills have focused on regulating FRT, illustrating concerns of some members of Congress. One example of proposed, FRT-specific legislation is the Facial Recognition Act of 2022 (H.R. 9061), which was introduced but never received a committee vote.24 The bill focuses on the use of FRT by law enforcement, eschewing a categorical ban on FRT, and instead would require, among other constraints, a judge-authorized warrant before conducting facial recognition searches, notice to individuals subject to FRT searches, and a ban on FRT searches using databases of illegally obtained photographs. The bill would also require law enforcement agencies to annually submit data about their use of FRT for audit by the GAO and require that FRT systems be tested annually using NIST’s benchmark for facial recognition for law enforcement. The bill also includes provisions for redress—including suppression of FRT results and any derivative evidence—in the event of improper use of FRT. Another example of legislation that has been introduced but thus far not acted on is a series of similar bills calling for a moratorium on federal law enforcement use of FRT, introduced most recently as the Facial Recognition and Biometric Technology Moratorium Act of 2023 (S. 681). More generally, proposed legislation to regulate artificial intelligence (AI) would address issues such as bias and civil rights compliance, and would, if enacted, also have implications for the regulation of FRT.

___________________

21 See B.E. Devany, 2022, “Clearview AI’s First Amendment: A Dangerous Reality?” Texas Law Review 101(2):473–507.

22Sorrell v. IMS Health Inc., 564 U.S. 552, 570 (2011).

23 Ibid.

24 See https://www.congress.gov/bill/117th-congress/house-bill/9061/text?s=1&r=20, H.R. 9061.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

U.S. State Regulations

Illinois was the first state to regulate FRT through the 2008 Biometric Information Privacy Act (BIPA).25 BIPA regulates “the collection, use, safeguarding, handling, storage, retention, and destruction of biometric identifiers and information.” It prohibits private parties from collecting biometric identifiers or using information derived from biometric identifiers to create individual profiles without notification, consent, and specified disclosures. Furthermore, BIPA prohibits the sale of collected biometric identifiers and requires private parties to make public their data retention and destruction policies. Similar laws were enacted in Arkansas, California, Texas, and Washington.

Illinois’s and California’s statutes allow for a private right of action, but the costs of civil litigation mean that it is often not feasible for individuals to bring suit. Only aggregate litigation, such as class action lawsuits, will have positive expected value before litigation begins. As a result, in the absence of nonprofit legal assistance, individual remedies under these statutes will likely rarely be pursued.

Lawsuits have been filed under BIPA but have ended in settlements rather than judgments. In 2020, for instance, Facebook settled a lawsuit alleging that its creation of face profiles violated Illinois’s biometric privacy law, changing its use of FRTs as a result.26 In 2021, Clearview AI settled a lawsuit under the same state law.27 Clearview AI suggested that it has a First Amendment defense to BIPA liability,28 but the soundness of this argument is unsettled because the case was not adjudicated.

Another potential avenue for state legislation is to regulate the use of FRT by law enforcement. For example, Maryland Senate Bill 192 would limit the use of FRT to serious crimes and threats to public safety or national security and prohibit use of FRT as the sole basis to establish probable cause.

U.S. Municipal Regulations

Local regulations governing surveillance technologies, including FRT, can mandate public approval of the acquisition and use of these technologies, require transparency or prohibit non-disclosure agreements, and confer legal standing to citizens to challenge violations of these rules. They can also impose notice requirements on private companies that use FRT as part of their business. Most municipalities have not, however, taken action to regulate the use of FRT. In the instances where they have, efforts have typically taken two forms: (1) the creation of administrative agencies with responsibility for public

___________________

25 740 Ill. Comp. Stat. Ann. 14/15(b).

26 N. Singer and M. Isaac, 2020, “Facebook to Pay $550 Million to Settle Facial Recognition Suit,” New York Times, January 29, https://www.nytimes.com/2020/01/29/technology/facebook-privacy-lawsuit-earnings.html.

27 R. Mac and K. Hill, 2022, “Clearview AI Settles Suit and Agrees to Limit Sales of Facial Recognition Database,” New York Times, May 9, https://www.nytimes.com/2022/05/09/technology/clearview-ai-suit.html.

28 B.E. Devany, 2022, “Clearview AI’s First Amendment: A Dangerous Reality?” Texas Law Review 101(2):473–507.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

surveillance technologies, review of annual reports on the use of these technologies, and new regulations and (2) city councils with legislative and administrative functions that establish procedures for the acquisition and use of surveillance technologies.

There have been moves by municipal jurisdictions to categorically ban FRT. In 2019, the city of San Francisco banned the use of FRT by its public agencies.29 Under the city’s administrative code, it is unlawful for any public agency to “obtain, retain, access, or use” any FRT on “city-issued software or a city-issued product or device” or any information obtained from FRT.30

Since 2016, the American Civil Liberties Union (ACLU) has been active in promoting a model bill for local governments interested in regulating surveillance technology in public hands. Called the Community Control Over Police Surveillance (CCOPS), the model bill requires city council approval before the “acquiring or borrowing [of] new surveillance technology” and the issuance of a “Surveillance Impact and Surveillance Use Policy” for any proposed technology. In 2023, at least 22 local governments, including Boston, Massachusetts; New York, New York; Detroit, Michigan; San Francisco, California; and San Diego, California, have adopted surveillance technology regulations using the ACLU model as a template but local governments have made significant alterations from the model bill.31

The city of Oakland, California, has often been cited as a model for local governance of surveillance technologies.32 Oakland enacted surveillance technology regulations and created a separate Privacy Advisory Commission to advise the Oakland City Council on privacy issues. The council and the commission share responsibility for approving new purchases or uses of surveillance technologies by public agencies. If, for instance, the Oakland Police Department seeks to adopt or change a use policy for a surveillance technology, it must notify the commission and then present a surveillance impact policy and use report. The commission conducts public hearings, creates reports, and makes recommendations to the city council regarding the city’s acquisition and use of technology that “collects, stores, transmits, handles or processes citizen data.”33 The city council has final decision-making authority.34

___________________

29 Oakland, California, and Somerville, Massachusetts, have also passed local ordinances banning the use of FRT by public agencies. See Fight for the Future, “Ban Facial Recognition Map,” https://www.banfacialrecognition.com/map, accessed November 17, 2023.

30 City and County of San Francisco, 2019, “Administrative Code–Acquisition of Surveillance Technology: Board of Supervisors Approval of Surveillance Technology Policy,” Section 19B.2(d), https://sfgov.legistar.com/View.ashx?M=F&ID=7206781&GUID=38D37061-4D87-4A94-9AB3-CB113656159A.

31 M. Fidler, 2020, “Local Police Surveillance and the Administrative Fourth Amendment,” Santa Clara, Computer and High Technology Law Journal, Aug. 2, p. 546, http://dx.doi.org/10.2139/ssrn.3201113.

32 Ibid.

33 “Privacy Advisory Commission,” City of Oakland, California, https://www.oaklandca.gov/boards-commissions/privacy-advisory-board, accessed November 17, 2023.

34 City of Oakland, “Chapter 9.64: Regulations on City’s Acquisition and Use of Surveillance Technology.”

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Regulation of FRT at the local government level faces some obstacles. Local ordinances will result in policies and regulations that vary from city to city. In addition, the creation of a specialized administrative agency or regulations for surveillance technologies like FRT requires resources, including access to technical expertise, that many municipalities do not have access to. Outside of large cities, an approach that emphasizes municipalities will necessarily leave oversight gaps.

There has also been some pushback against regulations limiting or banning use of FRT by law enforcement, citing concerns about crime.

International Law

Other countries are also grappling with whether and how to govern FRT. Perhaps the most ambitious attempt at regulation is contained within the European Artificial Intelligence Act, which would complement the General Data Protection Regulation and the Law Enforcement Directive of the European Union. The European Parliament adopted its negotiating position on the act in June 2023, and a provisional agreement of the European Parliament and the European Council on the final form of the act was announced on December 9, 2023—but the final text of the act had not been released as of this writing. The press release states the following regarding biometric identification systems:

Negotiators agreed on a series of safeguards and narrow exceptions for the use of biometric identification systems (RBI) in publicly accessible spaces for law enforcement purposes, subject to prior judicial authorisation and for strictly defined lists of crime. “Post-remote” RBI would be used strictly in the targeted search of a person convicted or suspected of having committed a serious crime.35

Constitutional Protections

Several provisions of the U.S. Constitution have potential relevance to FRT. The application of these provisions to the use of FRT is currently being studied, contested, and litigated. The most important are the Fourth Amendment’s protection against unreasonable searches and seizures, the Fifth Amendment due process right, the equality components of the Fifth and the Fourteenth Amendments, and the First Amendment’s free speech clause.36 It is important to note, however, that almost every constitutional prohibition applies only to “state action.”37 Although it can sometimes be unclear where state action ends and private action begins, the Constitution generally only applies when

___________________

35 European Parliament, 2023, “Artificial Intelligence Act: Deal on Comprehensive Rules for Trustworthy AI,” press release, December 9, https://www.europarl.europa.eu/news/en/press-room/20231206IPR15699/artificial-intelligence-act-deal-on-comprehensive-rules-for-trustworthy-ai.

36 It is also possible to imagine religious liberty challenges to the mandatory use of face verification. These, however, would not constitute a general regulation of the technology, and so are not addressed here.

37Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1928 (2019).

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

there is a federal or state official directly acting (and not, say, when a private actor voluntarily supplies information [such as footage or an identification] to a government actor). Nor do these protections apply when private actors act toward other private actors in a manner that would be unconstitutional had the government acted in the same manner.

Fourth Amendment

The Fourth Amendment is commonly associated with privacy from state intrusion, especially from law enforcement. It would be appropriate to presume that the amendment speaks to state use of FRT, but this presumption may not hold. The courts have not ruled on the question of whether the state’s collection of facial images is a Fourth Amendment “search.” Unless this threshold condition is met, the amendment would not apply.

In the context of the Fourth Amendment, a search is understood to have occurred when there is a violation of a “person’s reasonable expectation of privacy.”38 It is not clear, however, whether a person has a reasonable expectation of identification privacy in a public setting. If something is already “in plain view, neither its observation nor its seizure would involve any invasion of privacy.”39 This plain view exception reflects the intuition that, when something can be lawfully observed by an official, there is no reasonable expectation of privacy. This suggests that one does not have a reasonable expectation of privacy in one’s facial features when in public, but how should one think about this set of issues when one’s facial features can be used to make an identification or to track movement?

The question thus arises as to whether the state collection of facial data in a public setting could ever trigger Fourth Amendment scrutiny. Given the reasonable expectations test and the plain view exception, it is doubtful that federal courts would proscribe the general use of public surveillance cameras40 (although there is disagreement among lower federal courts as to whether long-term surveillance of a home using a pole-mounted surveillance camera constitutes a Fourth Amendment search).41 But even this carve-out would have a limited effect on FRT use because it would apply only to a small fraction of public video footage. Because the Fourth Amendment turns on how information is collected by the state actor—rather than how it is used—analysis of public surveillance footage for either identification purposes (with or without FRT) likely does not raise Fourth Amendment concerns.42

___________________

38Katz v. United States, 380 U.S. 347, 360 (1967) (Harlan, J. concurring).

39Horton v. California, 496 U.S. 128, 133 (1990).

40Leaders of a Beautiful Struggle v. Baltimore Police Dep’t, 979 F.3d 219, 231 (4th Cir. 2020), on reh’g en banc, 2 F.4th 330 (4th Cir. 2021) (“Precedent suggests law enforcement can use security cameras without violating the Fourth Amendment.”).

41 See, for example, United States v. Tuggle, 4 F.4th 505, 511 (7th Cir. 2021), cert. denied, 212 L. Ed. 2d 7, 142 S. Ct. 1107 (2022).

42 The Fourth Amendment applies only to government “searches” and “seizures.” The surveillance of a public place is neither. When the government does not acquire information directly from a suspect, but from a third party, the Fourth Amendment is typically not implicated. For an exception, discussed later, see Carpenter v. United States, 138 S. Ct. 2206 (2018).

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Importantly, the Fourth Amendment does not provide protection from all warrant-less searches by the state. The Supreme Court has carved out, for example, an exception to the warrant requirement of the Fourth Amendment at the border. At the border, officials “have more than merely an investigative law enforcement role,”43 and greater power to search. Federal courts have hence upheld suspicionless searches of cellphones and laptop searches at the border that would be illegal if conducted during the course of ordinary policing.44 Furthermore, the Supreme Court has developed an “administrative search” doctrine that permits searches without probable cause or warrants for many regulatory purposes.45 Due to such carve-outs, the Fourth Amendment often provides weak privacy protection outside a crime-control context, and its application to immigration enforcement, in particular, is highly context dependent.

All this suggests that the Fourth Amendment might offer only limited protections against the use of FRT, particularly when deployed in proximity to the border. Importantly, the government could also employ FRT far from the border (e.g., around workplaces likely employing noncitizens or on a subway to aid in the search for undocumented persons).

Another important question relates to instances where the government relies on a private party to deploy FRT. Under the “third-party” doctrine, Fourth Amendment protections do not apply when the government acquires records about a person from a third party—such as a bank or a telephone company.46 Hence, the state’s use of a private security firm’s footage would not trigger a Fourth Amendment concern because the state did not obtain data directly from the suspect. This distinction, however, is not absolute. In 2018, the U.S. Supreme Court created an exception to the third-party doctrine for the use of cellphone location data to pinpoint a suspect’s physical whereabouts over time.

In Carpenter v. United States, the Supreme Court ruled that “individuals have a reasonable expectation of privacy in the whole of their physical movements.”47 Reasoning from the Framers’ ambition to “place obstacles in the way of a too permeating police surveillance,” it expressed particular concern over the risk of “near perfect surveillance” by which police could—retroactively, if need be—“retrace a person’s whereabouts.”48 The Court also emphasized the “deeply revealing nature” of location data—its “depth, breadth, and comprehensive reach” and “the inescapable and automatic nature of its collection.”49 The Court frankly grappled with the way in which the diffusion of new

___________________

43United States v. Montoya de Hernandez, 473 U.S. 531, 544 (1985).

44Alasaad v. Mayorkas, 988 F.3d 8, 19 (1st Cir.), cert. denied sub nom. Merch. v. Mayorkas, 141 S. Ct. 2858 (2021) (holding, along with several circuit courts, that “basic border searches are routine searches and need not be supported by reasonable suspicion”).

45 See, for example, New York v. Burger, 482 U.S. 691, 703 (1987) (exempting regulatory inspections of automobile dismantling businesses from warrant and probable cause requirements).

46United States v. Miller, 425 U.S. 435, 443 (1976).

47Carpenter v. United States, 138 S. Ct. 2206, 201 L. Ed. 2d 507 (2018).

48 Ibid.

49 Ibid.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

surveillance tools, coupled to novel analytic strategies, can expand the state’s power to acquire personal information. Many of these concerns, though, can be extended easily to private actors that can also tap into broad information-gathering powers, even if the Fourth Amendment does not apply.

To date, Carpenter has not been extended to the use of FRT. Nevertheless, the U.S. Court of Appeals for the Ninth Circuit invoked Carpenter to reason that “the development of a face template using facial-recognition technology without consent … invades an individual’s private affairs and concrete interests.”50 Legal scholars have developed broad readings of Carpenter that would lead to more extensive regulation of FRT.51

The most natural application of Carpenter would be to FRT-based surveillance tools that focus on “prolonged tracking that can reveal intimate details through habits and patterns.”52 This understanding of Carpenter might mean that some facial identification use cases would be subject to a warrant requirement under the Fourth Amendment; it is less likely to include facial verification use cases.

A separate question arises as to whether an FRT “match” made by law enforcement may by itself constitute sufficient cause for either a brief investigative detention or an arrest. The Fourth Amendment requires that arrests must be based on probable cause and that “no warrants shall issue, but upon probable cause, supported by oath or affirmation,” a legal standard that the U.S. Supreme Court has described as a “practical, nontechnical conception.”53 Law enforcement may subject a person to a brief investigatory detention based on the less demanding Fourth Amendment standard of reasonable suspicion.54 “Terry stops,” for example, allow police to detain a person briefly based on a reasonable suspicion of involvement in criminal activity, and arrests also permit the police to engage in other investigative activities, including searches incident to an arrest.

Equality Under the Fifth and Fourteenth Amendments

A persistent concern about FRT relates to potential differential effects on racial and ethnic groups. The Fifth and the Fourteenth Amendments prohibit certain actions taken on the basis of race by the federal government and the states, respectively. Constitutional equality law, however, is not triggered by the creation of racial or ethnic disparities.55 A violation instead requires a particular showing of intent. A government decision-maker must have “selected or reaffirmed a particular course of action at least in part

___________________

50Patel v. Facebook, Inc., 932 F.3d 1264, 1273 (9th Cir. 2019) (finding standing on this basis).

51 Even the most ambitious of these accounts recognizes “constitutional gaps in protective coverage requiring legislative action.” A.G. Ferguson, 2019, “Facial Recognition and the Fourth Amendment,” 105 Minnesota Law Review 1105, October 21, https://doi.org/10.2139/ssrn.3473423.

52Leaders of a Beautiful Struggle v. Baltimore Police Dep’t, 2 F.4th 330, 341 (4th Cir. 2021).

53Illinois v. Gates, 462 U.S. 213, 231 (1983).

54Terry v. Ohio, 392 U.S. 1 (1968).

55Washington v. Davis, 426 U.S. 229, 240 (1976).

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

‘because of,’ not merely ‘in spite of,’ its adverse effects upon an identifiable group.”56 Especially in criminal and immigration cases, the Court has created a set of presumptions and procedural rules that make it exceedingly hard for most litigants to prove improper intent.57 In addition, the Supreme Court has carved out a near-categorical prohibition on official decision-makers taking explicit account of race in their decision-making protocols.58 This is commonly known as the colorblindness mandate. In effect, these rules prohibit a narrow class of intentional or explicitly race-conscious or racially directed actions. Where the government uses a criterion (e.g., residential zip code) that closely correlates with racial identity, its application is less clear.

Under current constitutional equality doctrine, FRT is unlikely to face successful challenges. Specific FRT instruments may have racially disparate effects, but this is typically not because of an intention to harm a minority. Nor is race used in an explicit criterion in matching. Constitutional equality law, moreover, would not be violated if a policing agency were to use an FRT with racial disparate effects unless its choice were demonstrated to be “because of” and not merely “in spite of” these disparities. It would be very difficult under current law for a plaintiff to satisfy this burden. Moreover, it is possible that certain race-conscious measures to mitigate those disparities may run into constitutional objections.59 For example, the Court has invalidated an official decision to reject the outcome of an employment test with racially disparate effects. It reasoned that abandoning an action because of racial disparities itself was a problematic race-conscious action.60 It would seem, therefore, that an agency concerned at avoiding racial disparities as a consequence of an FRT instrument would be advised to act up front by purchasing a tool that did not evince those gaps, rather than by trying to rectify such disparities after the fact.

Internal Law Enforcement Guidelines

Local law enforcement agencies can set administrative governance principles by establishing departmental rules and guidelines on the use of FRT. As an illustration of local law enforcement guidelines, consider the New York Police Department’s (NYPD’s) internal guidelines for FRT use.61 FRT may only be used by NYPD for a specified set of authorized uses, including to identify a person when there is “a basis to believe that such individual has committed, is committing, or is about to commit a crime.”62 The NYPD

___________________

56Personnel Admr. v. Feeney, 442 U.S. 256, 279 (1979).

57 A.Z. Huq, 2019, “What Is Discriminatory Intent?” Cornell Law Review 103(5), https://scholarship.law.cornell.edu/clr/vol103/iss5/4.

58Parents Involved in Cmty. Sch. v. Seattle Sch. Dist. No. 1, 551 U.S. 701 (2007).

59 A.Z. Huq, 2018, “Racial Equity in Algorithmic Criminal Justice,” Duke Law Journal 68(663).

60Ricci v. DeStefano, 557 U.S. 557 (2009); see also A.Z. Huq, 2019, “Racial Equity in Algorithmic Criminal Justice,” Duke Law Journal 68(1043) (discussing applications to other criminal justice algorithms).

61 NYPD Patrol Guide, Procedure No. 212-129 (3/1/2020).

62 Ibid.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

guidelines also state that the determination of a possible FRT match alone “does not constitute probable cause to effect an arrest, or obtain an arrest or search warrant.”63

Local law enforcement–developed rules and guidelines, unlike administrative governance by city councils or local agencies, may be vulnerable to questions of legitimacy and independence. Moreover, there is a risk that some interests will not be systematically represented within existing review and decision-making processes, including the interests of communities most intensively subject to FRT tools. These concerns can be mitigated at least in part by deliberate efforts to engage with stakeholders in their development.

Federal law enforcement agencies have also created internal agency guidance on the use of FRT, which are subject to review by agency and department general counsels and leadership. For example, the FBI’s Facial Analysis, Comparison, and Evaluation Services Unit (which uses reference databases containing hundreds of millions of faces, including driver’s license photos from more than a dozen states) and the Next Generation Identification-Interstate Photo System (which processes thousands of requests from state and local law enforcement agencies per month)64 are subject to internal regulations (though these are not public).

Governance by Private Entities

When local governments fail or choose not to adopt policies and regulations on FRT use, technology vendors can become the default rulemaking bodies. Vendors may impose non-disclosure agreements with contracting municipalities, and thus create problems of transparency and accountability. Contract terms imposed by vendors might, for instance, specify that data generated by FRT belong to the vendor and not the public agency or the city.65 This raises important transparency and equity concerns.

Private technology vendors may decide, as a policy matter, not to incorporate FRT into tools offered to law enforcement agencies. In 2019, Axon, the country’s largest supplier of police body-worn cameras and software, announced that it would impose a moratorium on FRT use in its devices.66 However, such self-regulation has limits. The decision creates no legally enforceable rights or remedies for individuals or third parties in the event that the company violates its own policies. Axon could reverse course at any time. Furthermore, law enforcement agencies using Axon products could transfer data collected from the company’s products to a third party for FRT analysis.

___________________

63 Ibid.

64 Congressional Research Service, 2020, “Federal Law Enforcement Use of Facial Recognition Technology,” CRS R46586, https://sgp.fas.org/crs/misc/R46586.pdf.

65 S. Gordon, “Milwaukee Committed to Shotspotter But Outcomes, Data Remain Elusive,” Wisconsin Public Radio, January 20, https://www.wpr.org/milwaukee-committed-shotspotter-outcomes-data-remain-elusive. (Reporting that data generated by gunshot detection system ShotSpotter are owned by the company and that “SST’s ownership of the data is written into the contracts it signs with law enforcement.”)

66 C. Warzel, 2019, “A Major Police Body Cam Company Just Banned Facial Recognition,” New York Times, June 27, https://www.nytimes.com/2019/06/27/opinion/police-cam-facial-recognition.html.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

FACIAL RECOGNITION TECHNOLOGY IN CRIMINAL INVESTIGATIONS AND TRIALS

Once an FRT match has been made, there are a number of scenarios where the resulting match could be invoked or applied (e.g., in a criminal investigation or in the course of the proceedings of a criminal trial).

In criminal investigations, current best practice is to use FRT as one component of investigative leads. This practice is reflected, for example, in guidelines from the Facial Identification Scientific Working Group (FISWG), whose members include a number of federal, state, and local law enforcement agencies as well as law enforcement agencies in Europe, the Americas, and Australia. The introduction to FISWG’s document on minimum training criteria for personnel who conduct facial comparisons using FRT states:

An automated FRS typically provides a list of candidates from a database in response to a facial image query. The user of an FRS and the personnel reviewing the results are required to be aware of the major elements and limitations of the facial comparison discipline and training in the use of available tools. Results from an automated FRS are used as investigative leads only and should be used in conjunction with additional resources.67

In legal settings, the use of the results of an FRT match is also subject to strict procedural constraints.

Fifth Amendment

Pursuant to the Fifth Amendment, prosecutors in a criminal action have a due process obligation to disclose to a defendant all evidence that is “favorable” and “material either to guilt or to punishment.”68 If a prosecution were to rely on evidence from an FRT match, the Fifth Amendment may require the prosecution to disclose “evidence of police misuse of facial recognition and poor algorithm quality.”69 At least one state appeals court has determined that the government was obligated to disclose detailed information about the FRT tool used to identify a suspect. Especially because FRT is a “novel and untested technology,” the court ordered the disclosure of the “identity, design, specifications, and operation of the program or programs used for analysis.”70

___________________

67 Facial Identification Scientific Work Group, 2021, “Minimum Training Criteria When Using Facial Recognition Systems,” Version 1.0, October 22, https://fiswg.org/fiswg_min_training_criteria_when_using_fr_systems_v1.0_2021.10.22.pdf.

68Brady v. Maryland, 373 U.S. 83, 87 (1963).

69 J. Brown, 2022, “We Don’t All Look the Same: Police Use of Facial Recognition and the Brady Rule,” Federal Communications Law Journal 74(3):329–346.

70State v. Arteaga, 476 N.J. Super 36, *61 (App. Div. 2023).

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Evidentiary Issues

Although many of the currently known instances of FRT use involve the development of investigative leads, courts will need to determine whether and how FRT matches may be admitted as evidence. To resolve a disputed issue about novel scientific or technical information, a court may permit a party to introduce testimony by an expert witness. In assessing the reliability of expert testimony, a court may consider a variety of factors, including amenability to testing, whether there is a known error rate and standards governing the use of the technique in practice, whether the technique has been subject to peer review in scientific publications or otherwise, and whether the technique or method has general acceptance in the relevant scientific community.71 At least one state court has observed that there is “no agreement in a relevant community of technological experts that [FRT] matches are sufficiently reliable to be used in court as identification evidence,”72 but given the general willingness to permit prosecutors to introduce expert evidence in court, it is likely that at some point, courts may determine that FRT is sufficiently valid and reliable to be introduced as evidence of identification. It is also possible that the fact that FRT has played a role in an investigation may be permitted, not as independent evidence of identification, but as part of the “res gestae”—the background circumstance and explanatory narrative describing the events that led to the arrest.

If the result of an FRT were to be introduced in court as evidence of identification, it would be critical for the court to determine both that the technology itself is adequately valid and reliable—that it has, as the President’s Committee of Advisors on Science and Technology report on forensic science put it, “foundational validity”—and that it was applied reliably by an appropriately trained, competent analyst in this particular instance.73 Determining validity may also raise issues of access to technical details about the surveillance instrument, which may in turn raise access issues given potential non-disclosure agreements or trade secrets.74

___________________

71Daubert v. Merrell Dow Pharmaceuticals, 509 U.S. 579 (1993); Kumho Tire Co. v. Carmichael, 526 U.S. 137 (1999).

72People v. Reyes, 133 N.Y.S.3d 433, 436-437 (N.Y. County 2020).

73 Executive Office of the President, President’s Council of Advisors on Science and Technology, 2016, Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods, Report to the President, Washington, DC, https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast_forensic_science_report_final.pdf.

74 R. Wexler, 2017, “Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System,” Stanford Law Review.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

ADDRESSING WRONGFUL MATCHES AND INTRUSIVE DEPLOYMENT OF FACIAL RECOGNITION TECHNOLOGY

Increasing use of FRTs in the public and private sectors raises questions about legal and administrative remedies for harms caused by the use of FRT. Courts may be asked to consider whether some FRT uses give rise to civil liability under traditional causes of action, and legislatures may wish to consider whether new legislation providing causes of action is warranted.

Individuals may seek legal relief in cases of mistaken FRT matches. Those harmed by mistaken FRT matches may rely on existing federal or civil rights causes of action, although their exact applicability in this context of FRT is not well defined. Federal law offers damages remedies and the possibility of injunctive relief when a constitutional rule such as the Fourth Amendment is violated. Furthermore, criminal defendants can ask that evidence gathered in violation of the Fourth Amendment be suppressed. But such remedies are, in practice, often not available because of a complex network of rules that limit the availability of damages or suppression except in instances where a government official has committed a particularly obvious and egregious violation of constitutional law. With new technologies, persons asserting a constitutional right must often point to previous judicial rulings to show specifically that a constitutional violation was especially egregious; it is not enough to point to a general, foundational ruling. But the hurdles to relief mean such rulings are sparse on the ground.75

As a practical matter, state statutes currently offer the only meaningful relief for individuals harmed by FRT. While, as noted above, federal agencies such as the FTC might offer remedies for deceptive commercial practices and violations of federal statutory law, the remedies are often designed to prevent future illegal behavior, not to make whole those harmed by a new technology.76

When FRT use by private actors is perceived as unduly invasive, individuals may seek remedies in the form of common law–based privacy torts against those actors. Most states recognize tort causes of action—for example, for the public disclosure of private facts, intrusion upon seclusion, false light (spreading falsehoods about an individual), and appropriation of name or likeness. For instance, a person who experiences what is perceived as nonconsensual and highly invasive use of FRT by private actors may rely on the privacy tort of “intrusion upon seclusion.” Although there is no widely recognized general expectation of privacy in public, some courts have suggested there may be limited exceptions in ways that might apply to the FRT context. For instance, the New York

___________________

75 A.Z. Huq, 2015, “Judicial Independence and the Rationing of Constitutional Remedies,” Duke Law Journal 65(1).

76 The Everalbum settlement is an example of that sort of remedy.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

State Court of Appeals opined that “overzealous” surveillance may be actionable when the information sought is “of a confidential nature” and the defendant’s conduct was “unreasonably intrusive.”77

For policymakers and organizations seeking to deploy and use facial recognition appropriately and safely, public transparency about the circumstances under which FRT is used is important. Furthermore, the disclosure of information regarding the technical performance of the deployed FRT system can create pressure on organizations to use top-performing algorithms and foster public confidence in the accuracy of these systems. Clear guidance on factors to consider when deploying FRT can help organizations identify use cases that may require more stringent safeguards. Training and certification programs for the personnel using and reviewing system outputs can ensure a uniform baseline competence.

Systems can be designed to strengthen privacy protections, particularly with regard to the storage of reference galleries and probe images. For instance, reference galleries should always store templates, which are derived from face images, rather than the images themselves. Meanwhile, to prevent inappropriate use of probe images for searches beyond pre-defined operational needs, systems can be configured to automatically delete captured probe images at the end of a set, publicly disclosed retention period.

Policy measures can help alleviate concerns related to the use of FRT. For instance, robust notice and consent practices could be enacted to notify individuals when their images might be stored in a reference gallery or used for training purposes and would give meaningful potential to opt out of image collection. Furthermore, deploying organizations and developers could develop data policies that limit data collection to absolutely necessary purposes, strictly govern how those data are to be used, and limit the long-term retention and sharing of facial image data. In crafting policy, policymakers might consider the context in which FRT is deployed. For instance, policymakers could ask whether a given deployment results in a greater scope, scale, and persistence of record-keeping than existed without the use of FRT. Measures might be taken to ensure that there is adequate justification for a given deployment of FRT, that consideration is given to who will bear responsibility for protecting privacy, and that privacy protections for certain vulnerable groups are appropriate (e.g., domestic violence survivors, individuals enrolled in witness protection, and other groups who may be endangered by the sharing of their whereabouts). Privacy impact assessments are used by the federal government and other organizations as a structured approach for considering such questions and making the analysis available to the public.

___________________

77Nader v. General Motors Corp., 255 N.E. 2d. 560, 567 (Ct. App. N.Y. 1970).

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Several mitigation measures might help address civil and human rights concerns. For instance, disclosure requirements could be enacted wherein those deploying FRT must clearly and publicly state that FRT is in use and for what purposes. Industry codes of conduct could be developed to promote best practices. Tools such as export controls might be employed to restrict access of FRT to authoritarian regimes.

Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 81
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 82
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 83
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 84
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 85
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 86
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 87
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 88
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 89
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 90
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 91
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 92
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 93
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 94
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 95
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 96
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 97
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 98
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 99
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 100
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 101
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 102
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 103
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 104
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 105
Suggested Citation:"4 Equity, Privacy, Civil Liberties, Human Rights, and Governance." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 106
Next: 5 Conclusions and Recommendations »
Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance Get This Book
×
 Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance
Buy Paperback | $40.00 Buy Ebook | $32.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Facial recognition technology is increasingly used for identity verification and identification, from aiding law enforcement investigations to identifying potential security threats at large venues. However, advances in this technology have outpaced laws and regulations, raising significant concerns related to equity, privacy, and civil liberties.

This report explores the current capabilities, future possibilities, and necessary governance for facial recognition technology. Facial Recognition Technology discusses legal, societal, and ethical implications of the technology, and recommends ways that federal agencies and others developing and deploying the technology can mitigate potential harms and enact more comprehensive safeguards.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!