National Academies Press: OpenBook
« Previous: Front Matter
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Summary

Facial recognition technology (FRT) is an increasingly prevalent tool for automated identification and identity verification of individuals. Its speed and accuracy have improved dramatically in the past decade. Its use speeds up identification tasks that would otherwise need to be performed manually in a slower or more labor-intensive way and, in many use cases, makes identification tasks practical that would be entirely infeasible without the use of these tools.

FRT measures the pairwise similarity of digital images of human faces to establish or verify identity. It uses machine learning models to extract facial features from an image, creating what is known as a template. It then compares these templates to compute a similarity score. In one-to-one comparison, the claimed identity of a single individual is verified by comparing the template of a captured probe image with an existing reference image (is this person who they say they are?). In one-to-many comparison, an individual is identified by comparing the template of a captured face image to the templates for many individuals contained in a database of reference images known as a gallery (what is the identity of the unknown person shown in this image?).

FRT accuracy is affected by image quality. Good quality is associated with cooperative capture in which the subject is voluntarily facing a good camera at close range with good lighting. Good lighting is especially important to give correct contrast in subjects with darker skin tones. Non-cooperative capture, in which subjects may not even realize that their image is being captured, such as images taken from security cameras, generally results in lower-quality images.

The attributes of FRT make it very useful in a number of identity verification and identification applications. These include the following:

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  • FRT enables the processing of large numbers of individuals quickly. For example, at international entry points, FRT allows arriving passengers to clear passport control faster.
  • FRT makes it possible to identify high-risk individuals among large numbers of people entering a location without delaying others. FRT can, for example, be used to screen those entering a concert venue for individuals known to pose a threat to the performers.
  • FRT can be a powerful aid for law enforcement in criminal and missing person investigations because it enables investigators to generate leads using images captured at a crime scene. A number of law enforcement agencies have reported successful use of FRT to generate otherwise unavailable leads.
  • FRT can be especially convenient as a means of identity verification. For example, FRT allows a smartphone to be unlocked or a payment to be authorized without entering a passcode.

At the same time, FRT raises significant equity, privacy, and civil liberties concerns that merit attention by organizations that develop, deploy, and evaluate FRT—as well as government agencies, legislatures, state and federal courts, and civil society organizations (see the conclusions and Recommendations 3 and 4 in the following text). These concerns arise from such factors as FRT’s low cost and ease of deployment, its ability to be used by inexperienced and inadequately trained operators, its potential for surveillance and covert use, the widespread availability of personal information that can be associated with a face image, and the observed differences in false negative (FN) and false positive (FP) match rates across phenotypes and demographic groups.

These are not just abstract or theoretical concerns:

  • FRT can be a powerful tool for pervasive surveillance. Concerns about government, commercial, and private use are compounded by the potential to aggregate FRT matches over time to create a dossier of a person’s activities, preferences, and associations—as has been the case in some authoritarian regimes.
  • As FRT becomes more widespread and inexpensive, private individuals may have the means to use FRT against others in ways that raise troubling concerns about privacy and autonomy. Indeed, at least one online service already allows anyone to search for similar faces in a large gallery of images collected without explicit consent from the Web.
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  • There are significant concerns about adverse equity and privacy impacts in the largely unregulated commercial sphere and the implications of collecting massive databases of face images without consent or other safeguards.
  • FRT has been implicated in at least six high-profile wrongful arrests of Black individuals. Although these incidents likely represent a small percentage of known arrests involving FRT, comprehensive data on the prevalence of FRT use, how often FRT is implicated in arrests and convictions, or the total number of wrongful arrests that have occurred on the basis of FRT use do not exist. Moreover, these incidents have occurred against a backdrop of deep and pervasive distrust by historically disadvantaged and other vulnerable populations of policing methods that have often included a variety of forensic, surveillance, and predictive technologies. The fact that all the reported wrongful arrests associated with the use of FRT have involved Black defendants exacerbates distrust of this technology. Concerningly, testing has demonstrated that FP match rates for Black individuals and members of some other demographic groups are relatively higher (albeit low in absolute terms) in FRT systems that are widely used in the United States.

Further compounding these concerns are many other potentially troubling uses—including uses that are technically feasible but not yet seen and uses that presently occur only outside the United States.

The National Academies of Sciences, Engineering, and Medicine undertook this study to assess current capabilities, future possibilities, societal implications, and governance of FRT. The study, sponsored by the Department of Homeland Security (DHS) and the Federal Bureau of Investigation, considers current use cases for FRT, explains how the technology works, and examines the legal, social, and ethical issues implicated by its use.

Deemed out of scope for this study are related computational techniques that classify a face image as a member of a given category, such as race, gender, or age, or to identify specific activities, behaviors, or characteristics of an individual not leading to an identification or verification and that are not normally considered face recognition technology.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

CONCLUSIONS

FRT has matured into a powerful technology for identification and identity verification. Some uses offer convenience, efficiency, or enhanced safety, while others—including ones already deployed in the United States—are troubling and raise significant equity, privacy, and civil liberties concerns that have not been resolved by U.S. courts or legislatures.

Concerns about the use of FRT arise from two (non-exclusive) factors that require different analysis and merit different policy responses:

  • Concerns about poor performance of the technology—for example, unacceptable FP or FN rates or unacceptable variation of these rates across demographic groups, especially in the case of poor-quality surveillance images.
  • Concerns about problematic use or misuse of the technology—for example, technology with acceptable technical performance sometimes produces societally undesirable outcomes as a result of either inadequate procedures or training for operating, evaluating, or making decisions using FRT or the deliberate use of FRT to achieve a societally undesirable outcome, including uses not foreseen by FRT developers or vendors.

That is, some concerns about FRT can be addressed by improving the technology, while others require changes to procedures or training, restrictions on when or how FRT is used, or regulation of the conduct that FRT enables. Furthermore, some uses of FRT may well cause such concern that they should be not only regulated but prohibited.

Currently, with a few exceptions, such as new department-wide guidance issued by DHS in September 2023, the nation does not have authoritative guidance, regulations, or laws that adequately address these concerns broadly.

Much progress has been made in recent years to characterize, understand, and mitigate phenotypical disparities in the accuracy of FRT results. However, these performance differentials have not been entirely eliminated, even in the most accurate existing algorithms. FRT still performs less well for individuals with certain phenotypes, including those typically distinguished on the basis of race, ethnicity, or gender.

Tests show that FN rate differentials are extremely small, especially with the most accurate algorithms and if both the probe and reference images are of high quality, but can become significant if they are not. FN matches occur when the similarity score between two different images of the same person is low. Causes include changes in appearance and loss of detail from poor image contrast. FN match rates vary across algorithms and have been measured to be higher by as much as a factor of 3 in women,

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Africans, and African Americans than in Whites. The algorithms that have the highest overall accuracy rates also generally have the lowest demographic variance. FN match rate disparities are highest in applications where the photographic conditions cannot be controlled; they are lower in circumstances with better photography and better comparison algorithms. The consequences of an FN match include a failure to identify the subject of an investigation or the need for an individual to identify themself in another way such as by presenting identity documents. Rate disparities mean, for example, that the burden of presenting identification or facing additional questioning currently falls disproportionately on some groups of individuals—including groups that have been historically disadvantaged and marginalized. Although this additional time and inconvenience may be seemingly small in a single instance, the aggregate impacts to individuals who repeatedly encounter it and to groups disproportionately affected can be large.

Tests also show that for identify verification (one-to-one comparison) algorithms, the FP match rates for certain demographic groups when using even the best-performing facial recognition algorithms designed in Western countries and trained mostly on White faces are relatively higher (albeit quite low in absolute terms), even if both the probe and the reference images are of high quality. Demographic differentials present in verification algorithms are usually but not always present in identification (one-to-many comparison) algorithms.

FP matches occur when the similarity score between images of two different people is high. (Instances of FP matches can thus be reduced with a higher similarity threshold.) Higher FP match rates are seen with women, older subjects, and—for even the best-performing FRT algorithms designed in Western countries and trained mostly on White faces—individuals of East Asian, South Asian, and African descent. With current algorithms, FP match rate differences occur even when the images are of very high quality and can vary across demographic groups markedly and contrary to the intent of the developer. However, some Chinese-developed algorithms have the lowest FP rates for East Asian subjects, suggesting that the makeup of faces in the training database, rather than some inherent aspect of FRT, contributes to these results. FP match rate disparities can therefore likely be reduced by using more diverse data to train models used to create templates from facial images or by model training with a loss function that more evenly clusters but separates demographic groups. The applications most affected by FP match rate differentials are those using large galleries and where most searches are for individuals who are not present in the gallery. FP rate disparities will mean that members of some groups bear an unequal burden of, for example, being falsely identified as the target of an investigation.

A final concern with FPs is that as the size of reference galleries or the rate of queries increases, the possibility of an FP match grows, because there are more potential

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

templates that can return a high similarity score to a probe face. Some facial recognition algorithms, however, adjust similarity scores in an attempt to make the FP match rate independent of the gallery size.

With respect to the need for regulation of FRT, the committee concluded that an outright ban on all FRT under any condition is not practically achievable, may not necessarily be desirable to all, and is in any event an implausible policy, but restrictions or other regulations are appropriate for particular use cases and contexts.

At the same time, the committee observes that because FRT has the potential for mass surveillance of the population, courts and legislatures will need to consider the implications for constitutional protections related to surveillance, such as due process and search and seizure thresholds and free speech and assembly rights.

In grappling with these issues, courts and legislatures will have to consider such factors as who uses FRT, where it is used, what it is being used for, under what circumstances it is appropriate to use FRT-derived information provided by third parties, whether FRT use is based on individualized suspicion, intended and unintended consequences, and susceptibility to abuse.

As governments and other institutions take affirmative steps through both law and policy to ensure the responsible use of FRT, they will need to consider the views of government oversight bodies, civil society organizations, and affected communities to develop appropriate safeguards.

Study committee members all agreed that some use cases of FRT should be permissible, that some use cases should be allowed only with significant limits or regulation, and that others likely should be altogether prohibited. But committee members did not reach a fully shared consensus on precisely which use cases should be permitted and how permitted uses should be regulated or otherwise governed, reflecting the complexity of the issues raised; their individual assessments of the risks, benefits, and trade-offs; and their perspectives on the underlying values. However, the committee is in full agreement with the following recommendations.

MITIGATING POTENTIAL HARMS AND LAYING THE GROUNDWORK FOR MORE COMPREHENSIVE ACTION

RECOMMENDATION 1: The federal government should take prompt action along the lines of Recommendations 1-1 through 1-6 to mitigate against potential harms of facial recognition technology and lay the groundwork for more comprehensive action.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

RECOMMENDATION 1-1: The National Institute of Standards and Technology should sustain a vigorous program of facial recognition technology testing and evaluation to drive continued improvements in accuracy and reduction in demographic biases.

Testing and standards are a valuable tool for driving performance improvements and establishing appropriate testing protocols and performance benchmarks, providing a firmer basis for justified public confidence, for example, by establishing an agreed-on baseline of performance that a technology must meet before it is deployed. The National Institute of Standards and Technology’s (NIST’s) Facial Recognition Technology Evaluation has proven to be a valuable tool for assessing and thereby propelling advances in FRT performance, including by increasing accuracy and reducing demographic differentials.

NIST would be a logical home for such activities within the federal government, given its role in measurement and standards generally and FRT evaluation specifically.

The following two subrecommendations apply to law enforcement’s use of FRT to identify suspects in criminal investigations.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Even if not subject to federal grant conditions, state and local agencies should adopt these standards.

RECOMMENDATION 1-5: The federal government should establish a program to develop and refine a risk management framework to help organizations identify and mitigate the risks of proposed facial recognition technology applications with regard to performance, equity, privacy, civil liberties, and effective governance.

Risk management frameworks are a valuable tool for identifying and managing sociotechnical risks, defining appropriate measures to protect privacy, ensuring

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

transparency and effective human oversight, and identifying and mitigating concerns around bias and equity. A risk management framework could also form the basis for future mandatory disclosure laws or regulations. Current examples of federally defined risk management frameworks include NIST’s Cybersecurity Framework and NIST’s Artificial Intelligence Risk Management Framework. NIST would be a logical organization to be charged with developing this framework given its prominent role in FRT testing and evaluation as well as in developing risk management frameworks for other technologies.

Some issues that might be addressed by the framework are

  • Technical performance—including accuracy and differential performance across standardized demographic groups, quality standards for probe and reference images, and adequate indication of the confidence of reported matches.
  • Equity—including the extent to which there are statistically and materially significantly different probabilities of error for different demographic groups, the extent to which these are attributable to technical characteristics or other factors (e.g., the manner in which an FRT tool is used), and the parity of use among different populations.
  • Privacy—including privacy protection for faces used in training the template extraction model, whether use of FRT significantly increases the scope or scale of the identification being performed, or other adverse privacy impact.
  • Data collection, disclosure, use, and retention policies for both subject and reference images and templates—including data retention policies to limit, for example, inappropriate use of probe images for searches beyond pre-defined operational needs.
  • Data security and integrity—including adequately protecting information in FRT training data sets and reference databases from exfiltration and misuse.
  • Civil liberties—including whether FRT is being used to control access to a public benefit or service and whether the use of FRT will have a reasonably foreseeable negative impact on the exercise of civil rights, such as free speech or assembly, whether by individuals or groups.
  • Governance—including whether there is an important public interest or legitimate business purpose; who decides whether and how to deploy FRT, and who assumes the risks and accrues the benefits of its use; consultation with the public at large or with affected groups, and meaningful consideration of results; and appropriate safeguards, oversight, and quality assurance.
  • Disclosure—including meaningful public disclosure about where, when, and for what purpose the system is used. Transparency and standardized
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  • reporting become more important in use cases where there are greater consequences for mistakes and errors.
  • Consent—including whether consent is opt-in or opt-out and whether consent is meaningful and uncoerced, and in the case of mandatory use, whether the justification is clear and compelling.
  • Training—including what sort of capabilities or competencies the operator of an FRT system, and those using its output, need to demonstrate and whether the training or certification regimes meet the needs of the system usage.
  • Human-in-the-loop—including whether there is an individual responsible for all significant decisions made on the basis of an FRT match.
  • Accountability—including who is responsible for addressing systematic technical issues with an FRT system, the manner in which it is used, ethical and societal concerns that arise from the social environment in which it is used, and whether and how frequently audits are conducted.
  • Adverse impacts and their distribution—including the potential adverse impacts of an FP or FN match in the proposed use, identifying who bears the consequences of those impacts, and indicating whether costs are borne primarily by the individual subject or the operator of the technology.
  • Recourse—including whether recourse mechanisms provide redress proportional to potential consequences, whether they are available to individuals who will experience adverse outcomes, and whether the organization has a mechanism for receiving complaints.

Note that some of the issues listed here cut across most, if not all, FRT use cases, while others are specific to particular use cases.

RECOMMENDATION 1-6: The federal government should support research to improve the accuracy and minimize demographic biases and to further explore the sociotechnical dimensions of current and potential facial recognition technology uses.

Public research organizations, such as NIST, already undertake important work in setting benchmarks and evaluating the performance of FRT systems. Additional government support could help NIST answer important questions on the performance of FRT systems in non-cooperative settings, how to improve data sets to both preserve privacy and promote equity in the performance of FRT tools, and how best to continue recent work on characterizing, understanding, and mitigating phenotypical disparities. To understand better how to responsibly deploy FRT while protecting equity, fairness, and

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

privacy, NIST, DHS’s Maryland Test Facility, or a similarly well-suited institution should conduct research on

  • The accuracy of FRT systems in a variety of non-optimal settings, including non-optimal facial angle, focus, illumination, and image resolution.
  • The development of representative training data sets for template extraction and other methods that developers can safely apply to existing data sets and models to adjust for demographic mismatches between a given data set and the public.
  • The performance of FRT with very large galleries (i.e., tens or hundreds of millions of entries), to better understand the impacts of FP and FN match rates as the size of galleries used continues to grow.

To advance the science of FRT and to better understand the sociotechnical implications of FRT use, the National Science Foundation or a similar research sponsor should support research on

  • Developing privacy-preserving methods to prevent malicious actors from reverse-engineering face images from stored templates.
  • Mitigating FP match rate variance across diverse populations, and building better understanding of the levels at which residual disparities will not significantly affect real-world performance.
  • Developing approaches that can reduce demographic and phenotypical disparities in accuracy.
  • Developing accurate and fast methods for directly matching an encrypted probe image template to an encrypted template or gallery—for example, using fully homomorphic encryption.
  • Developing robust methods to detect face images that have been deliberately altered by either physical means such as masks, makeup, and other types of alteration or by digital means such as computer-generated images.
  • Determining whether FRT use deters people from using public services, particularly members of marginalized communities.
  • Determining how FRT is deployed in non-cooperative settings, public reaction to this deployment, and its impact on privacy.
  • Determining how FRT may be used in the near future by individuals for abusive purposes, including domestic violence, harassment, political opposition research, etc.
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  • Determining how private actors might use FRT in ways that mimic government uses, such as homeowners who deploy FRT for private security reasons.
  • Researching future uses of FRT, and their potential impacts on various subgroups of individuals.

FOSTERING TRUST AND MITIGATING BIAS AND OTHER RISKS

RECOMMENDATION 2: Developers and deployers of facial recognition technology should employ a risk management framework and take steps to identify and mitigate bias and cultivate greater community trust.

RECOMMENDATION 2-1: Organizations deploying facial recognition technology (FRT) should adopt and implement a risk management framework addressing performance, equity, privacy, civil liberties, and effective governance to assist with decision making about appropriate use of FRT.

Until the recommended risk management framework is developed, the issues listed in Recommendation 1-5 may serve as a useful point of departure. Future standards documents may also provide relevant guidance.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Such practices are imperative to help address mistrust about bias in FRT’s technological underpinnings and to respond to broader mistrust, especially in communities of color, about the role of technology in law enforcement and similar contexts.

ENACTING MORE COMPREHENSIVE SAFEGUARDS

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

An additional set of issues with respect to inclusion in galleries relates to collection and use of images gathered from websites and social media platforms—both whether it is appropriate to use these without consent or knowledge as well as the implications of including low-quality or synthetic images collected in this manner. Under current law, the fact that a gallery was created by harvesting facial images from the Web in violation of platforms’ terms of service does not create a barrier to the instrument’s usage. Of course, Congress, a state legislature, or even a policing authority could promulgate a new rule barring the use of FRT applications developed without the benefit of consent from those whose data is used for training purposes.

Precisely which uses are or are not allowed merits careful consideration by legislators and the public at large. The risk management framework discussed earlier may provide a useful tool for considering these questions.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

* * *

FRT is a powerful tool with profound societal implications. It will be critically important to adopt a considered approach to its governance and future development.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

This page intentionally left blank.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 1
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 2
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 3
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 4
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 5
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 6
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 7
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 8
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 9
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 10
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 11
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 12
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 13
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 14
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 15
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 16
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 17
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 18
Next: 1 Introduction »
Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance Get This Book
×
 Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance
Buy Paperback | $40.00 Buy Ebook | $32.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Facial recognition technology is increasingly used for identity verification and identification, from aiding law enforcement investigations to identifying potential security threats at large venues. However, advances in this technology have outpaced laws and regulations, raising significant concerns related to equity, privacy, and civil liberties.

This report explores the current capabilities, future possibilities, and necessary governance for facial recognition technology. Facial Recognition Technology discusses legal, societal, and ethical implications of the technology, and recommends ways that federal agencies and others developing and deploying the technology can mitigate potential harms and enact more comprehensive safeguards.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!