National Academies Press: OpenBook
« Previous: 4 Equity, Privacy, Civil Liberties, Human Rights, and Governance
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

5

Conclusions and Recommendations

Facial recognition technology (FRT) has matured into a powerful technology for identification and identity verification. Some uses offer convenience, efficiency, or enhanced safety, while others—including ones already deployed in the United States—are troubling and raise significant equity, privacy, and civil liberties concerns that have not been resolved by U.S. courts or legislatures.

Concerns about the use of FRT arise from two (non-exclusive) factors that require different analysis and merit different policy responses:

  • Concerns about poor performance of the technology—for example, unacceptable false positive (FP) or false negative (FN) rates or unacceptable variation of these rates across demographic groups.
  • Concerns about problematic use or misuse of the technology—for example, technology with acceptable performance sometimes produces societally undesirable outcomes as a result of inadequate procedures or training for operating, evaluating, or making decisions using FRT or because FRT is deliberately used to achieve an outcome not foreseen by developers or vendors.

That is, some concerns about FRT can be addressed by improving the technology while others require changes to procedures or training, restrictions on when or how FRT is used, or regulation of the conduct that FRT enables. Furthermore, some uses of FRT may well cause such concern that they should be not only regulated but prohibited.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

TECHNICAL PERFORMANCE AND STANDARDS

Current top-performing facial recognition algorithms provide prompt, high-confidence matches when the probe image is obtained cooperatively and when the reference image is of high quality. Under these conditions and using today’s best face recognition algorithms, 99.9 percent of searches with a sufficiently clear face image will return the correct matching entry in a government database of 12 million identities in under a second.

Two key performance metrics are FP and FN match rates.

  • An FP occurs when the technology erroneously associates the template of a probe image with a template in the gallery. In some cases, the individual photographed in the probe image may not even have a corresponding template in the reference gallery. Recent stories of false arrests enabled by FRT typically involve an FP match, as the image of an innocent person in the gallery is incorrectly matched to a probe image of a suspected perpetrator. As the size of reference galleries or the rate of queries increases, the possibility of a false match grows, because there are more potential templates that can return a high similarity score to a probe face. The FP rate will be very high for twins and other individuals with a close familial resemblance to the probe face.
  • An FN occurs when a probe image of an individual whose image is contained in the reference gallery returns no matches. For instance, when a passenger on a departing airplane is asked to present their face for recognition at the boarding gate, an FN may occur when the technology erroneously fails to identify the passenger in the gallery of individuals on the flight manifest. In this case, an FN may require the traveler to show photo identification.

Matching performance will be worse when the probe image is obtained under suboptimal conditions (e.g., poor lighting) or when the reference image is outdated or of low resolution or contrast. Nevertheless, with the best available algorithms, as long as both the eyes in a face can be automatically detected, a probe image can be matched to an individual with more than 99 percent accuracy.1 In many cases, even if only one eye can be detected, an image of an individual can still be matched with high accuracy; even profile-view images can often be correctly matched.

Much progress has been made in recent years to characterize, understand, and mitigate phenotypical disparities in the accuracy of FRT results. However, these

___________________

1 See the latest NIST FRTE report on 1:N matching, https://pages.nist.gov/frvt/reports/1N/frvt_1N_report.pdf.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

performance differentials have not been entirely eliminated, even in the most accurate existing algorithms. FRT still performs less well for individuals with certain phenotypes, including those typically distinguished on the basis of race, ethnicity, or gender.

Tests show that FN rate differentials are extremely small if both the probe and reference images are of high quality, but the differentials can become significant if they are not. FN matches occur when the similarity score between two different images of one person is low. Causes include changes in appearance and loss of detail from poor image contrast. FN match rates vary across algorithms and have been measured to be higher by as much as a factor of 3 in women, Africans, and African Americans than in Whites. The most accurate algorithms also generally have the lowest demographic variance. FN match rate disparities are highest in applications where the photographic conditions cannot be controlled and can be reduced with better photography and better comparison algorithms. The consequences of an FN match include a failure to identify the subject of an investigation or the need for an individual to identify themselves in another way, such as by presenting identity documents. Rate disparities mean, for example, that the burden of presenting identification falls disproportionately on some groups of individuals—including groups that have been historically disadvantaged and marginalized. Although this additional time and inconvenience may be seemingly small in a single instance, the aggregate impacts to individuals who repeatedly encounter it and to groups disproportionately affected can be large.

FP matches occur when the similarity score between images of two different people is high. (The likelihood of an FP can thus be reduced with a higher similarity threshold.) Higher FP match rates are seen with women, older subjects, and—for FRT algorithms designed and trained in the West—individuals of East Asian, South Asian, and African descent. However, some Chinese-developed algorithms have the lowest FP rates for East Asian subjects. FP match rate differences occur even when the images are of very high quality and can vary across demographic groups markedly and contrary to the intent of the developer. FP match rate disparities can be reduced using more diverse data to train models used to create templates from facial images or model training with a loss function that more evenly clusters but separates demographic groups. The applications most affected by FP match rate differentials are those using large galleries and where most searches are for individuals who are not present in the gallery. FP rate disparities will mean that members of some groups bear an unequal burden of, for example, being falsely identified as the target of an investigation.

Tests also show that for identity verification (one-to-one comparison) algorithms, the FP match rates for certain demographic groups, when using even the best performing facial recognition algorithms designed in Western countries and trained mostly on White faces, are relatively higher (albeit very low in absolute terms), even if both the probe and reference images are of high quality.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

A final concern with FPs is that as the size of reference galleries or the rate of queries increases, the possibility of an FP match grows, as there are more potential templates that can return a high similarity score to a probe face. Some face recognition algorithms, however, adjust similarity scores in an attempt to make the FP match rate independent of the gallery size.

RECOMMENDATION 1: The federal government should take prompt action along the lines of Recommendations 1-1 through 1-6 to mitigate against potential harms of facial recognition technology and lay the groundwork for more comprehensive action.

RECOMMENDATION 1-1: The National Institute of Standards and Technology should sustain a vigorous program of facial recognition technology testing and evaluation to drive continued improvements in accuracy and reduction in demographic biases.

Testing and standards are a valuable tool for driving performance improvements and establishing appropriate testing protocols and performance benchmarks, providing a firmer basis for justified public confidence, for example, by establishing an agreed-on baseline of performance that a technology must meet before it is deployed. The National Institute of Standards and Technology’s (NIST’s) Facial Recognition Technology Evaluation has proven to be a valuable tool for assessing and thereby propelling advances in FRT performance, including by increasing accuracy and reducing demographic differentials. This work, and the trust it has engendered, provide the foundation for NIST to take on an expanded role in developing needed standards in such areas as evaluating and reporting on performance, minimum image quality, data security, and quality control.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

NIST would be a logical home for such activities within the federal government given its role in measurement and standards generally and FRT evaluation specifically.

RISK MANAGEMENT FRAMEWORK

Organizations deploying FRTs face a complex set of trade-offs and considerations as they seek to use the technology fairly and effectively. To help manage these complex tradeoffs around privacy, equity, civil liberties, and technical performance, a framework that is specified in advance can help users identify and manage risks, define appropriate measures to protect privacy, ensure transparency and effective human oversight, and identify and mitigate concerns around equity. A framework can similarly assist bodies charged with oversight of FRTs, whether governmental agencies or civil society organizations, in making decisions about where the use of FRTs is appropriate and where it should be constrained. Such a framework could also form the basis for future mandatory disclosure laws or regulations.

RECOMMENDATION 1-5: The federal government should establish a program to develop and refine a risk management framework to help organizations identify and mitigate the risks of proposed facial recognition technology applications with regard to performance, equity, privacy, civil liberties, and effective governance.

Risk management frameworks are a valuable tool for identifying and managing risks, defining appropriate measures to protect privacy, ensuring transparency and effective human oversight, and identifying and mitigating concerns around equity. A risk management framework could also form the basis for future mandatory disclosure laws or regulations.2 Current examples of federally defined risk management frameworks

___________________

2 A recent Federal Trade Commission statement calls for assessment of risks.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

include NIST’s Cybersecurity Framework and NIST’s Artificial Intelligence Risk Management Framework. NIST would be a logical organization to be charged with developing this framework given its prominent role in FRT testing and evaluation as well as in developing risk management frameworks for other technologies.

A framework for the use of FRT might address the following:

  1. Technical performance

    1.1 Does the FRT perform with the accuracy of current state-of-the-art systems? Does it perform with adequate accuracy for the intended application?

    1.2 Does the FRT have differential accuracy rates across different demographic groups of concern that are as low as current state-of-the-art systems? Is the differential adequately low for the intended application?

    1.3 Does it conform to the prevailing technical standards at the time of deployment, such as those specified by NIST?

    1.4 Do the subject and reference images conform to appropriate standards for image quality to support a match at the intended level of confidence?

    1.5 Does the FRT system adequately communicate to users the confidence of a reported match?

    1.6 Does it offer users with sufficient context information to mitigate other kinds of error?

  2. Equity, privacy, and civil liberties

    2.1 Equity

    2.1.1 Does use of the FRT system result in statistically and materially significantly different treatment for different demographic groups? Is this attributable to technical characteristics (1.2) or other factors?

    2.1.2 What steps have been taken to mitigate equity risks associated with using the technology in a specific use case?

    2.1.3 How are any of these differences assessed, reported, and disclosed?

    2.1.4 What training is being conducted to ensure that when in use, users understand FRT impacts on federally protected groups?

    2.1.5 What is the pre-assessment in FRT’s design for risk mitigation around equity concerns?

    2.1.6 Who makes up the training data and what are the contexts in which the data are collected (e.g., public or private databases)?

    2.1.7 Who is participating in the model’s design and evaluating outcomes for equity?

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  1. 2.1.8 Are the data extracted representative to avert potential errors in positive identification?

    2.1.9 What documentation is being gathered to audit for civil rights compliance and equity?

    2.1.10 What are the apparent and unintended sociotechnical outcomes of the FRT?

    2.2 Privacy

    2.2.1 Privacy of faces used in training the template extraction model

    2.2.1.1 Are privacy-preserving methods used, and if not what other measures are taken to protect the privacy of people whose images were used?

    2.2.1.2 Are data used for training the template extraction model acquired with consent and in compliance with relevant user agreements? Will the data used for this be purchased or sold without consent of individuals in the data set?

    2.2.1.3 Was the database constructed with data obtained in compliance with the terms of service for the data source?

    2.2.2 Are best practices for data security and integrity of FRT training data and reference databases—including adequately protecting information in FRT training data sets and reference databases from exfiltration and misuse—being followed?

    2.2.3 Have appropriate data collection, disclosure, use, and retention policies for both subject and reference images and templates been put in place to limit, for example, inappropriate use of probe images for searches beyond pre-defined operational needs?

    2.2.4 Does the use of FRT significantly increase the scope or scale of the identification being performed?

    2.2.4.1 In a world before FRT, would you have been identified in this setting?

    2.2.4.2 Does the use of FRT allow for identification on a scale that would have been impractical without FRT?

    2.2.4.3 Is the reference database being searched appropriate to the application? Is the search being performed in the smallest possible closed group?

    2.2.4.4 Would there have been a record kept of the identification, and for how long? Is this record-keeping consistent with the record-keeping without FRT?

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  1. 2.2.4.5 If FRT is being used for forensic purposes, is the record kept consistent with current forensic practice?

    2.2.5 Does the use of FRT lead to any other adverse privacy impacts?

    2.3 Civil liberties

    2.3.1 Is the outcome of this FRT being used to control access to a public benefit or service, and if so, does it accord with due process norms?

    2.3.2 Would the deployment of FRT in a given use case have a reasonably foreseeable negative impact on the exercise of civil rights, such as free speech or assembly, whether by individuals or groups?

    2.3.3 Is the use of FRT in compliance with existing civil rights laws?

    2.4 Surveillance (which implicates equity, privacy, and civil liberties concerns)

    2.4.1 Is the FRT being used by government actors, commercial interests, or private individuals? (Government and commercial uses of FRT may be more amenable to regulation and oversight than use by private individuals.)

    2.4.2 Is FRT applied to images collected retrospectively, live, or prospectively? (The use of retrospective images may mean that the subjects’ images were collected without notice or consent that FRT use was contemplated at the time of collection.)

    2.4.3 Is FRT applied for mass surveillance or individually targeted use? Is its use limited or indefinite in duration? (Indiscriminate or indefinite use of FRT on large crowds may pose greater threats to civil rights and civil liberties than the use of FRT to identify one or several individuals based on individualized suspicion.)

    2.4.4 Is the FRT application susceptible to uses constituting harassment, abuse, or new opportunities for criminal or civil harm? (Current or future FRT applications may, for instance, invite private individuals to identify persons in sensitive situations, track their movements, or endanger their safety.)

    2.4.5 Is the FRT application intended to be used covertly or transparently, particularly in places traditionally deemed public? If notice is provided, is the context such that it is reasonable to expect people to be able to make a choice about using such locations?

    2.4.6 Is the FRT application being used for exclusionary, adversarial, or punitive purposes, or is it likely to be so used?

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  1. 2.4.7 Is the FRT application being used against communities or in places that have historically experienced abusive or disproportionate surveillance practices, or is it likely to be so used?

    2.4.8 Do those who believe they have been subjected to a mistaken FRT match have a means of redress (e.g., administrative complaints, legal causes of action, etc.)?

  2. Governance

    3.1 Public interest or legitimate business purpose

    3.1.1 For government uses, is there an important public interest? Does FRT clearly enable that interest to be better served? What costs are imposed, and has every effort been made to minimize them?

    3.1.2 For commercial and other private uses, is there a legitimate business purpose?

    3.1.3 Is FRT being used for cases beyond the stated purpose?

    3.1.4 What safeguards exist against unauthorized uses?

    3.2 Decision-making about deployment

    3.2.1 Who decides whether and how to deploy the technology?

    3.2.2 Who will be operating the technology?

    3.2.3 Does the organization deploying and operating the FRT bear the risks, or are the risks externalized?

    3.3 Community and stakeholder engagement

    3.3.1 What consultation is done with the public at large or specific potentially affected groups?

    3.2.2 Has the consultation engaged with a sufficiently large and representative set of individuals?

    3.2.3 Have the results of the consultation been meaningfully considered (and at a minimum, have any changes been made) in determining whether deployment is appropriate, and whether safeguards are needed?

    3.4 Safeguards and oversight

    3.4.1 Who is responsible for ensuring that appropriate safeguards are in place and being followed?

    3.4.2 Does the system produce a record that can be used ex post for system verification and evaluation?

    3.4.3 Are safeguards, such as access controls or audit trails, in place to prevent unintended use—and if such use occurs, to impose appropriate penalties?

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  1. 3.4.4 Does the system keep biometric data separate from non-biometric data?

    3.4.5 Does the entity using FRT adhere to quality management and assurance practices per the ISO 9000 standards?

    3.5 Disclosure

    3.5.1 Is there meaningful public disclosure about where, when, and for what purpose the system is used, or has a clear and compelling justification been offered for why such disclosure is not needed?

    3.5.2 Is there a clear and publicly accessible data retention policy for both subject and reference images? Will the data be sold or transferred to another entity? Is this narrowly tailored to the stated purpose, and is this properly disclosed?

    3.5.3 In data retention systems, are sufficient guardrails established regarding the sharing and retention of images for purposes other than the reason for the original retention?

    3.6 Consent

    3.6.1 Is the FRT system opt-in? If it is opt-in, is the opt-in mechanism uncoerced? If it is an opt-out application, is the opt-out mechanism meaningful? (Analogous questions arise with both consent for the use of an FRT system and consent for one’s face to be included in a reference gallery.)

    3.6.2 If FRT is mandatory (i.e., there is no opt-in or opt-out), is there a clear and compelling justification?

    3.6.3 Are individuals in practice able to consent to the proposed use? Are individuals reasonably able to understand the implications of consent? If individuals were given the option not to consent, what fraction of them would refuse in this application?

    3.6.4 Are there procedures in place for persons who cannot consent by law (e.g., minors, etc.)?

    3.6.5 Were the reference images captured appropriately—that is, with consent or per legitimate government authority? Is there a protocol for eliminating reference images that are gathered without proper and lawful authority?

    3.7 Training

    3.7.1 What sort of capabilities or competencies does the operator of an FRT system need to demonstrate? How are these updated as new capabilities are added to an FRT system?

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  1. 3.7.2 Do the training or certification regimes adequately mitigate the risks of the system usage?

    3.8 Human-in-the-loop

    3.8.1 Is an identified individual responsible for all significant decisions or actions made on the basis of an FRT match result?

    3.9 Accountability

    3.9.1 Which is the expected/positive outcome or adverse outcome for an individual? What is the cost or consequence to an individual of an adverse outcome?

    3.9.2 Are appropriate (i.e., commensurate with cost/consequence) recourse/redress mechanisms available to individuals who will experience adverse outcomes?

    3.9.3 Does the organization using FRT have a mechanism for receiving complaints? Is it easy for individuals experiencing issues with the FRT system to find and use the complaint mechanism?

Note that some of the issues in this list cut across most if not all use cases, while others depend on the particular use case.

APPLYING THE FRAMEWORK TO REAL-WORLD USE CASES

The framework outlined in the preceding section is intended to identify issues that arise from the use of FRT in specific contexts. This section provides some examples of how the questions delineated in the risk management framework may provide helpful insight in concrete use cases. This section therefore applies portions of the risk management framework to four of the use cases introduced in Chapter 3—employee access control, aircraft boarding, protest surveillance, and retail loss prevention—to illustrate how the general questions posed in the framework play out in the context of specific uses and to develop a set of potential best practices for each case. These illustrative applications are brief and certainly do not consider every element of the risk framework, but they are intended to illustrate how a risk framework such as that suggested above can draw attention, in particular use cases, to key design and use issues that may enhance or detract from important values, like privacy and transparency. Encouraging (or requiring) that a framework be used to assess any given FRT invites organizations to, in essence, “show their work” and thus enhances transparency and, in many instances, can lead to greater care in system design.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Use of Facial Recognition Technology for Employee Access Control

Applying the risk management framework to the use of FRT for employee access control suggests that the following considerations—with respect to image collection, use, and retention; disclosure and consent; and fallback or alternative procedures—are of particular importance.

Image Collection, Use, and Retention

  • Ensure that probe image collection is limited to select check-in locations such as a building entrance or security checkpoint. This helps guarantee that images are only collected when operationally necessary—that is, when an employee presents themselves for access to the facility.
  • Ensure that probe image retention periods are strictly limited. For instances of controlling access to a facility, there is less need to keep the image for a long period of time. If, during the retention period, a probe image needs to be accessed and checked again (e.g., in case the employer wishes to determine whether a person was incorrectly granted access), administrators should seek organizational approval to access the image, documenting a specific purpose for which the image is needed.
  • If an organization must share a probe image with another organization such as law enforcement, share only relevant probe images when data are requested and ensure that recipients also have adequate safeguards in place to limit the retention and use of images.
  • Collect reference images when employees are hired and periodically update them in response to changes to the face from aging and technical needs for new systems.
  • Store reference images in a secured system for managing access control and do not distribute or store them externally.
  • Purge retained images after a set period of time when an employee leaves the organization or when a new reference image is collected.

Disclosure and Consent

  • Ensure that cameras used to collect probe images are highly visible and feature signage detailing the purpose of the use of FRT and how captured images are used and retained.
  • Organizations can assume consent from their employees and make enrollment mandatory given the legitimate business purpose of regulating access to the workplace but bear the responsibility for protecting reference images from disclosure.
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Fallback or Alternative Procedures

  • Use manual identification as a failsafe if an FRT system fails to verify the identity of an employee so that the employee is not incorrectly denied access.
  • Use manual identification to regulate access to authorized visitors and non-employees from whom the organization may not have gained implied consent as a condition of employment.

Use of Facial Recognition Technology for Aircraft Boarding

Applying the risk management framework to the use of FRT as an alternative to other methods of identity verification when boarding an aircraft suggests that the following considerations—with respect to image collection, use, and retention; disclosure and consent; and fallback procedures—are of particular importance.

Image Collection, Use, and Retention

  • Point equipment capturing probe images away from areas where passengers congregate to prevent the inadvertent photographing of any passenger who chooses to opt out of facial recognition.
  • Retain reference images for limited time periods as established by local or federal regulations. Note that a long-term record of a passenger’s identity is kept regardless of whether a passenger presents a boarding pass or uses facial recognition. However, associating an individual’s identity with a flight does not require the long-term storage of biometric data.
  • Require administrative approval and documentation if these data are to be kept for an extended period of time or shared with a third party.
  • Share only relevant probe images when data are requested by law enforcement investigators and ensure that recipients also have adequate safeguards in place to limit the retention and use of images.
  • Use the reference gallery of passengers3 included in the manifest only for the purpose of boarding an aircraft and terminating access to the gallery once the aircraft has departed (unless needed by an international entity receiving the passengers).

Disclosure and Consent

  • Ensure that cameras used to collect probe images are highly visible and feature signage detailing the purpose of the use of FRT and how captured images are used and retained.

___________________

3 The reference gallery is collected from the Customs and Border Protection’s Traveler Verification Service.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  • Notify passengers of their right to opt out of facial recognition screening and establish alternate procedures to ensure that those opting out are not significantly delayed or inconvenienced.

Fallback Procedures

  • Maintain existing procedures for verifying a passenger’s claim to board an aircraft—for example, the ability to scan boarding passes and check physical documents—for passengers who choose to opt out of FRT identification.

Equity

  • Collect statistics on whether members of particular demographic groups experience different FN match rates—that is, instances where individuals must physically present identification—and report the resulting aggregate time and inconvenience burdens.

Use of Facial Recognition Technology to Surveil a Protest

Applying the risk management framework to the use of FRT to surveil a protest suggests that the following considerations—with respect to image collection, use, and retention and disclosure and consent—are of particular importance.

Image Collection, Use, and Retention

  • Strictly limit law enforcement image collection to defined public safety purposes so as to avoid a chilling effect on First Amendment rights.
  • Use FRT only to identify individuals suspected of engaging in criminal behavior.
  • Ensure that probe image retention periods are strictly limited to the time reasonably needed to conclude any criminal investigations that arise from an event.

Disclosure and Consent

  • Develop and make publicly available policies that define the specific circumstances under which images are collected at public protests or submitted for FRT matching.

Use of Facial Recognition Technology to Assist in Retail Loss Prevention

The use of FRT for retail loss prevention differs from the use cases above because it takes place in a context where video surveillance has been widely used for decades. Applying the risk management framework to this use case suggests that the following considerations—with respect to image collection, use, and retention; disclosure and consent; and verification of an FRT match—are of particular importance:

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Image Collection, Use, and Retention

  • Include in the reference gallery of known shoplifters only individuals arrested for relevant offenses that were committed only in nearby geographic locations and within a set period of time.
  • Before sharing a face image of a shoplifter known to one retailer with other retailers, consider whether the consequence of exclusion from multiple stores is warranted by the shoplifting threat the individual poses.

Disclosure and Consent

  • Post prominent signs indicating that video surveillance and FRT are being used to identify known shoplifters and describe store procedures for handling customers identified using FRT as known shoplifters.

Verification of a Facial Recognition Technology Match

  • If FRT identifies a customer as a known shoplifter, before taking action to remove the customer, dispatch a security guard or other store employee to obtain a government-issued photo identification from the customer and verify that the FRT identification was correct.

USE OF FACIAL RECOGNITION FOR LAW ENFORCEMENT INVESTIGATIONS

Applying the risk management framework to the use of FRT in law enforcement investigations suggests that it is important that (1) only validated (or certified, if a certification regime is established) FRT systems are used by law enforcement; (2) there is adequate training of users; (3) potential uses are defined and disclosed; (4) there is appropriate disclosure to an individual when FRT is at least one of the factors that has been used to identify them; (5) there are appropriate limits on law enforcement use that balance citizen privacy protections with public safety needs; and (6) there is adequate consideration given to the potential for disproportionate impacts on marginalized communities.

The committee offers the following recommendations to assist with the development of guidelines for responsible use of FRT by law enforcement and for law enforcement recipients of federal funding for FRT system deployment.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Even if not subject to federal grant conditions, state and local agencies should adopt these standards.

RESEARCH AND DEVELOPMENT

Public research organizations such as NIST already undertake important work in setting benchmarks and evaluating the performance of FRT systems. Additional government support could help NIST answer important questions on the performance of FRT systems in non-cooperative settings, how to improve data sets to both preserve privacy and promote equity in the performance of FRT tools, and how best to continue recent work on characterizing, understanding, and mitigating phenotypical disparities.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

RECOMMENDATION 1-6: The federal government should support research to improve the accuracy and minimize demographic biases and to further explore the sociotechnical dimensions of current and potential facial recognition technology uses.

To understand better how to responsibly deploy FRT while protecting equity, fairness, and privacy, NIST, the Department of Homeland Security’s Maryland Test Facility, or a similarly well-suited institution should conduct research on

  • The accuracy of FRT systems in a variety of non-optimal settings, including non-optimal facial angle, focus, illumination, and image resolution.
  • The development of representative training data sets for template extraction and other methods that developers can safely apply to existing data sets and models to adjust for demographic mismatches between a given data set and the public.
  • The performance of FRT with very large galleries (i.e., tens or hundreds of millions of entries) to better understand the impacts of FP and FN match rates as the size of galleries used continues to grow.

To advance the science of FRT and to better understand the sociotechnical implications of FRT use, the National Science Foundation or a similar research sponsor should support research on

  • Developing privacy-preserving methods to prevent malicious actors from reverse-engineering face images from stored templates.
  • Mitigating FP match rate variance across diverse populations and building better understanding of the levels at which residual disparities will not significantly affect real-world performance.
  • Developing approaches that can reduce demographic and phenotypical disparities in accuracy.
  • Developing accurate and fast methods for directly matching an encrypted probe image template to an encrypted template or gallery—for example, using fully homomorphic encryption.
  • Developing robust methods to detect face images that have been deliberately altered by either physical means such as masks, makeup, and other types of alteration or by digital means such as computer-generated images.
  • Determining whether FRT use deters people from using public services, particularly members of marginalized communities.
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
  • Determining how FRT is deployed in non-cooperative settings, public reaction to this deployment, and its impact on privacy.
  • Determining how FRT may be used in the near future by individuals for abusive purposes, including domestic violence, harassment, political opposition research, etc.
  • Determining how private actors might use FRT in ways that mimic government uses, such as homeowners who deploy FRT for private security reasons.
  • Researching future uses of FRT, and their potential impacts on various subgroups of individuals.

BIAS AND TRUSTWORTHINESS

RECOMMENDATION 2: Developers and deployers of facial recognition technology should employ a risk management framework and take steps to identify and mitigate bias and cultivate greater community trust.

FRT has engendered mistrust about bias in its technological underpinnings and broader mistrust, especially in minority communities, about the role of technology in law enforcement and similar contexts.

RECOMMENDATION 2-1: Organizations deploying facial recognition technology (FRT) should adopt and implement a risk management framework addressing performance, equity, privacy, civil liberties, and effective governance to assist with decision making about appropriate use of FRT.

Until the recommended risk management framework is developed, the issues listed in Recommendation 1-5 may serve as a useful point of departure.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

Such practices will help address mistrust about bias in FRT’s technological underpinnings and broader mistrust, especially in minority communities, about the role of technology in law enforcement and similar contexts.

POTENTIAL EXECUTIVE ACTION AND LEGISLATION

An outright ban on all FRT under any condition is not practically achievable, may not necessarily be desirable to all, and is in any event an implausible policy, but restrictions or other regulations are appropriate for particular use cases and contexts.

Concerns about the impacts of FRT intersect with wider questions about how to protect consumer privacy, where and how to limit government surveillance that could infringe on civil liberties, and more generally how to govern and regulate a proliferation of artificial intelligence and other powerful computing technologies.

RECOMMENDATION 3: The Executive Office of the President should consider issuing an executive order on the development of guidelines for the appropriate use of facial recognition technology by federal departments and agencies and addressing equity concerns and the protection of privacy and civil liberties.

Comprehensively addressing such questions, especially to address nongovernmental uses, may require new federal legislation.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

In light of the fact that FRT has the potential for mass surveillance of the population, courts and legislatures will need to consider the implications for constitutional protections related to surveillance, such as due process and search and seizure thresholds and free speech and assembly rights.

In grappling with these issues, courts and legislatures will have to consider such factors as who uses FRT, where it is used, what is it being used for, under what circumstances it is appropriate to use FRT-derived information provided by third parties, whether its use is based on individualized suspicion, intended and unintended consequences, and susceptibility to abuse, while courts will have to determine how constitutional guarantees around due process, privacy, and civil liberties apply the deployment of FRT.

As governments and other institutions take affirmative steps through both law and policy to ensure the responsible use of FRT, they will need to take into account the views of government oversight bodies, civil society organizations, and affected communities to develop appropriate safeguards.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×

This page intentionally left blank.

Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 107
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 108
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 109
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 110
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 111
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 112
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 113
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 114
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 115
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 116
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 117
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 118
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 119
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 120
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 121
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 122
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 123
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 124
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 125
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 126
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 127
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 128
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 129
Suggested Citation:"5 Conclusions and Recommendations." National Academies of Sciences, Engineering, and Medicine. 2024. Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance. Washington, DC: The National Academies Press. doi: 10.17226/27397.
×
Page 130
Next: Appendixes »
Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance Get This Book
×
 Facial Recognition Technology: Current Capabilities, Future Prospects, and Governance
Buy Paperback | $40.00 Buy Ebook | $32.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Facial recognition technology is increasingly used for identity verification and identification, from aiding law enforcement investigations to identifying potential security threats at large venues. However, advances in this technology have outpaced laws and regulations, raising significant concerns related to equity, privacy, and civil liberties.

This report explores the current capabilities, future possibilities, and necessary governance for facial recognition technology. Facial Recognition Technology discusses legal, societal, and ethical implications of the technology, and recommends ways that federal agencies and others developing and deploying the technology can mitigate potential harms and enact more comprehensive safeguards.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!