National Academies Press: OpenBook

Airport Biometrics: A Primer (2021)

Chapter: Chapter 3 - Legal, Policy, and Privacy Review

« Previous: Chapter 2 - How Advanced Is the Employment of Biometrics at Present?
Page 43
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 43
Page 44
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 44
Page 45
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 45
Page 46
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 46
Page 47
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 47
Page 48
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 48
Page 49
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 49
Page 50
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 50
Page 51
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 51
Page 52
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 52
Page 53
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 53
Page 54
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 54
Page 55
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 55
Page 56
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 56
Page 57
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 57
Page 58
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 58
Page 59
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 59
Page 60
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 60
Page 61
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 61
Page 62
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 62
Page 63
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 63
Page 64
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 64
Page 65
Suggested Citation:"Chapter 3 - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 65

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

43   Legal, Policy, and Privacy Review Summary To pursue biometrics as potential solutions to challenges in aviation, industry stakeholders need to understand the legal, policy, and privacy issues associated with the use of biometric data. The two predominant policy and legal issues associated with the use of biometric data (including facial recognition) are protection of privacy rights and the inconsistent treatment of biometrics under a myriad of international, federal, and state laws. This chapter explains basic constitutional principles associated with the collection of biometric data; the few federal laws applicable to the use of biometric technology and the proliferation of inconsistent state laws; ongoing federal government activities and the role of airport operators, carriers, and tenants, including any litigation risks; privacy principles applicable to the aviation industry; significant court decisions, particularly those eroding traditional precedents because of the use of heightened or pervasive biometric technology; penalties and costs of litigation for violations of law; increasing collaboration of international organizations’ initiatives; best practices, applicable standards, and privacy-by-design principles; and international regulations that offer airport operators and stakeholders the safest course of action in their roles as airport operators, transportation industry providers, and retailers. While the predominant concern under the U.S. Constitution is the right to privacy, another concern is the fact that in the absence of a comprehensive national law, individual states have passed laws, a handful of which limit/prohibit the use of biometric data specifically, some of which address sensitive personal data, and all of which require action and notices upon breach of the data storage and retention systems. Many of these laws impose sanctions for failing to meet requirements pertaining to collecting and protecting biometric data. Another emerging trend is the development of laws and restrictions (some with a broad reach) by other countries, the EU, and international organizations that may affect airport operators and airlines in the United States and include the assessment of penalties for noncompliance. There are many sources of best practices for the protection of personal privacy. These include privacy by design, 2019 EU Data Protection Guidelines, and fair information practice principles, which offer guidance on transparency, adoption of policies and practices that incorporate privacy protections, communication tips, and similar practices. Introduction To pursue biometrics as potential solutions to challenges in the aviation industry, it is first necessary to understand the legal, policy, and privacy issues associated with the use of bio- metric data. The two predominant policy and legal issues associated with the use of biometric C H A P T E R   3

44 Airport Biometrics: A Primer data (including facial recognition) are protection of privacy rights and the inconsistent treatment of biometrics under a myriad of international, federal, and state laws. Protection of privacy rights is the most prevalent concern surrounding the use of facial recognition. Some contend that the use of facial recognition is a violation of privacy rights, that such use may infringe on constitutional rights, and that systems where biometric data are stored provide inadequate legal and security protections from breaches, abuses, and criminal activities (Wong 2020). As discussed in the subsection on state laws, the majority of recent lawsuits assert violations of state law by private-sector businesses (Kugler 2019). Additionally, notwithstanding the increased efficiencies and security benefits associated with facial-recognition technology, media reports have highlighted concerns over the effectiveness of the technology with claims that facial recognition has a high rate of misidentification and bias (see, e.g., Singer and Metz 2019; Harwell 2019). Further, federal studies report that differing algorithms result in differing degrees of misidentification and bias (National Institute of Stan- dards and Technology 2019). In a recent wave of criticism associated with allegations of racially based policing and the potential abuse by law enforcement engaged in surveillance activities of crowd demonstrations, Amazon, Microsoft, and IBM have either suspended or terminated production and sale of the technology (Greene 2020). A recent legal and policy trend in the area of biometrics has been a proliferation of state laws that address a wide range of biometric technology issues such as data breaches, use restrictions, and bans on police use. The legal landscape with respect to state laws is non-uniform and in flux. Another area of legal divide reflected in both federal and state laws is whether the entity collecting and using the biometric data is a federal or state agency or a private-sector or com- mercial entity. There are no uniform legal requirements for the protection of biometric data, which parties are responsible for protecting data, and what types of data are entitled to protection and under what circumstances. Thus, the state of the law with respect to the collection, use, protection, retention, and disclosure of biometric data is unsettled and evolving. Notwithstanding the state of the law, changes and advancements in the technology itself arguably serve to mitigate litigation risks. For example, in the past, facial images were retained in mass storage as photographs, whereas now facial templates are stored as numbers using sophisticated algorithms and, in most cases, are encrypted. With the use of facial templates and encryption, some concerns about data breaches have been addressed. Additionally, the courts have begun to debate/distinguish between capturing a photograph versus application of a biometric template to a facial image, bringing the image within the scope of a state bio- metrics law.1 In recent years, state legislatures, privacy advocates, and free Internet proponents have focused on cybercrime (i.e., hacking and misuse of data) and debated the need for (or opposition to) the application of additional security measures such as encrypting personal data to lessen the risk of access to and ease of illegal use of the data in the mass storage of data or jpeg files (Marks 2020). As a general rule, use of biometric data for the management of employees and for commercial operations is principally governed by state law (Carrero 2018). There is no comprehensive federal law governing the collection and use of biometric data or industry employees’ and consumers’ privacy.2 Under a “patchwork”3 of federal laws, there are provisions aimed at providing privacy protections for individuals’ data in specific industry sectors, such as limitations on federal agencies’ collection of personal information (if relevant and necessary to comply with a legal mandate),4 protections for medical and financial data,5 and limited enforcement action for unfair and deceptive practices.6 In the absence of any over- arching federal framework, states have been enacting or amending laws to protect their residents’ privacy regarding biometric data and, particularly, facial-recognition data.7

Legal, Policy, and Privacy Review 45   Airport operators and stakeholders, such as airlines and vendors, not only must comply with applicable federal and state laws that apply to biometric collection and use for employees and consumers but, in certain circumstances, must confront conflicting laws from multiple states and other countries or international organizations. Airport operators face differing risks and requirements depending on whether they are using biometric data collection for employee timekeeping, access to restricted areas, or other airport operational purposes, or are dealing with passengers to facilitate travelers’ experience or retail purchases. What follows in this chapter is a summary of key legal and policy issues and trends associated with the use of biometrics in the aviation environment. (Additional detailed discussion can be found in Appendices K and L). In particular, the chapter highlights: • Basic constitutional principles associated with the collection of biometric data; • The few federal laws applicable to the use of biometric technology and the proliferation of inconsistent state laws; • Ongoing federal government activities and the role of airport operators, carriers, and tenants, including any litigation risks associated with governmental and commercial activities using biometrics; • Privacy principles applicable to the aviation industry; • Significant court decisions, particularly those eroding traditional precedents because of the use of heightened or pervasive biometric technology; • Penalties and costs of litigation for violations of law; • Increasing collaboration of international organizations’ initiatives; • Best practices, applicable standards, and privacy-by-design principles, and international regulations that offer airport operators and stakeholders the safest course of action in their roles as airport operators, transportation industry providers, and retailers; and • Guidance on mitigation of risk of litigation and findings that support a responsible framework for use of biometrics in the airport environment. Interplay of U.S. Constitution, Federal Laws, and State Laws The legal concern most frequently raised in connection with the use of biometrics, and in particular facial recognition, is the right to privacy. The body of law defining the right to privacy is an amalgam of concepts evolving from multiple sources: common law; the Fourth, First, and Fifth Amendments to the U.S. Constitution; and applicable federal and state laws.8 In other words, there are many sources (and types) of privacy rights that may be implicated depending on the particular use of facial recognition, who is using it, and the nature of the viola- tion asserted (Intille 1999). What follows is a discussion of those legal considerations in relation to the use of biometrics (especially facial recognition) in an airport environment. The Right to Privacy Under the U.S. Constitution The U.S. Constitution protects privacy from governmental9 invasion, and this includes state and local authorities operating in the airport environment. Protection of a person’s right to be left alone by private parties is left largely to the laws of the individual states.10 While the Constitution does not explicitly mention any right of privacy, the Supreme Court has declared that the right of privacy is a fundamental right guaranteed by the Constitution.11 Key Takeaway This section provides a basic outline about the patchwork of federal and state laws and touches on areas relevant to aviation stakeholders.

46 Airport Biometrics: A Primer Privacy advocates have challenged the collection and disclosure of biometric data under the provisions of the Fourth and Fifth Amendments of the Constitution.12 The Fourth Amendment to the U.S. Constitution states, “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause . . .” and the Fifth Amendment states, “No person . . . shall be compelled in any criminal case to be a witness against himself. . . .” The question of whether the collection of facial biometrics constitutes a search for purposes of the Fourth Amendment would have been answered, until recently, by the application of historical precedent that no person has a reasonable expectation of privacy in his or her face.13 However, recent lower-court decisions suggest this may not be true for the pervasive or intrusive use of facial biometrics technology for identification or authentication purposes.14 The use of facial-recognition technology by government actors can be analyzed in three distinct aspects: • The initial collection of biometric data, • The request for identification, and • The comparison between the stored data15 and data presented by an individual on a particular occasion. The first is likely consistent with the Fourth Amendment. The second and third aspects are settled areas of law: A request for identification does not, by itself, constitute a Fourth Amendment seizure,16 and “the process of matching one piece of personal information against government records does not implicate the Fourth Amendment.”17 “The ultimate touchstone of the Fourth Amendment is reasonableness.”18 Generally, reason- ableness requires that, whenever a government actor conducts a search or seizure of a person or his or her property, that official must have a warrant based on probable cause and issued by a judge or magistrate.19 To determine whether a search occurred under the scope of the Fourth Amendment, a court initially will examine the search under tests formulated by the Supreme Court.20 One test looks at whether the government physically intrudes (i.e., trespasses),21 and the other whether the intrusion violates a person’s reasonable expectation of privacy.22,23 If a court determines that the intrusion falls within the scope of Fourth Amendment protection, it examines whether the search complied with the scope of the terms of the warrant, if one was obtained, or whether a recognized exception to the warrant requirement applies (e.g., administrative search, border search, exigent circumstances).24 Public Spaces and the Fourth Amendment Historically, the Supreme Court has determined that certain activities are not searches or protected by a constitutional right to privacy: those related to persons on public streets and beaches,25 conveyances on public roads,26 prisoners in jail,27 and information on the exterior of mails.28 Further, the Supreme Court has recognized that a temporary detention of a person for police questioning in connection with an airport search was reviewable under a lower standard than the rule announced in Terry v. Ohio29 (requiring reasonable suspicion that the person is involved in criminal activity for a temporary detention by a law enforcement officer).30 The Court found the temporary detention, although constituting a seizure under the Fourth Amendment, was constitutional because of the officer’s articulable suspicion and the public interest involved in the suppression of serious crime.31 In recent years, however, the courts and, in some cases, the Supreme Court, have wrestled with the traditional framework about public spaces when faced with advances in technology

Legal, Policy, and Privacy Review 47   that pose a potential for unreasonable intrusion into a person’s privacy.32 The Supreme Court noted that technological advances provide “access to a category of information otherwise unknowable,”33 and “implicate privacy concerns” in a manner as different from traditional intrusions as “a ride on horseback” is different from “a flight to the moon.”34 The trend from recent Supreme Court cases focused on GPS and cell phone tracking appears to signal that, where the use of heightened technology is more intrusive than traditional forms of observation, courts may find the tracking and identification of persons to be unreasonable under the Fourth Amendment.35 Exceptions to the Fourth Amendment Warrant Requirement Even where the Fourth Amendment text would otherwise suggest its application, the Supreme Court has recognized certain governmental activities as reasonable per se under the Fourth Amendment. Because these activities are deemed reasonable per se, they are regarded as excep- tions to the warrant requirement of the Fourth Amendment. For example, routine searches of persons and items at the border by duly authorized officers [e.g., CBP and Immigration and Customs Enforcement (ICE)] are deemed reasonable and may be conducted without suspi- cion.36 The border search exception is not limited to searches that occur at the border itself but includes searches that occur at the “functional equivalent” of the border (e.g., at an airport inspecting and clearing arriving international passengers).37 Courts have noted that there is a diminished expectation of privacy at airports.38 In that context, the courts have held that routine airport screening searches are reasonable under the special needs exception and do not require a warrant, individualized suspicion, or consent,39 provided that they are properly calibrated to their purpose and not in furtherance of traditional law enforcement goals.40 Under the special needs doctrine, courts assess the constitutionality of the challenged conduct by weighing the government conduct—in light of the special need and against the privacy interest advanced—through an examination of three factors: (1) the nature of the privacy interest involved, (2) the character and degree of the governmental intrusion, and (3) the nature and immediacy of the government’s needs and the efficacy of its policy in addressing those needs.41 Liability For alleged violations of the Fourth Amendment, persons can sue state and local government actors under 42 U.S.C. §1983, which provides a private cause of action against state officers for violations of federal civil rights.42 With respect to alleged unlawful searches or seizures by federal officers, the Supreme Court in Bivens v. Six Unknown Agents held that an individual could sue for monetary damages for a violation of the Fourth Amendment.43 Private Party Acting as an Agent of the Government Traditionally, searches by private parties do not trigger the Fourth Amendment unless the party is acting at the direction of a government actor.44 In other words, a search conducted by private individuals at the instigation of a government officer or authority constitutes a govern- mental search for purposes of the Fourth Amendment. Where a private party assists or acts at the direction of a federal agency in performing an activity lawful under an exception to the Fourth Amendment warrant requirement, such as the border search exception (i.e., CBP or ICE) or special needs exception (i.e., TSA), the private party’s assistance would fall within the protected scope of the exception.45 For example, courts have indicated that there is a dimin- ished expectation of privacy at airports,46 and thus, searches conducted by airport personnel in

48 Airport Biometrics: A Primer accordance with a federally approved security plan are deemed to fall within the special needs exception to the Fourth Amendment.47 Recently, there has been increased attention and scrutiny where private parties are engaged in voluntary data sharing with law enforcement. Most cases decided in this area focus on the degree of governmental influence over the initial collection (Brennan-Marquez 2018). At least one commentator has suggested that the analysis for determining whether a private party engaged in voluntary data sharing with law enforcement is sufficient to make that party an agent of the government should include consideration of the following four factors: • First, whether data sharing is repeated or, instead, spontaneous, • Second, whether the data transfer was aimed to assist law enforcement, • Third, how powerfully equipped the private actor is to perform data surveillance, and • Fourth, whether law enforcement practice has evolved to reflect the availability of privately collected data (Brennan-Marquez 2018). DNA Identification as a Case Study Although courts have not yet had the opportunity to examine the governmental use of facial-recognition technology,48 recent case law on the collection of DNA and creation of DNA profiles provides a useful framework for understanding how courts will likely address such conduct in the future. Ultimately, the similarities and differences between DNA profiling and the use of facial-recognition technology arguably demonstrate the latter’s legality. (See Appendix K for a case study analyzing courts’ treatment of DNA.) Federal and State Privacy Laws Relevant to Airport Operators and Stakeholders Generally, federal privacy laws (1) govern a federal agency’s collec- tion and use of biometric data, or (2) protect consumers’ privacy with respect to the collection and use of their biometric data and provide them a cause of action. There are a number of federal privacy laws applicable to the collection and use of biometric data by federal agencies as well as specific activities by various industries in distinct business sectors, some of which are relevant to airport operators and stake- holders. Interestingly, a 2015 U.S. Government Accountability Office (GAO) report observes that there are no federal laws restricting the capture of facial images except with respect to matters associated with minors.49 With respect to legal protections pertaining to commercial uses of biometric data, including facial-recognition technology, federal laws can be divided into three broad categories that address privacy and consumer protection for: “(1) the capture of facial images; (2) the collection, use, and sharing of personal data; and (3) unfair or decep- tive acts or practices, such as failure to comply with a company’s stated privacy policies.”50 The overarching federal law governing federal agencies’ collection, use, and disclosure of personal identifiable information (PII), such as biometric data, is the Privacy Act of 1974.51 Other federal laws pertinent to or regulating the collection and use of biometric data are the Federal Trade Commission Act (addressing unfair or deceptive acts) and the Public Health Key Takeaway As discussed in this section on federal and state laws, aviation stakeholders should note that if you are doing business in a state that does not have a law expressly limiting the collection and use of biometric data, such as facial recognition, but you collect and use biometric data on a resident from one of those states, you may be required to implement changes and follow the requirements of the resident’s state or potentially face stiff penalties.

Legal, Policy, and Privacy Review 49   Service Act (setting forth authorities to control public health emergencies such as COVID-19). (See Appendix K for a detailed discussion of federal laws governing privacy in specific sectors.) U.S. Congress Legislative Activity On January 1, 2019, the 116th Congress commenced, and it ended on January 3, 2021. During the 2-year period that Congress is in session, a bill may be introduced at any point, and it remains eligible for consideration until the Congress ends or adjourns sine die (i.e., adjourning without specifying a return date).52 Since January 2019, there have been approximately 70 bills introduced that propose to protect consumers’ privacy or impose limits on the private sector’s or government agencies’ collection and use of biometric data.53 Critics of the collection and use of biometric data and privacy advocates have voiced the need for national legislation to establish a comprehensive framework for government and commercial use and collection of biometric data (Schubert 2020; Cunningham 2019). By the end of the session, except for a hand- ful, most of the bills were not voted out of committee, and only a handful of generic provisions applicable to Department of Defense (DoD) activities within defense authorization acts (FY21) were enacted.54 It is worth monitoring activity in the 177th Congress, which commenced in January 2021. Survey of State Laws In the absence of a comprehensive federal law regulating the collection and use of biometric data, states have enacted basic laws protecting citizens’ and consumers’ privacy, most of which do not expressly specify coverage for biometric data. Distinct from specific provisions governing biometric data, there are many state laws protecting personal data. In some cases, where there is no express mention of biometrics, biometric data are regarded as a subset of personal data (Mandler et al. 2017). In other cases, however, the law distinguishes between personal identifying information and biometric data by considering the latter as a unique category of sensitive personal information.55 Even among the handful of states that have enacted laws expressly governing the use of biometric data, the definitions of biometric data are not uniform, and requirements imposed on companies’ use of biometric data are inconsistent, which adds to the uncertainty for private companies seeking to ensure compliance with applicable laws.56 A discussion of state laws (from Texas, Washington, California, New York, and Nevada) that expressly govern biometric data can be found in Appendix K. State laws are particularly relevant in circumstances where airport operators and stake- holders are collecting personal data and using biometrics, such as facial recognition, to manage their workforces and for commercial or transportation services for travelers. Four types of state laws are as described in the following (as well as in Figure 3-1): • Some 19 states regulate the collection and use of facial-recognition data (and an even larger number protect such data under a broad definition of personal information) (National Conference of State Legislatures 2015); • All 50 states, plus the District of Columbia, Puerto Rico, Guam, and the U.S. Virgin Islands, require private companies or governmental entities to notify residents of a data breach (National Conference of State Legislatures n.d.; Ramirez 2019);57 • Approximately 35 states, plus the District of Columbia and Puerto Rico, mandate that entities destroy (or otherwise make inaccessible) personal information that is no longer of use or after a specified period of time (National Conference of State Legislatures 2019a); and • A handful of states prohibit state and local agencies from collecting, using, sharing, and retaining biometric data, particularly facial-recognition data, except in limited circumstances (National Conference of State Legislatures 2019a).

50 Airport Biometrics: A Primer State laws widely vary in their protections, penalties for violations, and requirements for businesses.58 According to at least one article, as of May 2020, it is legal in 44 states to identify an individual using images taken without consent while they are in public, while New York, California, Washington, Illinois, Nevada, and Texas do not allow it for commercial use (Thales 2020) (see Figure 3-2). Several states, however, either expressly define personal information to include biometric data or their laws have been interpreted to provide such protection.59 Several states have determined that the definition of personal information excludes biometric data.60 In 2019 alone, legislatures in 25 states considered biometric privacy bills (National Conference of State Legislatures 2015). Illinois Enacted in 2008, the Illinois Biometric Information Privacy Act (BIPA) is the first law to expressly address biometric data.61 BIPA imposes requirements on employers and private entities to make written privacy policies publicly available and provide written notice to, and obtain written consent from, employees on whom the biometric information is being collected and retained, and also imposes limitations associated with the purpose of the collection and the period of storage. It also prohibits profiting off biometric data, allows only a limited right to disclose collected data, and sets forth data protection obligations for business (Hodges and Mennemeier 2020). Significantly, BIPA requires companies to adhere to reasonable standards while possessing the data and creates a private cause of action and authorizes enforcement by the state’s attorney general.62 Of note, BIPA carries strong penalties: $1,000 for each negligent violation and $5,000 for each willful or reckless violation.63 Pursuant to a private cause of action, plaintiffs can recover injunctive relief and any actual damages (that exceed the prescribed penalties). 19 states regulate the collection and use of facial recognition data (and an even larger number protect such data under a broad definition of personal information) 35 states plus the District of Columbia and Puerto Rico mandate entities destroy, or otherwise make inaccessible, personal information that is no longer of use or after a specified period of time 50 states plus the District of Columbia, Puerto Rico, Guam, and the U.S. Virgin Islands require private companies or governmental entities to notify residents of a data breach Handful of states prohibit state and local agencies from collecting, using, and retaining biometric data, particularly facial recognition data, except in limited circumstances Figure 3-1. Summary of state laws related to the collection and use of biometric data.

Legal, Policy, and Privacy Review 51   Over the period of 2018 to 2019, over 200 lawsuits were filed for alleged violations of BIPA, and reportedly this is on the rise (Prescott 2020). Illinois state courts rendered decisions on whether a technical violation of BIPA was sufficient to pursue a case since federal courts and state courts define injury differently for purposes of determining standing to bring a suit in the respective courts.64 To date, most of the cases under BIPA have been class actions targeting employers’ use of biometric technology at work. These lawsuits are not only on the rise but are expensive and difficult to defend against. For example, in 2020, Facebook offered to settle a class action law- suit (in California) for alleged violations of BIPA for $650 million (after the trial judge initially rejected the proposed $550 million settlement) (Shibu 2020). In May of 2020, plaintiffs filed lawsuits against Facebook in Texas and Arizona, adding to the many other suits pending nation- wide, and the company is reportedly facing billions in damages (Justia n.d.). There is also uncertainty with respect to application of BIPA (and other states’ biometric privacy laws) under circumstances where airport stakeholders are partnering with government agencies, such as CBP under the TVS program. Factors bearing on applicability, compliance, and potential liability include whether the state law addresses only commercial uses of biometric data and excludes government agencies from its scope, as is the case with all of the biometric state laws discussed in this section (with the exception of the recent Washington law); and whether the airport stakeholder’s partnership can be viewed as acting as an agent of the government by collecting facial images and transmitting them to CBP (and not retaining them), but this becomes complicated if the stakeholder collects the data for a dual business purpose as well, such as facilitating the boarding process. (See Appendix K for a discussion of other state laws.) Figure 3-2. Six states in the U.S. regulate commercial use of biometric data.

52 Airport Biometrics: A Primer Most recently, similar legislation has been introduced in several other states (e.g., Arizona, Florida, and Massachusetts).65 States have also enacted laws to protect the privacy of minors and students.66 In addition to the proliferation of state laws enacted or under consideration, local jurisdictions have approved ordinances banning the use of facial-recognition technology by city government agencies and, in particular, police forces (Hudgins 2019). It is anticipated that this trend will continue as privacy concerns increase. Supremacy Clause and Conflict of Laws The Supremacy Clause in the U.S. Constitution provides that even if a state statute is enacted in the execution of acknowledged state powers, state laws that “interfere with, or are contrary to the laws of Congress” must yield to federal law.67 In the event of a conflict between existing/ future federal law governing the use of facial-recognition technology, federal law is deemed to preempt state law in three situations (Vogel 2015): (1) when the federal law specifically states that it preempts conflicting state laws;68 (2) when the federal law purports to “regulate conduct in a field that Congress intended the federal government to occupy exclusively”;69 and (3) when a state law “actually conflicts with federal law . . . the [Supreme] Court has found pre-emption where it is impossible for a private party to comply with both state and federal requirements. . . .”70,71 By way of example, this issue has been at the heart of lawsuits and commentaries challenging the Government’s Secure Communities program seeking to mandate the sharing of finger- prints collected by state and local authorities as a means of checking immigration status and enforcing federal immigration laws. State and local authorities enacted policies and laws seeking to defy the mandate. While Congress has the power to preempt state law, it is not always clear where to draw the line between immigration enforcement and the exercise of general police powers (McCauley 2018). With respect to a conflict between state laws, federal courts analyzing the issue “ordinarily must follow the choice-of-law rules of the state in which it sits.”72 Where a commercial entity includes within its terms of service that any litigation will be conducted in a chosen state, the court will uphold that term if the chosen state has a “substantial relationship” to the parties and transaction unless the decision would be contrary to a fundamental policy of the alternate state.73 Numerous commentaries have recommended enactment of comprehensive national legisla- tion regulating privacy protections in connection with the collection and use of biometric data, particularly in the face of conflicting state laws on data privacy (Schubert 2020; Cunningham 2019). Several challenges render this a difficult task. Should national legislation preempt all the state laws in effect and set the bar for minimal requirements? How would the law make it clear to all stakeholders what the specific requirements and obligations are, and would it distinguish between personal information and sensitive personal data? Should national legislation incor- porate state law provisions pertaining to consent, opting out, correction of data, enforcement, and so forth? Thus, while there has been a call for passage of a federal law governing privacy and personal information, the drafting, scope, and construction of a comprehensive framework addressing these and other related issues pose a daunting task. Additional Legal and Policy Considerations Surveillance Versus Identification As noted earlier, there is a long-standing history of the use of stationary security cameras at airports, retail establishments, subways, and public streets.74 The courts have generally found that individuals do not have a reasonable expectation of privacy in public places and

Legal, Policy, and Privacy Review 53   have tended not to view public surveillance as a search governed by the Fourth Amendment. In a recent decision regarding the use of four fixed automatic license plate readers to surveil the ends of a bridge, a state court stated that, while the limited use of the technology did not constitute an unreasonable search, it struggled with where to draw the line.75 Part of the challenge the courts face is assessing society’s reason- able expectation of privacy where heightened technology influences or shapes those expectations, particularly with concerns centered around pervasive police presence.76 Legal and policy challenges associated with private-sector use are on the rise where security cameras are linked to some form of facial-recognition technology.77 Bias While the intersection of facial-recognition technology and dis- crimination, misidentification, and bias has occasionally cropped up in criminal prosecutions, policing, or cases involving false arrest,78 there has been little litigation against private employers applying the technology where the claim is one of bias or discrimina- tion. The litigation risk would most likely arise in the context of contract constraints or violation of state laws (Bickel et al. 2003). The issue, however, is front and center in assessments of the technology. Under a number of studies by the National Institute of Standards and Technology (NIST), the agency found that the accuracy of the facial technology depends on factors such as the specific vendor, whether the purpose is for verification or identification, and whether the algorithm is one-to-one or one-to- many (Grother et al. 2019a). Further, another NIST report acknowledged that the type of recog- nition algorithm used (i.e., one-to-many identification or one-to-one verification) in various facial technologies resulted in a higher incidence of false positives and false negatives for certain ethnic, gender, and age groups (Grother et al. 2019b; Grother et al. 2019c). The NIST study also reported that the impacts of false positives and false negatives varied greatly depending on the use or purpose (e.g., some capable of mitigation by a second attempt, some presenting a security concern and requiring additional follow-up, some leading to false accusation, and some incon- veniencing commercial applications such as access to phone use) (Grother et al. 2019b; Grother et al. 2019c). That same study concluded that advances in the vendor technologies available have consistently improved upon mitigating bias and misidentification (Grother et al. 2019b; Grother et al. 2019c), and reinforced the finding in the NIST report that “massive gains in accuracy have been achieved in the last five years (2013–2018).”79 In February 2020, NIST indicated that it is in the process of assessing CBP’s facial-recognition algorithm (Nash 2020). Title VII Discrimination There may be potential litigation based on allegations that an employment practice, such as the collection and use of an employee’s biometric data, may implicate and violate religious beliefs.80 Courts may require employers to provide a reasonable religious accommodation, such as an alternative method of clocking in.81 Other employment considerations may apply, such as the necessity of reasonable accommodations under the Americans with Disabilities Act.82 Foreign Laws and Regulations A short discussion of key foreign laws and regulations is included because they may involve challenging restrictions and legal risks to the use of biometrics by parties arguably subject to both U.S. and foreign laws and regulations. Many countries have turned to the use of biometrics, Key Takeaway States are considering and enacting laws limiting/governing the collection and use of biometric data at a rapid pace, so air industry stakeholders need to stay current with emerging privacy law trends and comply with the enactment of state laws and other nations’ requirements if they collect and use or propose to collect and use biometric data about their customers.

54 Airport Biometrics: A Primer particularly facial-recognition technology, to perform verification functions associated with traveler facilitation programs, criminal inves- tigations, and passport administration.83 The European Union (EU) General Data Protection Regulation (GDPR)84 was adopted officially on April 27, 2016, and entered into force on May 24, 2016. EU members were required to incorporate it into their national law by May 25, 2018 (see Figure 3-3). It offers insights into foreign trends and highlights potential conflicts with entities such as carriers subject to two sets of laws. The GDPR superseded the 1995 Data Protection Directive in the EU, but like that directive, the GDPR carries extra-jurisdictional ramifications by requiring countries or entities to prove compliance with the GDPR before allowing the transfer of EU-held personal data (Cunningham 2019). The GDPR restrictions and requirements apply to entities processing personal data in three circumstances: (1) where the company processing the data is established in the EU, (2) where, regardless of where the company is located or where the processing occurs, the processing relates to an offer of goods or services to EU residents or monitoring their behavior that occurs within the EU, and (3) where the processing occurs and the GDPR applies as a matter of public inter- national law (e.g., within an EU mission or consular post).85 Article 2 provides four exceptions to the GDPR’s applicability, including when government agencies and law enforcement collect and process data for the prevention, investigation, detection, or prosecution of criminal offenses or the execution of criminal penalties or for preventing threats to public safety.86 Arguably, however, the GDPR may apply to companies that provide EU residents’ data to national security Key Takeaway Although the focus of much of this report is on the current U.S. legal and policy landscape, this short section describes the EU privacy rules and their potential impacts for U.S. businesses. Source: Lord 2018. Figure 3-3. A history of EU data protection regulations leading up to the GDPR.

Legal, Policy, and Privacy Review 55   or law enforcement agencies. Further, it should be noted that the GDPR applies to personal data obtained from public sources (Heward-Mills 2020). The GDPR is intended to provide safeguards for consumer data security rights. It regulates the private sector with respect to the collection, use, retention, storage, and sharing of automatically processed personal data, including biometric data, defined as “special categories of personal data.”87 Pursuant to Article 4 (14), biometric data are classified as a special category and defined as “personal data resulting from specific technical processing relating to the physical, physio- logical, or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.” The processing of sensitive personal data is otherwise prohibited except if one of any number of enumerated exceptions apply, including explicit consent, specified public interest considerations, and certain exemptions in the fields of employment and social protection law.88 In essence, there are seven key requirements for organizations that process89 biometric data pertaining to EU residents, as shown in Figure 3-4. Significantly, the GDPR limits the ability for cross-border transfers of personal data and generally requires a finding that the receiving country has adequate data protections in place (the so-called “adequacy decision”).90 The EU had issued the United States “only a limited ade- quacy decision with respect to companies that registered voluntarily under the EU-U.S. Privacy Shield program that the U.S. Department of Commerce administered and whose principles the Federal Trade Commission enforced.”91 This status recently changed. On July 16, 2020, the European Court of Justice handed down a ruling that could have significant impacts for U.S. companies that collect personal data, including biometric data. “The Court of Justice invali- dated Decision 2016/1250 on the adequacy of the protection provided by the EU-US Data Protection Shield.”92 The court found that informa- tion on EU citizens when transferred to U.S. servers was not adequately protected from “government surveillance.”93 Essentially, the court found that the data were not adequately pro- tected from U.S. government authorities and failed to provide EU citi- zens with certain rights guaranteed under the GDPR.94 At a minimum, U.S. businesses that collect data on EU customers may need to assess their data protection policies and systems or risk severe sanctions. As of February  26, 2020, Google has been fined approximately $55 million for GDPR violations relating to unclear terms and for failing to provide valid consent.96 It should be noted that U.S. courts do not recognize nor are they required to enforce judgments for “the collection of taxes, fines, or penalties rendered by the courts of other [nation] states.”97 On February 17, 2020, the EU’s digital and competition chief told reporters that automated facial recognition breaches the GDPR because the technology fails to meet the regulation’s requirement for consent (Macaulay 2020). Moreover, it was reported that the EU plans to regu- late certain applications of facial-recognition technology in the future because it can violate EU subjects’ privacy rights (Stupp 2020). Although used in a majority of airports worldwide, biometric data systems such as fingerprint image and facial recognition technologies, are subject to very strict conditions under the European 2016 General Data Protection Regulation (GDPR). Due to its sensitivity, prohibition of biometric data processing is the rule, except if strong safeguards are provided to protect personal data (express consent, proportionality, necessity, limited storage time are some of the key principles). Thus, while automatization of border control (i.e., PARAFE E-Gates or a CLEAR® security check point) is legitimate, and even, to a certain extent, improvement of passenger journey experience (facilitation), mass surveillance to monitor passenger behavior along his airport journey is usually banned. In addition, given the high financial penalties at stake, the GDPR extra- territorial scope is another crucial issue a non-EU stakeholder shall consider, in particular if it has an establishment based in the EU or if it provides services to individuals based in the EU (no matter the individual nationality). Isabelle Lelieur Partner and Avocat à la Cour at Chevrier Avocats, Paris, France (personal communication)

56 Airport Biometrics: A Primer In 2018, the EU released the Regulation (EC) 2018/1725 of the European Parliament and of the Council of 23 October 2018 for the protection of natural persons with regard to the processing of personal data by the EU institutions, bodies, offices, and agencies and on the free movement of such data, and repealed Regulation (EC) No 45/2001 and Decision No 1247/2002/EC.98 In other words, as a counterpart to the GDPR, the EU implemented rules to protect the privacy rights of its residents when dealing with public EU institutions. In addition to the EU, approximately 28 countries have enacted legislation or promulgated regulations governing biometric data (Thales 2020).99 Figure 3-4. Seven GDPR requirements.95

Legal, Policy, and Privacy Review 57   International Organization Activities There are a number of international organizations working on, coordinating, and collabo- rating with others to support initiatives to achieve a more seamless travel experience in the commercial aviation industry. Significantly, most advocate for global standards in an effort to incorporate uniformity in international aviation.100 (See Appendix K for a discussion tracing the history of several international initiatives relying on the collection and use of biometrics, including among others the ICAO DTC, the UN High Commissioner for Refugees Population Registration and Identity Management and EcoSystem, IATA and ACI Smart Security and New Experience Travel Technologies, and WEF Known Traveller Digital Identity.) Best Practices/Privacy by Design There are many publications and organizations offering advice and recommendations on protecting the privacy of consumers’ and employees’ biometric data.101 In the early 1970s, a set of principles, known as the Fair Information Practice Principles (FIPPs) (Cate 2006), were developed and evolved over the next several decades to provide a framework for the protection of personally identifiable information. (see Appendix L for the description of the FIPPs core principles). During the mid-1990s, Ann Cavoukian introduced another widely accepted approach, privacy by design.102 Privacy by design garnered international acceptance upon the unanimous passage of a resolution by the International Assembly of Privacy Commissioners and Data Protection Authorities (2010). The concept advocates the integration of privacy into data systems and technologies at every stage of the design process to reflect a “design thinking” perspective.103 In essence, the seven principles of privacy by design promote incorporation of privacy protections into the design and build of technology and information technology (IT) systems, as well as applicable policies and practices, including those governing retention and destruc- tion. The principles stress that inclusion of these protections can be made without sacrificing functionality or security. Further, they advocate business accountability to the public for privacy protections and providing data subjects with avenues of redress and an “active role” in the collection, use, and protection of their data. The seven foundational principles for privacy by design are shown in Figure 3-5. The seven principles are explained in the following. 1. Proactive not reactive. Builds privacy considerations into technologies, policies, practices, and so forth to prevent invasive events before they can occur. 2. Privacy as the default setting. Incorporates privacy protections automatically into IT systems or business practices to provide maximum safeguards by default. 3. Privacy embedded into design. Promotes the integration of privacy into IT systems and operations in a holistic way to ensure that privacy is a key foundational element without sacrificing functionality. 4. End-to-end security – full life-cycle protection. Extends privacy and security protections for the entire retention period until destruction. 5. Respect for user privacy – keep it user-centric. Aims to provide data subjects an active role in the collection, use, and protection of their data. 6. Full functionality – positive-sum, not zero-sum. Accommodates non-privacy objectives without trade-offs to achieve full functionality while protecting privacy; in other words, Key Takeaway With the expected increase in the use of biometric data, including facial recognition, and the anticipated benefits and efficiencies from its use in the aviation industry, stakeholders need to find the right balance of privacy protections and technological applications.

58 Airport Biometrics: A Primer this principle advocates the accommodation of all legitimate interests and objectives in a win–win manner, not through a dated either/or approach where unnecessary trade-offs are made, and supports the position that it is possible to have both privacy and security. 7. Visibility and transparency – keep it open. Assures users and providers that the collec- tion and use of personal data comply with representations and objectives while provid- ing for accountability through independent verification as well as avenues of redress (Cavoukian 2008). In 2012, the Federal Trade Commission (FTC) issued a report on consumer data privacy (FTC 2012a)104 that recommends a number of best practices and addressed both substantive and procedural pro- tections. Building on its 2010 preliminary report (FTC 2010) that had recommended that businesses build privacy protections into their operations as described by privacy by design, the 2012 FTC report proposed best practices that offer simplified choices that give consumers more meaningful control and would increase the trans- parency of their data collection and use practices (FTC 2012a; for a list of the 2012 FTC Report recommended practices see Appendix L). From a global perspective, it is worth noting recent EU developments, which include the 2019 EU Data Protection Guidelines (4/2019) on the GDPR requirements for Data Protection by Design and by Default (European Data Protection Board 2019).105 The draft guidelines (which were open for comment until January 2020) proposed measures and guidance addressing data protection by design, data protection by default, data subjects’ rights, safeguard requirements, practical guidance on the application of the principles, and certification. Thus, the guidelines propose direction regarding what data protection obligations mean in practice and how to implement the data protection principles effectively. Similarly, the EU published guidelines on consent describing the elements of valid consent [i.e., under GDPR article 4(11), European Data Protec- tion Board 2019]. The EU also published Guidelines (2/2020) on GDPR provisions on transfers of personal data between EU and non-EU public authorities and bodies (European Data Protection Board 2020b). Figure 3-5. The seven principles of privacy by design. Key Takeaway Many of the principles of privacy protection (such as privacy by design) and best practices discussed in this section have been incorporated into U.S. law and internationally so that compliance with these principles may place stakeholders on surer footing with the law.

Legal, Policy, and Privacy Review 59   Commercial Developments A recent initiative that focuses on transparency, consumer education, and the mitigation of privacy concerns is the Digital Trust in Places and Routines (DTPR).106 DTPR is a collaborative project initiated by Sidewalk Labs, currently stewarded by Helpful Places, and joined by more than 100 participants to develop a standard that informs and advises individuals in simple language about complex forms of technology and data-collection activities. In particular, DTPR seeks to design transparent patterns and prototypes that entities can use to develop signage that conveys to members of the public information about the use and collection of digital technology.107 There are four components to DTPR: icons, a signage system (based on the icons), a “digital channel” (which, when you scan a QR code, connects you with a website for additional informa- tion), and a set of definitions of key concepts or taxonomy.108 Different icons convey different information such as “the purpose of the technology; another, the logo of the entity responsible for the technology; and a third contains a QR code that takes the individual to a digital channel where they can learn more. In situations where identifying information is collected, a privacy- related, colored hexagon would also be displayed.”109 The icons in Figure 3-6 are initial prototypes of a visual language for signage in the public realm that alerts the public to the presence of a digital technology. The black hexagons express the purpose of the technology, the blue and yellow hexagons show how identifiable information is used (see online version of report for color figure), and the white hexagons display the entity responsible for the technology. Another white hexagon with a QR code and URL enables people to learn more. Findings Significantly, the World Economic Forum (WEF) published a report in February 2020 in furtherance of an initiative launched in 2019 to design a governance framework for the respon- sible use of facial-recognition technology (World Economic Forum 2020c). The project applied a use-case approach focused on improving the air-passenger boarding experience. The WEF reports suggest a comprehensive approach to using not only facial-recognition technology but also other forms of biometric technology in the air environment (World Economic Forum 2020c). Components of the WEF framework include 10 principles for action,110 best practices, Source: Lord 2018. Figure 3-6. Open-source icons aimed to facilitate digital transparency goals.

60 Airport Biometrics: A Primer an assessment questionnaire, and an audit framework (World Economic Forum 2020c). The two reports set forth a four-step approach to building the framework. Each step is intended to represent an additional level of commitment for achieving transparency and public trust. The December 2020 report focuses on auditing and certification not only to validate compliance with the principles for action, but also to build in accountability, monitoring, and opportunities for improving the flow management process (World Economic Forum 2020d). WEF, while noting that the framework may serve as a “blueprint” for other applications, concludes that the next steps of the pilot or “journey” are to: Test the audit framework and certification scheme with industry actors, assess their relevance and the amount of work they create for actors seeking certification, and review them based on the observed results. If successful, this policy pilot will pave the way for the design of a standard for the responsible application of facial-recognition systems. Once the pilot project is completed, a multi-stakeholder coalition of actors committed to respecting and promoting this certification model will be formed (World Economic Forum 2020c). The following are findings on actions drawn from legal research and best practices and designed in part to mitigate litigation risks as well as promote sound privacy practices. This list is intended to identify considerations that may be factored into decisions to use or support the use of biometric data technology: • Determine and assess whether the application of a type of biometric technology is, as a matter of policy and operations, key to your business as an employer supervising a workforce (e.g., timekeeping, access to restricted areas) or a member of the aviation transport industry providing a service to customers; • Consult with legal advisers and understand legal (both federal and state) requirements and restrictions (particularly those with broad reach and severe penalties) that govern the collection and use of personal data, especially biometric data, and appreciate that laws may impose different requirements depending on the type of collection and use of specific biometric data; • Review existing company policies, establish a compliance plan, and designate a compliance officer responsible for ensuring that applicable policies, practices, and systems meet privacy principles and legal requirements; • Make sure that policies address obligations and issues of (advance) notice, (affirmative/written) consent (and the ability to opt out), sharing with third parties, the ability to access and amend personal data, and why your company collects specific data and the purposes for which it uses it; • Focus on and include policies and practices that allow for reasonable accommodation as the law may require (e.g., religious reasons, Americans with Disabilities Act); • Identify and address any applicable union issues; • Inform employees and the public in clear and plain terms about your company’s privacy policies and practices, including measures to protect the data, since communication is criti- cally important; • Meet with privacy and civil liberties groups, as warranted, and legislative representatives to inform them of specific programs and protections, to appreciate any privacy concerns, and to avoid misunderstandings or unwarranted criticism; • Ensure that IT systems safeguard personal data, limit others’ access, and are governed by practices limiting the collection of data to that which is authorized and necessary, retaining data for only as long as necessary, and providing for appropriate means of storage and disposal of data; • Prohibit the sale or transfer of personal data to third parties without the express consent of the data subject;

Legal, Policy, and Privacy Review 61   • As a precaution, ensure that a mechanism is in place for notification of data breaches; and • Stay current on changing laws (both domestic and international), try to anticipate fore- seeable changes, and seek advice from counsel to appreciate and address any impacts on your company’s use of biometric data.111 Endnotes 1. See, e.g., In re Facebook Biometric Info. Privacy Litigation, 326 F.R.D. 535, 540 (N.D. Cal. 2018); Rivera v. Google Inc., 238 F. Supp. 3d 1088, 1090 (N. D. Ill. 2017). 2. Id.; see also Kosseff 2019. 3. This descriptive term has been used in over 400 legal articles. LEXIS NEXIS advanced search (5/19/2020). 4. Privacy Act of 1974, 5 U.S.C. § 552a (2018). 5. Health Insurance Portability and Accountability Act of 1996 (HIPAA) (codified at 42 U.S.C. § 1320d-6); Gramm-Leach-Bliley Act of 1999 (GLBA), 15 U.S.C. § 6805(a). 6. Federal Trade Commission Act, 15 U.S.C. §§ 41–58 (2018). 7. See, e.g., Tex. Bus. & Com. Code Ann. § 503.001 (West 2017); 2017 Wa. ALS 299; 2017 Wa. Ch. 299; 2017 Wa. HB 1493. 8. 1 Data Privacy, Protection, and Security Law § 1.02 (2020). 9. See Katz v. United States, 389 U.S. 347, 350–51 (1967). 10. Id. 11. Griswold v. Connecticut, 381 U.S. 479, 484 (1965). 12. See Griswold, 381 U.S. at 484 [quoting Boyd v. United States, 116 U.S. 616, 630 (1886)] (describing the Fourth and Fifth Amendments as “protection against all governmental invasions ‘of the sanctity of a man’s home and the privacies of life.’”). 13. United States v. Dionisio, 410 U.S. 1, 14 (1973) (“No person . . . can reasonably expect that his face will be a mystery to the world.”); see also Maryland v. King, 569 U.S. 435, 476 (2013) (Scalia, J., dissenting) (noting that taking a person’s photograph is a not a Fourth Amendment search). 14. See, e.g., Patel v. Facebook, Inc., 932 F.3d 1264, 1273 (9th Cir. 2019) (recognizing a concrete privacy interest in one’s biometric face template), cert. denied, 140 S. Ct. 937 (2020). Although Patel does not squarely address the reasonableness of the use of facial recognition technology under the Fourth Amendment, its Article III standing analysis necessarily involves an examination of the privacy interests of persons subject to facial recognition technology, and such interests logically reflect those same persons’ expectations of privacy. 15. Note that there is no Fourth Amendment right to expunge government records of one’s identity. Johnson v. Quander, 440 F.3d 489 (D.C. Cir. 2006), cert. denied, 549 U.S. 945 (2006). Although indefinite retention of data deeply concerns other countries, U.S. courts have generally not shown the same level of concern. See, e.g., Gubala v. Time Warner Cable, Inc., 846 F.3d 909, 912 (7th Cir. 2017) (holding that the mere retention of data does not in itself cause an injury-in-fact absent a substantial risk of disclosure); United States v. Hasbajrami. 945 F.3d 641, 670, 670 n.20 (2d Cir. 2019) (noting that “[s]torage has little significance in its own right,” but caveating that “[t]he considerations might be different if the storage involved data respon- sive to a warrant and retained for the purpose of a domestic criminal prosecution”). 16. INS v. Delgado, 466 U.S. 210, 216 (1984). 17. Quander, 440 F.3d at 498 (citing Arizona v. Hicks, 480 U.S. 321 (1987)); see also King, 569 U.S. at 465 (“[O]nce respondent’s DNA was lawfully collected the STR analysis of respondent’s DNA . . . did not amount to a significant invasion of privacy that would render the DNA identification impermissible under the Fourth Amendment.”) 18. Riley v. California, 573 U.S. 373, 381 (2014) (quoting Brigham City v. Stuart, 547 U.S. 398, 403 (2006). 19. Id. at 382. 20. United States v. Moore, 381 F. Supp. 3d 139 (D. Mass. 2019) (citing United States v. Bain, 874 F.3d 1, 11–12 (1st Cir. 2017)) (law enforcement’s use of a pole camera to record the homeowners’ comings and goings constituted a search). 21. Florida v. Jardines, 569 U.S. 1, 5 (2013). 22. Katz v. United States, 389 U.S. 347, 360 (1967). 23. Later cases analyzing Katz and drawing on Justice Harlan’s concurrence in the case have explained that, under this approach, the Fourth Amendment protects legitimate or reasonable expectations of privacy where: (1) “the individual, by his conduct, has exhibited an actual (subjective) expectation of privacy,” and (2) “the individual’s subjective expectation of privacy is one that society is prepared to recognize as reasonable.” Smith v. Maryland, 442 U.S. 735, 740 (1979) (internal quotation marks omitted) (quoting

62 Airport Biometrics: A Primer Katz, 389 U.S. at 361 (Harlan, J., concurring)); Mariko H., Privacy in Public Spaces: The Reasonable Expecta- tion of Privacy Against the Dragnet Use of Facial Recognition Technology, 49 Conn. L. Rev. 1591 (2017). 24. See Howick, J. L., The Fourth Amendment and Airports (2016), ACRP 1101, https://www.nap.edu/catalog/ 23500/the-fourth-amendment-and-airports 25. Kowalski v. Scott, No. Civ.A.02-7197, 2004 WL 1240658, at *1 (E.D. Pa. May 26, 2004), aff ’d, 126 Fed. App’x 558 (3d Cir. 2005). 26. United States v. Knotts, 460 U.S. 276, 281 (1983); United States v. Jones, 565 U.S. 400, 4430 (2012) (noting that monitoring of a person’s movements is reasonable where short-term). 27. Hudson v. Palmer, 468 U.S. 517 (1984) (no reasonable expectation of privacy). 28. United States v. Ramsey, 431 U.S. 606, 619 (1977). 29. Terry v. Ohio, 392 U.S. 1 (1968). 30. Florida v Rodriguez, 469 U.S. 1 (1984). 31. Id. 32. Kyllo v. United States, 533 U.S. 27, 34 (2001); United States v. Jones, 565 U.S. 400, 416, 428 (2012) (GPS monitoring for extended periods of time); Carpenter v. United States, 138 S. Ct. 2206, 2215 (2018) (tech- nological advances in tracking cell-site location information); Riley v. California, 573 U.S. 373, 386 (2014) (modern cell phone storage of “vast quantities of personal information). 33. Carpenter, 138 S. Ct. at 2218. 34. Riley, 573 U.S. at 393. 35. But see United States v. Moore-Bush, 963 F.3d 29 (1st Cir. 2020) (affirming the Katz principle held use of pole camera comports with rule that “a person does not have a reasonable expectation of privacy in the actions he or she exposes to the public view”); Koops, Bert-Jaap, et al. Location Tracking by Police: The Regulation of ‘Tireless and Absolute Surveillance’, 9 U.C. Irvine L. Rev. 635 (March 2019) (Suggesting that factors such as duration, intensity, use, active generation of data, etc. are relevant to an assessment of privacy intrusion by police tracking). 36. United States v. Ramsey, 431 U.S. 606 (1977). 37. See Almeida-Sanchez v. United States, 413 U.S. 266, 272–273 (1973); Ramsey, 431 U.S. at 606, 610 n.2. 38. Cassidy v. Chertoff, 471 F. 3d 67, 76 (2d Cir 2006) (noting that “society has long accepted a heightened level of security and privacy intrusion with regard to air travel”); United States v. Herzbrun, 723 F.2d 773, 775 (11th Cir. 1984); United States v. Albarado, 495 F.2d 799, 805 (2d Cir. 1974); United States v. Skipwith, 482 F.2d 1272, 1275 (5th Cir. 1973); United States v. Davis, 482 F.2d 893, 910 (9th Cir. 1973); United States v. Hartwell, 296 F. Supp. 2d 596, 602–05 (E.D. Pa. 2003); People v. Hyde, 524 P.2d 830 (Cal. 1974). 39. United States v. Aukai, 497 F. 3d 955, 960 (9th Cir. 2007) (“The constitutionality of an airport screening search, however, does not depend on consent. . . . Rather, where an airport screening search is other- wise reasonable and conducted pursuant to statutory authority, 49 U.S.C. § 44901, all that is required is the passenger’s election to attempt entry into the secured area of an airport.”); United States v. Marquez, 410 F.3d 612, 617 (9th Cir. 2005) (noting that airport searches are conducted for the parallel purposes of “prevent[ing] passengers from carrying weapons or explosives onto the aircraft” and “deter[ring] passengers from even attempting to do so”). 40. Although there is no Supreme Court holding directly on point, Supreme Court dicta and lower courts are all in general agreement. See, e.g., Chandler v. Miller, 520 U.S. 305, 323 (1997) (“We reiterate, too, that where the risk to public safety is substantial and real, blanket suspicionless searches calibrated to the risk may rank as ‘reasonable’—for example, searches now routine at airports and at entrances to courts and other official buildings.”); see also Corbett v. TSA, 767 F.3d 1171, 1179 (11th Cir. 2014) (holding that suspicionless scans for explosives and pat-downs are reasonable administrative searches); Ruskai v. Pistole, 775 F.3d 61, 77 (1st Cir. 2014) (same, for just pat-downs); Electronic Privacy Information Center v. DHS, 653 F.3d 1 (D.C. Cir. 2011) (same, for just scans for explosives). 41. Cassidy, 471 F.3d at 75. 42. Harlow v. Fitzgerald, 457 U.S. 800 (1982). 43. Bivens v. Six Unknown Named Agents of Fed. Bureau of Narcotics, 403 U.S. 388, 389 (1971). 44. See, e.g., United States v. Jacobsen, 466 U.S. 109, 115 (1984) (explaining that private searches do not trigger the Fourth Amendment unless the private actor was operating as an agent or instrument of law enforce- ment at the time of the disputed conduct); Skinner v. Ry. Labor Executives’ Ass’n, 489 U.S. 602, 614 (1989); Coolidge v. New Hampshire, 403 U.S. 443, 487 (1971). 45. See United States v. Momoh, 427 F.3d 137 (1st Cir. 2005) (Court noting that even if private party was acting as a government agent, her search of defendant’s package would have been permissible under the “border search” exception); United States v. Rodriquez, 596 F.2d 169 (6th Cir. 1979) (private search of package and subsequent examination of contents valid under plain view); see, also Gonzales v. FedEx Ground Package Sys., No. 12-CV-80125-RYSKAMP/HOPKINS, 2013 WL 12080223, at *1 (S. D. Fla. Aug. 1, 2013) (holding that 19 U.S.C. § 507 protected Fed Ex acting reasonably to assist CBP officers).

Legal, Policy, and Privacy Review 63   46. Cassidy v. Chertoff, 471 F.3d 67, 74 (2d Cir. 2006); Brees v. HMS Global Mar. Inc., 431 F. Supp. 3d 1207 (W. D. Wash. 2020) 47. The Supreme Court has not directly addressed the application of the a special needs exception to airport administrative searches, but, in dicta, the Court signaled the likely validity of its reasonableness in two cases: See Chandler v. Miller, 520 U.S. 305, 323 (1997) (suspicionless searches “may rank as ‘reasonable’— for example, searches now routine at airports”); City of Indianapolis v. Edmond, 531 U.S. 32, 47-48 (2000) (“holding also does not affect the validity of border searches or searches at places like airports and govern- ment buildings, where the need for such measures to ensure public safety can be particularly acute.”). 48. In a review of case law to determine any suits filed against the FBI, CBP, or TSA regarding the use of facial recognition, the only cases located to date concerned actions filed under the Freedom of Information Act for agency records pertaining to CBP’s use of facial recognition technology. See, e.g., ACLU v. DHS, Case 1:20-cv-02213 (S.D.N.Y, March 12, 2020); EPIC v. CBP, No. 19-cv-689 (D.D.C. March 12, 2019) (settled on April 24, 2020). 49. U.S. Gov’t Accountability Office, GAO-15-621, Facial Recognition Technology (2015), at 28 (referencing the Video Voyeurism Prevention Act of 2004, codified at 18 U.S.C. § 1801)); but also would include the Children’s Online Privacy Protection Act in that list (15 U.S.C. §§ 6501–6506). 50. Id. 51. 5 U.S.C. § 552a (2018). 52. Constitutional Topic: How a Bill Becomes a Law, https://www.usconstitution.net/consttop_law.html (last visited Aug. 3, 2020). 53. See www.congress.gov (last visited July 24, 2020). See, e.g., Privacy Bill of Rights, S. 1214, 116th Cong. (2019); Consumer Data Privacy and Security Act, S. 3456, 116th Cong. (2020); American Data Dissemination Act, S. 142, 116th Cong. (2019); Data Care Act of 2019, S. 2961, 116th Cong. (2019); Digital Accountability and Transparency to Advance Privacy Act, S. 583, 116th Cong. (2019); H.R. 3900, 116th Cong. (2019); Privacy Score Act of 2020, H.R. 6227, 116th Cong. (2020); Office of Biometric Identity Management Autho- rization Act of 2019, H.R. 1729, 116th Cong. (2019). 54. www.congress.gov. (last visited 03/21/2021). 55. Mini-Symposium on Comprehensive Data Privacy Reform Legislation in the United States: Data Privacy and the Financial Services Industry: A Federal Approach to Consumer Protection, 24 N.C. Banking Inst. 527 (March 2020); General Data Protection Regulation (GDPR), Commission Regulation 2016/679, 2016 O.J. (L 119) 1, Arts. 4 and 9. 56. Id., at fn. 4. 57. See also Oregon’s Consumer Information Protection Act (OCIPA, ORS 646A-600, et seq.) effective 01/01/2020. 58. For list of state laws, see Ramirez 2019. 59. See, e.g., Ark. Code Ann. §4-110-103(7) (effective July 2019, Arkansas revised its breach notification law to include biometric data within its definition of personal information); Louisiana Revised Statutes 51:3071, et seq. (2018) (amending its definition of personal information under its breach law to extend protections to biometric data and include a private cause of action); N.C. Gen. Stat. §14-113.20(B); Iowa Code Ann. § 715C.1(11)(a)(5) (West 2014) (defining personal information to include “unique biometric data, such as a fingerprint, retina or iris image, or other unique physical representation or digital representation of bio metric data”); Neb. Rev. Stat. Ann. § 87-802(5)(a)(v) (West 2016) (defining personal information as including “unique biometric data, such as a fingerprint, voice print, or retina or iris image, or other unique physical representation”); Wis. Stat. Ann. § 943.201(1)(b)(13) (West 2017) (defining biometric data as “including fingerprint, voice print, retina or iris image, or any other unique physical representation”). 60. See National Conference of State Legislatures 2019b. List of state bills introduced including where proposed amendments to definition of personal information failed to pass (e.g., Florida FL HB 1153). 61. 740 Ill. Comp. Stat. 14/15 (2013) (BIPA). 62. Id. 63. Id. 64. See Bryant v. Compass Grp. USA, Inc., 958 F. 3d 617 (7th Cir. May 5, 2020) for a discussion of standing requirements and decisions (standing found in case where plaintiff alleged violation of BIPA for the failure to comply with statutory requirements associated with an account to biometrically access a vending machine). 65. H.R. 2478, 50th Leg., Second Reg. Sess. (Ariz. 2012); H.R. 1153, 2019 Sess. (Fla. 2019); S. 120, 191st General Court (Mass. 2019). 66. See, e.g., National Conference of State Legislatures 2019b. 67. Gibbons v. Ogden, 22 U.S. 1, 211 (1824) 68. See English v. Gen. Elec. Co., 496 U.S. 72, 78–79 (1990). 69. Id. at 79. 70. Id. (citing Fla. Lime & Avocado Growers, Inc. v. Paul, 373 U.S. 132, 142-43 (1963)).

64 Airport Biometrics: A Primer 71. For an extensive discussion of the Supremacy Clause and the doctrine of preemptions, see State v. Martinez, 896 N.W.2d 737 (Iowa 2017) 72. Palomino v. Facebook, 2017 U.S. LEXIS 2971 (N. D. Cal. 2017) (quoting from Atl. Marine Const. Co. v. U.S. Dist. Court for W. Dist. of Texas, 134 S. Ct. 568, 582, 187 L. Ed. 2d 487 (2013). 73. Id., at 7–8. 74. See, e.g., Iraola 2003. 75. Commonwealth v. McCarthy, 142 N.E.3d 1090, 1106 (Mass. 2020) (determining that it could not say precisely how detailed a picture of the defendant’s movements must be revealed to invoke constitutional protections). 76. Id.; see also Blitz 2004 (comprehensive discussion of the evolution of privacy protections in the law). 77. Bah v. Apple Inc., No. 19-cv-3539 (PKC), 2020 WL 614932, at *1 (S.D.N.Y. Feb. 10, 2020). In a nightmare of a case, Ousmane Bah, a teenager, lost his learner’s permit, which was subsequently used by person/persons unknown to commit thefts from Apple stores in four states. Bah was arrested for the thefts based on facial recognition misidentification information from Apple “linking” him to the thefts provided to law enforce- ment by Apple. Bah has sued for false arrest, defamation, and malicious prosecution. 78. Id.; see, e.g., Hamann and Smith 2019. 79. NIST.IR 8271, at 2. 80. See, e.g., EEOC v. Consol. Energy, Inc., 860 F.3d 131 (4th Cir. 2017), cert. denied, 138 S. Ct. 976 (2018). 81. Id. at 143. 82. Nat’l Fed’n. of the Blind v. United Airlines, Inc., 813 F.3d 718 (9th Cir. 2016) (referencing 14 CFR § 382.57 providing that generally biometrics may not be the only means of user identification or control); see also EEOC v. Orion Energy Systems, 208 F. Supp. 3d 989 (E.D. Wis. 2016) (no violation of the ADA since biometric health screening assessment was voluntary). 83. See Countries Applying Biometrics, https://en.wikipedia.org/wiki/Countries_applying_biometrics (list of various countries’ biometric laws); Carrero 2018 (citing examples involving Canada, Brazil, Australia, and Argentina); World Economic Forum 2017 (detailing various country programs using biometrics). 84. See Commission Regulation 2016/679, 2016 O.J. (L 119) 1, Art. 3. 85. GDPR, art. 3(1)-(3); Recital 25; Determann 2020, pp. 236–237; GDPR, Art. 3. 86. GDPR, art. 23 87. GDPR, arts. 2 and 4; see also art. 5 (setting out the principles governing the processing of personal data). 88. GDPR, art. 9(2)-(4). 89. GDPR, art.4(2) (Processing means collecting, recording, organizing, structuring, storing, adapting, using, sharing, etc.); Id., art. 33; Id., art. 15; Id., art. 7; Id., art. 20; Id., art. 25; Id., arts. 37–39; Biometric data and data protection regulations (GDPR and CCPA), Thales (June 27, 2020), https://www.thalesgroup.com/en/ markets/digital-identity-and-security/government/biometrics/biometric-data. 90. See GDPR, art. 45(1). 91. European Commission, Commission Implementing Decision (EU) 2016/1250, 2016 O.J. (L 207) 11. See also Determann 2020, p. 238. 92. Court of Justice of the European Union Press Release No. 91/20 (July 16, 2020), https://curia.europa.eu/ jcms/upload/docs/application/pdf/2020-07/cp200091en.pdf. 93. Id. 94. Id. 95. GDPR, art.4(2) (Processing means collecting, recording, organizing, structuring, storing, adapting, using, sharing, etc.); art. 33; art. 15; art. 7; art. 20; art. 25; arts. 37–39; Biometric data and data protection regulations (GDPR and CCPA), Thales (June 27, 2020), https://www.thalesgroup.com/en/markets/digital-identity- and-security/government/biometrics/biometric-data. 96. Shyy 2020. (Concluding that “the GDPR is both ineffective in protecting consumer privacy and burdensome on businesses of all sizes,” at 157). 97. Restatement (Third) of the Foreign Relations Law of the United States, §483 (Am. Law. Inst. 1987); See Wallace 2019 for an extensive discussion of comity and the principles associated with the enforcement of the GDPR in the United States. 98. See Parliament and Council Regulation 2018/1725, 2018 O.J. (L 295) 39, available at https://eur-lex.europa. eu/legal-content/EN/TXT/PDF/?uri=CELEX:32018R1725&from=EN. 99. For comprehensive list of foreign data privacy laws, see Greenleaf 2019. 100. See, e.g., ICAO n.d.-a. 101. See, e.g., National Institute of Standards and Technology 2020; U.S. Department of Homeland Security 2008a; Mandler et al. 2017; Amelung 2019. 102. Cavoukian is credited with developing this approach while Information and Privacy Commissioner of Ontario (Cavoukian 2009). 103. See Cavoukian 2009; Wong 2020.

Legal, Policy, and Privacy Review 65   104. See also Federal Trade Commission 2012b. 105. See also European Data Protection Board 2020a. For a comprehensive summary of the elements of data protection by design and by default, see www.twobirds.com/en/news/articles/2019/global/edpb-publishes- guidelines-on-data-protection-by-design-and-by-default. 106. See Designing for Digital Transparency in the Public Realm, https://sidewalklabs.com/dtpr/. 107. Id. 108. Id. 109. Id. 110. Those 10 principles for action are (1) Proportional use of facial recognition systems, (2) Risk assessment, (3) Bias and discrimination, (4) Privacy by design, (5) Performance, (6) Right to information, (7) Consent, (8) Information display, (9) Right of access to vulnerable groups, and (10) Alternative option and human presence. World Economic Forum 2020c, p. 16. 111. For examples of other checklists, see, e.g., Prescott 2020; Information Commissioner’s Office n.d.

Next: Chapter 4 - Planning and Process Considerations »
Airport Biometrics: A Primer Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Biometrics is one of the most powerful, but misunderstood technologies used at airports today. The ability to increase the speed of individual processes, as well as offer a touch-free experience throughout an entire journey is a revolution that is decades in the making.

The TRB Airport Cooperative Research Program's ACRP Research Report 233: Airport Biometrics: A Primer is designed to help aviation stakeholders, especially airport operators, to understand the range of issues and choices available when considering, and deciding on, a scalable and effective set of solutions using biometrics. These solutions may serve as a platform to accommodate growth as well as addressing the near-term focus regarding safe operations during the COVID-19 pandemic.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!