National Academies Press: OpenBook

Airport Biometrics: A Primer (2021)

Chapter: Appendix K - Legal, Policy, and Privacy Review

« Previous: Appendix J - Case Study: Happy Flow at Aruba s Queen Beatrix International Airport
Page 214
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 214
Page 215
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 215
Page 216
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 216
Page 217
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 217
Page 218
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 218
Page 219
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 219
Page 220
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 220
Page 221
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 221
Page 222
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 222
Page 223
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 223
Page 224
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 224
Page 225
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 225
Page 226
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 226
Page 227
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 227
Page 228
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 228
Page 229
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 229
Page 230
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 230
Page 231
Suggested Citation:"Appendix K - Legal, Policy, and Privacy Review." National Academies of Sciences, Engineering, and Medicine. 2021. Airport Biometrics: A Primer. Washington, DC: The National Academies Press. doi: 10.17226/26180.
×
Page 231

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

214 Legal, Policy, and Privacy Review The Supreme Court has recognized the common law root of the right to privacy.113 There are four well-established common law privacy torts: (1) unreasonable intrusion upon some- one’s seclusion, (2) appropriation of a person’s name or likeness, (3) unreasonable disclosure of private facts, and (4) publicity that unreasonably places another in a false light.114 The factual scenario most likely implicated is the unreasonable intrusion upon someone’s privacy, but even this is unlikely unless facial-recognition technology is deployed in circumstances where the intrusion is “especially private” (e.g., a bathroom).115 The Supreme Court has decided that while not expressly set out in the Constitution, privacy protections are implicit or within the “penumbra” of several provisions of the Constitution, to include the prohibition of unreasonable searches and seizures under the Fourth Amendment, and other amendments under the Bill of Rights.116 In addition, while beyond the scope of this report, use of facial recognition combined with public surveillance can implicate First Amendment considerations (Zick 2007). With respect to the issue of identification, the Self-Incrimination Clause in particular is implicated, but it does not protect against the compelled collection of physical evidence, such as blood, DNA, or fingerprints, or its introduction at trial.117 Rather, the evidence sought must be testimonial and incriminating, and the Supreme Court has held that the disclosure of one’s identity “is likely to be so insignificant in the scheme of things as to be incriminating only in unusual circumstances.”118 Thus, even without an individual’s consent, the collection and dis- closure of biometric data, which is neither testimonial nor incriminating, arguably does not violate the Self-Incrimination Clause, but recently lower courts have been split on the issue.119 Although the Supreme Court has not considered whether the specific use of facial-recognition technology might violate provisions of the Constitution, it has rendered decisions on the use of technology.120 For example, in Kyllo v. United States, the Supreme Court held that the use of thermal imaging technology to determine the amount of heat within a person’s house (presumed to be associated with marijuana cultivation) was an unreasonable search, given the heightened privacy expectation associated with one’s home.121 It may be argued, however, that the Court was more concerned with where the search occurred rather than the technology used to conduct the search (Haas 2019). In subsequent decisions where the facts focused on the use of technology for the collection of location data or cell phone information, the Supreme Court rendered decisions that eroded in part the doctrine that Fourth Amendment protections did not extend to public spaces or information received from third parties. For example, in 2012, the Court decided that the installation of a GPS tracking device without a warrant violated the defendant’s Fourth Amend- ment rights, holding that the physical installation of the device constituted an unlawful inter- ference with the suspect’s property interests in the vehicle.122 Six years later, the Supreme Court A P P E N D I X K

Legal, Policy, and Privacy Review 215   decided that the government’s acquisition of information from a third party on the defendant’s cell-site location was a search under the Fourth Amendment for which a warrant supported by probable cause was required.123 Private Parties As noted in Chapter 3, where a private party acts as an agent of the government, that con- duct may implicate Fourth Amendment principles pertaining to searches. Examples where the courts have found that a search by a private party was conducted as an instrument of the government include where an airline employee, who had been an informant in the past and financially rewarded for his assistance, performed a luggage search,124 and where an airline employee conducted a search of luggage pursuant to airline and FAA procedures.125 The fact that the federal government has not compelled a private party to take a particular action does not by itself establish that such action is a private one.126 For example, encouragement or endorsement of a certain action by authorizing it and removing all legal barriers to it “suffice[s] to implicate the [Constitution].”127 The issue of whether a company that sells facial-recognition technology to the government can be considered a state actor (and subject to Fourth Amendment rules to protect privacy) has arisen in a number of cases brought against Clearview AI.128 In essence, the complaints filed against Clearview AI contend the company was acting as an agent of the federal government for: • Allegedly scraping billions of facial images from the Internet, • Performing facial scans of those images, and • Creating a biometric database that allowed users of the database to immediately identify a member of the public merely by uploading a person’s image to the database.129 DNA Identification as a Case Study Although courts have not yet had the opportunity to examine the governmental use of facial-recognition technology,130 recent case law on the collection of DNA and creation of DNA profiles provide a useful framework for understanding how courts will likely address such conduct in the future. Ultimately, the similarities and differences between DNA profiling and the use of facial-recognition technology arguably demonstrate the latter’s legality. This case law is the product of disputes arising from the DNA Analysis Backlog Elimination Act of 2000 (DNA Act), which created the Combined DNA Index System (CODIS).131 CODIS, which is supervised by the FBI, connects DNA laboratories at the local, state, and national level; collects DNA profiles from arrestees, convicted offenders, and forensic evidence found at crime scenes; and sets uniform standards for DNA matching.132 Although the DNA used for such matching is often referred to as a “genetic fingerprint,” it does not have any associa- tion with a genetic disease or any other genetic disposition.133 Thus, while it is functionally equivalent to a fingerprint, the information in the database is only useful for “human identity testing.”134 However, that is not to say that the Supreme Court has not been divided over the government’s asserted purpose for DNA identification. In Maryland v. King, a close 5-4 decision from 2013, the Court held that the use of a cheek swab to obtain an arrestee’s DNA sample and DNA identification of the same arrestee as the perpetrator of an unsolved crime from 6 years earlier was reasonable under the Fourth Amendment.135 Justice Kennedy reasoned that the fact that the DNA identification search was part of a routine booking procedure precluded any need for a warrant.136

216 Airport Biometrics: A Primer Next, Justice Kennedy balanced the government’s legitimate interests against the degree to which the search intrudes upon an individual’s privacy, finding that the former’s interest in verifying the identity of a person in detention; determining the risks such person presents to facility staff, the existing detainee population, and new detainees; ensuring the accused person’s availability for trial; deciding whether the individual can safely be released on bail; and prevent- ing the detention of innocent people outweigh privacy intrusions from DNA identification.137 Specifically, Justice Kennedy noted that the intrusion of a cheek swab to obtain a DNA sample is a minimal one because its context, police custody, diminishes privacy expectations; it is brief, painless, and does not break the skin; and the sample only reveals an individual’s identity, not any genetic traits.138 Thus, he concluded that the search in this case, unsupported by any individualized suspicion beyond that which supported the arrest, was reasonable.139 In his dissent, Justice Scalia responded the Court’s assertion that “DNA is being taken, not to solve crimes, but to identify those in the state’s custody, “taxes the credulity of the credulous.”140 He maintained that the majority uses “identify” to mean finding out what unsolved crimes the arrestee has committed, since, in this case, what the DNA match identified was a previously taken sample from an earlier crime.141 In fact, King’s identity was already known when he was charged—before the DNA database returned a match.142 Consequently, because suspicionless searches have only been permitted when the primary purpose of the search was not to detect evidence of ordinary criminal wrongdoing, DNA identification searches must, Justice Scalia contended, require some level of individualized suspicion.143 It is well established that suspicionless searches of all persons—such as fingerprinting—for ordinary law-enforcement purposes would violate the Fourth Amendment.144 Extrapolating from both King opinions, arguably the use of facial-recognition technology to identify air passengers is even more safely within the ambit of Fourth Amendment reasonable- ness than DNA identification. Unlike DNA identification, identification via facial recognition would be for non–law enforcement investigatory purposes, including verifying that the passenger may lawfully enter or exit the country and ensuring the safety and security of other passengers, the airplane crew, and society at large. Moreover, if suspicionless searches of passengers’ persons and properties can already be lawfully conducted at airports as lawful border searches or administrative searches, a search that is more limited in scope, not physically intrusive, and func- tionally equivalent to the current practice of examining a passenger’s passport is consistent with the Fourth Amendment. Survey of Federal Privacy Laws Relevant to Airport Operators and Stakeholders Generally, federal privacy laws (1) govern a federal agency’s collection and use of biometric data, or (2) protect consumers’ privacy with respect to the collection and use of their biometric data and provide them a cause of action. There are a number of federal privacy laws appli- cable to the collection and use of biometric data by federal agencies as well as specific activi- ties by various industries in distinct business sectors, some of which are relevant to airport operators and stakeholders. Interestingly, a 2015 GAO report observes that there are no federal laws restricting the capture of facial images except with respect to matters associated with minors.145 With respect to legal protections pertaining to commercial uses of biometric data, includ- ing facial-recognition technology, federal laws can be divided into three broad categories that address privacy and consumer protection for “(1) the capture of facial images; (2) the collection,

Legal, Policy, and Privacy Review 217   use, and sharing of personal data; and (3) unfair or deceptive acts or practices, such as failure to comply with a company’s stated privacy policies.”146 The overarching federal law governing federal agencies’ collection, use, and disclosure of PII such as biometric data is the Privacy Act of 1974.147 Privacy Act of 1974 The Privacy Act of 1974 governs what information is collected, maintained, and used by federal agencies.148 Where a federal agency creates records retrievable by personally identify- ing information to satisfy requirements required by law,149 legal requirements governing the collection, use, retention, dissemination, disclosure, and maintenance of individuals’ personal information are set out in the Privacy Act of 1974. When seeking personal information from an individual, the act requires agencies to provide a Privacy Act Statement that informs the person about the authority for the collection, the purpose, the routine uses of such information, and the consequences should the individual decline to provide the information.150 In addition to controls imposed on federal agencies, the act creates rights for persons on whom the records are maintained.151 Generally, an agency must publish public notice in the Federal Register about the collection, maintenance, use, and dissemination of information about individuals maintained in a system of records (referred to as a System of Records Notice or SORN) and must conduct a PIA analyz- ing how the data are used and determining whether that use meets applicable legal and policy privacy requirements.152 Further, agencies may not disclose records on individuals without their consent unless one of 12 exceptions apply (such as for law enforcement and national security purposes).153 The act imposes recordkeeping, retention, and destruction requirements as well.154 There are civil and criminal penalties applicable to violations of the act.155 In addition to the application of the Privacy Act, federal agencies’ collection of biometric data, particularly facial-recognition data, is authorized or governed by laws, regulations, and policies pertaining to: • Requirements on airport operators to establish airport security programs;156 • Requirements for anyone, including airport and air carrier employees, requiring unescorted access to aircraft and secure areas to obtain a SIDA badge;157 • The issuance of airman certificates;158 • The collection of biometric data from travelers for border control purposes;159 • The establishment of an improved visa issuance system, ESTA;160 • The creation of trusted traveler programs;161 and • Travel security, defense, counterterrorism, and law enforcement.162 Even where federal agencies are using biometric or facial-recognition technology linked to sector-specific activities, there are gaps in the coverage of the law for the regulation of substantive and procedural protections for subjects’ data. For example, CBP implemented a biometric exit system required by law, but it has not mandated cooperation from carriers or other authorities.163 After almost a decade of collecting biometrics (fingerprints) from arriving aliens,164 CBP launched the TVS in 2017 to capture facial images from passengers departing the United States.165 Under TVS, information from a passenger’s check-in for a flight is used to compile a gallery of pre-existing photographs of the passenger, such as from visas or passports, and may include photographs from previous encoun- ters with CBP or other DHS components. Prior to boarding, the camera takes a photograph

218 Airport Biometrics: A Primer of the passenger, and TVS compares it to the photos in the gallery to verify the passenger’s identity. In essence, TVS uses an algorithm to “perform both 1:N and 1:1 facial-recognition matching.”166 Once confirmed, after a matter of seconds, the passenger is free to board, and CBP creates an exit record for the passenger.167 Photographs of U.S. citizens and exempt aliens are deleted by CBP within 12 hours of veri- fication.168 CBP has indicated that U.S. citizens entering or departing the United States may opt out of having their picture taken and that the verification will be done manually by a CBP officer or airline/airport representative.169 Further, according to a 2020 GAO report, foreign nationals may also opt out if an air carrier or third party conducts the facial-recognition verification.170 CBP, however, stores photographs of non-immigrant aliens and lawful permanent residents for up to 14 days in a database, and photos of “in-scope” travelers171 are stored for 75 years in another database. CBP stores biographic exit records for every traveler, regardless of their citizenship or status. CBP stores biographic exit records of U.S. citizens and lawful permanent residents for 15 years and exit records of non-immigrant aliens for 75 years.172 No photos are shared with travel stakeholders, only the results of the biometric match (or no match).173 In an effort to assess and address CBP’s use of TVS, the DHS Privacy Office sought a review and recommendations from the DHS Data Privacy and Integrity Advisory Committee (DPIAC).174 The DPIAC made a number of recommendations to improve transparency with respect to notice (both substantive and procedural), consult with the scientific community to determine length of years of reliability/accuracy of facial images, data minimization, steps to ensure greater data accuracy (including training partner airlines on use of the technology), and accountability and auditing.175 While CBP’s TVS has not been free from criticism, as of July 2020, only two lawsuits had been filed in connection with the use of its facial-recognition technology for border entry/exit purposes. Both complaints were filed in connection with requests filed under the Freedom of Information Act for records pertaining to CBP’s TVS.176 Similarly, TSA has been developing biometric solutions to perform travel document checks for passengers proceeding through security checkpoints. In 2018, the agency released its TSA Biometrics Roadmap for Aviation Security & the Passenger Experience. The Biometrics Roadmap outlines four goals: • Partner with CBP on biometrics for international travelers, • Operationalize biometrics for TSA PreCheck travelers, • Expand biometrics to additional domestic travelers, and • Develop support infrastructure for biometric solutions (TSA 2018b). In collaboration with CBP, TSA is deploying Credential Authentication Technology, which is to be used to authenticate the ID credential presented by the passenger (TSA 2020c). Lastly, the authors note that even in the absence of any lawsuit challenging FBI’s direct use of facial-recognition technology, there are numerous cases reporting challenges to the use, reliability, or introduction into evidence of the results of the agency’s matches during criminal proceedings.177 The Federal Trade Commission Act and Deceptive and Unfair Practices A core mission of the FTC is consumer protection. The Federal Trade Commission Act of 1914 (FTCA), as amended, authorizes the FTC to, among other things:

Legal, Policy, and Privacy Review 219   • Prevent and address unfair or deceptive acts or practices; • Seek monetary damages and other relief for conduct injurious to consumers; • Prescribe rules defining with specificity acts or practices that are unfair or deceptive, including the ability to establish requirements designed to prevent such acts or practices; • Conduct investigations relating to the organization, business, practices, and management of entities engaged in commerce; and • Make reports and recommendations to Congress and the public.178 Within the scope of the FTCA is the authority to combat unfair and deceptive practices regarding the collection and use of biometric data.179 The FTC’s authority, however, is limited. For example, it may not require retailers to have privacy policies, only that if they do, they may not engage in unfair or deceptive practices (Zimmerman 2018; Voss and Houser 2019). Under the FTCA, the FTC can determine that an act is “unfair” if it includes such acts or practices involving foreign commerce that: • Cause or are likely to cause reasonably foreseeable injury within the United States, or • Involve material conduct occurring within the United States.180 The FTCA further provides that: The Commission shall have no authority . . . to declare unlawful an act or practice on the grounds that such act or practice is unfair unless the act or practice causes or is likely to cause substantial injury to consumers which is not reasonably avoidable by consumers themselves and not outweighed by counter- vailing benefits to consumers or to competition.”181 As an example of the exercise of its enforcement authority, the FTC recently amended a 2012 settlement order involving Facebook’s alleged deceptive practice concerning consumers’ control over the privacy of their personal information (i.e., sharing of users’ personal data with third parties without their knowledge or consent). On April 28, 2020, the FTC formally approved an order to revise the settlement to where Facebook agreed to pay $5 billion in penalties, make changes in Facebook’s approach to privacy management, and be subject to FTC monitoring (FTC 2020).182 Finally, as discussed infra, while the FTC has not issued specific regulations governing the use of biometric data technology (Stewart 2019b; Wright 2019), it has undertaken to issue guidance, interpretive statements, and other similar public pronouncements to attempt to advise employers on privacy protections and best practices (Stewart 2019b; Wright 2019). Authorities to Control Travel During a Pandemic Federal law authorizes the Center for Disease Control and Prevention (CDC) to exercise broad powers to conduct or require cooperative efforts to prevent the introduction, transmission, and spread of quarantinable and serious communicable diseases such as COVID-19.183 The CDC may require the assistance of Customs and Coast Guard officers as well.184 Generally, this authority extends to screening, isolating, and quarantining, and “plac[ing] a person under surveillance” if the CDC director “has reason to believe” that the “arriving person” is infected with or has been exposed to a communicable disease listed under Executive Order 13295.185 Subject to certain limitations precluding the denial of entry of U.S. citizens and legal permanent residents, the scope of CDC’s authority and ability to screen arriving persons at ports of entry and engaged in interstate travel is clear (see CDC 2020). CDC surveils or quarantines persons believed to have been exposed to a disease but who are not yet ill and isolates persons infected with a communicable disease.186 Of note, TSA may assist

220 Airport Biometrics: A Primer CDC by screening passengers in the interest of aviation safety, maintaining a do-not-board list (identifying persons identified by CDC as public health threats),187 and canceling flights (if the CDC has determined that a person aboard the aircraft has been exposed or infected with a pandemic disease).188 Similarly, airlines may refuse to board passengers infected with com- municable diseases in accordance with DOT regulations (i.e., when the decision is based on “reasonable judgment that relies on current medical knowledge or on the best available evidence” that the person poses a “direct threat” to the health and safety of others).189 The law is less clear with respect to exit screening for outbound international flights. Although untested in the courts, arguably the authority to prevent the introduction, transmission, and spread of quarantinable and serious communicable diseases would support screening of depart- ing persons, including contact tracing of infected travelers who have been in contact with other persons in the United States or who might be on a return trip to the United States shortly or to comply with WHO regulations.190 In late May of 2019, the CDC issued a checklist for states’ health departments to design a contact tracing plan (CDC 2019a). Also, the CDC website on contact tracing is frequently updated to provide links to resources, tools, and guidance for public health staff and others (CDC 2019b). Note that health data are heavily regulated under federal and state law with extensive privacy protections.191 U.S. Congress Legislative Activity In its 116th session, Congress took up a number of bills proposing to legislate the use of biometric data, and there was every expectation that the next Congress would entertain similar legislation (see Figure K-1). Two bills merit monitoring. The bill that most clearly implicates the use of facial recognition and interests of airport stakeholders is the Commercial Facial Recog- nition Privacy Act (S. 847). The bill would, subject to certain important exceptions, generally prohibit the commercial use of facial-recognition technology to identify and track consumers without the end users’ consent. As proposed, the bill would place limitations on the third-party sharing of collected facial biometric data, would require certain entities to meet minimum data security standards (to be published by FTC in consultation with NIST), and would treat violations as unfair or deceptive acts or practices under the FTCA. The bill, which includes several exemptions, would except “security applications” that use the technology for loss pre- vention or to detect and prevent criminal activity.192 Another significant development is Senator Sherrod Brown’s announcement on June 18, 2020 that he intends to introduce the Data Accountability and Transparency Act of 2020, a comprehensive privacy and data protection bill. The bill would not only create a privacy agency but would prohibit private companies and government agencies from collecting per- sonal data unless it is “strictly necessary” to carry out one of a few specified purposes. More significantly, it would ban the use of facial surveillance technology (EPIC 2020). The impacts on the use of facial-recognition technology could be enormous. This broad, sweeping bill would appear to prohibit any data aggregator (which includes federal agencies) from using facial-recognition technology or collecting, using, or sharing any personal data obtained from facial-recognition technology. The topic of facial recognition has also received attention in several Congressional hear- ings. The House Committee on Oversight and Reform has held three hearings on facial recognition.193 Three key concerns were identified: accuracy, transparency, and protection of civil liberties.194

Legal, Policy, and Privacy Review 221   Figure K-1. How a bill becomes law.

222 Airport Biometrics: A Primer Survey of State Laws This section contains a survey and description of state laws that specifically govern the collection and use of biometric data, including facial-recognition technology (see Figure K-2). Illinois Over a 2-year period (2018 to 2019), over 200 lawsuits were filed for alleged violations of BIPA, and reportedly this is on the rise (Prescott 2020). In 2019, the Illinois Supreme Court issued a decision in Rosenbach v. Six Flags Entertainment Corporation that a plaintiff need not allege an actual injury to establish that he qualified as an “aggrieved” person under specific provisions of BIPA and allowed the suit for penalties and damages to proceed.195 In this case, Six Flags theme park took a minor’s thumbprint, without his consent and without informing him about the purpose of data collection, for a season’s pass and future reentry to the theme park.196 The reason this decision is significant is not only because it recognizes technical violations as sufficient to show harm under BIPA, but also potentially lowers the bar for litigants to bring lawsuits in federal courts that rely on satisfying a “case or controversy” showing to establish standing to bring a lawsuit. 197 In 2016, a plaintiff sued Google claiming in its photo-tagging that the company scanned her facial features from a photograph and created a facial template from the photograph in viola- tion of BIPA.198 The court held that biometric information obtained from a photograph qualified as a biometric identifier under BIPA, rejecting Google’s argument that only face scans taken in person would qualify as biometric identifiers under the statute.199 Subsequently, the case was dismissed on grounds relating to a lack of proof of concrete injuries.200 Following on the heels of Rosenbach, users of Facebook living in Illinois brought suit in California for violations of BIPA associated with Facebook’s use of facial-recognition Figure K-2. States with state laws surveyed.

Legal, Policy, and Privacy Review 223   tech nology to “tag” persons in users’ pictures.201 The Ninth Circuit Court of Appeals deter- mined that Facebook’s collection, use, and storage of biometric identifiers without a written release violated the procedural protections provided under BIPA, emphasizing that this was the type of harm targeted by the act, and thus found the plaintiffs had articulated “a concrete and particularized harm, sufficient to confer . . . standing.”202 Even potentially more important was the Appellate Court’s rejection of Facebook’s argument that there should be no extraterritorial application of BIPA for actions Facebook failed to undertake in another state. The Court opined: “it is reasonable to infer that the [Illinois] General Assembly contemplated BIPA’s application to individuals who are located in Illinois, even if some relevant activities occur outside the state.”203 The general rule with respect to application of a state’s law outside the state is that, absent a clear intent expressed in the provisions of the statute, courts will generally not find extraterritorial application of the law and generally limit the scope of the law to things or persons within the state. 204 Not only are the courts wrestling with the extraterritorial application of BIPA, they are considering lawsuits against third parties that have derived facial “geometries” from photographs, despite BIPA specifically excluding photographs from the definition of a biometric identifier subject to the law’s provisions.205 In addition to litigation over the specific terms in BIPA, two decisions subsequent to Rosenbach are worth mentioning as well as bearing on prerequisites to the ability to bring suit: Miller v. Southwest Airlines206 and Crooms v. Southwest Airlines Co.207 In both cases, airline ramp agents from Midway Airport alleged that Southwest Airlines’ practice of scanning and using their fingerprints for timekeeping purposes violated BIPA. The Seventh Circuit in Miller and the District Court in Crooms held that unions represented the plaintiffs’ interests under a collective bargaining agreement, Illinois could not divest the union of its role. Ultimately, the courts decided that federal law208 required the plaintiffs to bring their claims before an adjustment board (and not in federal court), and thus affirmed dismissal of their claims.209 As these cases illustrate, there are many unsettled questions, particularly with respect to the interplay of federal and state law, to include the degree of harm a plaintiff needs to show to have standing to pursue a case in federal courts versus technical violations under state law to bring a case under BIPA. As one court recently noted, federal courts and state courts define injury differently for purposes of determining standing to bring a suit in the respective courts.210 To date, most of the cases under BIPA have been class actions targeting employers’ use of biometric technology at work. These lawsuits are not only on the rise but are expensive and difficult to defend against. For example, in 2020, Facebook offered to settle a class action lawsuit (in California) for alleged violations of BIPA for $650 million (after the trial judge initially rejected the proposed $550 million settlement) (Shibu 2020). In May, plaintiffs filed lawsuits against Facebook in Texas and Arizona, adding to the many other suits pending nationwide, and the company is reportedly facing billions in damages (Justia n.d.). There is also uncertainty with respect to application of BIPA (and other states’ biometric privacy laws) under circumstances where airport stakeholders are partnering with government agencies, such as CBP under the TVS program. Factors bearing on applicability, compliance, and potential liability include whether the state law addresses only commercial uses of biometric data and excludes government agencies from its scope, as do all of the biometric state laws discussed in this section (with the exception of the recent Washington law), and whether the airport stakeholder’s partnership can be viewed as acting as an agent of the government by collecting facial images and transmitting them to CBP (and not retaining them), but this becomes complicated if the stakeholder collects the data for a dual business purpose as well, such as for facilitating the boarding process.

224 Airport Biometrics: A Primer Texas The Texas biometric privacy law, like that of Illinois, requires persons or entities that collect biometric data to inform individuals before capturing the biometric data and to obtain the indi- vidual’s consent.211 Unlike the Illinois law, the Texas statute does not require a written release. The Texas law, however, like the Illinois law, does prohibit the sale of biometric information, and it similarly sets restrictions on the storage of such information.212 Lastly, although modeled after BIPA, it lacks any private cause of action, relying on the state’s attorney general to enforce and, as appropriate, impose sanctions.213 Washington Washington state’s biometric privacy statute entered into effect in 2017 and regulates com- mercial uses of biometric data to require notice, consent, and a limited ability to sell the data.214 Washington’s law also does not include photographs, video or audio recordings, or facial geometry as biometric identifiers.215 Unlike the Texas law, it does not require that consent to collection of biometric data be in writing. Further, in a significant departure from its Illinois and Texas counterparts, the Washington state law carves out an exemption to biometric data collection and storage; businesses may collect and store such information without providing notice and obtaining consent so long as the information is collection for “security purposes,”216 including collection, storage, and use of the information for purposes of preventing shoplifting, fraud, and theft.217 As with Texas, there is no private cause of action, and enforcement is delegated to the state attorney general.218 On March 12, 2020, the governor signed a new facial biometric law, backed by Microsoft, the purpose of which was to limit state and local government authorities’ use of facial-recognition technology.219 Reportedly, Microsoft, Amazon, and Comcast met with state legislative members to address their concerns from earlier versions of the bill, and Microsoft issued a statement endorsing the bill.220 The act will go into effect in 2021. California California’s California Consumer Privacy Act (CCPA), which entered into effect on January 1, 2020,221 provides consumer and employee rights and regulates commercial uses of biometric data by including it in the definition of personal information.222 California’s law applies on a somewhat more limited scale than the Illinois, Texas, and Washington laws. The California law targets any company that both (1) operates in California and (2) either makes at least $25 million in annual revenue, gathers data on more than 50,000 users, or makes more than half its money from user data.223 The law treats biometric information, including images of one’s face, as personal information, and provides rights to consumers to protect their personal information. The CCPA provides the right to know what the company is collecting and why, including information about the sale of their personal information, the right to opt out of the sale of their personal information to third parties, the right to have their data deleted, with some exceptions, and the right to equal service and pricing even if they exercise their privacy rights (Shank 2019). The law provides a limited private cause of action against businesses that fail to “implement and maintain reasonable security procedures and practices appropriate to the nature of the information.”224 The CCPA provides a unique provision that requires consumers to provide notice to the company before initiating any legal action. This period of 30 days is to allow the company time to cure the violation and, if this is done, no action may be pursued.225

Legal, Policy, and Privacy Review 225   In sum, with respect to privacy laws enacted in Texas, Washington, and California, there are variances with respect to: • The definition of biometric data (e.g., application to the image/data collected, but not to the analysis of the data),226 and • The scope of coverage (commercial purpose but not extending to data collected for security purposes, arguably excluding timekeeping uses).227 New York New York amended its existing data-breach notification laws with its 2019 Stop Hacks and Improve Electronic Data Security (SHIELD) Act, which went into effect in March 2020. The SHIELD Act broadens the definition of private information to include biometric information to protect the private information of state residents by requiring businesses to implement and maintain information security protocols.228 Previously, New York had passed a limited bio- metric legislation, §201-a, which applies specifically in the employment context. It prohibits fingerprinting “as a condition of securing employment or of continuing employment.”229 The SHIELD Act extends protections to any state resident and extends the obligation to provide notification of a data breach under New York’s breach notification law beyond the more pre- viously limited protected class of affected persons or businesses that conducted business in New York. The law, as amended, does not expressly provide for a private right of action but authorizes the state’s attorney general to pursue a maximum fine for failing to notify those affected from $150,000 to $250,000 (The National Law Review 2020). Nevada Effective October 2019, Nevada enacted SB 220, which amended existing state law to require operators of websites and online services to post privacy notices on their websites and became the first state to provide consumers with the ability to opt out of the sale of their personal information.230 Although similar to the CCPA, SB 220 is narrower in scope (Kohne et al. 2019). Most recently, similar legislation has been introduced in several other states (e.g., Arizona, Florida, and Massachusetts).231 States have also enacted laws to protect the privacy of minors and students.232 It is anticipated that this trend will only continue as privacy concerns increase. Municipalities Recently, municipal bans on the use of facial-recognition technology have sprung up in cities across the country. Cities with such bans include San Francisco, Oakland, Boston, Somerville (MA), Cambridge City (MA), Northampton (MA), Portland (ME), and Brookline (MA) (Hudgins 2019). Other cities are considering similar measures, and California, New Hampshire, and Oregon have already passed laws that ban the use of facial-recognition tech- nology in police body cameras, and New York and New Jersey appeared poised to do the same (Read 2020; Johnston 2019; Jarmanning 2020). As a general rule, municipal governments have cited concerns about potential abuses from the use of facial-recognition technology, particularly with respect to surveillance activities; impacts on minorities’ freedom of association; and imperfections in, and invasiveness of, the technology, citing bias.233

226 Airport Biometrics: A Primer International Organization Activities There are a number of international organizations working, coordinating, and collaborating with others to support initiatives to achieve a more seamless travel experience in the air envi- ronment. Significantly, most advocate for global standards in an obvious effort to incorporate uniformity in international aviation.234 Beginning in the late 1960s, ICAO begin work on developing a machine-readable travel docu- ment (MRTD) and in 1980 produced Doc 9303 (currently in its seventh edition: ICAO 2015) setting specifications for passports and other travel documents. Over the next couple of decades, ICAO adopted and incorporated facial-recognition technology as a core concept for MRTD and leading to the promotion of its use for e-passports (ICAO 2007). ICAO initiated an ongoing project operating the ICAO Public Key Directory (PKD), which is a central repository for exchanging the information required to authenticate e-passports. Overall, the objective is to advocate for, and assist in, creating a globally interoperable system [section 3.2 of Doc 9303 (ICAO 2015)]. Specifically, ICAO is focused on the DTC, which could serve as an e-passport (i.e., extracted data) or be issued in parallel to or in replacement of a physical e-passport (Cole 2019). Recently, ICAO published guidelines pertaining to DTC (ICAO 2020). In an example of an alternate use of biometrics for an identity purpose, the UN High Com- missioner for Refugees (UNHCR) in 2018 issued its Strategy on Digital Identity and Inclusion, a global effort to encourage and support countries “giving everybody access to a legal and digital identity” (UNHCR 2018). UNHCR indicated that refugees and other displaced persons represent a “marginalized” sector of a population in greatest need of an identity to benefit, for example, from the receipt of basic assistance, protection, and relief services. States in turn benefit from the registration of stateless persons, refugees, and displaced persons with a clearer picture of those residing in their territory. “UNHCR assists member states in ensuring that refugees and asylum seekers, stateless persons, and other forcibly displaced are—digitally speaking—not left behind” (UNHCR 2018). In furtherance of this effort, UNHCR began rolling out Population Registration and Identity Management EcoSystem (PRIMES), which, using biometrics, provides a platform for all UNHCR registration and identity management tools. A consolidated global database is in place. Many of the pilots testing PRIMES are being conducted in partnership with governments, academic institutions, international organizations, and private-sector companies for proof of concepts, establishing use cases and processes, and testing IT infrastructure. In the 1990s, IATA and ACI launched a joint initiative, “Smart Security,” focused on improving the passenger flow process, particularly through security checkpoints. Smart Security sought to address anticipated growth in passenger traffic by focusing on strengthening security, increasing operational efficiencies, and improving the passenger experience with a more seamless approach (ACI n.d.). Over the years, the initiative evolved to combine IATA’s Checkpoint of the Future and ACI’s Better Security parallel programs to seek to achieve a roadmap for streamlining the passenger flow process, a key component of which is the integration of innovative biometric technology. As of 2018, more than 100 airports worldwide had implemented a variation of Smart Security (Smiths Detection 2018). In December 2006, the Simplifying Passenger Travel Interest Group, of which ACI is a founding member and which is made up of airlines, airports, and others, published its recommendations for the ideal process flow (IPF) for the travel of passengers through an airport. It proposed that many of the processes be automated using biometrics, incorporating ICAO standard biometrics for the passenger identification process in the IPF and at all airports (ACI 2006). Building on earlier efforts such as Smart Security and digital identity management, in 2018 IATA announced the OneID initiative, which proposed the use of biometric technology to establish a “secure, seamless, and efficient journey” (ICAO 2019) to facilitate the passenger

Legal, Policy, and Privacy Review 227   experience through security touchpoints by “sharing a single set of passenger identity informa- tion among authorized stakeholders in accordance with data privacy rules” (ICAO 2019). The initiative includes the concept of a DTC, the global policy and technical specifications of which are being shaped and developed by the ICAO. Generally, the concept has been described as a “virtual credential, derived from and linked to an issuing document, securely stored on a mobile device or in the cloud and accessed via biometric authentication, essentially enabling document- free travel” (Entrust Datacard 2019). Significantly, the initiative recognizes that a globally coor- dinated “horizontal” approach is needed to replace a disjointed, fragmented response to date (IATA 2018a). OneID expressly includes, within its desired end state, a plan to have OneID coincide with IATA’s NEXTT vision, described in the following (IATA 2018b). IATA and ACI are pursuing jointly the NEXTT initiative, the goal of which is to make the most of the latest technological advances to transform airports, improve the travel experience from check-in to boarding, including the cargo transport experience, and develop a “common vision to enhance the on-ground transport experience, guide industry investments, and help governments improve the regulatory framework” (Appleton 2019). In response to concerns about siloed sector developments, IATA and ACI broadened the focus of the initiative to “business and cultural change as well as how airports and airlines can develop their operations to achieve additional efficiencies and ultimately increase the capacity” (Appleton 2019). The initiative high- lights the benefits of “off-airport” processes, use of city centers, and interconnected systems, and includes incorporation of biometric technology without necessarily advocating any single option or solution. Per IATA, “[t]he concept involves the use of a trusted digital identity, biometric recognition technology, and a collaborative identity management platform” (IATA 2020). Similarly, the WEF has promoted the KTDI, and in March 2020, it published a white paper offering guidance and recommendations on specifications and a facial-recognition framework for industry (World Economic Forum 2020b). To promote “responsible use” of facial-recognition resources, WEF developed, among a diverse group of stakeholders, a set of principles (and best practices) designed to inform decision makers and product teams on issues pertaining to bias mitigation, proportional use of the technology, privacy, accountability, consent, right to accessi- bility, children’s rights, and alternative options, to name a few (World Economic Forum 2020c). Specifically, the Netherlands and Canada are conducting an ongoing pilot to test KDTI aspects in a “real-life, cross-border context” (World Economic Forum 2020c) and to further inform future pilots and implementation. The white paper is intended to document the standards, open specifications, capabilities, functionalities, and industry best practices captured by the initial pilot and “provide guiding principles for the KTDI concept and any related future pilots toward the end-state vision of global interoperability” (World Economic Forum 2020c). This synopsis illustrates how various initiatives have evolved. More importantly, the organi- zations pursuing initiatives to design biometric identity and authentication and air transport process improvements have clearly made substantial efforts to produce coordinated pilots and strategies (e.g., NEXTT) integrating and incorporating efforts of multiple stakeholders, and it is expected that that trend will continue. Endnotes 113. See U.S. Dep’t of Justice v. Reporters Comm. for Freedom of the Press, 489 U.S. 749, 763 & n. 15 (1989) (recognizing the common law’s protection of a privacy right). 114. Restatement (Second) of Torts, § 652A (Am. Law. Inst. 1977). 115. Rivera v. Google, Inc., 366 F. Supp. 3d 998 (N.D. Ill. 2018); Jacobson v. CBS Broad., Inc., 19 N.E.3d 1165, 1180 (Ill. App. Ct. 2014).

228 Airport Biometrics: A Primer 116. Griswold v. Connecticut, 381 U.S. 479, 483-485 (1965). 117. Schmerber v. California, 384 U.S. 757, 764 (1966). 118. Hiibel v. Sixth Judicial Dist. Court, 542 U.S. 177, 191 (2004). 119. Compare, e.g., United States v. Wright, 431 F. Supp. 3d 1175 (Nev. D. Ct. 2020) (Forcing the defendant to unlock his phone with his face violated the Fifth Amendment) with State v. Andrews, 2020 N.J. LEXIS 898 (Aug. 10, 2020) (Court held compelled disclosure of cell phone passcode did not incriminate the defen- dant as the passcode was not substantive information or a clue to the crime). 120. Kyllo v. United States, 533 U.S. 27, 34 (2001). 121. Id., at 40. 122. United States v. Jones, 565 U.S. 400 (2012). 123. Carpenter v. United States, 138 S. Ct. 2206 (2018); see also Riley v. California, 573 U.S. 373 (2014) (police officers could not search without a warrant defendants’ cell phones under the Fourth Amendment excep- tion for a search incident to an arrest). 124. United States v. Walther, 652 F.2d 788, 793 (9th Cir. 1981). 125. People v. Owens, 184 Cal. Rptr. 509 (Cal. Ct. App. 1982). 126. Skinner v. Ry. Labor Executives’ Ass’n, 489 U.S. 602, 615 (1989). 127. Id., at 616. 128. Calderon v. Clearview AI, Inc., 20 civ. 1296 (CM), 20 civ. 2222 (CM), 20 civ. 3053 (CM), 20 Civ. 3104 (CM), 20 Civ. 3481 (CM), 20 Civ. 3705 (CM), 2020 WL 2792979, at *1 (S.D.N.Y. May 29, 2020); Class Action Complaint, Mutnick v. Clearview AI, Inc., No. 1:20-cv-00512, 2020 WL 378474 (N.D. Ill. Jan. 22, 2020). 129. Id. 130. In a review of case law to determine any suits filed against the FBI, CBP, or TSA regarding the use of facial recognition, the only cases located to date concerned actions filed under the Freedom of Information Act for agency records pertaining to CBP’s use of facial recognition technology. See, e.g., ACLU v. DHS, Case 1:20-cv-02213 (S.D.N.Y, March 12, 2020); EPIC v. CBP, No. 19-cv-689 (D.D.C. March 12, 2019) (settled on April 24, 2020). 131. 34 U.S.C. §§ 12592-93 (2017); id. §§ 40701-44 (2019). 132. Maryland v. King, 569 U.S. 435, 444-45 (2013). As Judge O’Scannlain has helpfully explained: CODIS can be used in two different ways. First, law enforcement can match one forensic crime scene sample to another forensic crime scene sample, thereby allowing officers to connect unsolved crimes through a common perpetrator. Second, and of perhaps greater significance, CODIS enables officials to match evidence obtained at the scene of a crime to a particular offender’s profile. In this latter capacity, CODIS serves as a potent tool for monitoring the criminal activity of known offenders. United States v. Kincade, 379 F.3d 813, 819-20 (9th Cir. 2004). 133. Id. at 818 n.4. 134. King, 569 U.S. at 445. 135. Id. 136. King, 569 U.S. at 448. “Seemingly” because, as Justice Scalia himself notes, “the opinion does not really contain what you would call a rule of decision,” Justice Kennedy at various points relies on the search incident to arrest exception and the special needs exception to justify warrantless DNA identifications. 137. Id. at 465. 138. Id. at 461–64. Interestingly, Justice Kennedy also noted that “[t]he special needs cases, though in full accord with the result reached here, do not have a direct bearing on the issues presented in this case, because unlike the search of a citizen who has not been suspected of a wrong, a detainee has a reduced expectation of privacy.” Id. at 463. 139. Id. at 465–66. 140. Id. at 466. 141. Id. at 474. 142. Id. at 474–75. 143. Id. at 468. 144. United States v. Mitchell, 652 F.3d 387, 411 (3d Cir. 2011) (citing Hayes v. Florida, 470 U.S. 811, 813–18 (1985); Davis v. Mississippi, 394 U.S. 721, 727 (1969)), cert. denied, 565 U.S. 1275 (2012); see also Green v. Berge, 354 F.3d 675, 680 (7th Cir. 2004) (Easterbrook, J., concurring) (“What is ‘reasonable’ under the fourth amendment for a person on conditional release, or a felon, may be unreasonable for the general population. Just as parolees’ homes may be searched without a warrant or probable cause, while both are required to search a free person’s home, so it may be that collection of DNA samples from the general population would require person-specific cause—or at least a ‘special need’. . . .”). 145. U.S. Government Accountability Office 2015, at 28 [referencing the Video Voyeurism Prevention Act of 2004, codified at 18 U.S.C. § 1801)], but also would include the Children’s Online Privacy Protection Act in that list (15 U.S.C. §§ 6501-6506). 146. Id.

Legal, Policy, and Privacy Review 229   147. 5 U.S.C. § 552a (2018). 148. Id. 149. Examples of systems or programs specifically related to border enforcement and aviation operations for which federal agencies collect biometric data include IDENT, DHS Automated Biometric Identification System (formerly U.S. VISIT), E-Verify, TWIC, CBP Airport Security Program, TVS, Global Entry/Pre√, and Passport. 150. 5 U.S.C. § 552a (e)(3). 151. Individuals have the right to access records containing information about themselves, to amend incorrect information, and to sue the agency for violations of the statute. 5 U.S.C. 552a(d) and (g). 152. E-Government Act of 2002 (Pub. L. 107-347), codified at 44 U.S.C. § 3601 et seq. 153. 5 U.S.C. § 552a (j) and (k). 154. Id. § 552a(e). 155. Id. § 552a(g) and (i). 156. 49 U.S.C. § 44903 (2018). 157. 49 CFR 1540.5; see also 19 CFR 122.181-188 (CBP Airport Security Program for unescorted access to CBP areas). 158. 49 U.S.C. § 44703 (2018). 159. 8 U.S.C. §§ 1181, 1185, and 1221 and 19 U.S.C. §1433 (IDENT, US-VISIT, Traveler Verification Service); Intelligence Reform and Terrorism Prevention Act of 2004 (Public Law 108–458; 118 Stat. 3638); Implement- ing Recommendations of the 9/11 Commission Act of 2007 (Public Law 110–53; 121 Stat. 266); see also, Pope, Carra, Biometric Data Collection in an Unprotected World: Exploring the Need for Federal Legislation Protecting Biometric Data, 26 J.L. & Pol’y 769 (2018) (comprehensive discussion of DHS agencies’ use of biometric data). 160. Section 711 of the Implementing Recommendations of the 9/11 Commission Act of 2007, Pub. L. 110-53. 161. 8 U.S.C. § 1365b (2018), 49 U.S.C. §§ 114 note and 44919. 162. See, e.g., USCIS Fact Sheet (June 2020), https://www.uscis.gov/sites/default/files/USCIS/Refugee%2C%20 Asylum%2C%20and%20Int%27l%20Ops/Refugee_Screening_and_Vetting_Fact_Sheet.pdf. 163. See 8 U.S.C. § 1365b (2018); Biometric Exit Frequently Asked Questions (FAQs), https://www.cbp.gov/ travel/biometrics/biometric-exit-faqs (Last modified: Friday, May 15, 2020). 164. DHS Office of Biometric Identification Management (OBIM) was created in March 2013, replacing the United States Visitor and Immigration Status Indicator Technology (US-VISIT) Program, the purpose of which is to use biometrics to establish and verify the identity of foreign nationals who apply for visas and seek to enter the United States. https://www.dhs.gov/obim. 165. Biometric Exit Frequently Asked Questions (FAQs), https://www.cbp.gov/travel/biometrics/biometric- exit-faqs (Last modified: Friday, May 15, 2020); see also DHS Data Privacy and Integrity Advisory Committee (DPIAC), Report 2019-01, Privacy Recommendations in Connection with the Use of Facial Recognition Technology (Feb. 2019). 166. GAO-20-568 at 14, Facial Recognition Technology (Sept. 2020). 167. Supra note 165, at 37. 168. Id. 169. Id.: see also Transportation Security Administration and U.S. Customs and Border Protection: Deploy- ment of Biometric Technologies, Cong. Rept. (Aug. 2019), https://www.tsa.gov/sites/default/files/biometrics report.pdf. 170. Supra note 165. 171. See 8 CFR 235.1(f) (defining “in scope travelers” as “any person who may be required by law to provide biometrics upon entry into the United States pursuant to 8 CFR 235.1(f)(ii), or upon exit from the United States pursuant to 8 CFR 215.8. “In-scope” travelers include any alien other than those specifically exempt as outlined in the CFR. Exempt aliens include: Canadian citizens under Section 101(a)(15)(B) of the Immigration and Nationality Act who are not otherwise required to present a visa or be issued a form I-94 or Form I-95; aliens younger than 14 or older than 79 on the data of admission; aliens admitted A-1, A-2, C-3 (except for attendants, servants, or personal employees of accredited officials), G-1, G-2, G-3, G-4, NATO-1, NATO-2, NATO-3, NATO-4, NATO-5, or NATO-6 visas, and certain Taiwan officials who hold E-1 visas and members of their immediate families who hold E-1 visas unless the Secretary of State and the Secretary of Homeland Security jointly determine that a class of such aliens should be subject to the requirements of paragraph (d)(1)(ii); classes of aliens to whom the Secretary of Homeland Security and the Secretary of State jointly determine it shall not apply; or an individual alien to whom the Secretary of Homeland Security, the Secretary of State, or the Director of Central Intelligence determines it shall not apply.” at 4 (fn. 9). 172. Privacy Impact Assessment for the Traveler Verification Service DHS/CBP/PIA-056 at 21, November 14, 2018, https://www.dhs.gov/sites/default/files/publications/privacy-pia-cbp030-tvs-november2018_2.pdf;

230 Airport Biometrics: A Primer Report 2019-XX of the DHS Data Privacy and Integrity Advisory Committee (DPIAC): Privacy Recom- mendations in Connection with the Use of Facial Recognition Technology As approved in Public Session, https://www.dhs.gov/sites/default/files/publications/DPIAC%20DRAFT%20Biometrics%20Recommendation %20Report%20v4_02.06.2018.pdf. 173. Supra note 165. 174. Supra note 172. 175. Id. 176. See, e.g., ACLU v. DHS, Case 1:20-cv-02213, (S.D.N.Y. Mar. 12, 2020); EPIC v. CBP, No. 19-cv-689 (D.D.C. Mar. 12, 2019) (settled on April 24, 2020). 177. See, e.g., United States v. Martinez, No. 04 CR 543, 2019 LEXIS 192708 (N.D. Ill. Nov. 6, 2019) 178. 15 U.S.C. §§ 41-58, as amended 179. Pub. L. No. 63-203, § 5 [codified at 15 U.S.C. § 45(a)]. 180. Supra note 178, § 45(a)(4). 181. Id. § 45(n). 182. For a summary of Facebook’s history with FTC, see Wilson 2019. 183. 42 U.S.C. §§ 264-65. See also 42 U.S.C. § 268(b) (requiring Customs and Coast Guard officers to aid in the enforcement of quarantine rules and regulations). 184. See 42 U.S.C. § 268 (b) (2018). 185. 42 CFR § 71.32(a). 186. See CDC 2020. 187. 49 U.S.C. §114(m) and §106(l)(4) 188. 49 U.S.C. § 44905(b). 189. 14 CFR 382.21 and 382.19 (c)(1). 190. The World Health Organization, of which the U.S. is a member, has issued binding regulations, the Inter- national Health Regulations (IHR), which require among many things that members detect, monitor, and respond effectively to disease outbreaks that could spread internationally. World Health Assembly Res. 58.3, arts. 6-7, 9,13, (May 23, 2005) available at https://www.who.int/ihr/publications/9789241580496/en/. 191. Health Insurance Portability and Accountability Act (HIPAA) Pub. L. 104-191 (1996), particularly 42 U.S.C. § 1320d–6 (penalties for wrongful disclosure); see also Determann 2020. 192. See Neuberger 2019; See www.congress.gov (last visited July 24, 2020). See, e.g., Privacy Bill of Rights, S. 1214, 116th Cong. (2019); Consumer Data Privacy and Security Act, S. 3456, 116th Cong. (2020); American Data Dissemination Act, S. 142, 116th Cong. (2019); Data Care Act of 2019, S. 2961, 116th Cong. (2019); Digital Accountability and Transparency to Advance Privacy Act, S. 583, 116th Cong. (2019); H.R. 3900, 116th Cong. (2019); Privacy Score Act of 2020, H.R. 6227, 116th Cong. (2020); Office of Biometric Identity Management Authorization Act of 2019, H.R. 1729, 116th Cong. (2019). 193. Facial Recognition Technology (Part I): Its Impact on Our Civil Rights and Liberties (May 22, 2019); Facial Recognition Technology (Part II): Ensuring Transparency in Government Use (June 4, 2019); and Facial Recognition Technology (Part III): Ensuring Commercial Transparency & Accuracy (Jan. 15, 2020). https://www.youtube.com/watch?time_continue=7&v=2dpazLVUo_w&feature=emb_title. 194. Id. 195. Rosenbach v. Six Flags Entm’t Corp., 129 N.E.3d 1197 (Ill. 2019). 196. Id. at 6-9. 197. Patel v. Facebook, Inc., 932 F.3d 1264 (9th Cir. 2019), cert. denied, 140 S. Ct. 937 (2020). 198. Rivera v. Google, Inc., 238 F. Supp. 3d 1088, 1091 (N.D. Ill. 2017). 199. Id., at 1095. 200. Rivera v. Google., 366 F. Supp. 998, 1014 (N.D. Ill. 2018). 201. Id. 202. Id., at 1276; see also, Jackson 2020. 203. Patel, 932 F.3d at 25. But see Rivera v. Google, 366 F. Supp. 3d 998 (N.D. Ill. 2018) (A showing of a concrete injury required to establish standing for a violation of BIPA). 204. BIPA May Apply to Clearview AI’s Creation of Biometric Data, Law360 Expert Analysis (February 18, 2020). 205. 740 ILCS 10 (“Biometric identifiers do not include . . . photographs . . . “); See, e.g., Mutnick v. Clearview AI, Inc. et al., No. 1:20-cv-00512 (N.D. Ill. January 22, 2020); Vance v. IBM Corp., No. 1:20-cv-00577 (N.D. Ill. January 24, 2020). 206. Miller v. Southwest Airlines Co., 926 F.3d 898 (7th Cir. 2019) 207. No. 19-cv-2149, 2020 WL 2404878, at *1 (N.D. Ill., May 12, 2020). 208. The Railway Labor Act, 45 U.S.C. § 151 et seq. 209. Supra notes 206 and 208; Crooms, 2020 WL 2404878. 210. See, Bryant v. Compass Grp. USA, Inc., 958 F. 3d 617 (7th Cir. May 5, 2020) for a discussion of standing requirements and decisions (Standing found in case where plaintiff alleged violation of BIPA for the

Legal, Policy, and Privacy Review 231   failure to comply with statutory requirements associated with an account to biometrically access a vending machine). 211. Tex. Bus. & Com. Code Ann. § 503 212. Id. 213. Id. 214. Wash. Rev. Code § 19.375 (excludes physical or digital photographs, as well as video or audio recordings from the definition of “biometric identifier); Wash. H.B. 1493 (2017). 215. Wash. H.B. 1493 (2017). 216. Id. § 3(4). 217. Id. § 4 218. Id. § 5. 219. Wash. S.B. 6280 (2020). 220. See, e.g., Crawford 2020 and Hofer 2020. 221. Cal. Civ. Code § 1798.100 et seq. 222. Id.; For detailed summary, see Zaller 2020. 223. Cal. Civ. Code § 1798.140(c). 224. Id. at § 1798.150: (a) (1) Any consumer whose nonencrypted and nonredacted personal information, as defined in subparagraph (A) of paragraph (1) of subdivision (d) of Section 1798.81.5, is subject to an unauthorized access and exfiltration, theft, or disclosure as a result of the business’s violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information may institute a civil action for any of the following: (A) To recover damages in an amount not less than one hundred dollars ($100) and not greater than seven hundred and fifty ($750) per consumer per incident or actual damages, whichever is greater. (B) Injunctive or declaratory relief. (C) Any other relief the court deems proper. (2) In assessing the amount of statutory damages, the court shall consider any one or more of the relevant circumstances presented by any of the parties to the case, including, but not limited to, the nature and seriousness of the misconduct, the number of violations, the persistence of the misconduct, the length of time over which the misconduct occurred, the willfulness of the defendant’s misconduct, and the defen- dant’s assets, liabilities, and net worth. See also Attorney General website at California Consumer Privacy Act (CCPA), https://oag.ca.gov/ privacy/ccpa (last visited Aug. 4, 2020). 225. CCPA § 1798.150 (a)-(b). 226. Tex. Bus. & Com. Code Ann. § 503; Wash. Rev. Code § 19.375. (excludes physical or digital photographs, as well as video or audio recordings, from the definition of “biometric identifier); Cal. Civ. Code § 1798.100 et seq. 227. Id.; Wash. Rev. Code § 19.375. 228. S.5575A, 2019-2020 Leg. Sess. (N.Y. 2019), available at https://legislation.nysenate.gov/pdf/bills/2019/ S5575A. 229. N.Y. Lab. Law §201-a (McKinney 2019). 230. S.B. 220, 2019 Leg., 80th Sess. (Nev. 2019), available at https://www.leg.state.nv.us/App/NELIS/ REL/80th2019/Bill/6365/Text. 231. H.R. 2478, 50th Leg., Second Reg. Sess. (Ariz. 2012); H.R. 1153, 2019 Sess. (Fla. 2019); S. 120, 191st General Court (Mass. 2019). 232. See, e.g., National Conference of State Legislatures 2020. 233. See, e.g., Gandhi 2020 and Ringrose 2019. 234. See, e.g., ICAO n.d.-a.

Next: Appendix L - Best Practices and Privacy by Design »
Airport Biometrics: A Primer Get This Book
×
 Airport Biometrics: A Primer
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Biometrics is one of the most powerful, but misunderstood technologies used at airports today. The ability to increase the speed of individual processes, as well as offer a touch-free experience throughout an entire journey is a revolution that is decades in the making.

The TRB Airport Cooperative Research Program's ACRP Research Report 233: Airport Biometrics: A Primer is designed to help aviation stakeholders, especially airport operators, to understand the range of issues and choices available when considering, and deciding on, a scalable and effective set of solutions using biometrics. These solutions may serve as a platform to accommodate growth as well as addressing the near-term focus regarding safe operations during the COVID-19 pandemic.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!