Skip to main content

Currently Skimming:

5 Privacy and Confidentiality
Pages 80-91

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 80...
... , referred to throughout the report as P&R.1 Federal regulations require that ethical guidelines2 apply when humans are used as research subjects and that the level of risk is proportionate to the potential benefit for these subjects. Furthermore, the level of review 1  The following laws, regulations, and policies govern the Human Research Protection Pro gram: (1)
From page 81...
... and the Army Analytics Group in 2006 is regulated by the Army Human Research Protection Office; data researchers are required to apply for access and explain the analyses they plan to conduct. The data are deidentified by removal of 16 of the 18 fields specified in the Health Insurance Portability and Accountability Act (HIPAA)
From page 82...
... Implicit in the FIPPs, the HIPAA Safe Harbor provisions, and the deidentification procedures carried out by the PDE is a separation between PII -- personally identifiable information -- and other kinds of information.
From page 83...
... To the greatest extent possible, computations should be carried out on encrypted data.6 The techniques of homomorphic encryption and secure multiparty computation, which permit a data analyst to run computations on data without having direct access to the raw data themselves, even when these data are shared among multiple parties, provide the effect of having the data held by a single trusted and trustworthy data curator who will carry out computations as instructed and release the results. Nonetheless, these techniques do not ensure privacy, in that they do not address the question of what can be safely released.
From page 84...
... . This guarantee is very strong: It says that even if the analyst knows a complete data set of n individuals and is given all the data of an n + 1st individual x, the analyst cannot determine whether the data set actually in use is D or is D′ = D ∪ {x}, the union of D and x.7 Data sets can teach us that smoking causes cancer and thereby results in a rise in insurance premiums for the smokers.8 But this sort of impact will occur independent of whether any individual x joins or refrains from joining the data set.
From page 85...
... Thus, it is possible to combine simple differentially private computational primitives in order to obtain privacy-preserving algorithms for complex computational tasks while minimizing cumulative privacy loss -- just as in traditional algorithm design, simple computational primitives are combined in clever and creative ways to carry out complex computations while minimizing certain resources of interest, such as time, space, generalization error, and so on. Differential privacy necessarily ­ d ­ egrades accuracy (often necessarily)
From page 86...
... To remain current, to evolve toward being comprehensive, to accommodate contexts that were not previously anticipated, and to take into account new developments in the scientific community's constantly evolving understanding of data privacy risks and countermeasures (which may lead to either additions or deletions from the Safe Harbor list) , the Safe Harbor list should be maintained by a periodically convened task force including
From page 87...
... THE FEDERAL STATISTICAL SYSTEM'S LEGAL AND GOVERNANCE POLICY The Privacy Act of 1974 and the use of IRBs may not be sufficient to control, and more importantly leverage, the vast amounts of administrative and survey data about the military population that are collected by the Department of Defense and each of the military Services. These data have huge potential for use in cross-sectional and longitudinal analyses and could provide new insights into the military population as well as have spillovers for understanding microcosms of populations in the civilian sector.
From page 88...
... in her memorandum to heads of executive agencies for providing administrative data for statistical purposes, with a focus on repurposing data to minimize reporting burden and to protect privacy. CIPSEA provides a uniform set of confidentiality protections for information collected by statistical agencies for statistical purposes, at the same time keeping in place the stringent privacy laws governing many agencies.
From page 89...
... Under CIPSEA, only designated agencies and units can appoint "agents," who may then access the confidential data. CIPSEA is broad in that survey, interview, administrative, or other data provided to a statistical agency are protected under this statute, just as are individual data reported directly to the statistical agency (usually through surveys)
From page 90...
... 2014. Memorandum for the Heads of Executive Departments and Agencies: Guid ance for Providing and Using Administrative Data for Statistical Purposes.
From page 91...
... 2011. "Comments on Advance Notice of Proposed Rulemaking: Human Subjects Research Protections: Enhancing Protections for Research Subjects and Reducing Burden, Delay, and Ambiguity for Investigators, Docket ID number HHS-OPHS-2011-0005." http://­ privacytools.seas.harvard.edu/files/privacytools/files/commonruleanprm.pdf?


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.