Skip to main content

Currently Skimming:

Session 4: Data Use and Sharing and Technological Advancements
Pages 49-60

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 49...
... This balance determines the amount of protection that is accorded to the human subjects who provide the data. ARCHIVING AND SHARING CONFIDENTIAL DATA IN THE SOCIAL SCIENCES George Alter discussed issues that arise in the storing and sharing of confidential data, citing his experiences as director of the Inter-university Consortium for Political and Social Research (ICPSR)
From page 50...
... Finally, many datasets have multiple levels -- data on student, teacher, school, and school district, for example, or on patient, clinic, and community -- which can make it possible to identify individuals by working down from the higher levels. Protecting Confidential Data With respect to protecting confidential data, Alter said, it is useful to think in terms of a framework that considers protecting confidentiality with four different but complementary approaches: safe data, safe places, safe people, and safe outputs.
From page 51...
... ICPSR requires researchers to provide a research plan, IRB approval, a data protection plan, behavior rules, and also an institutional signature. In the institutional agreement, the member institution must agree that if the consortium alleges a violation or breach of the agreement by the researcher, the institution will treat that as research misconduct and pursue that individual under its own research misconduct policies.
From page 52...
... For example, he explained, in the case of a national survey such as an opinion poll, which has very few identifying questions, "we certify the data as having very low risk of re-identification or harm" and provide it under a simple terms-of-use agreement, in which the user agrees not to re-identify anyone in the sample. For more complex data that have greater risks, ICPSR imposes a stronger user agreement and such technology as the virtual data enclave.
From page 53...
... Finally, there is the question of who is responsible for paying the costs of sharing confidential data. Often the institutions that pay for the data are willing to assume the cost of distribution, Alter said, "but I think for many things we are going to be moving to a situation where the data user, because using confidential data has special costs associated with it, is going to have to pay user fees for access to confidential data." DATA-BASED DECISION MAKING FOR EDUCATION Taylor Martin, of Utah State University, described her research in mathematics education and discussed what the proposed changes to the Common Rule could mean for education research.
From page 54...
... The "big vision" Martin described calls for using personalized learning, connected learning, and anytime/anywhere learning to help interest kids in and teach them about STEM topics, giving all children the opportunity to learn about math and science. Achieving that vision will be helped along by the growing presence of "big data." Martin characterized the present state of affairs as a "biggish data" world rather than a big data world, but believes that a world characterized not only by tremendously large amounts of data but also by rich data streams that provide a great 1See, for example, the Scratch Program available at: http://scratch.mit.edu/about [June 2013]
From page 55...
... She shares with other speakers the goal of having more readable and understandable consent forms and agrees that continuing review should not be "one size fits all." She believes IRB forms should be simplified, and that multisite studies should have a single IRB. Focusing on the issue of information risk and educational data, Martin noted that she has been running education studies for 25 years and has kept her data stored in a locked filing cabinet.
From page 56...
... Harmonizing the Concept of Individually Identifiable Information One of the key proposed revisions related to data protection in the ANPRM is that the Common Rule should adopt HIPAA standards regarding what constitutes individually identifiable information, a limited dataset, and de-identified information. Adopting the HIPAA definition of individually identifiable information would not be a major change, Bouregy said, because the current Common Rule definition is very similar.
From page 57...
... First, in her view, IRBs are not necessarily the best place for determining appropriate data security plans, but the proposed rule would require IRBs to become even more involved in data security plans than they are now. Under the proposed rule, even excused research would be subject to these data security standards, she added.
From page 58...
... In her view, the best approach would be to provide them with guidance concerning the appropriate data security plan for low-, medium-, and high-risk data. Incorporating the HIPAA Breach Notification Requirement Bouregy also discussed using HIPAA security and breach notification standards as the model for data protection schemes.
From page 59...
... The most relevant difference for social and behavioral researchers, however, may be that under the HIPAA approach the IRB would not have the ability to consider the context of a breach, which will influence both its significance and the value of providing notice of the breach. For example, if a researcher conducting a study in another country lost the data after returning to the United States, the risk to the subjects would likely be quite low.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.