Skip to main content

Currently Skimming:

Appendix D: New Data, New Research Tools, New Ethical Questions
Pages 365-374

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 365...
... and public officials are excited about the possibilities that big data and computational social science hold for enhanced analysis of open-source intelligence in particular, including intelligence gleaned from social media, sensor data, and other digital information produced by routine human actions and behaviors (Harman, 2015)
From page 366...
... NEW ETHICAL CHALLENGES One primary issue for researchers who use large datasets is that these datasets raise new ethical questions that have not yet been systematically addressed. While SBS researchers have long been accustomed to addressing research ethics via the Common Rule, computational social science research transcends traditional human subjects protections and raises a number of new ethical questions.
From page 367...
... PRIVACY Traditional norms of privacy that are relevant to research ethics are oriented toward protecting individuals by ensuring that neither personally identifying information nor sensitive personal information will be exposed. Digital researchers typically deidentify individuals attached to digital datasets, but even this approach does not guarantee that the subjects of digital research will remain fully anonymous, as a National Research Council (2008)
From page 368...
... But many Internet researchers believe that informed consent is unrealistic in the online domain for at least two reasons. First, it is not possible to obtain consent from the millions of web users whose digital traces are being studied, and it is a matter of some dispute whether the individuals whose data is studied deserve the same protections as human subjects (Barocas and Nissenbaum, 2014; Buchanan, 2017; Dittrich and Kenneally, 2012)
From page 369...
... For S example, defenders of the Facebook study mentioned above in which the emotional content of users' feeds was altered insisted that this study was no different from the product testing commonplace in industry, and that such research is essential to developing sound algorithms. Detractors insisted that such research violated user expectations of informational and emotional autonomy on Facebook (Boyd, 2006; Hancock, 2017)
From page 370...
... Social network research often categorizes and makes judgments about individuals and groups based on their relationships. These associations, whether false or accurate, can have material effects on the lives and well-being of those individuals categorized, particularly when the categories carry social stigma or imply that categorized individuals pose security threats because of their social networks (Lyon, 2007)
From page 371...
... The potential for security community surveillance afforded by digital data compounds the power imbalances already present in digital spaces. Failure to address ethical concerns can have a chilling effect on research.
From page 372...
... , in which data mining was used to monitor potential security threats, have reignited old fears that the American national security community is unjustly monitoring domestic communications. Previously documented abuses, such as domestic surveillance of civil rights groups revealed by the Church Committee in 1975, are the backdrop for such concerns (Electronic Freedom Foundation, 2004; National Academies of Sciences, Engineering, and Medicine, 2016; Walsh and Miller, 2016)
From page 373...
... . Association of Internet Researchers Ethics Working Committee.
From page 374...
... . Tastes, ties, and time: A new social network dataset using Facebook.com.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.