Skip to main content

Currently Skimming:

2 Theoretical Foundations from Ethical and Social Science Frameworks
Pages 14-28

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 14...
... They provide a basis for determining ways to adapt for responsible computing the processes of design, development, deployment, evaluation, and monitoring of computing research and thus help guide responsible downstream use of computing research in building the many products that are reshaping daily life. In particular, scholars with expertise in these areas can assist computing researchers in designing research projects that adequately meet societal constraints, norms, and needs.
From page 15...
... This chapter focuses instead on presenting fundamental ethical concepts, the very concepts from which such principles arise, but, more importantly, concepts which support the practical reasoning responsible computing research requires Ethics provides tools for the moral evaluation of behaviors, institutions, and social structures. This section focuses on evaluation of behaviors, and Section 2.2 examines the roles of institutions and social structures.
From page 16...
... So, this report adopts a pragmatic distinction to facilitate ethical analysis of responsible computing research: the distinction between intrinsic and instrumental values.6 Intrinsic values are things that matter in themselves. Instrumental values are things that matter because they help us to realize intrinsic values.
From page 17...
... shared intrinsic values and illustrating how they are served by instrumental values specific to computing, this report offers computing researchers concepts with which to structure and understand both their own moral intuitions, and the inevitable moral disagreements that they will confront, when assessing the sources of ethical and societal challenges discussed in Chapter 3 and the recommendations in Chapter 4. 2.1.2 Intrinsic Ethical Values  Autonomy and freedom -- Individuals have beliefs, plans, and goals, and autonomy is the ability to act on those beliefs by formulating plans to achieve goals.
From page 18...
... 2.1.3 Instrumental Ethical Values Instrumental values are ethically important because they contribute to the realization of (or capability to realize) intrinsic values.
From page 19...
...  Explainability, interpretability, and intelligibility -- These concepts are grouped together as they have all been proposed as ways to promote understanding of increasingly complex computational PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 19
From page 20...
... , and this applied concept is rooted in a deeper ethical concept, according to which the intrinsic values described above should be enjoyed securely -- that is, without worry that significant threats or harms might arise. Protection against novel or additional threats makes actual harm less likely, and so enables people to better realize their intrinsic values.
From page 22...
... Another insight from social science scholarship is that the effects of computing research results and the products it enables are influenced by social phenomena at multiple scales.14 Macroscale social phenomena include national laws, economic conditions, and shared political ideologies; mesoscale social phenomena include organizational cultures and institutional rules; and microscale social phenomena include interpersonal relationships and shared identities. These different scales of social phenomena interact, particularly because they might embody different core values.15 For example, micro-level interactions (such as the treatments a patient is offered by a clinician)
From page 23...
... Box-bounding makes it possible for human annotators to mark the borders between the features of an image to help computer vision methods develop the ability to distinguish among different objects, including faces. The second was the use of human work to assist the machine learning systems by labelling image recognition training data, which harnesses human work to assist computing systems, to label image recognition training data.
From page 24...
... Perhaps the largest, most nebulous stakeholder is society itself as it absorbs the many and varied applications of face recognition and computer vision technologies and the ways they are deployed by macro- and meso-scale organizations.18 The power to influence outcomes is not uniform among these stakeholders, and so they are not equally able to advance or advocate for their values.19 For instance, the political, cultural, and economic environments in which facial recognition development were embedded shaped decisions about what and who to fund; in lab settings, Ph.D. students and professors have different degrees of power to set agendas or challenge the status quo; and marginalized groups who could address whether particular projects using image recognition are appropriate -- using faces to identify sexuality or gender, for example -- are likely not present and empowered to question and convince researchers to drop such research.
From page 25...
... For example, the social phenomena of gender identity -- people's everyday experience of gender -- is constantly evolving with the macro level of legal requirements, meso level of cultural norms, and micro level of self-esteem. Yet, many applications of facial recognition technologies rely on classifying gender as a stable, binary category.
From page 26...
... For example, Illinois's passage of the Biometric Information Privacy Act in October 2008 added new constraints on private enterprise's collection, use, and sharing of biometric data from people in the state without consent. A class action lawsuit filed in 2015, representing more than 1 million Facebook users in Illinois, won a $650 million settlement for the company's practice of tagging people in photos using facial recognition without users' consent; in other jurisdictions regulators have not agreed to restrict the private collection of biometric data, so people elsewhere have no recourse.
From page 27...
... The sociotechnical perspective described in Section 2.2 along with ethical analyses of values and trade-offs as described in Section 2.1, combined with methods of ethical reasoning and such social science methods as ethnographic observation, in-depth interviews, survey studies, and historical analysis, can support computing researchers in identifying and resolving the ethical and societal impact challenges that arise from introducing novel technologies into social worlds. Chapter 3 illustrates their use in identifying underlying roots of such challenges.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.