National Academies Press: OpenBook
« Previous: Summary
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page5
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page6
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page7
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page8
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page9
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page10
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page11
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page12
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2022. Fostering Responsible Computing Research: Foundations and Practices. Washington, DC: The National Academies Press. doi: 10.17226/26507.
×
Page13

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

1 Introduction 1.1 THE NATURE OF COMPUTING AND COMPUTING RESEARCH The fruits of computing research—a term used in this report to include research in computer science and engineering, information science, and related fields—have been increasingly woven into our personal and work lives and the constructed world around us, making us all stakeholders in how computing is used. The innovations enabled by computing research have improved the lives of individuals, the work of institutions and organizations of various sorts, and the functioning of governments and communities. For example, automobiles have been made safer with the introduction of antilock brakes and pedestrian detection, better weather forecasts help save lives and enhance crop yields, and medical discovery has been accelerated by ever-greater raw computational power along with advances in data analytics and modeling. At the same time, as regularly reported in both the general and scientific press, some uses of computing technology have raised concerns about individual and societal harms. For instance, the use of predictive analytics in administering criminal justice risks perpetuating structural biases in society. New communication and entertainment platforms have afforded new avenues for spreading misinformation and disinformation. Concerns such as these and others have led to increasing calls for governments to act both in making wiser decisions about their own use of computing technology and in revising regulations or implementing new ones. Furthermore, computing research has an obligation to support human flourishing, thriving societies, and a healthy planet. Unlike research in the natural sciences, computing research creates and studies computing artifacts— computer hardware and software and associated data, models, and algorithms—that are all human-made things. Computing research thus continually creates new possibilities for human action and is fundamentally a human-inspired (and largely human-constrained) endeavor. Although computing research encompasses engineering, it differs from other engineering endeavors because it is limited more by human imagination than by the physical constraints often found in other areas of engineering. Frederic Brooks made similar observations about the nature of software programming in the title essay of The Mythical Man-Month: In many creative activities the medium of execution is intractable. Lumber splits; paints smear; electrical circuits ring. These physical limitations of the medium constrain the ideas that may be expressed, and they also create unexpected difficulties in the implementation. . . . Computer programming, however, creates with an exceedingly tractable medium. The programmer builds from pure thought-stuff: concepts and very flexible representations thereof.1 There are, of course, some fundamental physical limits, such as energy, heat dissipation, and integrated circuit feature size, and some limits on what can theoretically be computed in a reasonable amount of time. Moreover, there are important resource and environmental constraints on what one should build—see Section 3.1.6. 1 Brooks, Jr., F.P. 1975, The Mythical Man-Month, 25th Anniversary edition, 2000 Addison Wesley. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5

To be sure, the nature of the medium is not quite as unconstrained as the quote above suggests. Not everything can be quantified or represented formally, and a particular programming language, like any technology, makes it easier to do some things and harder to do other things. Nonetheless, algorithms are extremely flexible, and computing technologies are used in many disciplines, scientific and otherwise, and every sector of society. Many algorithms, models, and pieces of software can be used for a wide variety of purposes, some beyond their original design intentions. The barrier to embedding software and hardware in the real world is low and continually decreasing. Anyone with basic programming skills can build and deploy an app. This general-purpose nature means that computing research can affect a wide diversity of applications, contexts, and societal domains. Computing researchers thus need to be especially thoughtful, creative and diligent when considering potential societal and ethical implications of their work, and they need to be assiduous in describing the intended uses and limitations of that work.2 1.2 THE NATURE OF THE TECHNOLOGY INNOVATION ECOSYSTEM To understand what is needed for responsible computing research, one needs to understand the vibrant technology ecosystem in which computing research takes place. This ecosystem comprises universities (with a majority of support from government research sponsors), both small and large computing firms, and government laboratories. Especially in the United States, the ecosystem features rich interplay among academic researchers, industry researchers, and the creators of products and services. This ecosystem benefits greatly from multi-directional flows of ideas, artifacts, technologies, and people. These multidirectional flows speed up the pace of deployment of computing research, fueling transformative results across every sector of the U.S. economy.3 The time required for basic research ideas to have commercial impact varies widely. Sometimes it takes years or even decades, as innovations compound and mature, technology components become less expensive, and market needs emerge. Sustained and patient research investment is often needed to realize the full potential of computing research. For example, machine learning research started more than 5 decades ago, saw its first significant commercial applications in the early 1990s, and dramatically accelerated in impact a decade ago enabled by a combination of new algorithms, new sources of training data from an increasingly interconnected and digitized society, and advances in computing hardware.4 Other times, the rich connections in the technology innovation ecosystem and the availability of funding from venture capital or well-resourced firms make it possible for research ideas to be deployed quickly and on a large scale.5 One recent example is the use of machine learning to predict advertising clickthrough rates. The idea made its way from a published paper to deployment by Facebook in six months.6 Few if any fields rival computing for the speed with which research advances can be deployed to be used by millions of people. 2 See, for example, Urbina, F., Lentzos, F., Invernizzi, C. et al. Dual use of artificial-intelligence-powered drug discovery. Nat Mach Intell 4, 189–191 (2022). https://doi.org/10.1038/s42256-022-00465-9. 3 NASEM (National Academies of Sciences, Engineering, and Medicine), 2020, Information Technology Innovation: Resurgence, Confluence, and Continuing Impact, National Academies Press, Washington, D.C., https://www.nap.edu/catalog/25961. 4 NASEM, 2020, Information Technology Innovation: Resurgence, Confluence, and Continuing Impact, National Academies Press, Washington, D.C., https://www.nap.edu/catalog/25961, 48–50. 5 NASEM, 2020, Information Technology Innovation: Resurgence, Confluence, and Continuing Impact, National Academies Press, Washington, D.C., https://www.nap.edu/catalog/25961, 48–50. 6 The original paper is https://www.microsoft.com/en-us/research/publication/web-scale-bayesian-click- through-rate-prediction-for-sponsored-search-advertising-in-microsofts-bing-search-engine/. For a history of how this work was implemented at Facebook, see https://www.google.com/url?q=https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai- misinformation/&sa=D&source=docs&ust=1635271025909000&usg=AOvVaw0W1f7-WYEEaUnOfQ75urJk. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 6

That being said, the transfer of an idea from a research lab to a consumer-facing product is a dynamic process that is often beset by unexpected challenges. Predeployment, algorithms from a paper may be tweaked, extended, or eliminated from a system to better fit the exigencies of real-world environments. Post-deployment, systems may be altered in response to relevant advances from the research community, or because unexpected dangers are uncovered. The net result is that the technology transfer of an idea from research to industry is not a single-shot event, but rather a feedback process that unfolds over time and has many inputs. This “continuous integration” means that the work of computing researchers can have significant impacts on deployed systems even long after the systems are launched, and that research continues to be linked to downstream outcomes. The computing industry has considerable experience in translating research into practice, and practioners who are closer to where technology uses and misuses occur. Companies have built processes and developed in-house expertise for identifying potential issues with new technologies that they can bring to bear as new technologies are deployed or integrated into products. These experiences and practices can be instructive to the computing research community. Testing of the systems produced by computing research occurs partly under controlled conditions and partly in uncontrolled environments with unknown users. The latter has become increasingly consequential as computing technology is deployed more widely (at a larger scale and to more diverse populations). Full testing of an approach or artifact in the context of a particular application inherently involves people using the system in that situation (e.g., nurses using a new electronic health records system in a hospital). Doing this testing can be costly (the difficulty of recruiting the “human subjects”) and complicated (because the testing itself may raise ethical issues). These issues create challenges for responsible computing research. The ways a new computing technology is ultimately used may also differ greatly from the original intent of its inventors. For example, Web cookies were introduced in the mid-1990s as a way of maintaining state without storing information on a server so that users did not have to keep reentering the same information. Within a short time, however, third party cookies were introduced as a way of tracking user activity across web sites, almost immediately raising privacy concerns about a technology that was originally thought to be privacy protecting. A related issue is that algorithms or actual software artifacts are frequently used in application areas other than those contemplated in the original research. 1.3 THE NATURE OF THE COMPUTING RESEARCH ECOSYSTEM Many actors participate in the multi-step translation of research results into deployed algorithms, devices and systems, including researchers, research sponsors, entrepreneurs, investors, and corporate leaders. The roles played by the various participants in the research enterprise varies as do the range of interactions they have with others, their capabilities for influencing outcomes, and the incentive structures that influence their choices. Indeed, incentive structures play a large role throughout the computing research ecosystem, from the Ph.D. student incentives for graduating and obtaining good positions, to faculty concerned about tenure and research funding, to startups aiming to establish a beachhead in the market, to giant corporations’ incentives for maintaining their market position. A well-known design principle in computing illuminates the importance of considering ethical and societal impact issues in research: it is much easier to design a technology correctly from the start than it is to fix it later. Furthermore, choices among research topics and research methods are determinative of possible computing technologies. In focusing on computing research, this study considers how ethical and societal impact challenges can be addressed at this consequential foundational stage, and the practical steps that computing researchers, the research community as a whole, research sponsors, and research- performing institutions can take toward fostering the development of computing technologies that more often serve social good and less often cause harm. Scientists and engineers have another important role as well: informing and educating future computing professionals about ethical and societal impact PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 7

responsibilities and ways to meet them. The changes in computing education will make it more likely that the computing industry deploys computing technologies in an ethical and responsible manner. Still, computing researchers cannot, by themselves, ensure that computing technologies are designed and used in ways that are ethical and responsible. Technologies can have many different uses, and it is the application and context of use that most directly create the ethical and societal impact challenges. Moreover, most innovations that lead to new capabilities, including those raising ethical and societal challenges, draw on a combination of many research results. Thus, science and engineering are necessary, but they alone do not suffice to address ethical issues, because, as Chapter 2 discusses, the design of computing technologies involves many trade-offs among values, preferences, and incentive choices. Such public policy choices (which include decisions about what technologies government acquires) are the proper realm of societies and communities in determining norms and governments in instating mechanisms to realize or enforce those norms, not of scientists and engineers. Nonetheless, scientists and engineers have important roles to play in ensuring that government decisionmakers have the information they need about the results of research to make wise choices. The recommendations of this study also aim to inform, complement and support actions by government and companies. How should computer scientists think about these challenges? This report argues that computer scientists must begin to treat ethical and societal impact considerations as first-order concerns. In the same way that computer scientists understand that quantitative metrics such as classification accuracy, algorithm efficiency, and energy consumption are important, computer scientists must now also reckon with the ethical and societal impacts of deployed technologies that incorporate their research results. The report recommends practical steps toward doing so. 1.4 THE ROLES OF ETHICS AND SOCIAL SCIENCE IN COMPUTING Ethics provides tools for the moral evaluation of behaviors, institutions, and social structures and for dealing with choices among and conflicts between values. Until relatively recently, many researchers and observers considered computing technologies to be value neutral. Few if any are. The design of new computing technologies, much as with technologies more generally, is always imprinted with the spectrum of values considered by the designer, which may not be broad enough to ensure a particular technology meets the needs of some stakeholders. Sometimes the value choices are intended. Very often they are not.7 Some of these values may be explicitly expressed while others may be implicit. Moreover, some of these values may be introduced during research, not solely in the translation of research into viable technologies. All technologies, and the research that enables them, create some opportunities and foreclose other possibilities. Ethics is concerned with doing good as well as avoiding harm. Consideration of ethical and societal impacts in computing research thus includes both proactive research to create computing technologies that do (more) good and preventative work to anticipate, avoid, or mitigate harms. Doing good and avoiding harm interact: even if the intention is to design for some good, failure to consider the full context of use may risk harm. Consider, for example, that cell phone location tracking can allow family members to find wandering loved ones with dementia but also allow abusive spouses to find their victims. Scholarship in the social and behavioral sciences provides methods for identifying ways that technologies interact with and affect people, their interactions, and their communities. Ethical issues arise not only at the personal level, but also for communities, and computing research and technology 7 See, for example, Nissenbaum, H. 2001. “How computer systems embody values.” Computer 34, no. 3: 120– 129; Kling, R. 1978. “Value conflicts and social choice in electronic funds transfer system developments.” Communications of the ACM 21, no. 8: 642–657; and B. Friedman and D.G. Hendry. 2019. Value Sensitive Design Shaping Technology with Moral Imagination. The MIT Press. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 8

invariably impact not only particular people but groups and societies. For example, although social networks were envisioned as bringing people together, experience has shown that they can foster divisive environments. Insights and frameworks from the social sciences thus play key roles in understanding the nature of responsible computing research, and mechanisms to ensure it. Chapter 2 elaborates on these points and provides conceptual frameworks for considering ethical and societal issues related to computing research. 1.5 SOURCES OF ETHICAL AND SOCIETAL IMPACT CHALLENGES As discussed at greater length in Chapter 3, the ethical and societal impact challenges raised by computing technologies arise from various sources. For example, some challenges derive from people, and societies, and the functioning of governmental and other organizations—arising for example from conflicting values and goals of different stakeholders. Others reflect values and social structures that counted as normal or typical in the culture at earlier points in history but no longer apply. Still others result from the ways existing societal and institutional arrangements collect and use data and ways technologies are deployed. Others reflect externalities such as the environmental impacts of the energy consumed by computing systems. Some ethically or societally adverse outcomes are the result of insufficient engagement with users and other stakeholders and inadequate attention to existing social relationships or institutional structures and practices, or the lack of use of best practices in design. In almost all cases, these issues do not result from deliberately unethical behavior on the part of computing researchers, but rather from such factors as a lack of knowledge or misaligned incentives. In addition to such societally rooted challenges, there are challenges that arise in the process of implementation or deployment. For instance, mission or function creep may occur when a technology developed for one application is applied to a new problem or in a new context for which it is inadequate, inappropriate, or poorly considered. It is thus important when reporting research results for researchers to clearly present not only the contributions of their research, but also the contexts in which it was performed and in which the results were tested and the limitations of which those using it in those or other contexts should be aware. Researchers also need to take reasonable steps, including following best practices for design and systems development articulated in this report’s recommendations to anticipate other possible uses in their research and augment their work to address or at least identify potential concerns. Computing researchers have certain obligations with respect to these challenges; other obligations necessarily fall on others. Researchers’ responsibilities arise from their work being foundational—the first step in new technologies entering the world—and from their work having limitations it is important to identify and explain. In cases in which challenges stem from poor decisions by those deploying the technologies—for example, decisions that do not appropriately trade off considerations of efficiency or accountability with other social values—mitigating certain harms will involve technology businesses behaving differently and require governments to regulate. In such situations, researchers may still have a role to play as they could help illuminate trade-offs and limitations and champion (other) societal values. In many cases, this work will involve collaboration with ethicists and social and behavioral scientists. 1.6 A BRIEF HISTORY OF CONCERNS Attention to the ethical and societal impact challenges posed by computing technologies dates back to the earliest days of computers. One of the earliest works to identify societal and ethical issues is Norbert Wiener’s 1950 book Human Use of Human Beings: Cybernetics and Society, which drew attention to both the benefits to society of automation as well as the risks of overreliance. In 1972, SRI International researcher Charles Rosen, as part of proposing a research program to advance automation technology, called for productivity to be redefined to “include such major factors as the quality of life of workers and PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 9

the quality of products, consistent with the desires and expectations of the general public,”8 an early harbinger of concerns about the impacts of automation and computing culture that persist today. Another early example of attention to ethical issues is in Joseph Weizenbaum’s Computer Power and Human Reason: From Judgement to Calculation (1976), which distinguishes between deciding (something that can be computed) and choice (which requires judgment) and highlights such human qualities as compassion and wisdom that computers lack. Other fields have confronted burgeoning societal and ethical implications of their research during this span of time. For instance, in biomedicine, the 1975 Asilomar Conference grappled with the public health and ecological implications of the then-new recombinant DNA technology, following calls for a voluntary moratorium on its use. The conference concluded that research should proceed only under strict guidelines. A few years later, the 1979 Belmont Report9 from a U.S. national commission articulated three principles for protecting human subjects in biomedical behavioral research—respect for persons, beneficence, and justice. Starting in the late 1980s, the Human Genome Project of the National Institutes of Health and Department of Energy set aside 3 percent of its research budget for the study of the ethical, legal, and societal implications of the knowledge gained from the mapping and sequencing of the human genome. The interdisciplinary field of inquiry that would come to be known as science and technology studies (or science, technology, and society studies) also started to take shape during this period. Questioning technological determinism (the view that technological advances determine the development of cultural values and social structure), it emphasizes understanding the development of technology in its social and historical context. Subsequent scholarship emphasized that technologies, including computing technologies, should not be viewed as value neutral. Also observed decades ago were ways computing technology differed from prior technology revolutions. In a 1985 essay, James Moor cited the “logical malleability” of computers—they “can be shaped and molded to do any activity that can be characterized in terms of inputs, outputs, and connectivity logical operations”—and anticipated that “in the coming decades many human activities and social institutions will be transformed by computer technology and that this transforming effect of computerization will raise a wide range of issues for computer ethics.”10 The 1980s also saw the founding of Computer Professionals for Social Responsibility (CPSR), a nongovernmental organization that focused in its early days on the risks posed by growing use of software for military applications such as the Strategic Defense Initiative. CPSR’s agenda soon broadened to look at issues that remain salient today: privacy and civil liberties, participatory design in the workplace, election systems, and encryption policy.11 From the 1980s onward, the use of computing has evolved markedly from use only by experts to use by nearly everyone. The 1990s saw a shift from primarily individual use (outside of some workplaces and institutions that were early adopters of computer networks) to highly interconnected use in which people’s online activities connect with different people and systems. Further change came in the 2000s with the introduction of smartphones and other mobile technologies (which, with falling prices have spread around the globe) and the growing embedding of computing technologies into the physical world. In just a few decades, computing technology has become a primary means by which people interact, a primary source of functionality and value in engineered systems, and an underpinning of every sector of the economy. This radical change has engendered a whole range of new ethical and societal challenges. The development of the Internet and Web led to early concerns with and responses to abuse and manipulation of network communications. Researchers helped identify and combat with some success such attacks as network intrusions, email spam, phishing, advertising click spam, and Web spam. 8 Rosen, C.A. 1972. ACM ’72: Proceedings of the ACM annual conference, Volume 1, August, 47–57, https://doi.org/10.1145/800193.805821. 9 Department of Health, Education, and Welfare. “The Belmont Report.” https://www.hhs.gov/ohrp/regulations- and-policy/belmont-report/read-the-belmont-report/index.html. 10 Moor, J.H. 1985. “What Is Computer Ethics?” Metaphilosophy. Vol. 16, No. 4, October. 266–275. 11 Computer Professionals for Social Responsibility, 2005, “CPSR History.” http://cpsr.org/about/history/. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 10

However, these efforts arguably did not represent enough cumulative effort given the breadth and depth of impacts being experienced today. At the same time, for much of the history of computing, the public, policymakers, and members of the computing research community have for the most part tended to emphasize positive societal and economic impacts. Computing technology’s spread has raised new issues and spurred growing recognition of the ethical and societal impacts that arise from computing research and technologies. One manifestation of this change is that universities are exploring new ways of incorporating ethics into computing courses and curricula.12 Another is that civil society organizations, other outside observers, and even former employees are calling attention to value trade-offs being made by industry that they characterize as harmful to society. A third is a blossoming of efforts to address some of these challenges. The discussion in Chapters 2 to 4 include references to many of these efforts. Furthermore, the National Defense Authorization Act for Fiscal Year 2021 (P.L. 116-283) signals congressional interest in the ethical implications of research in AI. Section 5401 states that it is the sense of Congress that “(A) a number of emerging areas of research, including artificial intelligence, have potential ethical, social, safety, and security risks that might be apparent as early as the basic research stage and (B) the incorporation of ethical, social, safety, and security considerations into the research design and review process for federal awards may help mitigate potential harms before they happen.” In recent years, various governments, companies, and other institutions have adopted different sets of ethical principles. These signal awareness of these issues although many have not yet been backed up by concrete actions. 1.7 CHARACTERISTICS OF RESPONSIBLE COMPUTING IN LIGHT OF THE UBIQUITY OF COMPUTING TECHNOLOGIES For computing research to be responsible, it needs to be ethical and adhere to societal values and norms.13 As will be discussed further in Chapter 2, computing researchers are not free to choose norms— that is a societal prerogative—but need to be knowledgeable of them and take them into account in their research. Computing research must thus consider and take into account its potential societal impacts, especially now that computing technology is present throughout the daily life of individuals, communities (of work and of play), and society. Society also expects computing technologies to be trustworthy, transparent, and accessible and designed in ways that ensure that users can understand and control what the technologies are doing on their behalf. These expectations have become ever more important with the increased complexity and scale of today’s computing systems. One might think these expectations apply only to computing systems research but they apply as well to theoretical work. For example, choices made by researchers to improve the performance of a matching algorithm can raise significant societal impact and ethical questions when that algorithm is used, say to optimize kidney transplant organ exchange. Factors that might at first appear entirely in the realm of theory, such as the design of an objective function to improve efficiency might in the kidney transplant example favor individuals from some groups over others, notably depending on the way the algorithm resolves a tie. Indeed, ethical and societal impact questions are not just arising in theory papers. An entire 12 Artificial Intelligence Index Report 2021, Stanford University Human-Centered Artificial Intelligence, https://hai.stanford.edu/research/ai-index-2021, p. 134; Mozilla Foundation, Responsible Computer Science Challenge Winners. https://foundation.mozilla.org/en/what-we-fund/awards/responsible-computer-science- challenge/winners/. 13 Although some current norms may need to be changed, that is a societal responsibility. Computing researchers can, of course, with their research decide to support such changes. Note also that it is possible for societal norms to be unethical. Such problems are for society to sort out, but computing researchers can call attention to such conflicts. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 11

conference devoted to the topic, the Symposium on Foundations of Responsible Computing (FORC), has been held annually since 2020. There are many ways that computing researchers can take into account the broader context:  Sufficiently deep grounding in the intended application domains. Where computing research is concerned with use in a particular application or context, for example for use in such sectors as health care, education, or transportation, computing research will benefit from researchers at a minimum engaging with experts in that application area and potentially including such experts as part of the research team.  Serious consideration of potential uses and application domains beyond those originally contemplated, including both beneficial and problematic uses. Although it is not possible to predict all future uses, it is possible to increase the probability of finding the most likely ones by engaging with people knowledgeable about the original application domain and using well established design methods. For example, a predictive algorithm used in pretrial detention decisions may not be appropriate for making parole decisions. A related issue is to anticipate and warn against mission creep, where a technology developed for a narrow use is subsequently used in a much broader context (as in the use of cookies referenced above).  Engaging with or otherwise considering the perspectives of all the stakeholders in the intended context of use, including not only direct users but others who will be affected by its use and considering the power differences among the stakeholders including the researchers.  Collaborate with scholars who have expertise in the social and behavioral sciences, and the humanities, most notably in ethics and ethical reasoning. Relevant areas of expertise include the humanities, social and behavioral sciences, and ethical reasoning as well as any particular domain of intended use (e.g., healthcare) and possibly surrounding possible domains of use to help identify possible other deployment settings. The report and its recommendations do not anticipate that computing researchers will become scholars or experts in any of these fields or domains. Rather they can successfully incorporate such expertise into their projects by collaborating with people with expertise in these areas. Doing so effectively entails that computing researchers acquire knowledge in some areas of the social and behavioral sciences and humanities and also that humanities and social science scholars understand key computing concepts. Furthermore, to successfully incorporate the requisite expertise into computing research projects may not only require the involvement of new kinds of expertise beyond that traditionally involved in computing research but also new kinds of projects that effectively leverage this new expertise. The recommendations also describe steps research institutions and research sponsors need to take to facilitate and support such efforts. 1.8 STUDY APPROACH Responsibilities for computing technologies’ effects on people and society progress from research to product and service deployment. Some responsibilities rest with researchers, for example in how they scope and structure projects, the diversity of perspective and expertise they engage, and how they report research results including limitations or caveats. Other responsibilities rest with industry, for example in what technologies are deployed, how, and for whom. Still other responsibilities rest with the government, which has responsibility for setting policy objectives, writing legislation, incentivizing desired behaviors, and formulating needed regulations. The analysis and recommendations in this report are primarily aimed at the computing research ecosystem comprising computer researchers, the computing research community, the scientific and professional societies in which they participate, other scholarly publishers, the public and private sector agencies and organizations that sponsor computing research, and the public and private sector institutions PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 12

that perform computing research. The committee has attempted to formulate recommendations that will work for all research producing institutions—that is, for small colleges, public colleges and universities, and private universities and for industry and government research as well as academic organizations. They fashioned them with the understanding that the institutional resources for carrying out the recommendations will vary, and that some actors may need to step in to support others, such as scientific and professional societies assisting less-resourced institutions with accessing scholars with the requisite expertise on societal and ethical aspects of research proposals and activities. Because there is little empirical data on the effectiveness of any approaches to responsible computing research, these recommendations were developed primarily by considering leverage points in the research ecosystem, early efforts that appear promising, and expertise provided by social scientists and ethicists who served on the study committee and made presentations to the committee. Several recommendations also address the need for empirical evaluation and possible adaptation or revision of some recommendations based on experience implementing them. The recommendations are also designed for use by government funding agencies as well as industry and philanthropic research sponsors. In keeping with its statement of task, this report does not provide recommendations for government regulation of computing technologies including corporate computing research, but it does discuss ways that the computing research community can help inform government action in this space. The report describes various ways that addressing diversity, equity, and inclusion (DEI) in the computing research community is critical to ensuring responsible computing research. (The same holds true for the other disciplines and application domains involved in the research.) The committee considers DEI a cross-cutting issue, and discussions of it permeated the committee’s analysis and informed its recommendations. Although the report does not offer separate recommendations on DEI, these considerations are reflected throughout many of the recommendations. By defining practices by which more ethical and societal adverse outcomes problems can be caught than at present and better steps taken to mitigate or eliminate potential harms to individuals and society, the report’s recommendations aim to foster responsible computing. They cannot, of course, ensure that every potential ethical or societal problem in the computing research ecosystem will be recognized and addressed. In some cases, the recommendations constrain research explicitly and call for direct action by computing researchers, while in others they speak to the roles researchers have in assisting those deploying computing research outcomes and technologies to use them in ways that take into account their limitations as well as strengths. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 13

Next: 2 Theoretical Foundations from Ethical and Social Science Frameworks »
Fostering Responsible Computing Research: Foundations and Practices Get This Book
×
Buy Paperback | $40.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

With computing technologies increasingly woven into our society and infrastructure, it is vital for the computing research community to be able to address the ethical and societal challenges that can arise from the development of these technologies, from the erosion of personal privacy to the spread of false information.

Fostering Responsible Computing Research: Foundations and Practices presents best practices that funding agencies, academic organizations, and individual researchers can use to formulate and conduct computing research in a responsible manner. This report explores ethical issues in computing research as well as ways to promote responsible practices through education and training.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!