National Academies Press: OpenBook

Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues (2014)

Chapter: 5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues

« Previous: 4 Sources of ELSI Insight
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

5

An Analytical Framework for Identifying Ethical, Legal, and Societal Issues

This chapter presents a possible framework for identifying and assessing ethical, legal, and societal issues that may be associated with a given research effort. Derived from considering the sources of insight described in Chapter 4 and ELSI commonalities that appear in many of the technologies discussed in Chapters 2 and 3, the framework is an organized list of ELSI-related questions that decision makers could ask about the development of any technology or application. The framework has two equally important parts. The first part describes the parties that have a stake, either direct or indirect, in ethical, legal, and societal issues, and it poses questions that might be relevant to these stakeholders. The second part of the framework poses questions in relation to crosscutting themes that arise for many or all of these stakeholders. The chapter then illustrates a worked example of how the framework might be used in practice, and it puts the framework in context by considering its utility from a variety of perspectives. Note that the framework is offered as a starting point for discussion and is not intended to be comprehensive. It is useful primarily for raising ELSI concerns that might not otherwise have been apparent to decision makers.

The approach taken in this framework—posing questions that are useful to assessment of ethical, legal, and societal issues in the context of R&D on emerging and readily available (ERA) technologies that are relevant to national security and providing some discussion of why answers to these questions may be relevant—is similar to the approach described in the framework for assessment of information-based programs offered

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

in the 2008 National Research Council report Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Assessment.1 That framework was intended to help public officials charged with making decisions about the development, procurement, and use of information-based programs to determine the effectiveness of such programs in achieving their intended goals, consistent with national and societal values, compliant with the laws of the nation, and reflective of the values of society. The Government Accountability Office has made use of that framework in assessing a number of programs.2

5.1 STAKEHOLDERS

The first componment of the framework described in the present report is organized by stakeholder. That is, any given research project has a variety of stakeholders—parties that have an interest in the project because the project may, directly or indirectly, in the short term or in the long term, have a positive or negative impact on them. This report identifies as possible stakeholders in any research project those involved in or connected to the conduct of the research, the intended users of applications enabled by that research, adversaries against whom those applications may be directed, nonmilitary users of such applications, organizations, noncombatants, and other nations. Not all of these groups are necessarily stakeholders for any given research project or program, and an effort to identify the relevant stakeholder groups is therefore an essential part of any ELSI assessment.

In principle and in fact, ethical, legal, and societal issues affect many groups of stakeholders, many of which are described below. However, not every technology or application will touch the interests of every one of these stakeholders, and part of an analysis of ethical, legal, and societal issues for any given technology or application is to determine the relevant stakeholder groups. An additional analytical step is to determine how the interests of each of these groups should be weighed (e.g., equally or with some other weighting). The science of effective public participation is summarized by a recent National Research Council report.3

___________________

1 National Research Council, Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment, The National Academies Press, Washington D.C., 2008, available at http://www.nap.edu/catalog.php?record_id=12452.

2 For example, see Government Accountability Office, 9/11 Anniversary Observations on TSA’s Progress and Challenges in Strengthening Aviation Security, GAO-12-1024T, Washington, D.C., 2012, available at http://www.gao.gov/products/GAO-12-1024T.

3 Thomas Dietz and Paul C. Stern, eds., Public Participation in Environmental Assessment and Decision Making, The National Academies Press, Washington, D.C., 2008, available at www.nap.edu/catalog.php?record_id=12434.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

The sections below provide a brief description of stakeholder groups along with a number of ELSI-related questions that could apply to each group.

5.1.1 Those Involved in or Connected to the Conduct of Research

The conduct of research in many ERA technology and application domains raises ethical, legal, and societal issues that are most troublesome when the research itself affects humans, which may include human beings directly involved by deliberate intent in the R&D, human beings who are not directly involved in the R&D, and human beings affected through changes in the environment that may occur as the result of the R&D.

In addition, a variety of different impacts may need to be considered—direct and indirect impacts on physical, emotional, or psychological health and well-being; infringements on civil rights; economic status; and so on. For example, titration of a pharmaceutical agent to determine dose-response relationships is an essential element of research on such agents. In the context of incapacitating nonlethal weapons, titration is an issue in determining dosages that will incapacitate the largest percentage of individuals while still being simultaneously nonlethal to them. Mood-altering drugs may need to be tested to determine if they have long-term effects.

But the impact on operators and users of technology is relevant as well. Soldiers with prostheses that can enhance their function over normal human function or pilots of remotely piloted vehicles who execute their missions far away from immediate danger have a psychological relationship to their jobs different from that of soldiers who are not as privileged. Before widespread deployment of such technologies is contemplated, policy makers may wish to understand the psychological effects of such phenomena—raising the question of how such research might be conducted.

Matters such as the scope of populations to include as test subjects, the nature and duration of contemplated harms, and so on are well understood to be within the purview of mechanisms existing in the civilian sector for the protection of humans used as experimental subjects. For example, in testing incapacitants, the question of whether to include young children or the elderly or pregnant women in the test population would arise.

The Belmont report (described in Chapter 4) articulated three ethical principles that can be generalized to the conduct of most R&D: beneficence, respect for persons, and justice.4 The remainder of this subsection

___________________

4 The Belmont report can be found at http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

(Section 5.1.1) provides that generalization, and readers interested in the original analysis of the Belmont report should consult that source.

Beneficence

In the context of conducting R&D, the principle of beneficence suggests that the research effort should maximize the benefits and minimize the harms that result. Some key considerations include the following:

• What defines “benefit” and “harm”? Note that a risk of harm is not necessarily the same thing as harm. How can an R&D effort benefit or harm research subjects? The investigator? Society at large?

• When R&D is being conducted for applications that are intended to harm an adversary, how can the nature and extent of harm be ascertained in research? Note that there are many kinds of harm that may be at issue, as suggested in the previous question. Harm may include physical, mental, emotional, financial, and psychological harms.

• How do the definitions of “benefit” and “harm” differ when different stakeholders are involved? For example, different criteria may apply for individuals indirectly affected by a project and for those directly affected as research subjects.

• How should benefits and harms to different stakeholder groups be determined, aggregated, and compared?

• Learning what the benefits of an R&D effort may be sometimes requires exposing stakeholders to some harm or risk of harm. How should learning about possible benefits be weighed against actual or possible harm?

As the Belmont report stated, “The problem posed by these imperatives is to decide when it is justifiable to seek certain benefits despite the risks involved, and when the benefits should be foregone because of the risks.”

Respect for Persons

In the context of conducting R&D, the principle of respect for persons suggests that the effort should obtain voluntary informed consent from parties that are directly involved in such research and act in the best interests of parties that are not capable of providing such consent (e.g., those indirectly affected by the research). Some considerations are as follows:

• What constitutes genuine “informed consent” when information derived from possibly sensitive intelligence sources is part of a threat

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

assessment? For example, consider a research project to develop a vaccine against a particular biological agent. Specifics of the threat posed by the agent may well be derived from classified sources. How, if at all, is such information to be a part of any “informed consent” process?

• If parties directly involved in research related to a particular application are members of the U.S. armed forces, how and to what extent—if any—is there a conflict between their obligation to obey legal orders and their provision of informed consent on a voluntary basis? For example, Section 3 of Executive Order 12139 authorizes the President to waive informed consent for deployed military personnel for the administration of certain investigational drugs, provided that the President determines that obtaining consent is not feasible; is contrary to the best interests of the (service) member; or is not in the interests of national security.5 Have undue inducements been offered to persuade individuals to “volunteer”? What counts as an “undue” inducement?

• Who, if anyone, will speak for the best interests of parties that are not capable of providing informed consent? Almost by definition, such parties are not themselves capable of articulating their interests. For example, the parties may be physically or temporally distant—in other words, future persons—or those with environmental concerns may be affected by certain R&D efforts. How should such concerns be identified, assessed, and ultimately weighed?

Justice

The principle of justice suggests that the benefits and burdens associated with R&D should be fairly distributed. To paraphrase the Belmont report, injustice occurs if some benefit to which a person is entitled is denied improperly or when some burden is imposed unduly. Some considerations include the following:

• On what basis are specific parties or groups of parties selected for direct involvement in a research effort? For example, why is one group rather than another chosen to be the pool of research subjects? Why is one geographical location rather than another the choice for situating a potentially dangerous research facility?

• How and to what extent, if at all, do national security considerations demand that certain groups (e.g., warfighters) accept an exceptional or a higher level of risk than that accepted by or imposed on other groups (e.g., civilians)?

• How and to what extent, if at all, should new knowledge derived

___________________

5 See http://www.gpo.gov/fdsys/pkg/FR-1999-10-05/pdf/99-26078.pdf.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

from research be subject to restrictions on distribution? For example, should such knowledge be kept from certain allies or the rest of the world? Should it be restricted from public distribution? If so, why?

5.1.2 Users of an Application

Users are the parties that are intended to use an application—those who make decisions about how and when the application is deployed and operated in the field, and those who use it based on those decisions.

• What could be the nature of the impact, if any, on users of an application? For example, the extended use of a particular application may cause physical damage (e.g., it may require a user to sit at a keyboard for extended periods of time and thereby cause repetitive stress injuries) or psychological stress (e.g., a weapons operator may feel stress if the concept of operations is something with which he is morally uncomfortable).

• What could be the cumulative impact, if any, on users of an application? For example, the insertion of one prosthetic implant may not be harmful, but the insertion of multiple implants or the use of a certain implant with certain drugs may be harmful. By definition, cumulative effects will appear only when the application in question interacts with other components in the user’s environment or biology.

• What could be the long-term impact, if any, on users of an application? The short-term impact on a user may be benign, but over the long term, the impact may be harmful. Hearing loss due to repeated exposure to loud noises is an example of such a long-term impact. The history of Agent Orange provides an example of long-term consequences.6

5.1.3 Adversaries

Adversaries are parties against which an application might intentionally be directed or parties that might seek to harm U.S. interests. Adversaries are not “stakeholders” in the traditional sense understood in domestic policy matters—obviously, one does not seek adversary input or agreement on weapons intended to affect them, for example. Nonetheless, adversaries certainly are parties that a research project might affect, and

___________________

6 Agent Orange was a herbicide/defoliant used as a chemical weapon by the U.S. military during the Vietnam War which killed thousands and caused birth defects. See Le Cao Dai, Agent Orange in the Vietnam War: History and Consequences, Vietnam Red Cross Society, 2000. An Institute of Medicine (IOM) report addressing a number of ethical, legal, and societal issues related to Agent Orange is Institute of Medicine, Veterans and Agent Orange: Update 2010, The National Academies Press, Washington, D.C., 2011, available at www.nap.edu/catalog.php?record_id=13166.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

adversaries do have interests that the law of armed conflict requires all nations to take into account.

Thus, considering adversary reactions to the use of new military applications against them is an important part of a framework for assessment of ethical, legal, and societal issues. These reactions fall into at least three general categories:

Adversary acquisition of similar applications for their own uses. The successful use of any new military application of technology is an affirmative demonstration of its feasibility and value, and often carries much more weight with policy makers than any report or study regarding its utility. For example, Stuxnet was the first known operational use of cyber weapons to cause physical damage to infrastructure.7 The possibility and feasibility of such an attack were discussed in many reports on cybersecurity, but Stuxnet galvanized the policy community as never before. U.S. use of remotely piloted vehicles in Afghanistan and Iraq has conclusively demonstrated their value in many battlefield situations, and dozens of nations are today pursuing the development of such systems for their own use. Further, such pursuits may from time to time result in systems that are even more advanced than those available to the United States. A final relevant point is that in using such applications against the United States, adversaries may not feel constrained in their observance of the law of armed conflict, as, for example, when they use human shields.

Adversary development of countermeasures that negate or reduce the advantages afforded by new military applications. For example, the microwave-based Active Denial System can be countered through the use of aluminum foil to protect exposed areas of skin.8 In some cases, a remotely piloted vehicle can be “spoofed” into thinking that its location is a long way from where it actually is.9 For those cases in which countermeasures are relatively easy and inexpensive to develop, the wisdom of pursuing a given application may be questionable unless the primary value of the

___________________

7 The Stuxnet computer worm, first discovered in June 2010, was aimed at disrupting the operation of Iran’s uranium enrichment facilities. See http://topics.nytimes.com/top/reference/timestopics/subjects/c/computer_malware/stuxnet/index.html.

8 The Active Denial System (ADS) is a directed-energy nonlethal weapon first developed in the mid-2000s and designed for keeping humans out of certain areas. The ADS aims a beam of microwave energy at a target such as a human being, thus causing an intense burning sensation on the human’s skin. However, because the beam does not penetrate very far into the skin, it causes little lasting damage (no lasting damage in nearly all cases). The pain is intended to cause the human to turn away and flee the area.

9 Daniel P. Shepard, Jahshan A. Bhatti, and Todd E. Humphreys, “Drone Hack: Spoofing Attack Demonstration on a Civilian Unmanned Aerial Vehicle,” GPS World, August 1, 2012, pp. 30-33, available at http://radionavlab.ae.utexas.edu/images/stories/files/papers/drone_hack_shepard.pdf.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

application can be realized before countermeasures emerge. Indeed, ethical, legal, and societal issues may arise without the hoped-for benefits of an application ever having been realized.

Adversary perceptions of a military application’s uses against them. The possible emotional and psychological reactions of an adversary to an application’s use span a wide range. At one end, an adversary may be so discouraged by the use of a very potent application that he simply loses the will to continue engaging in conflict. At the other end, an adversary may be so outraged and incensed by the use of a very potent application that he redoubles his hostile efforts and recruits others to his cause—such outcomes are made more likely when the use of such an application has caused nonnegligible collateral damage.

Some questions that arise from these kinds of adversary reactions include the following:

• What is the nature of the direct impact, if any, of use of an application against adversaries? Not all applications have a direct negative impact against adversaries—examples might include better battlefield medical care and sources of alternative fuel.

• How and to what extent can the application’s impact be reversed?

• How do considerations of symmetry apply? That is, what are the ELSI implications of an adversary pursuing the same technology development path as the United States? For example:

—Under what circumstances, if any, would an adversary’s use of the same application against the United States, its allies, or its interests be regarded as unethical?

—Assuming that the United States is conducting R&D on application X, how would the United States interpret the intentions of an adversary conducting similar research?

• In the long term, what is the impact of an application on adversary behavior and perceptions?

—How and to what extent could an adversary develop similar capabilities? What is the time scale on which an adversary could do so? How could an adversary use these capabilities? What advantages could an adversary gain from using these capabilities free of legal and ethical constraints?

—How do the benefits to the United States of pursuing a particular application unilaterally compare to the potential losses should an adversary develop similar applications in the future?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

—What countermeasures might an adversary take to negate the advantages conferred by the application in question? How long would it take for the adversary to obtain those countermeasures? How, if at all, could the developed countermeasure be worse in some way from an ethical standpoint than the application itself?

—How could the application affect the adversary’s perception of the United States? For example, the application might instill a fear in the adversary that would inhibit the adversary from taking action against the United States, or it might instill a resentment or hatred that might inspire still others to take additional action against the United States.

—What, if any, could be the application’s effect on deterrence? Note that the United States justifies nearly all military programs by their (putatively) enhancing effects on deterrence. But adversaries may not necessarily see U.S. military R&D activities in the same light, and in fact may initiate their own similar program because the United States appears to be seeking a technological advantage.

—What effect, if any, could U.S. restraint in pursuing a particular application have on inducing an adversary to exercise similar restraint? A relevant precedent is the ban on assassinations promulgated by Executive Order 12333.10 The original rationale for this ban was the concern that in its absence, assassinations of U.S. political leaders would be legitimized.

—What, if any, opportunities for adversary propaganda could an application enable or facilitate? For example, how, if at all, could an adversary be able to point to a U.S. program as indicative of an immoral, unethical, and hostile stance toward it?

5.1.4 Nonmilitary Users

Military applications also sometimes have value to nonmilitary users. Changing the problem domain from a military to a civilian one can and often does raise other ethical and societal issues. Three of the most prominent nonmilitary problem domains are those of law enforcement, commerce, and the general public.

Law Enforcement

From a technical standpoint, many of the problems facing law enforcement have military or other national security counterparts. Such problems include those of personal protection, surveillance, and intelligence analy-

___________________

10 See http://www.archives.gov/federal-register/codification/executive-order/12333.html.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

sis. But law enforcement authorities, at least in the United States, operate under an entirely different legal regime than do military or other national security authorities, one premise of which is that residents of the United States enjoy certain rights that other groups (e.g., enemy combatants) do not have. For example, the U.S. military is legally permitted to participate in domestic law enforcement operations only at the request of civilian law enforcement authorities. Thus, a relevant question is the following:

• If the military application in question were deployed to support law enforcement operations, how and to what extent, if any, could such deployment raise ethical, legal, and societal issues that do not arise in the military context? Possible differences include the different legal authorities provided in Title 18, Title 10, and Title 50 of the U.S. Code (dealing with criminal law enforcement, military, and intelligence affairs, respectively), and possible restrictions imposed by the U.S. Constitution on the U.S. government acting domestically.

Commerce

Technologies developed for military applications sometimes have commercial and economic relevance. A good example is the evolution of packet-switched communications, originally developed by the U.S. Air Force to enhance the survivability of military communications networks,11 into the ARPANET (supported by DARPA) and then the Internet. Again, commerce in the private sector is a different problem domain and thus raises different ethical issues. A relevant question is the following:

• How and to what extent, if any, could a commercial adaptation of a military application raise ethical, legal, and societal issues that do not arise in the military context? Such issues might include issues of access (which commercial companies might profit from government efforts to develop the application), accountability (public accountability regimes of private-sector companies differ from those of the government), and possible adoption of technologies by adversaries after commercialization (such uses may be different from adoption as described above).

The General Public

Technologies developed for national security applications sometimes can be adapted for use by ordinary citizens, uses both good and bad. For

___________________

11 Paul Baran, “On Distributed Communications: Summary Overview,” RM-3767-PR, Rand Corporation, Santa Monica, Calif., August 1964.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

example, it is possible today to purchase over-the-counter a remotely piloted aircraft for a few hundred dollars. Controlled via Wi-Fi, this airframe—called a quadricopter—can stay aloft for about 20 minutes and has an onboard video camera whose uses are limited only by the operator’s imagination. A relevant question is thus:

• How and to what extent could adaptations of a military application be used by ordinary citizens? What are the ELSI implications of such use?

5.1.5 Organizations

For the U.S. armed forces, military applications of technology do not exist in a vacuum. The introduction of new technologies into military organizations often has a significant impact on the practices, procedures, and lines of authority embedded in those organizations. Individuals make decisions about deployment and use, and these individuals are themselves embedded in organizations and are thus affected by the structure and culture of those organizations. Organizational structure and culture are the foundations of accountability and chains of command, and affect matters such as promotion, respect, levels of cooperation between units, and influence within a hierarchy. Organizations determine rules of engagement and other orders that specify the conditions under which various applications may be used.

For example, the significance of cyber conflict (in both its offensive and defensive aspects) has led the Department of Defense to establish Cyber Command, an entirely new element of U.S. Strategic Command and likely to become its own combatant command co-equal to other combatant commands. The U.S. Air Force is reorganizing itself to accommodate a large influx of pilots for remotely piloted vehicles, and such reorganization will inevitably have an impact on the Air Force’s organizational culture.

Introducing new technology that affords new capabilities often affects the assumptions on which an organization is structured, and thus may have implications for the organization. Relevant questions may include the following:

• How and to what extent, if at all, could a new military application influence or change traditional structures and mechanisms of accountability and responsibility for its use? For example, some applications are intended to drive certain kinds of battlefield decision making to lower ranks in the military hierarchy. How will the organization react to such tendencies? How, if at all, will accountability for the use of the application in question be maintained? Conversely, might the application make it less likely for someone in the lower ranks to raise questions about ethical use?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

• Military organizations often place great value on personal bravery in combat. How and to what extent, if at all, could a technological application used in combat change such valuation?

• Promotions in many military organizations are sometimes based on command opportunities. How and to what extent, if any, could an application change command structures? For example, will piloting a remotely piloted vehicle confer the same cachet and status as piloting a crewed air vehicle?

5.1.6 Noncombatants

Noncombatants are those who do not participate directly in hostilities, and they include bystanders on the battlefield, family members of combatants, civilians in nonbattlefield areas, civilians who may be affected by environmental damage, personnel from nongovernmental organizations, and future generations.

Questions relevant for noncombatants as stakeholders in the use of new military applications may include the following:

• How and to what extent could an application affect noncombatants on and off the battlefield? Although it is true that a weapon can be used in ways that cause excessive collateral damage and other ways that do not, a weapon that is inherently incapable of discriminating between combatants and noncombatants may well raise ethical, legal, and societal issues. This question is routinely asked in the Department of Defense laws-of-war review of new weapons systems, described in Chapter 7.

• How might the public at large perceive a given application? As noted in passing in Chapter 3, the Martens clause of the Geneva Conventions prohibits the use of weapons whose use violates “the principles of humanity” and the “dictates of public conscience,” even if the precise meaning of this clause in the case of any given weapon is not necessarily clear. In so doing, this clause in principle gives standing to the public at large to object to the use of such weapons.

• How and to what extent could an application affect future generations? For example, could operating an application cause genetic changes in users? And what might be the effects of such operation on those targeted by the application?

• How and to what extent could the operation of an application—especially large-scale operations—harm the environment? An illustration is that the use of depleted uranium in ammunition in large quantities may have significant radiation effects on the environment in which it is used, thus potentially placing in danger individuals present in that environment now or sometime in the future.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

5.1.7 Other Nations

The behavior of the United States can affect perceptions and attitudes in other nations. For example, long-term allies who tend to share U.S. values may nevertheless disagree over the ethics of certain military applications, as noted in Section 1.6 (“What Is and Is Not Within the Scope of This Report”). Relevant questions may include the following:

• What, if any, could be the impact of a new military technology or application on political solidarity with the United States?

• How, if at all, could the technology or application raise questions about the strength of U.S. commitments to other nations or allies?

• What could be the impact, if any, on U.S. reluctance to share a technology or application with its allies?

• How, if at all, could a technology or application affect the willingness of allies to participate in coalition efforts with the United States if the latter uses this technology?

In addition to long-term allies, the United States must also consider its relationships with allies of convenience and non-aligned nations. Some relevant questions include:

• How and to what extent, if any, could U.S. restraint in pursuing a new military application induce other nations to exercise similar restraint?

• How and to what extent, if any, could an application help to compromise human rights if used by another nation on its own citizens?

5.2 CROSSCUTTING THEMES

The second component of the framework described in this chapter is a set of themes that cut across different stakeholder groups. That is, in some cases, similar ethical, legal, and societal issues appear in considering the perspectives of a number of stakeholders. This report identifies as crosscutting themes issues related to scale, humanity, technological imperfections, unanticipated military uses, crossovers to civilian use, changing ethical standards, ELSI considerations in a classified environment, and opportunity costs. Last, the sources of insight from Chapter 4 suggest other themes that from time to time cut across different stakeholder groups.

5.2.1 Scale

Against the backdrop of stakeholder concerns described above, it is helpful to keep in mind several dimensions of scale in thinking about

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

ethical, legal, and societal issues both in research and in its applications and their use.

Societal Scope

In principle, an application might have an impact on a specific military situation, on the military as an institution, on the ways in which conflicts are prosecuted, on specific parts of society, or on large segments of society. Relevant questions regarding scope may include the following:

• How and to what extent, if any, could a change in the scale of deployment or use of a technology or application change an ethical calculation? For example, if an application provides value to one soldier, does the overall value or risk of the application increase if that application is used by many soldiers?

• How and to what extent, if any, are the costs of using a particular application transferred from its immediate users to other entities? Economics gives the label “externalities” to describe situations in which such costs, which may include costs that go beyond immediate financial costs alone, are indeed transferred in this manner—with the result that the immediate users do not bear the full costs of their activities and therefore tend to overuse the resource in question.

• If an application becomes successful because of the increased functionality it affords to its users, and such functionality becomes essential for individuals participating in society, how and to what extent, if any, can the costs of obtaining an essential application be made broadly affordable so that all individuals can obtain its benefits equally? This question is especially relevant to military applications that turn out to have civilian utility, as might be the case for advanced prosthetic limbs originally designed to serve the medical needs of soldiers injured in battle.

Degree of Harm

The degree of inadvertent or undesirable harm associated with an application may span a very broad range. Some research on or uses of one application may cause only minor and unmemorable inconvenience to those affected, whereas other research or uses involving another application may kill people or destroy property on a large scale. How and to what extent, if any, does the degree of inadvertent or undesirable harm compare to the benefits obtained from using that application?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

The Nature of the Activity

From research to use is a very long path, and different ethical, legal, and societal issues arise when an activity is considered basic research, applied research, development, testing, deployment, and use.

• How does the scale of ethical, legal, and societal issues differ along the continuum from basic research to use of an application? How do the stakeholders and their interests change?

Timing Considerations

• What are the ELSI considerations in weighing short-term benefits against long-term costs and how does the scale of such benefits and costs affect these considerations?

5.2.2 Humanity

Many ethical issues raised by new technologies or applications revolve around what it means to be human. In some ways, this should not be surprising—the very purpose of tools (that is, technology) is to extend the abilities of humans. Today, the technologies of reading and writing are taken for granted as part of human existence, but Socrates noted around 370 BC that:

… this discovery of yours [writing] will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.12

In a military context, many applications of technology pose issues related to extending human capabilities. Prostheses could be developed to enhance human functions—physical functions such as lifting strength and running speed and sensory functions such as night vision and enhanced smell. Advances in neuroscience might be able to help soldiers process information more quickly, operate equipment through a direct brain-machine interface, and remember more information, or they might enable the creation of false human memories or make it possible to induce differ-

___________________

12 Alexander Nehamas and Paul Woodruff, eds., Plato’s Phaedrus,” Hackett Publishing Company, Indianapolis, Ind., 1995.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

ent emotional states (e.g., reduced or increased fear, feelings of anger or calm). Information technology underlies increasing automation of many functions previously delegated to people,13 but today and more so in the future, computers may make decisions that have traditionally been made by responsible humans in positions of authority.14

More broadly, responsible stewardship for the humanity of the soldiers that the nation asks to go to war is an important concern for policy makers—and the technology of war may have an impact on that sense of humanity. Military psychologist David Grossman argues that the act of killing has important psychological effects on individuals that may affect their sense of humanity.15 Philosopher Shannon French argues that soldiers live by a “warrior’s code”—a code of values—about what is right and wrong in combat, and that this code is the shield that guards their humanity.16 She further argues that an individual’s sense of humanity and sense of himself or herself is endangered by, among other things, excessive distancing in war (e.g., the use of drones), dehumanization of the enemy, and the erosion of traditional warrior values.

In other words, asking soldiers to violate their code of values, explicitly or implicitly, and thus to act unethically is inherently harmful to these soldiers—not physically, but psychologically. In particular, asking soldiers to use weapons in an unethical manner or to use weapons that violate a soldier’s sense of his or her obligations under the code may make it harder to reconcile their actions with their values, and may ultimately impede their healthy transition out of combat and back into civilian life.

Questions relevant to concerns about technologies’ effects on individuals’ sense of humanity may include the following:

• How and to what extent, if at all, does a new military application

___________________

13 For example, the World War II Baltimore-class cruiser (CA-68) displaced 13,600 tons and carried a crew ranging from 1650 to 1950 individuals. By contrast, the planned DDX Zumwalt-class destroyer (DDG-1000) is expected to displace approximately 14,500 tons and carry a crew of 140 individuals.

14 A critique of the idea that computers might replace human judges, for example, is found in Joseph Weizenbaum, Computer Power and Human Reason: From Judgment to Calculation, W.H. Freeman, San Francisco, 1976. A paper by law professor Anthony D’Amato advocates exactly this idea. See Anthony D’Amato, “Can/Should Computers Replace Judges?” Northwestern University School of Law, Evanston, Ill., 1977, available at http://scholarlycommons.law.northwestern.edu/cgi/viewcontent.cgi?article=1128&context=facultyworkingpapers.

15 Dave Grossman, On Killing: The Psychological Cost of Learning to Kill in War and Society, Little, Brown and Company, Boston, 1995 (hardback), 1996 (paperback, in 18th printing as of 2008).

16 Shannon French, Code of the Warrior: Exploring Warrior Values Past and Present, Rowman and Littlefield Publishers, Inc., Lanham, Md., 2003.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

compromise something essential about being human? How and to what extent, if at all, might users believe that the application is unethical?

• How and to what extent, if at all, is the application invasive of the human body or mind? This question applies both to users and adversaries, although perhaps in different ways.

• How and to what extent, if at all, could use of an application tread on religiously or culturally sensitive issues (e.g., notions of “playing God” or using animal parts in humans (aka the “ick” factor))?

• Does a technology threaten to cede control of combat capabilities to nonhuman systems to an unacceptable degree?

5.2.3 Technological Imperfections

The first operational use of a military application almost never marks the end of development work on that application. First generations of an application are refined in subsequent iterations to address flaws in the application’s design and/or implementation that become apparent through operational use and to improve its capabilities above and beyond those afforded by the first generation.

Technological imperfections raise ethical, legal, and societal issues for a number of reasons. Under the pressure of delivering a potentially important new capability, applications developers may make choices that provide less safety, reliability, or controllability than they can with subsequent generations—and ethical, legal, and societal issues arise when an application affords less safety, reliability, or controllability than it could afford.

A different take on technological imperfection is that sometimes a technology’s potential is limited by exogenous factors. For example, nearly all “nonlethal” weapons can be lethal under some circumstances. This can be the case because, for example, the maximum “sublethal” dose (of energy or a chemical substance, for example) can vary from individual to individual owing to differences in physiology (so that what is sublethal for one individual is lethal for another) or because one individual is exposed to a given weapon more than another individual in a particular situation (e.g., he or she is closer to a weapon than someone else). Recognizing this point, many analysts use the term “less lethal” weapons. Whether genuinely nonlethal weapons can be developed (the paradigm of which is the “stun” setting on a Star Trek phaser) remains to be seen, but no plausible mechanisms have been identified to date that would underlie a nonlethal weapon’s operation.

To the extent that an application depends on information technology, the reality of all complex software is that it is flawed in some way. Flaws in the software may have an impact on the behavior of an autonomous sys-

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

tem, and the resulting behavior may raise ethical, legal, and societal issues that properly written software would not raise. Such flaws may raise safety issues related to improper operation under certain circumstances.

An analogy exists with embedded control systems in automobiles and/or commercial transport aircraft. Vendors of these control systems make use of extensive quality control and testing measures during design and prototyping, but in neither case are the resulting systems flawless. Still, they are “safe enough” to meet the safety requirements for the corresponding applications. In other words, the alignment of design quality and assessment has happened, partly because the alignment process has been underway for decades. More generally, safety issues become resolved by a lengthy process of trial and error, adjustment, societal adaptation, and the like.

Standards of performance are also inherently social in nature. For example, the stated performance safety requirements of an application (e.g., that a new weapon must have at most a 1-in-109 chance of harming its operator when it is fired) reflect not only technical inputs that determine what is possible but also economic and ethical judgments about how much safety can be obtained for different levels of expenditure.

Questions relevant to technological imperfections as they affect applications might include the following:

• Who decides the appropriate safety requirements associated with a new application?

• On what basis are such decisions made?

• What, if any, are the tradeoffs between an application’s functionality or use and the safety requirements imposed on it?

5.2.4 Unanticipated Military Uses

An application’s concept of operation is an articulation of how an application is expected to be used. But, of course, these expectations may not include all possible modes of usage for that application. In other words, there are generally a variety of unintended uses of an application that are not explicitly sanctioned by its proponents and that go beyond the stated concepts of operation.

For example, the discussion of nonlethal weapons in Chapter 3 suggests the possibility that nonlethal weapons could be used as a means for torture. Although such use is not part of the stated concepts of operation for these weapons, such unintended uses—if known—are properly part of an ELSI analysis of an application.

A relevant question may be the following:

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

• What military uses are possible for the application or technology in question that go beyond the stated concepts of operation? What are the ELSI implications of such uses?

5.2.5 Crossovers to Civilian Use

One obvious possibility for civilian use of new capabilities enabled by various emerging and readily available technologies is in law enforcement and domestic security. For example, some law enforcement authorities have argued for the use of certain autonomous systems (e.g., drones for surveillance, bomb disposal robots) and certain nonlethal weapons (e.g., tasers, dazzling lasers). When such resources are controlled by the Defense Department, their use by law enforcement authorities is limited by the Posse Comitatus Act (codified at 18 USC 1385), which prohibits the U.S. armed forces from taking part in domestic law enforcement, unless such actions are explicitly authorized by statute or the U.S. Constitution. (For example, 10 USC 371-381 of the U.S. Code explicitly allows the Department of Defense to provide federal, state, and local police with information, equipment, and training/expertise.) But law enforcement may not need to rely on DOD resources to gain access to the capabilities they afford. Indeed, vendors may well approach law enforcement authorities with proposals to sell versions of military applications customized for law enforcement purposes.

In any event, the civilian law enforcement use of an application originally intended to operate in a military context generally calls forth a different set of ELSI considerations. For example, legal restrictions on “unreasonable search” apply to the use of drones for law enforcement surveillance, but they do not apply in a military context. Rules of engagement for nonlethal weapons in a law enforcement context are very different from those that apply in a military context (for example, law enforcement officials are not generally given orders to “shoot on sight”). Onion routing and TOR—anonymizing technology developed by the Office of Naval Research—are used to advance U.S. foreign policy interests (such technology facilitates untraceable communications, which can be used by dissidents living under nondemocratic regimes).17 But onion routing has also been used to conceal criminal activity in democratic nations as well.

Civilian applications of existing military applications are potentially broader than their use for law enforcement. More speculative applications of use for law enforcement include various neuroscience-based applications (e.g., functional magnetic resonance imaging) to detect deception. Nor need the applications be confined to law enforcement. In a health

___________________

17 “Onion Routing,” Onion-Router.net, available at http://www.onion-router.net.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

care context, physicians and others seek better prostheses to replace human functionality lost to accident or injury. Transportation companies (e.g., moving companies, taxi companies) may be able to use capabilities recently developed by DARPA for automated vehicle driving—a point suggesting that such technology has the potential to displace many jobs previously thought to require humans. Ordinary citizens could use inexpensive drone technology to follow children or to gather intelligence on spousal affairs.

All of these nonmilitary applications raise ethical, legal, and societal issues that do not appear in a military context, even if the technology needs only relatively minor changes in transitioning from military to civilian uses. For example, sophisticated prostheses generally provide more capability and thus cost more; increased costs are easier to accommodate politically when they support soldiers wounded in action than when they may be used to support civilians injured in the course of everyday life.

Such considerations related to civilian use of military applications raise the following questions:

• How and to what extent, if any, could civilian-oriented adaptations of military applications made widely available to citizens raise ethical and societal issues that do not arise in the military context? Consider also that “civilians” include both those using civilian adaptations and those who might be injured by such use.

• How fast should such military-to-civilian transfers of applications be made? What safeguards should be put into place before they are made? How should such safeguards vary with the technology involved?

5.2.6 Changing Ethical Standards

The use of a particular technology may well change the ethical standards associated with the problem being solved. For any given dimension of performance, is it sufficient that the standards for autonomous systems call for performance equal (on average) to what humans can do? Or should such systems be held to a much higher standard, perhaps a standard of near-perfection? Although the first (weaker) standard is an instance of technology not diminishing the degree of ethical behavior on the battlefield, it is also true that an ethically questionable action involving new technology will be subject to a high level of scrutiny and criticism, and this may be true even if the technology has built up a long record of ethically appropriate performance.

Thus, issues of legitimate expectations, due care, and reasonable control become relevant. The core recognition is that society expects that its institutions and their experts will field systems that include adequate

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

control and continuing due care. Society has a minimum standard for control and due care, even if it is not articulated explicitly, the violation of which results in public disapproval. Certain kinds of technology or application are of more and deeper concern for the public, and thus face more stringent scrutiny.

A second ethical standard—legally expressed in the laws of war—may also be affected by the availability of new weapons. One aspect of this issue is discussed in Chapter 4 under the general heading of force being a measure of last resort under principles of jus ad bellum—the availability of weapons that reduce the risks to decision makers may increase the likelihood of those decision makers deciding to use force.

A second aspect of this issue is that many new weapons are designed to be significantly more discriminating than those of earlier generations. Using such weapons is likely to produce less collateral damage, an outcome that serves the goals of the jus in bello law of armed conflict. At the same time, their availability may, under the principle of necessity, create obligations—ethical if not legal—to use such weapons rather than weapons that may be less discriminating. Many militaries reject such obligations, but militaries are only one of the stakeholders involved in ethical discussions.18

Such considerations regarding ethical standards raise the following questions:

• If an application is intended to address a military issue that previously had to be addressed by humans, what is the minimum standard of performance that the application must meet before it is deemed acceptable for widespread use?

• How and to what extent, if any, does a new application create new ethical obligations to use it in preference to older applications addressing similar problems that may raise ELSI concerns to a greater extent?

5.2.7 ELSI Considerations in a Classified Environment

Basic science is not tied to specific applications and is therefore usually unclassified, even if it is supported by the Department of Defense. But as a technology development gets closer to specific applications with military utility, the likelihood increases that such work will become clas-

___________________

18 Weapons in this category include precision-guided munitions (e.g., “smart” air-to-surface bombs that can be remotely guided with high accuracy), low-yield nuclear weapons for use against hard or deeply buried targets, “smart” antipersonnel land mines that self-destruct or self-neutralize after a predetermined period of time, various nonlethal weapons, armed drones, and cyber weapons. See, for example, David Koplow, Death by Moderation: The U.S. Military’s Quest for Useable Weapons, Cambridge University Press, Cambridge, 2009.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

sified, at least in certain instances. At the same time, it is often only in the context of specific applications that certain kinds of ethical, legal, and societal issues arise.

This juxtaposition raises a dilemma that is unique to environments in which classified research is conducted—how to coordinate research in such environments when there may be different levels of secrecy associated with the research, and how to establish effective ELSI oversight in these environments. Staying abreast of developments and the associated benefits and risks can also be difficult for policy makers.

For example, certain neuroscience research is classified—and as noted in Chapter 2, even discussion of the ethics underlying such research may be classified. How can such work be reviewed? What is the appeals process for challenging classification designations that may have been assigned inappropriately?

Compartmented (special access) research is especially problematic. The DOD defines a special access program (SAP) as one involving “a specific class of classified information that imposes safeguarding and access requirements that exceed those normally required for information at the same classification level.”19 Access to SAPs for an agency’s senior leadership is an essential element of agency oversight, which would be diminished if “freelancing” on the part of agency staff is permitted.

Classified research raises a variety of ethical, legal, and societal issues. By assumption, classification means that the research in question is not subject to peer review from the broad scientific community, and thus those doing the research cannot benefit from input and criticism from that broad community. Limiting such input increases the likelihood that erroneous or incomplete results obtained in classified research will not be identified as quickly. Furthermore, classified research is often not directly relevant to helping the broader community address its research problems.

Other issues arise with restrictions on the involvement of non-U.S. citizens, such as foreign students in U.S. universities. Such restrictions generally arise with U.S. export control laws, which can regard the education of certain foreign students in certain disciplines as comparable to the actual export of technologies associated with those disciplines. Again, such restrictions prevent the research from benefiting from the largest possible talent pool.

These comments are not meant to imply that classified research is unnecessary or somehow wrong simply by virtue of its classification. But they do point out that there are downsides to classified research that must be taken into account in supporting such work.

___________________

19 See http://www.dtic.mil/whs/directives/corres/pdf/520507p.pdf.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Some questions that might be asked of classified programs include the following:

• How can research in a classified environment be reviewed for ELSI purposes?

• What is the appeals process for challenging classification designations that may have been assigned inappropriately? This question assumes that someone with the appropriate clearance would initiate the appeal. Otherwise, one could not know enough to make the challenge.

5.2.8 Opportunity Costs

Opportunity cost acknowledges the reality that resources (time, talent, money) are finite, and that not all valuable R&D can be conducted. Some considerations include the following:

• How should the value of an R&D effort be ascertained? Some approaches to valuation are quantitative and easily understood—how many lives or how many dollars might be saved by the effort in question? In other cases, it is not clear how to assign or to calculate value—how valuable is an increase in the probability that a given terrorist plot might be detected?

• Why is the R&D effort proposed more valuable than another effort whose cost and likelihood of success are comparable? For example, a program to reduce the cost of generating electricity for deployed forces at the end of a long supply chain may have to be weighed against a program to lighten the weight of body armor. On what basis should one program be chosen over another?

• How and to what extent does U.S. military effort in a selected R&D problem domain signal to adversaries that this domain may be a promising one for military applications?

5.2.9 Sources of Insight from Chapter 4

Chapter 4 describes a number of different possible sources of ELSI insight relevant to considering the impact of R&D on new technologies in a military context, including philosophical ethics and various disciplinary approaches to ethics; international law; sociology, anthropology, and psychology; scientific framing of research problems; the precautionary principle and cost-benefit analysis; and risk communication. Depending on the particular research effort at hand, one or more of these sources of insight might be regarded as a crosscutting approach offering questions relevant to decision making about how to proceed.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

5.3 AN EXAMPLE OF USING THE FRAMEWORK

To show how the framework described above might be used in practice, the committee starts with a hypothetical scenario involving the need for decision making about R&D in a military context and then illustrates how the ELSI-related questions and thematic concerns raised in the sections above on stakeholders (Section 5.1) and crosscutting themes (Section 5.2) might apply.

5.3.1 A Hypothetical Scenario for Analysis

A hypothetical research scenario is as follows:

Josie Director has received a preliminary inquiry, with some supporting and confidential data, from a well-respected researcher who specializes in enhancing work performance in high-stress situations. The researcher believes that the data demonstrate that the drug compound he is testing will enable persons to stay awake and on task for up to a week—7 days and nights. The data show no detrimental effects once the administration of the drug ends. Director’s research portfolio focuses on performance enhancement under extremely stressful battlefield conditions. She has recently been involved in a high-level meeting where operations indicated the need for an intervention that could improve the capability of small groups of troops to fight and hold on in difficult terrain where reinforcements are not available.

How might Director respond to the researcher’s preliminary inquiry?

5.3.2 A Process for Identifying Ethical, Legal, and Societal Issues

The underlying premise of this report is that Josie Director is a scientifically knowledgeable and well-intentioned research director within a DOD science agency who is motivated to advance the frontiers of knowledge in the interests of U.S. national security and is also concerned about the ELSI dimensions of the work that she supports.

In this context, the framework offered in this report provides advice for Director if she wants to explore the latter concerns. The operative question is the extent and nature of the commonality, if any, between the technology in question and its application and other technologies raising ELSI concerns. Such characteristics include technological complexities and uncertainties as well as societal sensitivities about the technology and its application. There is considerable complexity and uncertainty with respect to the development and use of this technology, and about its long-term implications. Societal groups may be sensitive about administration of this type of medication in the circumstances in which it would be used, as well as about its likely broader penetration in society. These factors

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

mean that ethical principles and expertise from a number of sources, including social science expertise, may be helpful in formulating a course of action.

At the highest level of abstraction, advice in the National Academy of Engineering’s workshop report Ethics Education and Scientific and Engineering Research: What’s Been Learned? What Should Be Done?20 provides some general top-level guidance for what it means to “consider the ethics of doing scientific research.” That report identifies a number of useful steps:

• Framing the problem, including ethical dimensions and issues; recognizing it is an iterative process;

• Soliciting advice and opinions in the problem development phase and throughout the process as needed; developing communications strategies;

• Identifying relevant stakeholders and socio-technical systems; collecting relevant data about them;

• Understanding and evaluating relevant stakeholder perspectives;

• Identifying value conflicts;

• Constructing viable alternative courses of action or solutions and identifying constraints;

• Assessing alternatives in terms of consequences, public defensibility, institutional barriers, and so on;

• Engaging in reasoned dialogue or negotiations; and

• Revising options, plans, or actions.

Depending on the stakes and the nature of the decision required of Director, she might pursue these steps to varying degrees. For example, while she would not want to encourage submitting a proposal highly unlikely to be funded, a decision regarding whether to encourage a researcher to submit a full proposal has lower stakes than one regarding whether to fund the proposal. Because it is assumed that Director is a well-intentioned manager who wishes to proceed in an ELSI-responsible manner, she will use her best judgment about how far to carry any of these steps and whether or not any of these steps can be carried out in parallel with a decision or must be executed serially before a decision is made.

___________________

20 National Academy of Engineering, Ethics Education and Scientific and Engineering Research: What’s Been Learned? What Should Be Done? Summary of a Workshop, Rachelle Hollander and Carol R. Arenberg, eds., The National Academies Press, Washington, D.C., 2009.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Framing the Problem

The scenario indicates that “Director’s research portfolio focuses on performance enhancement under extremely stressful battlefield conditions.” The portfolio contains a variety of kinds of projects, ranging from management strategies to bolster performance in small groups under stress, to new technologies to enable more timely and accurate communications, to administration of drugs such as mood stabilizers as well as performance enhancers. Questions arise in all of these modalities concerning effectiveness, negative side effects, and spillover. Director has not previously considered creating a portfolio component devoted specifically to experimental performance enhancement.

As a starting point for her framing of the problem, she might consider the four elements outlined in Box 5.1 that are often used to analyze the ethics of an action (or policy). Although there is no formulaic method for taking these elements into account, a serious consideration of the ethics of a given action will generally account for all of them. Alternative actions (including doing nothing) are likely to fare differently when these elements are taken into account, and an assessment of the various elements may well help a decision maker to compare the alternatives systematically. Revisiting these elements allows for reconsideration of prior decisions.

Soliciting Advice; Developing Communications Strategies

To go forward, Director might speak with trusted advisors and experts or construct an informal advisory group to provide background guidance in developing an options paper outlining what this proposal might consist of and accomplish. The paper could include identification and analysis of the ELSI questions that should be addressed in a decision to proceed, and of the societal concerns that may arise from undertaking such a program, as well as from its results. It should also outline a communications strategy for the effort. Her superiors and advisors can review and respond, and the document that results can be used to guide the effort.

The role of public communication often goes unaddressed in efforts to develop innovative technologies. Many times, experimenters and innovators regard public communication as something to try to avoid. This might happen because of a desire to protect intellectual property or from a fear of public response. Yet this type of effort can pay large benefits in garnering public support and also in forestalling public panic or fear or calls for stopping the effort. It can also pay benefits in forcing supporters of a technological innovation to confront associated problems that they would otherwise overlook.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Box 5.1 An Approach to Help Compare the Ethics of Different Policies or Actions

Decision makers must often make judgments about the ethics of different policies or actions. Four important standards or elements for analyzing the ethics of an action or policy include the following:

The foreseeable good and bad consequences of performing an action compared to alternative actions, including doing nothing. The qualifier “foreseeable” is necessary because by definition it is impossible to know all possible consequences until the end of time. But what counts as foreseeable depends on the probabilities of different consequences. It is also important to consider the values of different consequences. The values can be based on widely accepted, common human values such as the promotion of life, happiness, health, abilities, security, knowledge, freedom, opportunities, and resources. Of course, the weight of these various values will vary among people to some extent, but all rational people share these values to some degree when it comes to themselves and the people they care about.

Relevant duties and rights. Under what circumstances should a duty or right be overridden? Duties and rights often arise from one’s designated roles—duties as an engineer, as a soldier, as a parent, as a human being, and so on. Here one needs to be concerned about setting precedents for future actions, particularly with respect to violations of rights and duties.

Assignment of responsibilities. People should know who is responsible for what. Thus, for example, actions that create moral hazards should be avoided if possible. Whistle blowers need protection. Avoiding these hazards and protecting whistleblowers allow people to fulfill their responsibilities.

Justice. Roughly, this criterion asks whether taking the action in question is fair to all relevant parties when all aspects of the situation are considered. If not, to whom is the action unfair and in what ways is it unfair?

These four elements cannot be considered or compared in an algorithmic fashion. But they provide an approach for systematically understanding similarities and differences in competing ethical claims, and they call attention to aspects of ethical action that have to be considered in justifying any given action or policy and when comparing a possible action with alternative actions (including doing nothing).

Identifying Stakeholders and Systems; Collecting Data

In the broadest sense, the stakeholders in this hypothetical case include those involved in and affected by the biomedical system of the United States. More particularly and directly, there is the entire hierarchy of the armed forces. Most directly, there are the men and women who are members of the groups in which the interventions would be tested and

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

implemented. If there are to be initial small-scale clinical trials, these may involve civilian volunteers and the staff administering the experiment.

If a decision is made to proceed, data at issue here include evidence of safety and effectiveness (or lack thereof), negative or positive side effects, and spillover or the potential for it (both positive and negative). A communication campaign could develop useful data about the priorities for drug design for various populations as well as issues that need attention as drug use spreads beyond the target group.

Understanding and Evaluating Stakeholder Perspectives

A communications strategy can have an important payoff in collecting data about the reactions of different stakeholder groups—from the most general level, including patient advocacy and family groups, to military veterans, as well as active members at various levels of the military hierarchy. Civilian sectors such as the sports industry and the transportation industry might also be consulted. The results of the R&D could provide interesting information both for drug design and for implementation of trials for the drug’s use. The response of these stakeholders to the identification and assessment of ethical issues would also be useful, as the next section suggests.

Identifying Value Conflicts

People have given human enhancement technologies a mixed reception. Although there is little controversy about therapeutic interventions intended to overcome disease or disability, such is not the case for the wide range of interventions to bolster abilities and performance. In these cases, people are concerned that the enhancements might contribute to inequities between populations and that they might lend themselves to abuse. Performance-enhancing drugs might also spread into many different sectors of society, with uncertain implications. On the other side is the potential for enhancing performance and, in certain instances, limiting death and injury as a result. From this perspective, delay in development and use of these drugs for U.S. troops on the battlefield might be placing them at significant risk.

Questions concerning benefit and equity have been central in biomedical ethics, which asks if particular ethical principles can be satisfied and how they can be satisfied or reconciled in particular settings. If they can be, the focus can turn to how such research should be done to satisfy the principles—with what populations and protections, for whose benefit, and at what costs.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Constructing Viable Alternative Courses of Action; Identifying Constraints

Director has at least several viable alternatives to consider. She has already developed an active program to improve troop management as well as communications between troop members and between the members and other responsible military personnel—a program that is paying dividends on the battlefield. Is the likely benefit from augmenting her program to include the proposed new area worth the diversion of investment from these other areas? What information would help to address this question? What values need to be considered?

Assessing Consequences, Public Defensibility, and Institutional Barriers

There are ways for Director to test her moral intuitions about the activities she is supporting and the one she has under consideration. She can ask herself what guidance her colleagues or profession might provide. She can consult an ethics officer or the office of the general counsel in her organization. (There may not be an ethics officer, but legal advice is almost certainly available.) She can ask about potential harms, as noted above and taken up again below. She can ask whether she could comfortably defend the additional activities publicly and whether, should harm come to her as a result of one of these activities, she would still think it was good to have supported it.21

Undertaking a research program to augment human performance by using drugs raises ethical questions about the potential benefits, risks, and costs. Evidence of effectiveness is not beyond dispute, and there is considerable evidence that administration of certain drugs can lead to a variety of abuses. Even without abuse, a major area for concern is the equity implications in a system where availability may be based on ability to pay. Thus, the virtues of performance enhancement, in uneven expansion to the wealthy, may exacerbate inequalities. Further, dystopian fiction has long made a vice of the virtue of such interventions—pointing out that they may become required rather than elected.

On the other side, enhancements for certain limited purposes may receive a more positive reception, particularly if they can be shown to preserve lives and lower injury. If this research is under consideration in military agencies, institutional barriers may be low. Should the decision of this program director about whether or not to proceed take account of

___________________

21 Daniel A. Vallero, citing M. Davis, in Google e-book, “Biomedical Ethics for Engineers: Ethics and Decision Making in Biomedical and Biosystems Engineering,” April 1, 2011, Academic Press, Waltham, Mass., p. 339.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

the broader societal concerns, or should she set them aside? Surveys of her advisors, superiors, and segments of the broader public may provide interesting results that she can use to inform her decision. Openness about the research could provide some evidence as to whether the program has or can gain public acceptability.

Biomedical ethics identifies significant ethical questions that will arise with the administration of an experimental drug to enhance performance in any setting, even away from the battlefield. Director and her advisors should consider the overarching principles from biomedical ethics—beneficence, nonmalfeasance, justice, and respect for autonomy—as well as the rules of procedure intended to implement those principles. Application of these principles and procedures may result in a finding that it would be premature to proceed with use of this experimental intervention in battlefield contexts until further research results are available. Respect for autonomy and the associated requirements for informed consent would also have to receive special attention when drug testing moves to those contexts, in which standard voluntary informed consent will not be available, although it is possible that an acceptable facsimile can be created.

Engaging in Dialogue

If Director decides to proceed, she may be well served to initiate dialogues with the various stakeholders as noted above. These discussions may change the research design and implementation and thus be likely to lead to different outcomes.

Revising Plans

Proceeding through the set of deliberations outlined above is likely to have resulted in various revisions to the decision as to whether and how to proceed. As further questions and issues arise, Director should revisit the previous steps and revise the effort(s) accordingly.

5.3.3 Questions Related to Stakeholders and Crosscutting Themes

As for the ELSI content that Director’s analysis may uncover, the questions in the “Stakeholders” and “Crosscutting Themes” sections above, perhaps in modified form, are relevant to issues that may emerge. The approach taken in the present section describes the parties that have a stake, either direct or indirect, in the treatment of ethical, legal, and societal issues, and it poses questions that might be relevant to these stakeholders. It then identifies some themes that arise for many or all

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

of these stakeholders and raises related questions to suggest some ways to use the framework productively in identifying and assessing ethical, legal, and societal issues.

The issues range from those involving persons directly connected to the conduct of research, to those that are distant, such as nonmilitary users and organizations and even nations. The discussion that follows extracts from the material in the preceding sections those questions that seem of most relevance to Director’s circumstances.

Again, the working assumption for understanding the discussion above is that Director is a well-intentioned manager who wishes to proceed in an ELSI-responsible manner consistent with her responsibilities for advancing science and technology for national security purposes. She understands that exploring ethical, legal, and societal issues is a potentially unbounded enterprise, and that she is responsible for exercising her best judgment in determining how far to carry such exploration. She recognizes that exploring ethical, legal, and societal issues is often best done in parallel with the conduct of research, but that some level of preliminary exploration may be necessary to make such a determination.

Stakeholders

Research Performers, Subjects, and Users

Using the bioethical principles from the Belmont report (described in Chapter 4), Director would examine issues of beneficence, interpreted as maximizing benefits and minimizing harms. She might ask:

• What benefits and harms can arise for research subjects and performers? What are the costs and risks, and the potential benefits to military users and to society at large? How and to what extent might this application affect future generations? Are changes to military operations likely to arise, and how would they be accommodated? In the context of the drug compound in question, she might ask what it means to say “use of the drug exhibits no detrimental effects.” What specifically are the detrimental effects for which evidence was sought? How long was the drug used and in what dosages? Why was it possible to rule out detrimental effects after long-term use? How might use of the drug affect the logistics chain needed to support soldiers who use the drug? For example, will they need to eat more food and consume more water when on the drug?

• Are adversaries likely to benefit from the results? How, if at all, could the drug be kept away from adversaries? Is U.S. military operation better off if the United States and its adversaries have the drug? Why or why not?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

• How should benefits and harms to U.S. and other stakeholder groups be determined, aggregated, and weighed and compared?

• Learning what the benefits may be sometimes requires exposing stakeholders to some harm or risk of harm. How should learning about possible benefits be weighed against actual or possible harm?

Another Belmont report principle requires respect for persons. Director would also have to consider how voluntary, informed consent would be obtained in trials or other circumstances where this intervention might be administered.

• If parties directly involved in such research are members of the U.S. armed forces, they could have an obligation to obey legal orders to participate in the research. Those giving orders would then have a conflict with the duty to provide for voluntary informed consent. If the commanding officers did not give an order, there would not be a conflict, but that might interfere with good research design.

• If the circumstances do not allow voluntary, informed consent, who, if anyone, will speak for the best interests of those parties? How should those speaking for the subjects or participants identify, assess, and weigh these interests?

The principle of justice suggests that the benefits and burdens associated with R&D should be fairly distributed.

• On what basis are specific parties or groups of parties selected for direct involvement in the research? Does the selection of these groups satisfy concerns for fair distribution of benefits and burdens? Should the compound be tested only in frontline soldiers? Only in ground soldiers (versus pilots or sailors)? Only in officers? Only in enlisted personnel? These different groups have different responsibilities in combat, and the drug might differentially affect abilities to execute such responsibilities.

• How and to what extent, if at all, do national security considerations demand that certain groups (e.g., warfighters) accept an exceptional or a higher level of risk than that accepted by other groups (e.g., civilians)? Does higher potential benefit justify increased risk?

• How and to what extent, if at all, should new knowledge derived from the research in question be subject to restrictions on distribution? Should it, for example, be kept from certain allies or the rest of the world? Should it be restricted from public distribution? If so, would the reasons withstand public scrutiny?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Actual use of the drug compound resulting from the research in question might raise other ethical, legal, and societal issues. But even though any need to address those issues is contingent on the success of the research, Director may wish to consider these issues at least in a preliminary fashion, taking into account the type, likelihood, and extent of these issues.

• What might the nature of the impact on users be? For example, how might the use of performance-enhancing drugs affect group cohesiveness? Under what circumstances, if any, might military users of the compound continue to have access to the compound once they are no longer directly participating in combat activities or have returned to civilian life?

• What, if any, could be the longer-term impacts on users? Such an inquiry could be addressed very broadly, but what is the appropriate scope of the inquiry for Director to consider? The inquiry is most relevant when the nature and the extent of potential risks and harms are greater than is apparent here. But given societal concerns about enhancement drugs, the benefits of considering this potential issue may outweigh the costs.

• Might an enhancement of the type being considered lead to heightening the stress that warfighters are under, for example by extending the period of time in which they are left in a battle zone?

Adversaries

The framework discussed in this chapter raises a set of questions about ethical issues that might be associated with the adoption of a technology or application by adversaries. In the case of the research posited in the scenario above, the questions do not appear to be grave ones. Is it likely that adversaries could or would easily adopt this innovation or develop countermeasures? Since the application does not have a directly harmful effect on adversaries, the answer to this question seems likely to be negative or at least of relatively trivial consequence. Similarly, it seems unlikely that the United States would regard an adversary’s pursuit of this research to be unethical, or that U.S. pursuit would have negative consequences for adversary behavior and perceptions or propaganda against the United States. The answer to the question of whether other non-adversarial countries might pursue this research also seems likely to be negative. If they did pursue it, the United States would likely be able to adopt it easily.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Civilian Users, in the United States and Elsewhere

There is a strong likelihood that a performance-enhancing drug such as the one Director’s researcher would like to develop would be perceived as having benefits in a wide variety of applications. People often want to stay awake and alert and often find themselves in contexts where their ability to perform well is highly valuable and valued. Pressures to use the drug, and inequities in its availability, are likely.

A broader social conversation about the merits of investing in research on drug-based or drug-induced performance enhancement could be beneficial. What role might or should Director play in encouraging such a conversation? A wide variety of issues different from those pertinent to military contexts are likely to arise, including issues of access (which commercial companies might profit from government efforts to develop the application), accountability (public accountability regimes of private-sector companies differ from those of the government), and adoption of technologies by other nations, adversaries as well as allies, after commercialization (such uses may raise concerns very different from those arising in military situations).

Organizations

Director is concerned about enhancing performance, and she recognizes that this goal must take organizational as well as individual effectiveness into account. The introduction of this technology can have significant impacts on practices, procedures, and lines of authority in military organizations. Relevant questions may include the following:

• How and to what extent, if at all, could the application in question influence or change traditional structures and mechanisms of accountability and responsibility for its use? Could it drive certain kinds of battlefield decision making to lower ranks in the military hierarchy? How will the organization react to such tendencies? How, if at all, will accountability for the use of the application in question be maintained? Conversely, might the application make it less likely for someone in the lower ranks to raise questions about ethical use?

• Military organizations often place great value on personal fortitude and endurance in combat. How and to what extent, if at all, could the use of a performance-enhancing drug by soldiers on the battlefield change such valuation?

• Promotions in many military organizations are sometimes based on command opportunities. How and to what extent, if any, could the use of an enhancement drug change command structures? For example, might soldiers’ reduced need for sleep mean that units could be smaller? How

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

might promotion opportunities be affected if commanders commanded smaller units?

Crosscutting Themes

Director can consider whether the crosscutting themes in the framework apply in the case of the proposed research on an enhancement technology. The themes are scale, humanity, unanticipated military uses, technological imperfections, crossover to civilian use, changing ethical standards, ELSI considerations in a classified environment, and opportunity costs. Although some of these issues are discussed above, particular aspects as they arise in some of these thematic areas are worth highlighting. (The discussion below omits mention of unanticipated military uses and changing ethical standards, illustrating the point that not all question categories in the framework are necessarily applicable.)

Scale

Large-scale deployment of the type of enhancement in question may have consequences that are difficult to predict. This possibility may increase the ethical difficulties of doing a cost or risk benefit analysis going beyond the consequences to research subjects or experimental deployment. Will deployment increase costs to segments of society that do not receive the benefits? Is cheap access to the enhancement beneficial? If widespread deployment creates unexpected harms, can the performance-enhancing drug be recalled?

Humanity

Enhancement technologies seem often to raise ethical issues about what it means to be human. In a military context, might enhanced performance lead to decreased empathy and less group adhesion or solidarity?

Technological Imperfections

How much experimental iteration should occur before an innovation can justifiably be used in an operational context? What are the safety and efficacy criteria? Given bodily intrusion, should these standards be higher than those for weapons or other technological innovations?

Crossovers to Civilian Use

How might the drug in question be used by civilians? How do the risks of using the drug vary in the population at large? What would the

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

drug cost? If there is a limited supply of the drug, which civilians should receive the drug first?

ELSI Considerations in a Classified Environment

Director might be faced with considerations about whether certain neuroscience experimentation for purposes of human enhancement would or should be classified. Were the research to require classification, the questions of ethics in the research and deployment process would be complicated by secrecy requirements. She would need to be satisfied that agency procedures and oversight were sufficient to justify the classification and assure an ethical research process. The process would also affect her relationship to the research performer, who would have to adhere to secrecy restrictions. Community feedback for improvements would also be greatly limited if not totally unavailable. Although limiting access to the results of the research on a performance-enhancing drug has benefits as well as negative implications, the potential for public outcry should problems arise or security be breached may be greater.

Opportunity Costs

For Director, supporting the proposed research endeavor means that others will not be supported. Valuation of research options is never easy. Within the frame of a program to enhance battlefield performance, Director needs to consider and weigh her current priorities with and without this new possibility. Advice from a variety of sources can help in this assessment.

5.3.4 Developing a Future Course of Action

Director can use the framework offered in this report to identify ethical concerns and relevant questions associated with each stakeholder. She can use this knowledge to determine how and to what extent, if any, a program or project might be modified—or in extreme cases abandoned—because of ELSI concerns. The framework does not substitute for other processes and procedures that may be applicable for other reasons such as legal requirements. As she gains more experience with identifying and assessing ethical and societal issues, Director may well add to the framework and incorporate it into a standard procedure that can last throughout the lifetime of a given program.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

What actions might emerge from the use of the analysis described above? The space of possible actions is large. For example, Director could:

• Decide that ethical questions as well as insufficiencies in the data and demonstrable potential mean that she should not encourage proceeding;

• Encourage research on the broader social and ethical implications of developing drugs with these characteristics;

• Examine the broader contexts in which such a drug is likely to be used;

• Involve operations in testing the intervention in battlefield conditions;

• Encourage the researcher to proceed with a limited effort, involving further testing on a larger population while continuing to monitor the previous subjects; and/or

• Encourage the researcher to submit a proposal to continue and expand the effort, without specifically raising ethical considerations.

Several of these options can be pursued at the same time and, undoubtedly, there are other options for proceeding as well. But the most important feature of this list is that there are more than two options—that is, Director has choices other than ignoring ethical, legal, and societal issues entirely or discouraging the researcher entirely.

Again, the working assumption for understanding the discussion below is that Director is a well-intentioned manager who wishes to proceed in an ELSI-responsible manner consistent with her responsibilities for advancing science and technology for national security purposes. She understands that exploring ethical, legal, and societal issues is a potentially unbounded enterprise, and that she is responsible for exercising her best judgment in determining how far to carry such exploration. As before, she recognizes that exploring ethical, legal, and societal issues is often best done in parallel with the conduct of research, but some level of preliminary exploration may be necessary to make such a determination.

5.4 THE FRAMEWORK IN CONTEXT

5.4.1 A Summary of the Framework’s Questions

This section pulls out of the “Stakeholders” and “Crosscutting Themes” sections all of the questions posed and discussed in the framework. (In the interest of brevity, explanatory material such as examples is omitted. Readers should refer to the sections above for such material.)

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Questions of Relevance by Stakeholder

Researchers (and Those Otherwise Associated with Research)

• Beneficence

—What defines “benefit” and “harm”? (Note that a risk of harm is not necessarily the same thing as harm.) How can an R&D effort benefit or harm research subjects? The investigator? Society at large?

—When R&D is being conducted for applications that are intended to harm an adversary, how can the nature and extent of harm be ascertained in research? (Note that there are many kinds of harm that may be at issue.)

—How do the definitions of “benefit” and “harm” differ when different stakeholders are involved?

—How should benefits and harms to different stakeholder groups be determined, aggregated, and compared?

—How should learning about possible benefits be weighed against actual or possible harm?

• Respect for persons

—What constitutes genuine informed consent when information derived from possibly sensitive intelligence sources is part of a threat assessment?

—If parties directly involved in research related to a particular application are members of the U.S. armed forces, how and to what extent—if any—is there a conflict between their obligation to obey legal orders and their provision of informed consent on a voluntary basis? Have undue inducements been offered to persuade individuals to “volunteer”? What counts as an “undue” inducement?

—Who, if anyone, will speak for the best interests of parties that are not capable of providing informed consent? How should such concerns be identified, assessed, and ultimately weighed?

• Justice

—On what basis are specific parties or groups of parties selected for direct involvement in a research effort? For example, why is one group rather than another chosen to be the pool of research subjects? Why is one geographical location rather than another the choice for situating a potentially dangerous research facility?

—How and to what extent, if at all, do national security considerations demand that certain groups (e.g., warfighters) accept

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

an exceptional or a higher level of risk than that accepted by or imposed on other groups (e.g., civilians)?

—How and to what extent, if at all, should new knowledge derived from research be subject to restrictions on distribution?

Users of an Application

• What could be the nature of the impact, if any, on users of an application?

• What could be the cumulative impact, if any, on users of an application?

• What could be the long-term impact, if any, on users of an application?

Adversaries

• What, if any, is the nature of the direct impact of use of an application against adversaries?

• How and to what extent can the application’s impact be reversed?

• How do considerations of symmetry apply? That is, what are the ELSI implications of an adversary pursuing the same technology development path as the United States?

• In the long term, what is the impact of an application on adversary behavior and perceptions?

—How and to what extent could an adversary develop similar capabilities? What is the time scale on which an adversary could do so? How could an adversary use these capabilities? What advantages could an adversary gain from using these capabilities free of legal and ethical constraints?

—What countermeasures might an adversary take to negate the advantages conferred by the application in question? How long would it take for the adversary to obtain those countermeasures? How, if at all, could the developed countermeasure be worse in some way from an ethical standpoint than the application itself?

—How could the application affect the adversary’s perception of the United States?

—What, if any, could be the application’s effect on deterrence?

—What effect, if any, could U.S. restraint in pursuing a particular application have on inducing an adversary to exercise similar restraint?

—What, if any, opportunities for adversary propaganda could an application enable or facilitate?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Nonmilitary Users

• Law enforcement

—If the military application in question were deployed to support law enforcement operations, how and to what extent, if any, could such deployment raise ethical, legal, and societal issues that do not arise in the military context?

• Commerce

—How and to what extent, if any, could a commercial adaptation of a military application in question raise ethical, legal, and societal issues that do not arise in the military context?

• The general public

—How and to what extent could adaptations of a military application be used by ordinary citizens? What are the ELSI implications of such use?

Organizations

• How and to what extent, if at all, could a new military application influence or change traditional structures and mechanisms of accountability and responsibility for its use? How will the organization react to such tendencies? How, if at all, will accountability for the use of the application in question be maintained? Conversely, might the application make it less likely for someone in the lower ranks to raise questions about ethical use?

• Military organizations often place great value on personal bravery in combat. How and to what extent, if at all, could a technological application used in combat change such valuation?

• Promotions in many military organizations are sometimes based on command opportunities. How and to what extent, if any, could an application change command structures?

Noncombatants

• How and to what extent could an application affect noncombatants on and off the battlefield?

• How might the public at large perceive a given application?

• How and to what extent could an application affect future generations? And what might be the effects of such operation on those targeted by the application?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

• How and to what extent could the operation of an application—especially large-scale operations—harm the environment?

Other Nations

• What, if any, could be the impact of a new military technology or application on political solidarity with the United States?

• How, if at all, could the technology or application raise questions about the strength of U.S. commitments to other nations or allies?

• What could be the impact, if any, on U.S. reluctance to share a technology or application with its allies?

• How, if at all, could a technology or application affect the willingness of allies and nonaligned nations to participate in coalition efforts with the United States if the latter uses this technology?

• How and to what extent, if any, could U.S. restraint in pursuing a new military application induce other nations to exercise similar restraint?

• How and to what extent, if any, could an application help to compromise human rights if used by another nation on its own citizens?

Questions of Relevance by Crosscutting Issue

Scale

• Societal scope

—How and to what extent, if any, could a change in the scale of deployment or use of a technology or application change an ethical calculation?

—How and to what extent, if any, are the costs of using a particular application transferred from its immediate users to other entities?

—If an application becomes successful because of the increased functionality it affords to its users and such functionality becomes essential for individuals participating in society, how and to what extent, if any, can the costs of obtaining an essential application be made broadly affordable so that all individuals can obtain its benefits equally?

• Degree of harm

—How and to what extent, if any, does the degree of inadvertent or undesirable harm compare to the benefits obtained from using that application?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

• The nature of the activity

—How does the scale of ethical, legal, and societal issues differ along the continuum from basic research to use of an application? How do the stakeholders and their interests change?

• Timing considerations

—What are the ELSI considerations in weighing short-term benefits against long-term costs, and how does the scale of such benefits and costs affect these considerations?

Humanity

• How and to what extent, if at all, does a new military application compromise something essential about being human? How and to what extent, if at all, might users believe that the application is unethical?

• How and to what extent, if at all, is the application invasive of the human body or mind?

• How and to what extent, if at all, could use of an application tread on religiously or culturally sensitive issues?

• Does a technology threaten to cede control of combat capabilities to nonhuman systems to an unacceptable degree?

Technological Imperfections

• Who decides the appropriate safety requirements associated with a new application?

• On what basis are such decisions made?

• What, if any, are the tradeoffs between an application’s functionality or use and the safety requirements imposed on it?

Unanticipated Military Uses

• What military uses are possible for the application or technology in question that go beyond the stated concepts of operation? What are the ELSI implications of such uses?

Crossovers to Civilian Use

• How and to what extent, if any, could civilian-oriented adaptations of military applications made widely available to citizens raise ethical and societal issues that do not arise in the military context?

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

• How fast should such military-to-civilian transfers of applications be made? What safeguards should be put into place before they are made? How should such safeguards vary with the technology involved?

Changing Ethical Standards

• If an application is intended to address a military issue that previously had to be addressd by humans, what is the minimum standard of performance that the application must meet before it is deemed acceptable for widespread use?

• How and to what extent, if any, does a new application create new ethical obligations to use it in preference to older applications addressing similar problems that may raise ELSI concerns to a greater extent?

ELSI Considerations in a Classified Environment

• How can research in a classified environment be reviewed for ELSI purposes?

• What is the appeals process for challenging classification designations that may have been assigned inappropriately?

Opportunity Costs

• How should the value of an R&D effort be ascertained?

• Why is the R&D effort proposed more valuable than another effort whose cost and likelihood of success are comparable? On what basis should one program be chosen over another?

• How and to what extent does U.S. military effort in a selected R&D problem domain signal to adversaries that this domain may be a promising one for military applications?

Questions of Relevance by Source of Insight

Chapter 4 describes a number of different sources of ELSI insight, and the discussion includes illustrative ELSI-related questions that may be derived from considering each of those sources.

5.4.2 Utility of the Framework

Readers of this report who identify ethical, legal, and societal issues inherent in the above described scenario (Section 5.3.1) that do not derive from use of the framework may be dismayed about that fact. Such dismay would foreshadow material presented in Chapter 6, which argues that a

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

comprehensive identification of ethical, legal, and societal issues associated with a given technology development is difficult indeed.

Put differently, the framework is itself a starting point for discussion and is not comprehensive. The framework provides some structure for thinking through various ethical, legal, and societal issues, but as the sampling of such issues across various technologies and applications suggests, it is not necessary to treat all issues in the framework as equally important for any given technology or application—judgment is necessary to make the most effective use of the framework. That is, different ethical, legal, and societal issues may come into play or a given ELSI concern may be significant to varying degrees depending on the technology in question. On the other hand, not considering any given element in the framework must itself be a thoughtful and defensible decision rather than a reflexive one—a good and plausible argument must be available as to why that particular element is not relevant.

As decision makers gain more experience with identifying and assessing ethical, legal, and societal issues, it should be expected that the content embedded in the framework will evolve. Years from now, it would be surprising indeed if the questions that policy makers posed regarding ethical and societal issues had not changed at all.

Policy makers might wish to use this framework for new or existing R&D programs or projects. In addition, it may be appropriate to apply this framework when some unanticipated application emerges. One might regard use of this framework as part of an ongoing process that lasts throughout the lifetime of a given program.

The purpose of this framework is not to impose compliance requirements on program managers, but rather to help them to do their jobs better and to help ensure that basic American ethical values are not compromised. The analytical framework is necessarily cast in somewhat broad and abstract terms because it is designed to apply to most R&D programs; consequently, not all questions in the framework will necessarily be relevant to any specific technology or application.

Furthermore, although it may not be likely that a contentious issue identified through this framework will be resolved in a decisive or final manner, this fact is not an adequate rationale for dismissing or ignoring the issues. Honest, well-reasoned analyses are useful to policy makers, even if they might be incomplete, and such analyses can be supplemented or corrected through adaptive processes as additional knowledge is gained over time, as discussed in Chapter 6.

As for the framework itself, the number of stakeholder groups and the number of crosscutting themes described in this chapter are both large, reflecting the breadth of possible technologies whose ethical, legal, and societal consequences must be considered and the large number of inter-

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

ested parties, as well as the diverse nature of the concerns that any given stakeholder may bring to bear. Indeed, the committee found that attempts to make these lists more concise—in general an effort worthwhile as an analytical goal—would constrain the intended broad applicability of the framework. In part, the framework fills the role of a checklist, a mechanism that is often used to remind decision makers to consider the possible relevance to the project at hand of a wide range of issues that may not be related to each other.

The framework provides information about ethical foundations and approaches that many people and organizations find useful in considering difficult questions about research for technological innovations, without choosing a particular orientation from among them. This approach recognizes that weighting different ethical constraints and opportunities is difficult and does not lend itself to an algorithmic decision-making procedure. Under some set of specific circumstances and technological characteristics, certain criteria may have priority, whereas under a different set of circumstances, different criteria may have priority.

At the level of generality at which this framework is cast, a few caveats are necessary. First, a full consideration of ethical issues sometimes produces a cacophony of methodologies and perspectives that leads to dissonance and controversy. Similarly, “societal” issues range across such a broad range of possibilities that attempts to limit the scope of such issues inevitably generates questions about why this issue or that issue was included or excluded. Third, decision makers will surely face tradeoffs, satisfying no stakeholder fully in any ethically or societally controversial enterprise. Fourth, the framework does not provide a methodology for resolving or settling competing ethical claims, for choosing between ethical theories, or for providing specific answers to ethical questions, although it does call for decision makers to attend to a variety of ethical positions and approaches.

At the same time, the framework does not assume that “anything goes,” and it posits that through deliberation and discussion, it is often possible to identify initial ethical positions that are more well grounded and defensible or less so. Further deliberation and discussion may well lead to evolution in these initial positions and decisions. Because such discussion increases the likelihood that major ethical, legal, and societal concerns will be identified before any given technology R&D program or project gets underway, and casting the initial net broadly rather than narrowly will help to limit ELSI-related surprises, the committee believes that such a discussion is worthwhile as a part of any ELSI assessment.

The framework above is useful primarily for bringing ethical, legal, and societal concerns to the surface that would not otherwise have been apparent to decision makers and program managers. The ELSI-related

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

questions included within the framework are intended to help decision makers develop useful knowledge on a variety of ethical, legal, and societal issues regarding specific military science and technology programs and projects. The framework was developed to apply to decision making in a U.S. context, although decision makers and program officials in other nations may nonetheless find parts of it useful.

In the end, the use of this framework can only provide input to decision makers, who will have to make judgments about how, if at all, to proceed with a particular R&D program or project, and such judgments should be undertaken after the decision makers have examined the issues posed by the framework rather than before. Different individuals may develop different answers to the various questions raised by the framework about a given technology, but the important aspect of this process is that the questions be asked and that a discussion take place.

This framework does not substitute for other processes and procedures that may be applicable for other reasons. In particular, program managers are obligated to conduct their programs in accordance with applicable law and regulation (such as the Common Rule,22 which sets forth federal policy for the protection of human subjects used in research). Judgments about the compliance of a specific program with applicable laws are beyond the scope of this report, although the report draws on relevant national and international standards in its discussion.

5.4.3 Identifying Fraught Technologies

Not all technologies or applications are equally fraught from an ELSI standpoint. Technologies or applications are likely to be highly fraught if they satisfy one or more of the following attributes:

• A technology or application that is relevant to multiple fields (for example, an enabling technology or application) will almost surely have more ELSI impact in the long run than one whose scope of relevance is narrow.

• A technology or application whose operation has the potential to result in intended or unintended consequences that could cause harm to people on a very large scale is likely to raise more ELSI concerns than one without such potential.

• A technology or application that challenges traditional (and often religious) notions of life and humanity or appears to do so is likely to

___________________

22 See http://www.hhs.gov/ohrp/humansubjects/commonrule/index.html.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

raise more ELSI concerns. Under this heading are some concerns that the Wilson Center report describes as concerns over “nonphysical” harms.23

A technology or application for which one of these statements is true is worthy of special consideration and effort to understand ELSI concerns, and a technology or application for which more than one is true is even more worthy of such consideration. Examples from history that have all of these attributes in some measure might include genetic engineering and recombinant DNA research, and Chapter 2 highlights the current discussion of what synthetic biology, as a similar kind of research, might produce and how its potential benefits are accompanied by a range of ethical, legal, and societal issues that its proponents have worked hard to address.

5.4.4 Frequently Heard Arguments

Finally, it is helpful to address a number of frequently heard arguments about ethics as they apply to new military technologies. Specifically, one common thread of the arguments discussed below is that they are often made with the intent or desire of cutting off debate or discussion about ethical issues.

An argument. U.S. adversaries are unethical, and so ethics should not be a constraint in using advanced weaponry against them. Moreover, they seek every advantage over the United States that they can obtain, and thus the United States, too, must do the same in any conflict with adversaries.

Response. The United States has publicly stated a commitment to abide by certain constraints in how it engages in conflict regardless of how its adversaries behave; these commitments are embodied in domestic law that criminalizes violations of the Geneva Conventions by the U.S. armed forces, and also by certain treaties that the United States has signed and ratified. The real question is not whether we constrain ourselves ethically but how and under what circumstances, and with what decision-making procedures we do so.

An argument. U.S. adversaries will pursue all technological opportunities that serve their interests, and if the United States doesn’t pursue those opportunities as well, it will wind up being at a military disadvantage.

___________________

23 The full report can be found at http://www.synbioproject.org/process/assets/files/6334/synbio3.pdf.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Response. From the standpoint of decision makers, there is a world of difference between the possibility that technology X could provide military advantages and a clear demonstration that technology X does provide military advantages in specific and important operational scenarios. That is, the latter provides a proof of principle that technology X is worth a significant investment. This point argues that in some cases, it may make sense to separate decisions about exploring the value of a technology (a preliminary step) from decisions based on demonstrating how it can be used to confer military advantages (a more decisive step), and to make such decisions separately.

An argument. We don’t know the significance of technology X, so we must work on it in order to understand its implications, and we would be unwise to give up on it without knowing if and how it might have value to the United States.

Response. This argument poses a false choice between cessation of all investigatory work on X and proceeding to work on X without any constraints at all. In fact, there are a variety of choices available in between these two extremes, the most significant of which is something along the lines of “proceed, but carefully.” Intermediate choices are addressed in Chapters 4 and 5 and in the recommendations made in Chapter 8.

An argument. Consideration of ethical, legal, and societal issues will slow the innovation process to an unacceptable degree.

Response. Although the argument is surely true in some cases, it is not necessarily true in all cases. For example, it depends on the nature and extent of such consideration. Moreover, a consideration of ethical, legal, and societal issues is hardly the only dimension of the military acquisition process on which that process may be slowed. Finally, a small slowdown in the process up front may in fact be worth the cost if it helps to prevent a subsequent explosion of concern that takes program managers by surprise.

An argument. Research on and development of defensive technologies and applications is morally justified, whereas work on offensive technologies is morally suspect.

Response. The categories of “offensive” and “defensive” technologies are not conceptually clear, because offensive technologies (that is, technologies that can kill or destroy) can be used for defensive purposes, and, similarly, defensive technologies (that is, technologies that prevent or reduce death or destruction) can be used for

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

offensive purposes. An example of the first is a defender’s use of an offensive weapon to destroy an incoming offensive weapon—in this case, the defender uses its offensive weapon to prevent or reduce the death and destruction that the attacker’s offensive weapon would otherwise cause. An example of the second is the use of a defensive system to protect an attacker that has launched a first strike—in this case, the attacker’s possession of a defensive system enables the attacker to attack without fear of retaliation, thus increasing the likelihood that it will in fact attack. In short, the distinction between the two categories often fails in practice.

It should be stressed here that the responses to the various arguments outlined above are not intended to dismiss out of hand any of the frequently heard arguments. That is, all of the frequently heard arguments described above sometimes have at least a grain of truth that may be worth considering. At the same time, those grains of truth should not be amplified to the point that they render discussion of ELSI considerations illegitimate—the short responses to the frequently heard arguments are intended essentially as points of departure for further dialogue.

Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 163
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 164
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 165
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 166
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 167
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 168
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 169
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 170
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 171
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 172
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 173
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 174
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 175
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 176
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 177
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 178
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 179
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 180
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 181
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 182
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 183
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 184
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 185
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 186
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 187
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 188
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 189
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 190
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 191
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 192
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 193
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 194
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 195
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 196
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 197
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 198
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 199
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 200
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 201
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 202
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 203
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 204
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 205
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 206
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 207
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 208
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 209
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 210
Suggested Citation:"5 An Analytical Framework for Identifying Ethical, Legal, and Societal Issues." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 211
Next: 6 Going Beyond Initial A Priori Analysis »
Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues Get This Book
×
Buy Paperback | $63.00 Buy Ebook | $49.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Emerging and Readily Available Technologies and National Security is a study on the ethical, legal, and societal issues relating to the research on, development of, and use of rapidly changing technologies with low barriers of entry that have potential military application, such as information technologies, synthetic biology, and nanotechnology. The report also considers the ethical issues associated with robotics and autonomous systems, prosthetics and human enhancement, and cyber weapons. These technologies are characterized by readily available knowledge access, technological advancements that can take place in months instead of years, the blurring of lines between basic research and applied research, and a high uncertainty about how the future trajectories of these technologies will evolve and what applications will be possible.

Emerging and Readily Available Technologies and National Security addresses topics such as the ethics of using autonomous weapons that may be available in the future; the propriety of enhancing the physical or cognitive capabilities of soldiers with drugs or implants or prosthetics; and what limits, if any, should be placed on the nature and extent of economic damage that cyber weapons can cause. This report explores three areas with respect to emerging and rapidly available technologies: the conduct of research; research applications; and unanticipated, unforeseen, or inadvertent ethical, legal, and societal issues. The report articulates a framework for policy makers, institutions, and individual researchers to think about issues as they relate to these technologies of military relevance and makes recommendations for how each of these groups should approach these considerations in its research activities. Emerging and Readily Available Technologies and National Security makes an essential contribution to incorporate the full consideration of ethical, legal, and societal issues in situations where rapid technological change may outpace our ability to foresee consequences.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!