National Academies Press: OpenBook
« Previous: 2 Foundational Technologies
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

3

Application Domains

Application domains are, by definition, associated with operational military problems. Solutions to these problems call for the application of various technologies, both foundational and specialized. Research and development work (applied research) on operational military problems is often classified.

This chapter addresses four application domains: robotics and autonomous systems, prosthetics and human enhancement, cyber weapons, and nonlethal weapons. For each, the relevant section provides a brief overview of the technologies relevant to that domain, identifies a few characteristic military applications within the domain, and addresses some of the most salient ethical, legal, and societal issues for that application domain. As with Chapter 2, the reader is cautioned that ELSI concerns are not handled uniformly from section to section—this lack of uniformity reflects the fact that different kinds of ethical, legal, and societal issues arise with different kinds of military/national security applications.

3.1 ROBOTICS AND AUTONOMOUS SYSTEMS

An autonomous system can be defined loosely as a system that performs its intended function(s) without explicit human guidance. The technology of autonomous systems is sometimes called robotics. Many such systems are in use today, both for civilian and military purposes, and more are expected in the future. And, of course, there are degrees of

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

autonomy that correspond to different degrees and kinds of direct human involvement in guiding system behavior.

The overarching rationale for deploying such systems is that they might replace humans performing militarily important tasks that are dangerous, tedious, or boring or that require higher reliability or precision than is humanly possible. If such replacement is possible, two consequences that follow are that (1) humans can be better protected and suffer fewer deaths and casualties as these important military tasks are performed, and (2) important military tasks will be performed with higher efficiency and effectiveness than if humans are directly involved.

3.1.1 Robotics—The Technology of Autonomous Systems

Computer systems (without the sensors and actuators) have always had a certain kind of “autonomous” capability—the term “computer” once referred to a person who performed computations. Today, many computer systems perform computational tasks on large amounts of data and generate solutions to problems that would take humans many years to solve.

For purposes of this report, an autonomous system (without further qualification) refers to a standalone computer-based system that interacts directly with the physical world. Sensors and actuators are the enabling devices for such interaction, and they can be regarded as devices for input and output. Instead of a keyboard or a scanner for entering information into a computer for processing, a camera or radar provides the relevant input, and instead of a printer or a screen for providing output, the movement of a servomotor in the appropriate manner represents the result of the computer’s labors.

Autonomous systems are fundamentally dependent on two technologies—information technology and the technology of sensors and actuators. Both of these technologies have developed rapidly. On the hardware side, the costs of processor power and storage have dropped exponentially for a number of decades, with doubling times on the order of 1 to 2 years. Sensors and actuators have also become much less expensive and smaller. On the software side, the technologies of artificial intelligence, statistical learning techniques, and information fusion have advanced a long way as well, although at the cost of decreased transparency of operation in the software that controls the system.

Software that controls the operation of autonomous systems is subject to all of the usual problems regarding software safety and reliability—programming errors and bugs, design flaws, and so on. Flaws can include errors of programming (that is, errors introduced because a correct performance requirement was implemented incorrectly) or errors of design (that is, a performance requirement was formulated incorrectly or stated improperly).

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

To control an autonomous system, the software is programmed to anticipate various situations. An error of programming might be a mistake made in the programming that controls the response to a particular situation, even when that situation is correctly recognized. An error of design might become apparent when a system encounters a situation that was not anticipated, and as a result either does something entirely unexpected or improperly assesses the situation as one for which it does have a response, which happens to be inappropriate in that instance.

Neuroscience may be an enabling technology for certain kinds of autonomous systems. Some neuroscience analysts believe that neuroscience will change the approach to computer modeling of decision making by disclosing the cognitive processes produced by millions of years of evolution, processes that artificial intelligence has to date been unable to capture fully. Such processes may become the basis for applications such as automatic target recognition. Even today, it is possible for automated processes to differentiate images of tanks from those of trucks, and such processes do not rely on neuroscience. However, neuroscience may contribute to an automated ability to make even finer distinctions, such as the ability to distinguish between friendly and hostile vehicles or even individuals.

In general, the logic according to which any complex system operates—including many autonomous systems—is too complex to be understood by any one individual. This is true for three reasons. First, multiple individuals may be responsible for different parts of the system’s programming, and they will not all be equally conversant with all parts of the programming. Second, the programming itself may be large and complex enough to make it very hard to understand all of how it works in detail. Third, the program may combine and process inputs (sometimes unique inputs that depend on the very specific circumstances extant at a given moment in time) in ways that no human or team of humans can reasonably anticipate. System testing is one mechanism that can provide some information about the behavior of the system under various conditions, but it is well understood that testing can only provide evidence of flaws and that it cannot prove that a system is without flaw.1

It is worth noting that a flaw in the software controlling an autonomous system may be far more damaging than a flaw in software that does

___________________

1 National Research Council, Software for Dependable Systems: Sufficient Evidence?, The National Academies Press, Washington, D.C., 2007, available at http://www.nap.edu/catalog.php?record_id=11923. See also National Research Council, Summary of a Workshop on Software Certification and Dependability, The National Academies Press, Washington, D.C., 2004, available at http://books.nap.edu/catalog.php?record_id=11133. Real-time programming (the class of programming needed for robotics applications) is especially complicated by unanticipated “interaction” effects that are hard to detect by testing and also do not usually arise in non-real-time programming.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

not control physical objects—in the latter case, a display may be in error (or indicate an error), whereas in the former case, the physical part of a system (such as a robotically controlled gun) may kill friendly troops.

3.1.2 Possible Military Applications

Technologies for autonomous systems are the basis for a wide variety of real-world operational systems. Today, robots are available to clean pools and gutters, to vacuum and/or wash floors, and to mow lawns. Robotic dogs serve as personal companions to some children. Robots perform a variety of industrial assembly line tasks, such as precision welding. A number of commercial robots also have obvious military applications as well—robots for security patrolling at home have many of the capabilities that robots for surveillance might need to help guard a military facility, and self-driving automobiles are likely to have many similarities to self-driving military trucks. In a military context, robots also conduct long-range surveillance and reconnaissance operations, disarm bombs, and perform a variety of other functions. In addition, these robots may operate on land, in the air, or on and under the sea.

Perhaps the most controversial application of autonomous systems is equipping such systems with lethal capabilities that operate under human control. Even more controversially, some systems have lethal capabilities that can be directed without human intervention. Some of these systems today include:2

• A South Korean robot that provides either an autonomous lethal or nonlethal response in an automatic mode rendering it capable of making the decision on its own.

• iRobot, which provides Packbots capable of tasering enemy combatants; some are also equipped with the highly lethal MetalStorm grenade-launching system.

• The SWORDS platform in Iraq and Afghanistan, which can carry lethal weaponry (M240 or M249 machine guns, or a .50 caliber rifle). A new Modular Advanced Armed Robotic System (MAARS) version is in development.

• Stationary robotic gun-sensor platforms that Israel has considered deploying along the Gaza border in automated kill zones, with machine guns and armored folding shields.

___________________

2 Ronald C. Arkin, unpublished briefing to the committee on January 12, 2012, Washington, D.C.; and Ronald C. Arkin, “Governing Lethal Behavior,” Proceedings of the 3rd International Conference on Human Robot Interaction, ACM Publishing, New York, 2008.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Are such systems new? In one sense, no. A simple pressure-activated mine fulfills the definition of a fully autonomous lethal system—it explodes without human intervention when it experiences a pressure exceeding some preprogrammed threshold. Other newer, fully autonomous systems are more sophisticated—the radar-cued Phalanx Close-In Weapons System for defense against antiship missiles and its land-based counterpart for countering rocket, artillery, and mortar fire are examples. In these latter systems, the fully autonomous mode is enabled when there is insufficient time for a human operator to take action in countering incoming fire.3

Other systems, such as the Mark 48 torpedo, are mobile and capable of moving freely (within a limited domain) and searching for and identifying targets. A torpedo is lethal, but today it requires human intervention to initiate weapons release. Much of the debate about the future of autonomous systems relates to the possibility that a system will deliberately initiate weapons release without a human explicitly making the decision to do so.

Seeking to anticipate future ethical, legal, and societal issues associated with autonomous weapons systems, the Department of Defense promulgated a policy on such weapons in November 2012. This policy is described in Box 3.1.

3.1.3 Ethical, Legal, and Societal Questions and Implications

In some scenarios, the use of armed autonomous systems not only might reduce the likelihood of friendly casualties but also might improve mission performance over possible or typical human performance. For example, autonomous systems can loiter without risk near a target for much longer than is humanly possible, enabling them to collect more information about the target. With more information, the remote weapons operator can do a better job of ascertaining the nature and extent of the likely collateral damage should s/he decide to attack as compared with a pilot flying an armed aircraft in the vicinity of the target; with such information, an attack can be executed in a way that does minimal collateral damage. A remote human operator—operating a ground vehicle on the battlefield from a safe location—will not be driven by fear for his or her own safety in deciding whether or not to attack any given target, and thus is more likely in this respect to behave in a manner consistent with the law of armed conflict than would a soldier in immediate harm’s way.

___________________

3 Clive Blount, “War at a Distance?—Some Thoughts for Airpower Practitioners,” Air Power Review 14(2):31-39, 2011, available at http://www.airpowerstudies.co.uk/APR%20Vol%2014%20No%202.pdf.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Box 3.1 Department of Defense Policy on Autonomy in Weapon Systems

Department of Defense Directive 3000.09, dated November 21, 2012, on the subject of “Autonomy in Weapon Systems” establishes DOD policy regarding autonomous and semi-autonomous weapon systems.

An autonomous weapon system is a weapon system that, once activated, can select and engage targets without further intervention by a human operator.

A subset of autonomous weapon systems are human-supervised autonomous weapon systems that are designed to select and engage targets without further human input after activation but nevertheless allow human operators to override operation of the weapon system and to terminate engagements before unacceptable levels of damage occur.

A semiautonomous weapon system is a weapon system that, once activated, is intended to engage only individual targets or specific target groups that have been selected by a human operator. In semiautonomous weapon systems, autonomy can be provided for engagement-related functions including, but not limited to, acquiring, tracking, and identifying potential targets; cueing potential targets to human operators; prioritizing selected targets; timing of when to fire; or providing terminal guidance to home in on selected targets. Semiautonomous systems also include fire-and-forget or lock-on-after-launch homing munitions that rely on tactics, techniques, and procedures to maximize the probability that only the individual targets or specific target groups explicitly selected by a human operator will be attacked. This provision allows weapons such as the United States Air Force Low Cost Autonomous Attack System loitering missile system to operate within a designated area in which only enemy targets are expected to be found.

Not covered by the policy are autonomous or semiautonomous cyber weapons, unguided munitions, munitions manually guided by operators, or mines.

The policy states that those who authorize the use of, direct the use of, or operate autonomous and semiautonomous weapon systems must do so in accordance with the laws of war, applicable treaties, weapon system safety rules, and the applicable rules of engagement (ROE). In addition, it directs that autonomous and semiautonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force. Autonomous and semiautonomous weapon systems should do the following:

This list of advantages, including ethical ones, provides a strong incentive to develop and deploy autonomous systems. Despite such advantages, a variety of ELSI concerns have been raised about autonomous systems and are discussed below.4

___________________

4 The concerns described below are drawn from a number of sources, including Patrick Lin, “Ethical Blowback from Emerging Technologies,” Journal of Military Ethics 9(4):313-331, 2010.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

• Function as anticipated in realistic operational environments against adaptive adversaries;

• Complete engagements in a timeframe consistent with commander and operator intentions and, if unable to do so, terminate engagements or seek additional human operator input before continuing the engagement; and

• Be sufficiently robust to minimize failures that could lead to unintended engagements or to loss of control of the system to unauthorized parties.

The policy permits use of semiautonomous weapons systems to deliver kinetic or nonkinetic, lethal or nonlethal force in most combat situations, subject to the requirements described above regarding the laws of war and so on. The policy also permits the use of human-supervised autonomous weapon systems in local defense scenarios to select and engage (nonhuman) targets to respond to timecritical or saturation attacks against manned installations and onboard defense of manned platforms. (This provision allows systems such as the Phalanx Close-in Weapons System (CIWS) to operate in its fully autonomous mode.) Last, it permits the use of autonomous weapon systems in the context of applying nonlethal, nonkinetic force, such as some forms of electronic attack, against materiel targets.

The DOD does not currently possess autonomous weapons systems designed for use in scenarios other than those described in the previous paragraph. But in the future, the acquisition of such weapons systems (that is, autonomous weapons systems designed for use in other scenarios) will be subject to two special additional reviews involving the Undersecretaries of Defense for Policy and for Acquisition, Technology and Logistics, and the Chairman of the Joint Chiefs of Staff. Before a decision to enter into formal development, a review will ensure that the development plan meets the requirements of the policy described above. Before a decision to field such a weapons system, a review will ensure that the weapon to be fielded does meet the requirements of the policy described above and, further, that relevant training, doctrine, techniques, tactics, and procedures are adequate to support its use.

Finally, in an acknowledgment that technology will inevitably evolve, the directive states that the policy will expire in 10 years (on November 22, 2022) if it has not been reissued, canceled, or certified current by November 22, 2017.

International Law

Autonomous systems—especially lethal autonomous systems—complicate today’s international law of armed conflict (LOAC) and domestic law as well. Some relevant complications include the following:

• Individual responsibility is one of the most important mechanisms for accountability under LOAC. However, an autonomous system taking an action that would be a LOAC violation if taken by a human being

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

cannot be punished and is not “accountable” in any meaningful sense of the term. Behind the actions of that system are other actions of a number of human beings, who may include the system operator, those higher in the chain of command who directed that the system be used, the system developer/designer/programmer, and so on. How and to what extent, if any, are any of these individuals “responsible” for an action of the system?5

• How and to what extent can lethal autonomous systems distinguish between legitimate and illegitimate targets (such as civilian bystanders)? How and to what extent can such a system exercise valid judgment that “pulling the trigger” does not result in “excessive” collateral damage?

• How might autonomous systems contribute to a lowering of the threshold for engaging in armed conflict? Some analysts argue that the use of remotely operated lethal autonomous systems in particular emboldens political leaders controlling the use of such weapons to engage in armed conflict.6 The argument, in essence, is that nation X will be more likely to wage war against nation Y to the extent that nation X’s troops are not in harm’s way, as would be the case with weapons system operators doing their work from a sanctuary (e.g., nation X’s homeland) rather than in the field (that is, on the battlefield with nation Y’s troops). Under such a scenario, the use of force (that is, the use of such systems) is less likely to be a true act of last resort, and thus violates the “last resort” principle underlying jus ad bellum.

Impact on Users

The armed forces of the world have a great deal of experience with traditional combat, and still the full range of psychological and emotional

___________________

5 A military organization provides a chain of command in which some specific party is responsible for deciding whether a system or weapon is used, and if untoward things happen as the result of such use, the presumption is that this individual specific party is still responsible for the bad outcome. This presumption can be rebutted by various mitigating circumstances (e.g., if further investigation reveals that the weapon itself was flawed in a way that led directly to the bad outcome and that the responsible party had no way of knowing this fact).

6 See, for example, Peter Asaro, “Robots and Responsibility from a Legal Perspective,” Proceedings of the IEEE 2007 International Conference on Robotics and Automation, Workshop on RoboEthics, April 14, 2007, Rome, Italy, available at http://www.peterasaro.org/writing/ASARO%20Legal%20Perspective.pdf; Rob Sparrow, “Killer Robots,” Journal of Applied Philosophy 24(1):62-77, 2007; and Noel Sharkey, “Robot Wars Are a Reality,” The Guardian (UK), August 18, 2007, p. 29, available at http://www.guardian.co.uk/commentisfree/2007/aug/18/comment.military. Also cited in Patrick Lin, George Bekey, and Keith Abney, Autonomous Military Robotics: Risk, Ethics, and Design, California Polytechnic State University, San Luis Obispo, Calif., 2008.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

effects of combat on soldiers is not well understood. Thus, there may well be some poorly understood psychological effects on soldiers who engage in combat far removed from the battlefield.

For example, a 2011 report from the United States Air Force School of Aerospace Medicine, Department of Neuropsychiatry, on the psychological health of operators of remotely piloted aircraft and supporting units identified three groups of psychological stressors on these operators:7

• Operational stressors (those related to sustaining operations) include issues such as restricted working environments (e.g., ground control stations with limited freedom for mobility) and poor workstation ergonomics.

• Combat stressors (those that involve missions undertaken in direct support of combat operations) include stresses induced in operators of remotely piloted vehicles who must manage their on-duty warrior role contemporaneously with their role as one with domestic responsibilities arising from being stationed at home.

• Career stressors (those arising from the placement of individuals into positions requiring the flying of remotely piloted vehicles) include poorly defined career fields with uncertain career progression, especially for those who have previously qualified for piloting manned aircraft. What is the psychological impact on a Navy pilot when a remotely piloted vehicle can land with ease on an aircraft carrier at night in a storm, or on a specialist in explosive ordnance disposal when a bomb disposal robot can disarm an improvised explosive device without placing the specialist at risk?8 How will such individuals demonstrate courage and skill to their superiors and colleagues when such technologies are available?

Humanity of Operators

In the context of armed remotely piloted vehicles (RPVs), concerns have been raised about psychological distancing of RPV operators from

___________________

7 Wayne Chappelle et al., Psychological Health Screening of Remotely Piloted Aircraft (RPA) Operators and Supporting Units, RTO-MP-HFM-205, USAF School of Aerospace Medicine, Department of Neuropsychiatry, Wright-Patterson Air Force Base, Ohio, 2011.

8 Peter Singer describes individuals from the Foster Miller Company in Waltham, Massachusetts, talking about the moment at which they decided to use robots for explosive ordnance disposal (EOD). Teams had received robots for EOD but were not using them. But an incident occurred in which two EOD technicians were killed in Iraq, and the prevailing sentiment shifted quickly from ”We leave the robots in the back of the truck” and ”We don’t use them because we’re brave” to ”You know what? We really do have to start using them.” See Robert Charette, “The Rise of Robot Warriors,” IEEE Spectrum, June 2009, available at http://spectrum.ieee.org/robotics/military-robots/the-rise-of-robot-warriors.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

their targets. Quoting from a report of the UN Human Rights Council,9 “[B]ecause operators are based thousands of miles away from the battlefield, and undertake operations entirely through computer screens and remote audiofeed, there is a risk of developing a ‘Playstation’ mentality to killing.”

Others counter such notions by pointing out that killing at ever-larger distances from one’s target characterizes much of the history of warfare. Increasing the distance between weapons operator and target generally decreases the likelihood that the operator will be injured, and indeed there is no legal requirement that operator and target must be equally vulnerable.

Organizational Impacts

New technology often changes relationships within an organization. For example, the scope and nature of command relationships for the use of that technology are not arbitrary. Someone (or some group of individuals) specifies these relationships. Under what circumstances, if any, is an individual allowed to make his or her own decision regarding placement of a system into a lethal autonomous mode? Who decides on the rules of engagement, and how detailed must they be?

A second example of organizational impact is that autonomous systems reduce the need for personnel—in such an environment, what becomes of promotion opportunities, which traditionally depend in part on the number of personnel that one can command effectively? How do personnel needs affect the scale of financial resources required by an organization?

A third example is that a military organization built around the use of autonomous systems may be regarded differently from one organized traditionally. For example, it is worth considering the controversy over a proposal to introduce a new medal to recognize combat efforts of drone and cyber operators (Box 3.2). The proposal was intended to elevate the status of the operators, recognizing their increasing importance to modern combat. But the public reaction to the proposal reflected skepticism of the idea that a soldier who operates a drone or engages in cyber operations should be recognized and decorated in the same way as the soldier who risks his or her life in the actual theater of battle.

A final example of organizational impact is that autonomous systems

___________________

9 Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Study on Targeted Killings, Human Rights Council, 84, UN Doc. A/HRC/14/24/Add.6, May 28, 2010, available at http://www2.ohchr.org/english/bodies/hrcouncil/docs/14session/A.HRC.14.24.Add6.pdf.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Box 3.2 The Distinguished Warfare Medal

In February 2013, then-Defense Secretary Leon Panetta proposed the “Distinguished Warfare Medal” to recognize drone operators and cyber warriors whose actions “contribute to the success of combat operations, particularly when they remove the enemy from the field of battle, even if those actions are physically removed from the fight.”1 While most agreed that electronic warriors deserve recognition for their contributions to war efforts, many were upset at the proposal that this medal would rank above the Bronze Star (awarded for heroic or meritorious acts of bravery on the battlefield) and the Purple Heart (awarded to soldiers who have been injured in battle). In addition, military decorations and recognition are important for promotions. The designation of the Distinguished Warfare Medal as higher than other medals awarded for physical valor in the theater of battle left many veterans feeling insulted and created a great deal of backlash from the Pentagon, veterans groups, and many members of Congress.

Shortly after taking office, Defense Secretary Chuck Hagel ordered a review of the new medal, resulting in a decision to replace the medal with a “distinguishing device” that would be placed on an existing medal to honor the combat achievements of drone and cyber operators. Such a distinguishing device would be similar to the “V” placed on the Bronze Star to indicate valor.

___________________

1 Lolita Baldor, “Pentagon Creates New Medal for Cyber, Drone Wars,” Associated Press, February 11, 2013, available at http://bigstory.ap.org/article/pentagon-creates-new-medal-cyber-drone-wars.

raise questions regarding accountability. If an autonomous system causes inadvertent damage or death, who is accountable? What party or parties, for example, are responsible for paying punitive or compensatory damages? The party ordering the system into operation? The programmers who developed the controlling software? The system’s vendor? Is it possible for no one to be responsible? If so, why? What counts as sufficient justification?

Technological Imperfections

Autonomous systems have been known to “go haywire” and harm innocents around them. Such problems obviously present safety issues. Moreover, how and to what extent are operators in the vicinity of an autonomous system entitled to know about possible risks? A pilot in an airplane that is partially out of control may be able to steer the airplane away from populated areas—what of the operator of a remotely piloted

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

aircraft that is partially out of control? What are the responsibilities of programmers of an RPV to prevent it from landing in a populated area?

Cybersecurity issues are also often overlooked in the rush to deployment of first-generation technologies. In one instance, video feeds from RPV to operator were not encrypted and adversaries could easily intercept the signals.10 In another instance, a group of university researchers took control of an unmanned aerial vehicle owned by the college after the U.S. Department of Homeland Security asked them to demonstrate such a capability.11

Yet another issue is the ethical standard to which autonomous systems should be held. In particular, for any given dimension of performance, is it sufficient that they do better (on average) than humans can do? Or should they be held to a much higher standard, perhaps a standard of near-perfection? Although the first (weaker) standard is an instance of technology enabling a greater degree of ethical behavior on the battlefield, it is also true that an ethically questionable action of an autonomous system will result in criticism of the system’s autonomy as being flawed and ethically improper. And this will still be true even if the system has built up a long record of ethically appropriate performance.

Adversary Perceptions and Use

To the extent that new technologies bring overwhelming advantages against an adversary, the adversary may well respond with behavior that we might regard as improper or unethical; for example, the adversary may use tactics (such as the use of civilians as human shields for military targets) that violate the laws of war. (Indeed, adversaries may use such tactics even without U.S. use of new technologies—but at the very least the new technologies may provide a post hoc justification for unethical tactics.)

In the case of armed remotely piloted vehicles, concerns have been raised that such use enables the insurgent adversary to cast itself in the role of underdog and the West as a cowardly bully that is unwilling to risk its own troops but is happy to kill remotely.12 Furthermore and regardless of their perceptions of the United States, adversaries may also want to

___________________

10 Siobhan Gorman, Yochi J. Dreazen, and August Cole, ”Insurgents Hack U.S. Drones,” Wall Street Journal Online, December 17, 2009, available at http://online.wsj.com/article/SB126102247889095011.html.

11 “Texas College Hacks Drone in Front of DHS,” RT.com, June 28, 2012, available at http://rt.com/usa/news/texas-1000-us-government-906/.

12 See paragraph 519 in Ministry of Defence, 2011, Joint Doctrine Note 2/11: The UK Approach to Unmanned Aircraft Systems, available at http://dronewarsuk.files.wordpress.com/2011/04/uk-approach-to-uav.pdf/.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

acquire and use such vehicles as well. For example, terrorists could use small drones for assassination purposes, and they could easily be used on U.S. soil.

Civilian Uses

Autonomous systems have a number of civilian applications. Law enforcement authorities can make and have made use of RPVs for surveillance and of bomb disposal robots. Truck and car driving can now be automated under many circumstances,13,14 although such driving is not common today. Unpiloted airplanes may soon be used for transporting cargo. And criminals have used remotely piloted vehicles as transport mechanisms for removing stolen property from the site of the crime.15

Law enforcement authorities act domestically, and within the continental United States a variety of legal protections operate that do not apply overseas. Using technologies originally developed for military application (and in particular for use against non-U.S. citizens outside the borders of the United States) within the United States (e.g., for border surveillance, location of fleeing fugitives) raises a host of potential issues related to civil liberties. The issue is not so much whether these military systems can be usefully and practically employed to assist domestic law enforcement authorities (they do have potential value for certain applications), as it is questions concerning the scope, nature, extent, and conditions of such use. Put differently, the use of military systems in a domestic context raises ethical, societal, and policy questions that are largely open at the time of this writing.

The law enforcement issues are only one policy element of domestic use. For example, liability issues concerning autonomous trucks and cars (technology for which was developed in part by DARPA) have yet to be worked out in any systematic way, at least in part because the authors of today’s laws did not contemplate such vehicles. Various regulatory issues related to safe operation of autonomous vehicles (specifically, RPVs) are in

___________________

13 ”Preparing for DARPA’s Urban Road Challenge,” Cnet.com, January 26, 2007, available at http://news.cnet.com/Preparing-for-DARPAs-urban-road-challenge/2100-11394_3-6153932.html.

14 “Google Driverless Cars: Genius or Frightening Folly,” Electricpig.co.uk, October 11, 2010, available at http://www.electricpig.co.uk/2010/10/11/google-driverless-cars-genius-or-frightening-folly/.

15 Singer reports on a Taiwanese gang that used tiny helicopters with pinhole cameras to carry out a jewelry heist and got away with $4 million in jewels. See “More Countries, Organizations Seeking to Use Aerial Drones for Peaceful, Nefarious Purposes,” October 26, 2011, available at http://www.pri.org/stories/science/technology/more-countries-organizations-seeking-to-use-aerial-drones-for-peaceful-nefarious-purposes-6639.html.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

the process of being addressed at the time of this writing.16 Finally, what of the use of such technologies by private citizens to spy on each other or to perform independent environmental monitoring?17

3.2 PROSTHETICS AND HUMAN ENHANCEMENT

Today, prostheses have been developed for replacement of lost bodily function, but in principle, prostheses could be developed to enhance human functions—physical functions such as lifting strength and running speed and sensory functions such as night vision and enhanced smell.

3.2.1 The Science and Technology of Prosthetics and Human Enhancement

Prostheses are devices that are intended to replace missing human body parts. The discussion below focuses on prostheses that replace body parts that serve physical functions, such as vision or locomotion. Neural prostheses are addressed in the Chapter 2 section on neuroscience.

All prostheses have two components—an assembly (which may be biological and/or electromechanical in nature) and an interface to the human body to which the prosthesis is attached. The assembly replaces the missing part’s function and usually has several components:

• Sensors that provide information to the body about the assembly’s behavior, configuration, and state.

• Receivers that accept information from the body and thus provide guidance to the assembly about the body’s intention for the assembly.

• Actuators that produce the output of that assembly—forms of output are sometimes electrical (as in the case of a prosthesis for a sensory organ) or mechanical (as in the case of a prosthesis for a limb).

• A processing unit that controls the assembly’s operation.

The interface transmits information from the assembly’s sensors to the body’s nervous system and from the nervous system to the assembly. But information flows in the human body are not encoded in forms that are well understood with today’s science. Today, a key factor limiting the development of prostheses—at least prostheses that are integrated

___________________

16 For example, the FAA Modernization and Reform Act of 2012 calls on the FAA to fully integrate unmanned systems, including for commercial use, into the national airspace by September 2015.

17 Siobhan Gorman, “Drones Get Ready to Fly, Unseen, into Everyday Life,” Wall Street Journal, November 3, 2010, available at http://online.wsj.com/article/SB10001424052748703631704575551954273159086.html.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

into the human body to be used in a highly natural way—is likely to be understanding information flows in a useful way, and how to interpret the signals from the nervous system that indicate intentionality and how to translate sensor information into forms that the human nervous system can usefully process.18

As a general rule, today’s state of the art does not result in prosthetic devices that can function nearly as effectively as the human parts they replace. For example, one state-of-the-art visual prosthesis enables a large number of its users to read large-font type and sometimes to recognize words.19 Considering that these individuals were previously unable to read at all, such a prosthesis is remarkable, but no one would argue that it has come close to being a serious replacement for a lost human eye.

3.2.2 Possible Military Applications

To date, prosthetic devices are under development only for the replacement of lost human function (e.g., a prosthetic limb), and as noted above, they are far from achieving such functionality. But there is no reason in principle that they cannot be designed to exceed human capabilities. Visual prostheses could be designed to see infrared light or to provide telescopic vision. Aural prostheses could be designed to provide better-than-normal hearing. A powered arm or leg prosthesis could be designed to have significantly greater strength than a human arm or leg. Some DARPA efforts have focused explicitly on human enhancement (e.g., increased strength,20 improved cognition,21 lowered sleep requirements22).

If the constraint on integration into the human body is relaxed, devices that replace and even augment human function—devices that have already been designed and tested although they are not available for widespread use today—could come into use. For example, exoskeletons have been developed that can help disabled wheelchair-bound individuals to leave their wheelchairs behind. Other exoskeletons have been

___________________

18 A second limiting factor is the energy storage capacity of reasonably sized batteries.

19 Lyndon da Cruz et al., “The Argus II Epiretinal Prosthesis System Allows Letter and Word Reading and Long-Term Function in Patients with Profound Vision Loss,” British Journal of Ophthalmology 97(5):632-636, 2013, available at http://bjo.bmj.com/content/early/2013/02/19/bjophthalmol-2012-301525.full.

20 See http://www.darpa.mil/Our_Work/DSO/Programs/Warrior_Web.aspx.

21 Mark St. John et al., “Overview of the DARPA Augmented Cognition Technical Integration Experiment,” 2007, available at www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA475406.

22 Sam A. Deadwyler et al., “Systemic and Nasal Delivery of Orexin-A (Hypocretin-1) Reduces the Effects of Sleep Deprivation on Cognitive Performance in Nonhuman Primates,” Journal of Neuroscience 27(52):14239-14247, 2007, available at http://www.jneurosci.org/content/27/52/14239.abstract.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

developed to enable individuals to lift much heavier loads than would be possible for unassisted individuals. These latter devices have not been designed for use in direct combat—rather, they enable soldiers in the field to move and handle heavy logistic burdens more easily.

3.2.3 Ethical, Legal, and Societal Questions and Implications

In a nonmilitary context, ethical, legal, and societal issues regarding prosthetics and human enhancement technology span a wide range, and some if not most of these issues spill over into the military context. Such issues include (but are not limited to):

• Exacerbation of economic inequalities due to the high cost of prostheses.

• Damage to solidarities and/or culture based on a group’s common experience with lost human function (as is the concern of many in the deaf community).

• Reducing the importance and value of human effort in improving human function (a particularly important point when considering enhancements). If anyone can become very fast, or very strong, or very smart simply by using a prosthetic device, how should we regard an individual who has expended a great deal of personal effort to become faster, stronger, or smarter?

The remainder of this section addresses a number of ethical, legal, and societal issues related to prosthetics and human enhancement that emerge in the military context.23

International Law

The Martens clause contained in the 1977 Additional Protocol to the Geneva Conventions in essence prohibits weapons whose use would violate the laws of humanity and the requirements of the public conscience. Established as a way to ensure that the use of weapons not explicitly covered by the conventions was not necessarily permitted by them, the Martens clause is broadly recognized as having no accepted interpretation. Nevertheless, some analysts argue that the existence of the Martens clause

___________________

23 Patrick Lin, “More Than Human? The Ethics of Biologically Enhancing Soldiers,” The Atlantic, February 16, 2012, available at http://www.theatlantic.com/technology/archive/2012/02/more-than-human-the-ethics-of-biologically-enhancing-soldiers/253217/.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

raises the issue of whether a highly enhanced human soldier engaging in combat might himself be such a weapon.24

Safety and Other Effects on the Recipients of Enhancements

Traditional biomedical ethics come into play any time a foreign object or substance is introduced into the human body, and safety is one of its primary concerns. But when the human body is that of a soldier, especially one who may go into combat, and the soldier functions within a military chain of command, how and to what extent, if any, should concerns about personal safety be weighed against battlefield advantages that an enhancement may afford the user? And what happens if the enhancement is still in its early developmental stages, when the safety risks may be understood only very poorly?

Safety risks may be compounded by exposure to cyber security threats. To the extent that these devices depend on information technology, they may be subject to cyber attacks that could alter their function in dangerous ways or cause them to malfunction. Privacy, too, is an issue—how and to what extent are data associated with the use of these devices sensitive? Does it constitute personal health information that requires special protections?

Reversibility is an ELSI concern as well. Can any deleterious effects of an enhancement on the human body be reversed by removing the prosthesis from the body? Should an enhancement be removed when a soldier leaves military service?

Last, what are the psychological effects of human enhancements that are integrated into the human body? How and to what extent, if any, do they change an individual’s conception of himself or herself? How long-lasting are such changes? What is the significance of such changes? Might enhanced soldiers take more personal risks? And how will unenhanced soldiers react to the availability of enhancements for others? For example, will unenhanced soldiers demand them for their own use?

Organizational Issues

How and to what extent, if at all, should a military organization regard enhanced soldiers differently from unenhanced soldiers? For example, what of:

___________________

24 See Patrick Lin, Maxwell J. Mehlman, and Keith Abney, Enhanced Warfighters: Risk, Ethics, and Policy (Greenwall Report), California Polytechnic State University, San Luis Obispo, Calif., 2013, pp. 34-35, available at http://ethics.calpoly.edu/Greenwall_report.pdf.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

• Expectations for combat behavior,

• Rates of promotion and decoration,

• Integration into existing military units,

• Needs for rest and recuperation, and

• Terms of service in the armed forces.

Civilian Use

Use of prosthetic and enhancement technologies in the civilian sector raises a number of ethical, legal, and societal issues.

For individuals transitioning from military to civilian life, policy makers must ask whether prosthetic and enhancement technologies acquired in the military will remain with the individual. In some cases (e.g., prosthetic limbs that replace lost human function), there may be a social contract that allows these individuals to retain these devices. But should retiring soldiers be allowed to keep devices that enhance their performance? How well will such individuals integrate with civilian society?

Prosthetic and enhancement technologies also move the traditional boundaries separating disability from normal function and normal function from enhanced function—and sometimes certain legal categories are based on traditional boundaries. For example, being a member of a certain legal class (e.g., those individuals regarded as blind or deaf) may be an entitlement gateway for certain benefits; how, if at all, should prosthetic technology change an individual’s eligibility for those benefits?

Implanted devices retained by individuals may also subject them to certain restrictions, ranging from increased screening at airports to restricted travel to countries that may be on some “no-export” list. And do individuals actually own their prosthetic devices, in the sense of being allowed to control all uses of such a device? (For example, could they themselves modify it?)

Unanticipated Effects

In his presentation to the committee, Nick Agar of the Victoria University of Wellington introduced the notion that human enhancement technologies might have priming effects on their users. He illustrated the point by describing research on implicit memory effects—subtle and unconscious effects of prior stimuli on human behavior—citing the example of people reading lists of adjectives describing stereotypical attributes of the elderly and then displaying behaviors of the elderly such as stooped

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

walking.25 In the case of enhancement technologies, Agar speculated that the priming effect might be driven by the stimuli of the technology’s function. For example, a prosthetic limb designed in part to serve as a weapon might have a subtle, ongoing priming effect on its bearer that would make him or her more aggressive.

3.3 CYBER WEAPONS

Cyber weaponry opens up a new dimension of warfare that may target critical infrastructures on which society will increasingly depend, generating vast increases in cost to defend and to generate countervailing attack technologies.

3.3.1 The Technology of Cyber Weapons

Cyber weapons are configurations of information technology (either hardware or software) that can be used to affect an adversary’s information technology systems and/or networks. Because such weapons are fundamentally based on today’s information technology, experts in the field understand the basic technological building blocks of cyber weapons well. That is, there are no “new” technologies that contribute uniquely to cyber weaponry, although new ways of using more mature technologies can certainly emerge. Furthermore, nonstate actors (e.g., terrorists, criminals, random hackers) can develop and/or use certain cyber weapons.

Cyber weapons gain their power and sophistication from two facts. First, the basic technological building blocks can be arranged in many different ways, and those arrangements are limited only by human creativity and ingenuity. Second, cyber weapons are generally designed to target systems that are complex and thus have many failure modes.

These two facts mean that cyber weapons can operate through mechanisms that are quite surprising and difficult to understand, and can thus appear to involve entirely novel capabilities (sometimes looking like “magic” to an uninitiated observer). In practice, these mechanisms will almost always take advantage of sometimes obscure or subtle weak points (that is, vulnerabilities) in a system or the socio-technical organization in which the system is embedded.

In addition, cyber weapons can be designed to be highly discriminating or highly indiscriminate in their targeting. As a general rule, highly discriminating cyber weapons (that is, weapons that affect only their spec-

___________________

25 John A. Bargh, Mark Chen, and Lara Burrows, ”Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype-Activation on Action,” Journal of Personality and Social Psychology 71(2):230-244, 1996.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

ified targets and nothing else) are more difficult to design and implement than are weapons that are more indiscriminate. Highly discriminating weapons also require a great deal of intelligence support for their use—and in the absence of adequate intelligence, the effects of using even a highly discriminating cyber weapon may cascade if previously unknown elements are connected (directly or indirectly) to the targeted system.

3.3.2 Possible Military Applications

Cyber weapons can be used to compromise the confidentiality of information, the integrity of information or software/programming, or the availability of IT-based services to the user and also to forge authenticity:26

Breaching the confidentiality of information refers to the ability to obtain from the targeted IT system information that the rightful owner or operator of that system would prefer to keep confidential. For example, an adversary listens to a Wi-Fi connection between a computer and a base station and is able to capture the data stream between them.

Compromising the integrity of computer-represented data refers to changing or destroying information that its rightful owner wishes to keep intact. That data may be input to computer programs or machine-readable programs themselves. For example, a computer virus can erase all of the files on a user’s hard drive.

Denying the availability of IT-based services to users refers to preventing a user from obtaining the full value of his or her interactions with the computer. If the user finds the computer too slow to respond, or that it does not respond at all, availability has been denied. For example, a denial-of-service attack on an important Web site keeps legitimate and authorized users from accessing the services it provides.

Forging authenticity. An authentic message or transaction is one known to have originated from the party claiming to have originated it. Forgery leads the receiver of the message or the other party in a transaction into believing that the sender or first party in a transaction is who he claims to be, even if that is not true.

Cybersecurity analysts distinguish between cyber exploitation and cyber attack. Cyber exploitation refers to activities involving the first bulleted item above (breaching confidentiality), cyber attack to activities involving the second, third, and fourth items above (compromising integ-

___________________

26 This discussion of cyber weapons borrows liberally from National Research Council, Technology, Policy, Law, and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities, The National Academies Press, Washington, D.C., 2009.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

rity, denying availability, forging authenticity). Many policy makers today believe that cyber exploitations conducted against the United States are a major threat to its economic security, and perhaps even more significant than traditional military threats.

Cyber weapons can cause temporary damage or permanent damage. Some examples of temporary damage include denial-of-service attacks; operations that take advantage of bugs in a target system, causing a machine to crash and reboot at critical times (but leaving it otherwise unharmed); attacks that change the configuration of a system (e.g., to give false credentials that allow an intruder to gain access), and so on.

Examples of permanent damage include injection of commands into database queries to delete or alter data in the database, modification of programs to cause subtle and slow changes in databases such that all of the user’s backup files are corrupted and hence the entire database becomes unrecoverable for all practical purposes, and programs that destroy hardware (e.g., by repeatedly writing flash memories in a way that uses up their limited write cycles).

Another class of attacks targets not the computers per se but the physical devices that may be controlled by those computers. Computers often control equipment such as ultracentrifuges or refrigerators or diesel generators, and by introducing faulty programming into the computer controllers of the targeted equipment, it is possible to destroy or damage such equipment.27 Furthermore, it is sometimes possible to compromise the controlling computers in such a way that reinstallation of all of the original software does not restore the computer to its original state—that is, only a replacement of the corrupted computer would suffice to restore the controller to its original state.

A different class of attacks is designed not so much to reduce the actual functionality of the targeted IT systems or networks as to reduce the user’s confidence or ability in using them. For example, a user can lose confidence in a system even if the actual damage to the system is relatively minor. (A calculator may provide an accurate answer to a given addition 99.9 percent of the time, but if the user does not know the precise circumstances under which it provides an inaccurate answer, he may well refrain from using it for any calculation at all.) Or an attack on an adversary’s primary IT system may force him to use a backup system, which may well have less functionality or which the adversary may use less effectively.

A cyber weapon of special power and significance is the botnet. Bot-

___________________

27 For example, the Stuxnet computer worm, first discovered in June 2010, was aimed at disrupting the operation of Iran’s uranium enrichment facilities. See http://topics.nytimes.com/top/reference/timestopics/subjects/c/computer_malware/stuxnet/index.html.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

nets are arrays of compromised computers connected to the Internet that are remotely controlled by the attacker. The attack value of a botnet arises from the sheer number of computers that an attacker can control—often tens or hundreds of thousands and perhaps as many as a million. Since all of these computers are under one party’s control, the botnet can act as a powerful amplifier of an attacker’s actions. Although botnets are known to be well suited to certain denial-of-service attacks, their full range of possible utility has not yet been examined.

3.3.3 Ethical, Legal, and Societal Questions and Implications28

The use of cyber weapons in conflict as a deliberate instrument of national policy raises a variety of ethical, legal, and societal issues.

International Law

Although the United States has stated its view that the law of armed conflict applies to cyberspace,29 this view has not been explicitly endorsed by all of the signers of the Geneva and Hague Conventions or the UN Charter. In addition, cyber warfare raises a variety of questions about how to interpret LOAC in any given scenario involving the use of cyber weapons.30 Moreover, even if LOAC does not apply in any given scenario, the principles underlying LOAC may still be relevant to the ethics of using cyber weapons in that scenario.

For example, the laws of war address the circumstances under which the use of force can be legally justified (also known as jus ad bellum and further discussed in Chapter 4). Some of the underlying principles include the following:

Assignment of responsibility for a hostile act to the appropriate nation. In a cyber context, it may be difficult to ascertain the identity of the responsible nation. In some (perhaps many) cases, a hostile cyber operation

___________________

28 See, for example, Patrick Lin, “Robots, Ethics, & War,” Center for Internet Society at Stanford Law School, December 15, 2012, available at http://cyberlaw.stanford.edu/blog/2010/12/robots-ethics-war.

29 “International Law in Cyberspace,” remarks of Harold Hongju Koh, legal advisor of the U.S. Department of State, to the USCYBERCOM Inter-Agency Legal Conference, Ft. Meade, Md., September 18, 2012, available at http://www.state.gov/s/l/releases/remarks/197924.htm.

30 The most comprehensive source on this topic is Michael Schmitt (ed.), Tallinn Manual on the International Law Applicable to Cyber Warfare, available at http://www.nowandfutures.com/large/Tallinn-Manual-on-the-International-Law-Applicable-to-Cyber-Warfare-Draft.pdf.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

may have been perpetrated by a subnational group, and the responsible party may not be a nation at all. This point suggests that it may be very hard to know the party against which a response should be targeted, or if international law per se is even applicable.

The fuzziness of the lines between cyber crime and cyber war, the former being a law enforcement matter and the latter being a matter of national security. Moreover, because the damage from an individual cyber attack can be very small, the precise point at which a set of many cyber attacks becomes a national security issue may be unclear.

The laws of war also address how opposing forces must behave in the conduct of conflict (known as jus in bello and further discussed in Chapter 4). Some of the principles include the following:31

Differentiation between military and civilian targets. In general, ethical considerations suggest that only military entities should be targeted. A party aiming kinetic weapons often (indeed, usually) has reasonably direct confirmation that a given target is indeed military. But how does a cyber targeter know that a given computer is indeed a military computer? Any computer could be located at a specific Internet Protocol (IP) address, and IP addresses for a given computer are not necessarily static. In the absence of a machine-readable indication that any given computer is in fact a military computer, an intelligence collection effort must be undertaken to determine the extent to which the computer has military purposes. What evidence and what degree of certainty in the intelligence information are sufficient to make a determination that a given computer is a valid military target?

Avoidance of collateral damage. A second principle is that in attacking military targets, targeters should seek to avoid accidental, inadvertent, or undesired harm to civilians and their property. But a cyber attack may inflict damage on some civilian computers. What consideration should such damage receive in attack planning, especially if it does not result in death or physical destruction? Moreover, given that the success of many cyber attacks depends on good intelligence about their targets, how should commanders estimate likely collateral damage when good intelligence about newly discovered cyber targets is sparse?

Cease-fire and conflict termination. What constitutes a cease-fire in cyberspace between two adversaries? How can the two sides in a cyber-

___________________

31 For further discussion, see Chapter 7 of National Research Council, Technology, Policy, Law, and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities, The National Academies Press, Washington, D.C., 2009.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

conflict negotiation meaningfully demonstrate their commitment to a cease-fire?

Outside the existing law of armed conflict, cyber weapons introduce the possibility that international law could or should evolve to manage new kinds of harm that might be caused through the use of such weapons. Neither LOAC nor any other international law prohibits the conduct of espionage. But this legal tradition evolved before deployments of information technology made it possible to find and exfiltrate much larger volumes of information, and in an era when information is a key coin of the realm, the large-scale exfiltration of important information from a nation surely raises a number of ethical, legal, and societal issues. Should exfiltration of information continue to be legal? If not, what kinds of and how much information exfiltration should be allowed under what circumstances? How might exfiltration be regulated or rules regarding exfiltration be enforced?

Domestic Law

The United States Code includes Title 10, which relates to military matters; Title 50, which relates to intelligence matters; and Title 18, which relates to law enforcement and criminal matters. But the nature of cyber weaponry is that military forces, intelligence agencies, and law enforcement agencies can all find value in the use of cyber weapons under certain circumstances, and the separate legal frameworks of Title 10, Title 50, and Title 18 inevitably leave gaps or result in a lack of clarity about which agencies of the U.S. government should take the lead regarding the use of cyber weapons in any given situation.

As one example of gaps in current domestic law, private-sector entities are prohibited by Title 18 from engaging in offensive operations in cyberspace to protect themselves. Whether or not this policy is wise and appropriate for the nation is subject to debate—but it is manifestly clear that current law forbidding private parties to engage in self-help in cyberspace was formulated many years before the issue attained its current significance.

Civilian Uses

Civilian users have plausible and legitimate uses for cyber weapons. The most common purpose is for developing and testing cyber defenses. Penetration testing—a legitimate activity of civilian enterprises that tests their cyber defenses for their resistance to cyber attack—demands the use of cyber weapons that are comparable to those that might be used in a real

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

attack. And the development of defenses against particular cyber attacks requires having the appropriate cyber weapons available for use in the development environment.

Organizational Impacts

The use of cyber weapons as an instrument of government policy has many organizational implications. For example, organizations established to use cyber weapons must consider matters such as training, liability for any use of such weapons that harms innocent parties, recruitment (how to obtain personnel skilled in the use of such weapons who can be trusted to use them in the service of legitimate government goals), command and control and rules of engagement (how and under what circumstances cyber “shooters” receive orders to use their weapons, whose authority is needed to issue such orders), and identification friend-or-foe, the process by which legitimate cyber targets are identified.

Adversary Perceptions

The alleged U.S. use of cyber weapons (alleged because such use has not been publicly acknowledged by the U.S. government) against Iran (the Stuxnet worm, as described in Footnote 27) has spawned concerns that cyber weapons released “into the wild” and then used against adversary targets will redound against U.S. interests in several ways. The first concern is that the use of such weapons by the United States legitimates them as an instrument of international conflict, and increases the likelihood that other nation-states will use them against the United States in a future conflict or disagreement. A second concern is that such use flies in the face of long-standing U.S. policy pronouncements about the value of a secure Internet environment for the entire world. Last, there is a concern that the code—the actual programming—can be reverse-engineered and then used by adversaries to develop cyber weapons of their own.

3.4 NONLETHAL WEAPONS

The U.S. Department of Defense defines “nonlethal weapons” as “weapons … designed and primarily employed to incapacitate targeted personnel or materiel immediately, while minimizing fatalities, permanent injury to personnel, and undesired damage to property in the targeted areas or environment. Non-lethal weapons are intended to have reversible effects on personnel or materiel.” Other terms used to refer to similar weapons include “less lethal,” “less than lethal,” “prelethal,” and “potentially lethal.”

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

3.4.1 The Technology of Nonlethal Weapons

The general class of nonlethal weapons includes a wide variety of technologies:

• Kinetic weapons are decidedly low-tech—bean-bag rounds for shotguns and rubber bullets for pistols have been used for a long time.

• Barriers and entanglements can be used to stop land vehicles moving at high speed (such as a car trying to speed through a checkpoint) or to damage propellers of waterborne craft.

• Optical weapons (e.g., dazzling lasers) are used to temporarily blind an individual using bright light—the individual must shut or avert his eyes to avoid pain. Such weapons are often used on individuals operating a vehicle, with the intent of forcing the driver to stop or flee.

• Acoustic weapons project intense sound waves in the direction of a target from long distances, and individuals within effective range feel pain from the loud sound.

• Directed-energy weapons that project millimeter-wave radiation can cause a very painful burning sensation on human skin without actually damaging the skin.32 Such weapons, used to direct energy into a large area, are believed to be useful in causing humans to flee an area to avoid that pain. Other directed-energy weapons direct high-powered microwave radiation to disrupt electronics used by adversaries.

• Electrical weapons (e.g., tasers and stun guns) use high-voltage shocks to affect the nervous system of an individual, causing him or her to lose muscle control temporarily. One foundational science for understanding such effects is neuroscience, as discussed in Chapter 2.

• Biological and chemical agents may be aimed at degrading fuel or metal, or may target neurological functions to incapacitate people, repel them (e.g., with a very obnoxious odor), or alter their emotional state (e.g., to calm an angry mob, to induce temporary depression in people). For the latter types of effects, a foundational science for understanding such effects is neuroscience.

• Cyber weapons are often included in the category of “nonlethal” weapons because they have direct effects only on computer code or hardware.

___________________

32 Directed-energy weapons with this effect are sometimes regarded as being weapons based on neuroscience, since they manipulate the central nervous system, even if the mechanisms involved are not chemically based. See, for example, Royal Society, Neuroscience, Conflict, and Security, Royal Society, London, UK, February 2012.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

3.4.2 Possible Applications

Nonlethal weapons are intended to provide their users with options in addition to lethal force. Proponents of such weapons suggest that they may be useful in a variety of military engagements or situations that are “less than war,” such as in peacekeeping and humanitarian involvements, in situations in which it is hard to separate combatants and noncombatants, or in civilian and military law enforcement contexts such as riot control or the management of violent criminals. In such situations, the use of lethal force is discouraged—and so new nonlethal weapons (such as tasers) have tended to substitute for older nonlethal weapons (such as billy clubs).

A key question concerning nonlethal weapons in combat is their relationship to traditional weapons—are nonlethal weapons intended to be used instead of traditional weapons or in addition to traditional weapons? For example, an acoustic weapon can be used to drive troops or irregular forces from an area or to dissuade a small boat from approaching a ship. But it can also be used to flush adversaries out from under cover, where they could be more easily targeted and killed with conventional weapons. The latter uses are explicitly permitted by NATO doctrine on nonlethal weapons:

Non-lethal weapons may be used in conjunction with lethal weapon systems to enhance the latter’s effectiveness and efficiency across the full spectrum of military operations.33

So it is clear that in at least some military contexts, military doctrine anticipates that nonlethal weapons can be used along with traditional weapons. But it is also clear that they are not always intended to be used in this way.

Another issue is whether the availability of nonlethal weapons in addition to traditional weapons creates an obligation to use them before one uses traditional weapons that are (by definition) more lethal. On this point, NATO doctrine is also explicit:

Neither the existence, the presence, nor the potential effect of non-lethal weapons shall constitute an obligation to use non-lethal weapons, or impose a higher standard for, or additional restrictions on, the use of lethal force. In all cases NATO forces shall retain the option for immediate use

___________________

33 Science and Technology Organization Collaboration and Support Office, Annex B: NATO Policy on Non-Lethal Weapons, available at http://ftp.rta.nato.int/public//PubFullText/RTO/TR/RTO-TR-SAS-040///TR-SAS-040-ANN-B.pdf.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

of lethal weapons consistent with applicable national and international law and approved Rules of Engagement.34

3.4.3 Ethical, Legal, and Societal Questions and Implications

The diversity of nonlethal weapons types and of possible contexts of use complicate ethical analysis.

Controversy over Terminology

As suggested in the introduction to this section, the term “nonlethal weapon” is arguably misleading, because such weapons can indeed be used with lethal effects. The public policy debate over such weapons is thus clouded, because many of the issues that do arise in fact would not emerge were such weapons always capable of operating in a nonlethal manner.

For example, how and to what extent, if any, should the intended targets of such weapons be taken into account in determining whether a weapon is “nonlethal”? The physical characteristics of the intended target must be relevant in some ways, but this requirement cannot mean that a machine gun aimed at an inanimate object should be categorized as a nonlethal weapon.

Are cyber weapons nonlethal? Yes, to the extent that they do not cause damage to artifacts and systems connected to their primary targets. But many cyber weapons are also intended to have effects on systems that they control, and malfunctions in those systems may well affect humans. Are antisatellite weapons nonlethal? Yes, since most satellites are unmanned. But if fired against a crewed military spacecraft, they become lethal weapons. Are chemical incapacitants nonlethal? Yes (for the most part), when they are used in clinically controlled settings. But the Scientific Advisory Board of the Organization for the Prohibition of Chemical Weapons concluded in 2011 that, given the uncontrolled settings in which such agents are actually used, “the term ‘non-lethal’ is inappropriate when referring to chemicals intended for use as incapacitants.”35

___________________

34 Science and Technology Organization Collaboration and Support Office, Annex B: NATO Policy on Non-Lethal Weapons, available at http://ftp.rta.nato.int/public//PubFullText/RTO/TR/RTO-TR-SAS-040///TR-SAS-040-ANN-B.pdf.

35 Scientfic Advisory Board, Report of the Scientific Advisory Board on Developments in Science and Technology for the Third Special Session of the Conference of the States Parties to Review the Operation of the Chemical Weapons Convention, October 29, 2012, available at http://www.opcw.org/index.php?eID=dam_frontend_push&docID=15865.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Impact on Existing Arms Control Agreements

Certain nonlethal weapons raise concerns about eroding existing constraints associated with existing arms control agreements. One good example of such nonlethal weapons is that of biological or chemical agents that are intended to affect humans. The Biological and Toxin Weapons Convention forbids signatories from developing, producing, stockpiling, or otherwise acquiring or retaining biological agents or toxins “of types and in quantities that have no justification for prophylactic, protective or other peaceful purposes” and also “weapons, equipment or means of delivery designed to use such agents or toxins for hostile purposes or in armed conflict.”36

Similarly, the Chemical Weapons Convention (CWC) forbids parties to the treaty from developing, producing, otherwise acquiring, stockpiling, or retaining chemical weapons.37 Chemical weapons are in turn defined as “toxic chemicals and their precursors,” except when they are intended for permissible purposes and acquired in the types and quantities consistent with the permissible purposes. A toxic chemical is one that through its chemical action on life processes can cause death, temporary incapacitation, or permanent harm to humans or animals. (Thus, incapacitating agents are included in the definition of “toxic chemicals” and the use of incapacitating agents is forbidden as a means and method of war.) Permissible purposes include “industrial, agricultural, research, medical, pharmaceutical or other peaceful purposes”; protective purposes (that is, purposes “directly related to protection against toxic chemicals and to protection against chemical weapons”; and law enforcement, including domestic riot control purposes. Signatories also agree not to use riot control agents as a means of warfare, where a riot control agent is an agent that “can produce rapidly in humans sensory irritation or disabling physical effects which disappear within a short time following termination of exposure.”

Many issues regarding arms control turn on the specific meaning of terms such as “temporary incapacitation,” “other harm,” and “sensory irritation or disabling physical effects.” In addition, they depend on determinations of the intended purpose for a given agent (there is no agreed definition of “law enforcement,” for example).

Such definitional concerns have been particularly apparent in contemplating possible chemical weapons based on neuroscience (see the Chapter 2 section on neuroscience) that could create specific temporary effects in humans. Although there is a broad consensus that the CWC

___________________

36 See http://www.un.org/disarmament/WMD/Bio/.

37 See http://www.opcw.org/chemical-weapons-convention/.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

prohibitions on using toxic chemicals in conflict extend to the use of incapacitating chemical agents (ICAs) in genuine combat situations, a number of countries, including the United States and Russia, have shown an active interest in ICAs for law enforcement and in situations such as counterterrorism where the lines between combat and law enforcement may blur. For example, even after the signing of the CWC, research has been proposed to develop “calmatives”—chemical agents that, when administered to humans, change their emotional states from angry to calm (as one possibility);38 such agents might be useful in reducing the damage that a rioting crowd might cause or in sapping the will of adversary soldiers to fight on the battlefield.

The first two CWC review conferences were unable to address the issue of ICAs. Although substantial discussion and debate during the third review conference in April 2013 clarified a number of national positions, a Swiss proposal to undertake formal technical discussions was not included in the final document. At the first meeting of the Organization for the Prohibition of Chemical Weapons (OPCW) executive council following the review conference, the U.S. ambassador stated:

… we too are disappointed that time ran out before final agreement could be reached on language relating to substances termed “incapacitating chemical agents”. The United States believes that agreement on language is within reach. We will work closely and intensively with the Swiss and other delegations so that this important discussion can continue. In this context, I also wish very clearly and directly to reconfirm that the United States is not developing, producing, stockpiling, or using incapacitating chemical agents.39

Beyond the debates over whether ICAs would be permitted in law enforcement, there is also concern that the use of such agents will undermine the fundamental prohibitions of the treaty. To the extent that some

___________________

38 For example, the International and Operational Law Division of the Deputy Assistant Judge Advocate General of the Navy approved in the late 1990s a list of proposed new, advanced, or emerging technologies that may lead developments of interest to the U.S. nonlethal weapons effort, including gastrointestinal convulsives, calmative agents, aqueous foam, malodorous agents, oleoresin capsicum (OC) cayenne pepper spray, smokes and fogs, and riot control agents (orthochlorobenzylidene malononitrile, also known as CS, and chloracetophenone, also known as CN). See, for example, Margaret-Anne Coppernoll, “The Nonlethal Weapons Debate,” Naval War College Review 52:112-131, Spring 1999. In 2004, a Defense Science Board study on future strategic strike forces (available at http://www.fas.org/irp/agency/dod/dsb/fssf.pdf) noted that calmatives could have value in neutralizing individuals while minimizing undesirable effects.

39 Robert Mikulak, Statement by Ambassador Robert P. Mikulak, United States Delegation to the OPCW at the Seventy-Second Session of the Executive Council, OPCW EC-72/NAT.8, available at http://www.opcw.org/index.php?eID=dam_frontend_push&docID=16511.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

ICAs also fall under the provisions of the Biological Weapons Convention (BWC), the same concerns apply.

The pressures placed on the CWC and the BWC by the possibility of developing chemically or biologically based incapacitating agents may point to a broader lesson. Arms control agreements are often signed in a particular technological context. Changes in that context, whether driven by new S&T developments or new concepts of use for existing technologies, mean that in order to remain effective treaties must strive to stay on top of relevant advances. In extreme cases, even changes in the basic language of the treaty or abrogation or creation of new legal mechanisms might become necessary in response.40 This lesson suggests that even research on certain new technology developments may have ELSI implications for existing agreements long before such research bears fruit.

International Law

The BWC and the CWC are not the only legal frameworks that affect the potential development and use of nonlethal weapons. The law of armed conflict (specifically, Article 51 of Additional Protocol I to the Geneva Conventions) stipulates that civilians shall not be the subjects of attack. This is a key element of the principle of distinction, which distinguishes between members of a nation’s armed forces engaged in conflict and civilians, who are presumed not to participate in hostilities directly and thus should be protected from the dangers of military operations.41 Although civilians (that is, noncombatants) have always contributed to the general war effort of parties engaged in armed conflicts (e.g., helped produce weapons and munitions), they have usually been at some distance from actual ground combat. Since the end of World War II and the

___________________

40 Because science and technology are at the core of both treaties, both the CWC and the BWC call for regular review of developments in science and technology that could affect the future of conventions, both during the review conferences held every 5 years and in between (see National Research Council, Life Sciences and Related Fields: Trends Relevant to the Biological Weapons Convention, The National Academies Press, Washington, D.C., 2012). In 2012, for example, the Organization for the Prohibition of Chemical Weapons created a temporary working group on convergence to address the increasing overlap between chemistry and biology and how that affects the future of the CWC and the BWC. Members included a member of the staff of the BWC Implementation Support Unit and the chair of a major independent international review of trends in S&T for the seventh BWC review conference (see http://www.opcw.org/about-opcw/subsidiary-bodies/scientific-advisory-board/documents/reports/).

41 Nils Melzer, ”Interpretive Guidance on the Notion of Direct Participation in Hostilities Under International Humanitarian Law,” International Committee of the Red Cross, Geneva, Switzerland, 2009, available at http://www.icrc.org/eng/assets/files/other/icrc-002-0990.pdf.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

increase in civil wars over traditional interstate conflict and then the rise of nonstate actors, the assumption of separation has been increasingly challenged, and combatants and noncombatants are often intermingled.

The fact of intermingling is one rationale for the development of nonlethal weapons—the use of these weapons when combatants and noncombatants are intermingled is intended to reduce the risk of incurring noncombatant casualties. A common use scenario is one in which a soldier confronting such a situation is unable to distinguish between a combatant and a noncombatant, and uses a nonlethal weapon to subdue an individual. The rationale for nonlethal weapons is thus that if the individual turns out to be a noncombatant, then no harm is done, but if the individual turns out to be a combatant, then he has been subdued.

In this case, the argument turns on the meaning of the term “attack,” which is defined as an act of violence. For “nonlethal” weapons other than those covered by the CWC and BWC, is it an act of violence to use a weapon that causes unconsciousness? And if the answer is not categorical (that is, “it depends”), what are the circumstances on which the answer depends?

A second requirement of the law of armed conflict is a prohibition on weapons that are “calculated to cause unnecessary suffering.”42 In the 1980s and 1990s, a question arose over whether a weapon intended to blind but not kill enemy soldiers—by definition, a nonlethal weapon—might be such a weapon.

Box 3.3 recounts briefly some of the history of blinding lasers. At a high level of abstraction, lessons from this history suggest an interplay of ethical and legal issues. No specific international prohibitions against blinding lasers were in place in the early 1980s, and the United States sought to develop such weapons. However, over time, ethical concerns suggesting that blinding as a method of warfare was in fact particularly inhumane were one factor that led the United States to see value in explicitly supporting such a ban, first as a matter of policy and then as a matter of international law and treaty, even if blinding lasers themselves could arguably have been covered under the prohibition of weapons that caused unnecessary suffering.

Another distinct body of law, discussed further in Chapter 4, is international human rights law, which addresses the relationship between a state and its citizens rather than relationships between states in conflict addressed by the law of armed conflict. Many analysts, but by no means all, believe that international human rights law and international

___________________

42 Annex to Hague Convention IV Respecting the Laws and Customs of War on Land of October 18, 1907 (36 Stat. 2277; TS 539; 1 Bevans 631), article 23(e). Notably, neither the annex nor the convention specifies a definition for “unnecessary suffering.”

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

humanitarian law (that is, the law of armed conflict) are closely related, however.43 International human rights law is codified in a number of general treaties as well as international agreements focused on particular issues.

Among the provisions of international human rights law that could be relevant to nonlethal weapons are prohibitions on torture or on degrading or inhumane punishments. More general provisions, such as a fundamental right to life or to health, are also potentially relevant. Potential violations of international human rights law have been cited as part of the arguments against the use of incapacitating chemical agents,44 as well as against other forms of nonlethal weapons.

Safety

The extent to which a given weapon is nonlethal (or more precisely, less lethal) is often an empirical question. How might such weapons be tested for lower lethality? Animal testing and modeling do provide some insight, but high fidelity is sometimes available only through human testing. Laboratory testing conditions often do not reflect real-world conditions of use. In practice, then, certain information on lethality may be available only from operational experience—a point suggesting that the first uses of a given nonlethal weapon may in fact be more lethal than expected.

In the cases of the nonlethal weapons described above:

• Weapons that provide high-voltage shocks to an individual may cause serious injury or death if the person falls or if the person’s heart goes into cardiac arrest.

• Dazzling lasers may cause a driver to lose control of a vehicle by

___________________

43 See, for example, Robert Kolb, “The Relationship Between International Humanitarian Law and Human Rights Law: A Brief History of the 1948 Universal Declaration of Human Rights and the 1949 Geneva Conventions,” International Review of the Red Cross, No. 324, September 30, 1998, available at http://www.icrc.org/eng/resources/documents/misc/57jpg2.htm; Marco Sassoli and Laura Olson, “The Relationship Between International Humanitarian and Human Rights Law Where it Matters: Admissible Killing and Internment of Fighters in Non-International Armed Conflicts,” International Review of the Red Cross 90(871):599-627, September 2008, available at http://www.icrc.org/eng/assets/files/other/irrc-871-sassoli-olsen.pdf; and United Nations Office of the High Commissioner on Human Rights, “International Humanitarian Law and Human Rights,” July 1991, available at http://www.ohchr.org/Documents/Publications/FactSheet13en.pdf.

44 International Committee of the Red Cross, “Incapacitating Chemical Agents”: Law Enforcement, Human Rights Law, and Policy Perspectives, report of an expert meeting, Montreux, Switzerland, April 24-26, 2012, available at http://www.icrc.org/eng/resources/documents/publication/p4121.htm.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Box 3.3 On the Compliance of Lasers as Antipersonnel Weapons with the Law of Armed Conflict

In 1983, the New York Times reported that the U.S. Army was developing a weapon known as C-CLAW (Close Combat Laser Assault Weapon) that used low-power laser beams to blind the human eye at distances of up to one mile.1 Pentagon officials noted that the beam “would sweep around the battlefield and blind anyone who looked directly into it.”

In September 1988, the DOD Judge Advocate General issued a memorandum of law concerning the legality of the use of lasers as antipersonnel weapons.2 This memorandum identified the key law-of-armed-conflict issue as whether the use of a laser to blind an enemy soldier would cause unnecessary suffering and therefore be unlawful. The memorandum noted that blinding a soldier “ancillary to the lawful use of a laser rangefinder or target acquisition lasers against material targets” would be legal. If so, the memorandum argued, consistency requires that it must not be illegal to target soldiers directly with a laser. If it were otherwise, “enemy soldiers riding on the outside of a tank lawfully could be blinded as the tank is lased incidental to its attack by antitank munitions; yet it would be regarded as illegal to utilize a laser against an individual soldier walking ten meters away from the tank.” The memorandum then noted that “no case exists in the law of war whereby a weapon lawfully may injure or kill a combatant, yet be unlawful when used in closely-related circumstances involving other combatants.” The memorandum then concluded that a blinding laser would not cause “unnecessary suffering when compared to other [legal] wounding mechanisms to which a soldier might be exposed on the modem battlefield,” and that thus the use of a laser as an antipersonnel weapon must be lawful.

However, in September 1995, the U.S. Department of Defense promulgated a new policy that prohibited “the use of lasers specifically designed to cause permanent blindness of unenhanced vision and supported negotiations prohibiting the use of such weapons” and continued training and doctrinal efforts to minimize accidental or incidental battlefield eye injuries resulting from using laser systems for nonprohibited purposes. One month later, the first review conference of the 1980 Convention on Certain Conventional Weapons adopted a protocol on blinding laser weapons, which the United States signed. Some of the issues raised in

forcing the driver to shield his or her eyes, leading to injury or death as a result.

• Acoustic weapons can cause permanent hearing losses through repeated exposure.

• Chemical incapacitants can cause serious harm, or death may occur if overdoses occur or as the result of secondary effects (e.g., an incapacitated person who falls and hits his head on a rock).

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

the lead-up to this conference included the desirability of a protocol to cover this issue; a debate over whether to prohibit blinding weapons per se or blinding as a method of warfare; and the possibility of a ban interfering with other military uses of lasers, such as the designation of targets.

In January 2009, the United States deposited its instrument of ratification for Protocol IV of the Convention on Conventional Weapons, which prohibits the employment of laser weapons “specifically designed, as their sole combat function or as one of their combat functions, to cause permanent blindness to unenhanced vision.”3 The protocol further prohibits the transfer of such weapons to any state or nonstate entity. However, it recognizes the possibility of blinding as “an incidental or collateral effect of the legitimate military employment of laser systems, including laser systems used against optical equipment,” and exempts such blinding from the prohibition of this protocol.

One analyst suggests that a major factor in the adoption of the protocol was the support garnered from a variety of nongovernment organizations, such as medical associations and national Red Cross and Red Crescent organizations.4 In addition, in May 1995, the European Parliament called on the Council of Europe to take action on the protocol. In the United States, Human Rights Watch (HRW)—an international nongovernmental organization—issued a report in May 1995 that documented U.S. efforts to develop military laser systems that were intended to damage optical systems and/or eyesight. Whether or not prompted by the HRW report, a number of influential U.S. senators and representatives shortly thereafter asked the administration to adopt a ban on blinding lasers.

___________________

1 See http://www.nytimes.com/1983/12/18/us/army-works-on-a-blinding-laser.html.

2 “Memorandum of Law: The Use of Lasers as Antipersonnel Weapons,” The Army Lawyer, DA PAM 27-50-191, November 1988, available at http://www.loc.gov/rr/frd/Military_Law/pdf/11-1988.pdf.

3 See http://www.state.gov/r/pa/prs/ps/2009/01/115309.htm.

4 Louise Doswald-Beck, “New Protocol on Blinding Laser Weapons,” International Review of the Red Cross, No. 312, June 30, 1996, available at http://www.icrc.org/eng/resources/documents/misc/57jn4y.htm. This article also provides some of the other information contained in this box.

Unanticipated Uses

Nonlethal weapons—at least some of them—raise issues that are not generally anticipated in the doctrines of their use. For example, although nonlethal weapons are often presented as a substitute for lethal weapons, they may in practice be a substitute for nonviolent negotiations—that is, they may be used to bypass the time-consuming process of negotiations.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×

Indeed, there are instances in which nonlethal weapons have been used when no force (rather than lethal force) would have been used.45

Building on this possibility, nonlethal weapons could be used as a means for coercion—that is, they might be used to torture an individual or persuade an otherwise unwilling individual to cooperate The nonlethality of some nonlethal weapons is premised on the ability of an individual to flee the scene of weapons use (as is true for nonlethal area-denial systems)—the weapon causes pain for an individual who is exposed to the weapon’s effects, but the individual is free to leave the area in which the weapon causes these effects. But if the individual is not free to leave (e.g., by being restrained), an area-denial system could plausibly be used as an instrument of torture.

It is of course true that virtually any instrument can be used as an instrument of torture, which is prohibited under international law. In this context, a possible ELSI concern arises because certain nonlethal weapons technologies might be better suited for torture (if, for example, the use of a particular technology left no physical evidence of the torture).

___________________

45 In one study performed by the sheriff’s office in Orange County, Florida, officers on patrol were equipped with tasers and were trained to use them. One immediate effect was that the number of citizen fatalities due to police action decreased significantly—the intended effect. A second immediate (and unanticipated) effect was a significant increase in the frequency of police use of force overall. That is, without tasers, there were most likely a number of situations in which the police would not have used force at all, but with tasers available, they were more willing to use force (nonlethal force, but force just the same) than before. See Alex Berenson, 2004, “As Police Use of Tasers Soars, Questions Over Safety Emerge,” New York Times, July 18, 2004.

Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 79
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 80
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 81
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 82
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 83
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 84
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 85
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 86
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 87
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 88
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 89
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 90
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 91
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 92
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 93
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 94
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 95
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 96
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 97
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 98
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 99
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 100
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 101
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 102
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 103
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 104
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 105
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 106
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 107
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 108
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 109
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 110
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 111
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 112
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 113
Suggested Citation:"3 Application Domains." National Research Council and National Academy of Engineering. 2014. Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi: 10.17226/18512.
×
Page 114
Next: 4 Sources of ELSI Insight »
Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues Get This Book
×
Buy Paperback | $63.00 Buy Ebook | $49.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Emerging and Readily Available Technologies and National Security is a study on the ethical, legal, and societal issues relating to the research on, development of, and use of rapidly changing technologies with low barriers of entry that have potential military application, such as information technologies, synthetic biology, and nanotechnology. The report also considers the ethical issues associated with robotics and autonomous systems, prosthetics and human enhancement, and cyber weapons. These technologies are characterized by readily available knowledge access, technological advancements that can take place in months instead of years, the blurring of lines between basic research and applied research, and a high uncertainty about how the future trajectories of these technologies will evolve and what applications will be possible.

Emerging and Readily Available Technologies and National Security addresses topics such as the ethics of using autonomous weapons that may be available in the future; the propriety of enhancing the physical or cognitive capabilities of soldiers with drugs or implants or prosthetics; and what limits, if any, should be placed on the nature and extent of economic damage that cyber weapons can cause. This report explores three areas with respect to emerging and rapidly available technologies: the conduct of research; research applications; and unanticipated, unforeseen, or inadvertent ethical, legal, and societal issues. The report articulates a framework for policy makers, institutions, and individual researchers to think about issues as they relate to these technologies of military relevance and makes recommendations for how each of these groups should approach these considerations in its research activities. Emerging and Readily Available Technologies and National Security makes an essential contribution to incorporate the full consideration of ethical, legal, and societal issues in situations where rapid technological change may outpace our ability to foresee consequences.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!