National Academies Press: OpenBook
« Previous: E High-grade Threats
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Appendix F Glossary


Access

A subject's right to use an object. Examples include read and write access for data objects, execute access for programs, or create and delete access for directory objects.

Access control

The granting or denying to a subject (principal) of certain permissions to access an object, usually done according to a particular security model.

Access control list

A list of the subjects that are permitted to access an object, and the access rights of each subject.

Access label

See Label.

Access level

A level associated with a subject (e.g., a clearance level) or with an object (e.g., a classification level).

Accountability

The concept that individual subjects can be held responsible for actions that occur within a system.

Accreditation

1. The administrative act of approving a computer system for use in a particular application. See Certification. 2. The act of approving an organization as, for example, an evaluation facility.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Administratively directed access control (ADAC)

Access control in which administrators control who can access which objects. Contrast with user-directed access control (UDAC). See Mandatory access control.

Assurance

Confidence that a system design meets its requirements, or that its implementation meets its specification, or that some specific property is satisfied.

Auditing

The process of making and keeping the records necessary to support accountability. See Audit trail analysis.

Audit trail

The results of monitoring each operation of subjects on objects; for example, an audit trail might be a record of all actions taken on a particularly sensitive file.

Audit trail analysis

Examination of an audit trail, either manually or automatically, possibly in real time (Lunt, 1988).

Authentication

Providing assurance regarding the identity of a subject or object, for example, ensuring that a particular user is who he claims to be.

Authentication sequence

A sequence used to authenticate the identity of a subject or object.

Authorization

Determining whether a subject (a user or system) is trusted to act for a given purpose, for example, allowed to read a particular file.

Availability

The property that a given resource will be usable during a given time period.


Bell and La Padula model

An information-flow security model couched in terms of subjects and objects and based on the concept that information shall not flow to an object of lesser or noncomparable classification (Bell and La Padula, 1976).

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Beta testing

Use of a product by selected users before formal release.

Biba model

An integrity model in which no subject may depend on a less trusted object (including another subject) (Biba, 1975).


Capability

An authenticating entity acceptable as evidence of the right to perform some operation on some object.

Certification

The administrative act of approving a computer system for use in a particular application. See Accreditation.

CESG

The Communications-Electronics Security Group of the U.K. Government Communications Headquarters (GCHQ).

Challenge-response

An authentication procedure that requires calculating a correct response to an unpredictable challenge.

Checksum

Digits or bits summed according to arbitrary rules and used to verify the integrity of data.

Ciphertext

The result of transforming plaintext with an encryption algorithm. Also known as cryptotext.

Claims language

In the ITSEC, the language that describes the desired security features of a "target of evaluation" (a product or system), and against which the product or system can be evaluated.

Clark-Wilson integrity model

An approach to providing data integrity for common commercial activities, including software engineering concepts of abstract data types, separation of privilege, allocation of least privilege, and nondiscretionary access control (Clark and Wilson, 1987).

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Classification level

The security level of an object. See Sensitivity label.

Cleanroom approach

A software development process designed to reduce errors and increase productivity (Poore and Mills, 1989).

Clear text

Unencrypted text. Also known as plaintext. Contrast with ciphertext, cryptotext.

Clearance level

The security level of a subject.

CLEF

In the ITSEC, a Commercial Licensed Evaluation Facility.

CoCom

Coordinating Committee for Multilateral Export Controls, which began operations in 1950 to control export of strategic materials and technology to communist countries; participants include Australia, Belgium, Canada, Denmark, France, Germany, Greece, Italy, Japan, Luxembourg, the Netherlands, Norway, Portugal, Spain, Turkey, the United Kingdom, and the United States.

COMPUSEC

Computer security.

COMSEC

Communications security.

Confidentiality

Ensuring that data is disclosed only to authorized subjects.

Correctness

1. The property of being consistent with a correctness criterion, such as a program being correct with respect to its system specification, or a specification being consistent with its requirements.

2. In ITSEC, a component of assurance (together with effectiveness).

Countermeasure

A mechanism that reduces the vulnerability of a threat.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Covert channel

A communications channel that allows two cooperating processes to transfer information in a manner that violates a security policy, but without violating the access control.

Criteria

Definitions of properties and constraints to be met by system functionality and assurance. See TCSEC, ITSEC.

Criticality

The condition in which nonsatisfaction of a critical requirement can result in serious consequences, such as damage to national security or loss of life. A system is critical if any of its requirements are critical.

Crypto-key

An input to an encryption device that results in cryptotext.

Cryptotext

See Ciphertext.


Data

A sequence of symbols to which meaning may be assigned. Uninterpreted information. Data can be interpreted as representing numerical bits, literal characters, programs, and so on. (The term is used often throughout this report as a collective, singular noun.) See Information.

Data Encryption Standard (DES)

A popular secret-key encryption algorithm originally released in 1977 by the National Bureau of Standards.

Delegate

To authorize one subject to exercise some of the authority of another.

Denial of service

Reducing the availability of an object below the level needed to support critical processing or communication, as can happen, for example, in a system crash.

Dependability

The facet of reliability that relates to the degree of certainty that a system will operate correctly.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Dependence

The existence of a relationship in which the subject may not work properly unless the object (possibly another subject) behaves properly. One system may depend on another system.

Digital signature

Data that can be generated only by an agent that knows some secret, and hence is evidence that such an agent must have generated it.

Discretionary access control (DAC)

An access-control mechanism that permits subjects to specify the access controls, subject to constraints such as changes permitted to the owner of an object. (DAC is usually equivalent to IBAC and UDAC, although hybrid DAC policies might be IBAC and ADAC.)

DTI

Department of Trade and Industry, U.K

Dual-use system

A system with both military and civilian applications.


Effectiveness

1. The extent to which a system satisfies its criteria. 2. In ITSEC, a component of assurance (together with correctness).

Emanation

A signal emitted by a system that is not explicitly allowed by its specification.

Evaluation

1. The process of examining a computer product or system with respect to certain criteria. 2. The results of that process.


Feature

1. An advantage attributed to a system. 2. A euphemism for a fundamental flaw that cannot or will not be fixed.

Firmware

The programmable information used to control the low-level operations of hardware. Firmware is commonly stored in Read-Only Memorys (ROMs), which are initially installed in the factory and may be replaced in the field to fix mistakes or to improve system capabilities.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Formal

Having a rigorous respect for form, that is, a mathematical or logical basis.

FTLS

Formal top-level specification. (See "Security Characteristics" in Chapter 5.)

Functionality

As distinct from assurance, the functional behavior of a system. Functionality requirements include, for example, confidentiality, integrity, availability, authentication, and safety.


Gateway

A system connected to different computer networks that mediates transfer of information between them.

GCHQ

Government Communications Headquarters, U.K.

Group

A set of subjects.


Identity-based access control (IBAC)

An access control mechanism based only on the identity of the subject and object. Contrast with rule-based access control. See Discretionary access control.

Implementation

The mechanism that (supposedly) realizes a specified design.

Information

Data to which meaning is assigned, according to context and assumed conventions.

Information-flow control

Access control based on restricting the flow of information into an object. See, for example, Bell and La Padula model.

INFOSEC

Information security. See also COMPUSEC and COMSEC.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Integrity

The property that an object is changed only in a specified and authorized manner. Data integrity, program integrity, system integrity, and network integrity are all relevant to consideration of computer and system security.

Integrity level

A level of trustworthiness associated with a subject or object.

Integrity policy

See Policy.

ITAR

International Traffic in Arms Regulations (Office of the Federal Register, 1990).

ITSEC

The Information Technology Security Evaluation Criteria, the harmonized criteria of France, Germany, the Netherlands, and the United Kingdom (Federal Republic of Germany, 1990).


Kernel

A most trusted portion of a system that enforces a fundamental property, and on which the other portions of the system depend.

Key

An input that controls the transformation of data by an encryption algorithm.


Label

A level associated with a subject or object and defining its clearance or classification, respectively. In TCSEC usage, the security label consists of a hierarchical security level and a nonhierarchical security category. An integrity label may also exist, consisting of a hierarchical integrity level and a nonhierarchical integrity category (Biba, 1975).

Letter bomb

A logic bomb, contained in electronic mail, that is triggered when the mail is read.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Level

1. The combination of hierarchical and nonhierarchical components (TCSEC usage). See Security level, Integrity level. 2. The hierarchical component of a label, more precisely referred to as "hierarchical level" to avoid confusion. In the absence of nonhierarchical categories, the two definitions are identical.

Logic bomb

A Trojan horse set to trigger upon the occurrence of a particular logical event.


Mandatory access control (MAC)

1. Access controls that cannot be made more permissive by users or subjects (general usage, roughly ADAC). 2. Access controls based on information sensitivity represented, for example, by security labels for clearance and classification (TCSEC usage, roughly RBAC and ADAC). Often based on information flow rules.

Model

An expression of a policy in a form that a system can enforce, or that analysis can use for reasoning about the policy and its enforcement.

Monitoring

Recording of relevant information about each operation by a subject on an object, maintained in an audit trail for subsequent analysis.

Mutual authentication

Providing mutual assurance regarding the identity of subjects and/or objects. For example, a system needs to authenticate a user, and the user needs to authenticate that the system is genuine.


NCSC

The National Computer Security Center, part of the National Security Agency, which is part of the Department of Defense.

Node

A computer system that is connected to a communications network and participates in the routing of messages within that network. Networks are usually described as a collection of nodes that are connected by communications links.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Nondiscretionary

Equivalent to mandatory in TCSEC usage, otherwise equivalent to administratively directed access controls.

Nonrepudiation

An authentication that with high assurance can be asserted to be genuine, and that cannot subsequently be refuted.


Object

Something to which access is controlled. An object may be, for example, a system, subsystem, resource, or another subject.

Operating system

A collection of software programs intended to directly control the hardware of a computer (e.g., input/output requests, resource allocation, data management), and on which all the other programs running on the computer generally depend. UNIX, VAX/VMS, and DOS are all examples of operating systems.

Orange Book

Common name for the Department of Defense document that is the basic definition of the TCSEC, derived from the color of its cover (U.S. DOD, 1985d). The Orange Book provides criteria for the evaluation of different classes of trusted systems and is supplemented by many documents relating to its extension and interpretation. See Red Book, Yellow Book.

OSI

Open Systems Interconnection. A seven-layer networking model.

Outsourcing

The practice of procuring from external sources rather than producing within an organization.


Password

A sequence that a subject presents to a system for purposes of authentication.

Patch

A section of software code that is inserted into a program to correct mistakes or to alter the program.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Perimeter

A boundary within which security controls are applied to protect assets. A security perimeter typically includes a security kernel, some trusted-code facilities, hardware, and possibly some communications channels.

PIN

Personal identification number. Typically used in connection with automated teller machines to authenticate a user.

Plaintext

See Clear text.

Policy

An informal, generally natural-language description of desired system behavior. Policies may be defined for particular requirements, such as security, integrity, and availability.

Principal

A person or system that can be authorized to access objects or can make statements affecting access control decisions. See the equivalent, Subject.

Private Key

See Secret key.

Protected subsystem

A program or subsystem that can act as a subject.

Public key

A key that is made available without concern for secrecy. Contrast with private key, secret key.

Public-key encryption

An encryption algorithm that uses a public key to encrypt data and a corresponding secret key to decrypt data.


RAMP

Rating Maintenance Phase. Part of the National Computer Security Center's product evaluation process.

Receivers

Subjects reading from a communication channel.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Red Book

The Trusted Network Interpretation of the Trusted Computer System Evaluation Criteria, or TNI (U.S. DOD, 1987).

Reference monitor

A system component that enforces access controls on an object.

Requirement

A statement of the system behavior needed to enforce a given policy. Requirements are used to derive the technical specification of a system.

Risk

The likelihood that a vulnerability may be exploited, or that a threat may become harmful.

RSA

The Rivest-Shamir-Adelman public key encryption algorithm (Rivest et al., 1978).

Rule-based access control (RBAC)

Access control based on specific rules relating to the nature of the subject and object, beyond just their identities—such as security labels. Contrast with identity-based access control. See Mandatory access control.


Safety

The property that a system will satisfy certain criteria related to the preservation of personal and collective safety.

Secrecy

See Confidentiality.

Secret

Known at most to an authorized set of subjects. (A real secret is possible only when the size of the set is one or less.)

Secret key

A key that is kept secret. Also known as a private key.

Secret-key encryption

An encryption algorithm that uses only secret keys. Also known as private-key encryption.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Secure channel

An information path in which the set of all possible senders can be known to the receivers, or the set of all possible receivers can be known to the senders, or both.

Security

1. Freedom from danger; safety. 2. Computer security is protection of data in a system against disclosure, modification, or destruction. Protection of computer systems themselves. Safeguards can be both technical and administrative. 3. The property that a particular security policy is enforced, with some degree of assurance. 4. Often used in a restricted sense to signify confidentiality, particularly in the case of multilevel security.

Security level

A clearance level associated with a subject, or a classification level (or sensitivity label) associated with an object.

Security policy

See Policy.

Sender

A subject writing to a channel.

Sensitivity label

A security level (i.e., a classification level) associated with an object.

Separation of duty

A principle of design that separates functions with differing requirements for security or integrity into separate protection domains. Separation of duty is sometimes implemented as an authorization rule specifying that two or more subjects are required to authorize an operation.

Shareware

Software offered publicly and shared rather than sold.

Signature

See Digital signature.

Simple security property

An information-flow rule stating that a subject at a given security level can read only from an object with a security label that is the same or lower (Bell and La Padula, 1976).

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Smart card

A small computer in the shape of a credit card. Typically used to identify and authenticate its bearer, although it may have other computational functions.

Source code

The textual form in which a program is entered into a computer (e.g., FORTRAN).

Specification

A technical description of the desired behavior of a system, as derived from its requirements. A specification is used to develop and test an implementation of a system.

Spoofing

Assuming the characteristics of another computer system or user, for purposes of deception.

State

An abstraction of the total history of a system, usually in terms of state variables. The representation can be explicit or implicit.

State machine

In the classical model of a state machine, the outputs and the next state of the machine are functionally dependent on the inputs and the present state. This model is the basis for all computer systems.

STU-III

A secure telephone system using end-to-end private-key encryption.

Stub

An artifact, usually software, that can be used to simulate the behavior of parts of a system. It is usually used in testing software that relies on those parts of the system simulated by the stub. Stubs make it possible to test a system before all parts of it have been completed.

Subject

An active entity—e.g., a process or device acting on behalf of a user, or in some cases the actual user—that can make a request to perform an operation on an object. See the equivalent, Principal.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

System

1. A state machine, that is, a device that, given the current state and inputs, yields a set of outputs and a new state (see State machine). 2. An interdependent collection of components that can be considered as a unified whole, for example, a networked collection of computer systems, a distributed system, a compiler or editor, a memory unit, and so on.


TCB

See Trusted computing base.

TCSEC

The Department of Defense Trusted Computer System Evaluation Criteria (U.S. DOD, 1985d). See Orange Book.

Tempest

U.S. government rules for limiting compromising signals (emanations) from electrical equipment.

Threat

The potential for exploitation of a vulnerability.

Time bomb

A Trojan horse set to trigger at a particular time.

Token

When used in the context of authentication, a physical device necessary for user identification.

Token authenticator

A pocket-sized computer that can participate in a challenge-response authentication scheme. The authentication sequences are called tokens.

Trapdoor

A hidden flaw in a system mechanism that can be triggered to circumvent the system's security.

Trojan horse

A computer program whose execution would result in undesired side effects, generally unanticipated by the user. A Trojan horse program may otherwise give the appearance of providing normal functionality.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Trust

Belief that a system meets its specifications.

Trusted computing base (TCB)

A portion of a system that enforces a particular policy. The TCB must be resistant to tampering and circumvention. Under the TCSEC, it must also be small enough to be analyzed systematically. A TCB for security is part of the security perimeter.

Trusted system

A system believed to enforce a given set of attributes to a stated degree of assurance (confidence).

Trustworthiness

Assurance that a system deserves to be trusted.

Tunneling attack

An attack that attempts to exploit a weakness in a system at a low level of abstraction.


User authentication

Assuring the identity of a user. See Authorization.

User-directed access control (UDAC)

Access control in which users (or subjects generally) may alter the access rights. Such alterations may, for example, be restricted to certain individuals by the access controls, for example, limited to the owner of an object. Contrast with administratively directed access control. See Discretionary access control.


Vaccine

A program that attempts to detect and disable viruses.

Virus

A program, typically hidden, that attaches itself to other programs and has the ability to replicate. In personal computers, ''viruses" are generally Trojan horse programs that are replicated by inadvertent human action. In general computer usage, viruses are more likely to be self-replicating Trojan horses.

Vulnerability

A weakness in a system that can be exploited to violate the system's intended behavior. There may be security, integrity, availability, and other vulnerabilities. The act of exploiting a vulnerability represents a threat, which has an associated risk of being exploited.

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×

Worm attack

A worm is a program that distributes itself in multiple copies within a system or across a distributed system. A worm attack is a worm that may act beyond normally permitted behavior, perhaps exploiting security vulnerabilities or causing denial of service.


Yellow Book

The Department of Defense Technical Rationale Behind CSC-STD-003-85 (U.S. DOD, 1985b). Guidance for applying the TCSEC to specific environments.


ZSI

Zentralstelle für Sicherheit in der Informationstechnik. The German Information Security Agency (GISA).

Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page286
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page287
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page288
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page289
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page290
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page291
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page292
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page293
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page294
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page295
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page296
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page297
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page298
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page299
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page300
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page301
Suggested Citation:"F Glossary." National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: The National Academies Press. doi: 10.17226/1581.
×
Page302
Next: G List of Members of the Former Commission on Physical Sciences, Mathematics, and Resources »
Computers at Risk: Safe Computing in the Information Age Get This Book
×
Buy Paperback | $85.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Computers at Risk presents a comprehensive agenda for developing nationwide policies and practices for computer security. Specific recommendations are provided for industry and for government agencies engaged in computer security activities.

The volume also outlines problems and opportunities in computer security research, recommends ways to improve the research infrastructure, and suggests topics for investigators.

The book explores the diversity of the field, the need to engineer countermeasures based on speculation of what experts think computer attackers may do next, why the technology community has failed to respond to the need for enhanced security systems, how innovators could be encouraged to bring more options to the marketplace, and balancing the importance of security against the right of privacy.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!