Skip to main content

Currently Skimming:

3 Investing in Information Technology Research
Pages 28-96

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 28...
... , the committee believes that there is enough overlap in the research problems and approaches to make it unwise to articulate a separate R&D program for each area. Although many areas of information technology research could be potentially valuable for counterterrorist purposes, the three areas described below are particularly important for helping reduce the likelihood or impact of a terrorist attack: 1.
From page 29...
... . While terrorist attacks and natural disasters have many similarities with respect to the consequences of such events, the issues raised by C3I for emergency response for terrorist disasters differ from those for natural disasters for several reasons.
From page 30...
... "Information fusion" promises to play a central role in countering future terrorist efforts. Information fusion is an essential tool for the intelligence analysis needed if preemptive disruption of terrorist attacks is to be successful.
From page 31...
... It is beyond the scope of this report to address such issues in detail, but policy makers should be cautioned that R&D is only the first step on a long road to widespread deployment and a genuinely stronger and more robust IT infrastructure. 3.1 INFORMATION AND NETWORK SECURITY A broad overview of some of the history of and major issues in information and network security is contained in the CSTB report Cybersecurity
From page 32...
... 32 INFORMATION TECHNOLOGY FOR COUNTERTERRORISM TABLE 3.1 A Taxonomy of Priorities Time Scale for R&D for Significant Research Progress and Category Criticality Difficulty Deployment Improved Information and Network Security High Difficult 5-9 years Detection and identification High Difficult 5-9 years Architecture and design for containment High Difficult 5-9 years Large system backup and decontamination High Difficult 5-9 years Less buggy code High Very difficult 5-9 years Automated tools for system configuration High Difficult 1-4 years Auditing functionality Low Difficult 10+ years Trade-offs between usability and security Medium Difficult 5-9 years Security metrics Medium Difficult 1-4 years Field studies of security High Easy 1-4 years C3I for Emergency Response High Difficult 1-4 years Ad hoc interoperability High Easy 1-4 years Emergency deployment of communications capacity High Easy 1-4 years Security of rapidly deployed ad hoc networks Medium Difficult 5-9 years Information management and decision-support tools Medium Difficult 5-9 years Communications with the public during emergency High Difficult 1-4 years Emergency sensor deployment High Easy 1-4 years Precise location identification Medium Difficult 5-9 years Mapping the physical telecommunications infrastructure High Easy 1-4 years Characterizing the functionality of regional networks for emergency responders High Difficult 1-4 years Information Fusion High Difficult 1-4 years Data mining High Difficult 1-4 years Data integration High Difficult 1-4 years Language technologies High Difficult 1-4 years Image and video processing High Difficult 5-9 years Evidence combination Medium Difficult 1-4 years Interaction and visualization Medium Difficult 1-4 years Privacy and Confidentiality High Difficult 1-4 years Planning for the Future Medium Difficult 10+ years
From page 33...
... Research in information and network security can be grouped in four generic areas: authentication, detection, containment, and recovery. A fifth set of topics (e.g., reducing buggy code, dealing with misconfigured systems, auditing functionality)
From page 34...
... 144-152 (hereafter cited as CSTB, NRC, 1999, Realizing the Potential of C4I) ; CSTB, NRC, 1999, Trust in Cyberspace.
From page 35...
... , and had a cost of ownership that was competitive with passwords. In practice, these desirable attributes often entail trade-offs with one another; one way to focus a research effort in authentication would be to address the reduction of these trade-offs.5 3.1.2 Detection Even with apparently secure authentication processes and technologies, it might still be possible for an intruder to gain unauthorized access to a system, though with more effort if the system were more secure.
From page 36...
... Of course, such monitoring requires good characterizations of what "normal" behavior is and knowledge of what various kinds of behavior mean in the context of specific applications. Today, the major deficiency in this approach is the occurrence of too many false positives.
From page 37...
... (For example, the failure of a perimeter defense, such as a firewall, surrounding otherwise unprotected systems can result in an intruder's gaining full and complete access to all of those systems.) A system that degrades gracefully is more desirable in this case, a successful attack on one part of a system results only in that part's
From page 38...
... Architectural containment as a system-design principle calls for the ability to maintain critical functionality (such as engine control on a ship) despite failures in other parts of a system.8 A sophisticated control system used during "normal operations" must be able to provide basic functionality even when parts of it have been damaged.9 Such an approach could be one of the most effective long-term methods for hardening IT targets that oversee critical operations.
From page 39...
... Another approach within this general area of containment is the de10A subscription model calls for a user to register for service in some authenticated way, so that a site can distinguish that user from a random bad user. Because denial-of-service attacks depend on a flood of bogus requests for service, the availability of a database of registered users makes it easy to discard service requests from requests that are not registered and those are likely to account for the vast majority of bogus requests.
From page 40...
... In the end, this ability is critical to long-term deterrence. Given that penetration of computer and telecommunications networks is likely to continue despite our best efforts to build better perimeter security, more resilient and robust systems are necessary, with backup and recovery as essential elements.
From page 41...
... . 3.1.5 Cross-cutting Issues in Information and Network Security Research A number of issues cut across the basic taxonomy of detection, containment, and recovery described above.
From page 42...
... Mitchell et al., 1997, "Automated Analysis of Cryptographic Protocols Using Murphi," IEEE Symposium on Security and Privacy, Oakland, available online at .
From page 43...
... Thus, better tools for formulating and specifying security policies and for checking system configurations quickly against prespecified configurations should be developed. Better tools for system and network operators to detect added and unauthorized functionality (e.g., the addition of a Trojan horse)
From page 44...
... Research is needed to develop tools that can detect added unauthorized functionality. Managing Trade-offs Between Functionality and Security As a general rule, more secure systems are harder to use and have fewer features.23 Conversely, features such as executable content and remote administration can introduce unintended vulnerabilities even as they bring operational benefits.
From page 45...
... Notions such as calculating the return on a security investment common in other areas in which security is an issue are not well understood either, thus making quantitative risk management a very difficult enterprise indeed.26 Research is needed for developing meaningful security metrics. 26Information on the economic impact of computer security is available online at
From page 46...
... 144-152; CSTB, NRC, 1991, Computers at Risk; CSTB, NRC, 1999, Trust in Cyberspace. 29For example, commercial needs for computer security focused largely on data integrity, while military needs for security focused on confidentiality, as noted in David Clark and David Wilson, 1987, "A Comparison of Commercial and Military Computer Security Policies," in Proceedings of the 1987 IEEE Symposium on Security and Privacy, IEEE, Oakland, Calif.
From page 47...
... National Academy Press, Washington, D.C., p. 39 thereafter cited as csTs, NRC, 1999, Information Technology Research for Crisis Management)
From page 48...
... Funding needs are further exacerbated by the difficulty of acquiring new digital systems that preserve backwards compatibility with existing legacy analog systems. (In New York City, the Fire Department relied on radios that were at least 8 years old and in some cases 15 years old, and a senior Fire Department official reported that "there [are]
From page 49...
... Also according to this article, the New York Fire Department had replaced at least some of its analog radios in early 2001 with digital technology better able to transmit into buildings; but after a few months, these new radios were pulled from service because several firefighters said they had been unable to communicate in emergencies. It is not clear whether or not a full "head-to-head" systematic comparison between the analog and digital systems was ever undertaken.
From page 50...
... 29. 38CSTB, NRC, 1999, Information Technology Research for Crisis Management, p.
From page 51...
... For more discussion, see computer science and Telecommunications Board tcsTsy, National Research Council tNRCy, 2002, Information Technology Research, Innovation, and E-Government, National Academy Press, Washington, D.C. thereafter cited as csTs, NRC, 2002, Information Technology Research, Innovation, and E-Government)
From page 52...
... 33. 48csTs, NRC, 1999, Information Technology Research for Crisis Management, p.
From page 53...
... For example, it is likely that some portion of the public networks would survive any disaster; emergency-response agencies could use that portion to facilitate 50CSTB, NRC, 1996, Computing and Communications in the Extreme, p. 17; CSTB, NRC, 1999, Information Technology Research for Crisis Management.
From page 55...
... A disaster (or an attack on the communications of emergency responders in conjunction with a physical attack) is likely to destroy some but not all of the communications infrastructure in a given area, leaving some residual capability.
From page 56...
... 83. 54csTs, NRC, 1999, Information Technology Research for Crisis Management, pp.
From page 57...
... 39. 57csTs, NRC, 1999, Information Technology Research for Crisis Management, p.
From page 58...
... Unpublished paper, available online at . August 18.
From page 59...
... CSTB, NRC, 1999, Information Technology Research for Crisis Management, Appendix B 62CSTB, NRC, 1996, Computing and Communications in the Extreme, National Academy Press, Washington, D.C., p.
From page 60...
... , and find pathways through debris and rubble. Developing robust sensors for these capabilities is one major challenge; developing architectural concepts for how to deploy them and integrate the resulting information is another.67 64CSTB, NRC, 1999, Information Technology Research for Crisis Management, p.
From page 61...
... 14. 72CSTB, NRC, 1999, Information Technology Research for Crisis Management, p.
From page 62...
... It may also be possible to develop technology to generate the data for accurate maps of a debris-strewn disaster location. Finally, keeping track of emergency responders' positions within a disaster area is an essential element of managing emergency response.
From page 63...
... Such integrated data can be particularly valuable for decision makers in law enforcement, the intelligence community, emergency-response units, and other organizations combating terrorism. Information fusion gains power and relevance for the counterterrorist mission because computer technology enables large volumes of information to be processed in short times.
From page 66...
... Intelligence agencies are routinely involved in information fusion as they attempt to track suspected terrorists and their activities. One of their primary problems is that of managing a flood of data.
From page 68...
... 3.3.1 Data Mining Data mining is a technology for analyzing historical and current online data to support informed decision making. It has grown quickly in importance in the commercial world over the past decade, because of the increasing volume of online data, advances in statistical machine-learning algorithms for automatically analyzing these data, and improved networking that makes it feasible to integrate data from disparate sources.
From page 69...
... 3.3.2 Data Interoperability An inherent problem of information fusion is that of data interoperability the difficulty of merging data from multiple databases, multiple sources, and multiple media. Often such sources will be distributed over different jurisdictions or organizations, each with different data definitions.
From page 70...
... This is a good example of information fusion in which multiple representations of content are combined to reduce the effect of errors coming from any given source. The major limitation of present language and image technologies is in accuracy and performance: despite significant progress, these need to be considerably improved.
From page 71...
... A variety of visualization techniques have been developed for large-scale scientific applications, but more research is need on techniques that are effective for visualizing huge amounts of dynamic information derived from unstructured data about people, places and events. Such research is potentially valuable because it takes advantage of the human ability to recognize patterns more easily than automated techniques can.
From page 72...
... Most importantly, it is often not known in advance what specific information must be sought in order to recognize a suspicious pattern, especially as circumstances change. From the perspective of intelligence analysis, the collection rule must be "collect everything in case something might be useful." Such a stance generates obvious conflicts with the strongest pro-privacy rule "Don't collect anything unless you know you need it." Data mining and information fusion have major privacy implications, and increased efforts by commercial and government entities to correlate data with a specific person negatively impact privacy and data confidentiality.
From page 73...
... See Computer Science and Telecommunications Board, National Research Council.
From page 74...
... 2000. Summary of a Workshop on Information Technology Researchfor Federal Statistics.
From page 75...
... concerns about privacy arise from the fear that improperly disclosed information might be used to an individual's economic or legal detriment. Thus, a concern about privacy with respect to records of HIV status may be partly rooted in a fear that improper disclosure might result in the loss of health insurance or the denial of a job opportunity in the future; a concern about the privacy of one's financial records may be rooted in a fear that one could become a "mark" for criminals or the subject of unwarranted tax audits.
From page 76...
... First, robots currently exist to assist or replace emergency workers or military personnel in dangerous situations. For example, bomb squads use robots to inspect, open, and destroy or detonate suspicious packages; tethered or wireless robots, with receivers and transmitters, could respond to commands to disarm bombs.
From page 77...
... These devices could be integrated into objects or systems that are likely to last for long periods of time and must function under constraints such as limited power source, need for adequate heat dissipation, and limitations on bandwidth and memory. Moreover, because they protect a civilian rather than a military population (civilian populations are much less tolerant of false positives and much more vulnerable to false negatives)
From page 78...
... . This would include both the data82The discussion in this section is adapted largely from CSTB, NRC, 1996, Computing and Communications in the Extreme, and CSTB, NRC, 1999, Information Technology Research for Crisis Management.
From page 79...
... For example, the mayor of a city might require information on whether to order an evacuation in response to a successful attack on a nuclear plant and thus would need information on what might happen at the plant on a time scale of 24 hours. On the other hand, a firefighter at the plant may need to know what might happen in the next hour.
From page 80...
... cryptanalysts broke the Soviet codebecause the Soviets began to reuse some of the key pads.83 83Key pads are simply lists of random numbers, so there was no particular reason why the Soviets were forced to reuse them. Their reuse was a human error that violated the essential premise underlying the security of one-time pad encryption systems.
From page 81...
... It is simply not possible to have zero false positives and zero false negatives simultaneously, especially in a world filled with uncertainty, ambiguity, and noise. Most importantly, deliberate adversaries seek to cause and exploit false alarms.
From page 82...
... Then the human is suddenly and unexpectedly faced with an emergency, and because at this point everything is under human control, any failure is labeled "human error." By contrast, when systems are designed with a full understanding of the powers and weaknesses of human operators, the incidence of human error is greatly diminished. People must be given meaningful tasks.
From page 83...
... For example, in the September 11 attack on the World Trade Center, the medical response was hindered by the fact that the city's Office of Emergency Management (OEM) , responsible for coordinating all aspects of a disaster response, was housed in Building 7 of the World Trade Center, one of the buildings that collapsed several hours after the airplane strikes.
From page 84...
... The result is that people write down their passwords on yellow Post-it notes and paste them on their terminals, where they are easy for unauthorized users to see. Biometrics provides another example.
From page 85...
... A terrorist who needed to know the security code to enter a secure building through the back door would call and say: 88"The Complete Social Engineering FAQ." Available online at
From page 86...
... The reason is that helpful people often play a key role in getting any work done at all and thus the research challenge is to develop effective techniques for countering social engineering that do not require wholesale attacks on tendencies to be helpful. Understand Bystander Apathy As more people are involved in checking a task, it is possible for safety to decrease.
From page 87...
... An effective criminal or terrorist approach is to trigger an alarm system repeatedly so that the security personnel, in frustration over the repeated false alarms, either disable or ignore it which is when the terrorist sneaks in. Probe and Test the System Independently The terms "red team" and "tiger team" refer to efforts undertaken by an organization to test its security from an operational perspective using teams that simulate what a determined attacker might do.
From page 88...
... The essential reason is that an attacker has the opportunity to attack any vulnerable point in a system's defenses, whether that point of vulnerability is the result of an unknown software bug, a misconfigured access control list, a password taped to a terminal, lax guards at the entrance to a building, or a system operator trying to be helpful. Over the years, tiger teams have been an essential aspect of any security program, and tiger-team tests are essential for several reasons: · Recognized vulnerabilities are not always corrected, and known fixes are frequently found not to have been applied as a result of poor configuration management.
From page 89...
... On the one hand, outsourced work represents a potential vulnerability to the company that uses such work, unless that company has the expertise to audit and inspect the work for security flaws. By assumption, a company that outsources work has less control over how the work is done, and the possibility of deliberately introduced security vulnerabilities in outsourced work must be taken seriously.
From page 90...
... For example, from a security perspective, termination of the access privileges of employees found to be improperly hired or retained must happen without warning them of such termination. On the other hand, due process may prevent rapid action from being taken.
From page 91...
... 3.6.3 Dealing with Organizational Resistance to Interagency Cooperation An effective response to a serious terrorist incident will inevitably require the multiple emergency-response agencies to cooperate. Section 3.2.1 describes technical barriers to effective cooperation, but technological limitations by no means explain why agencies might fail to cooperate effectively.
From page 92...
... · In the aftermath, senior Fire Department and Police Department officials disagreed over the extent to which the departments were able to coordinate. A senior Fire Department official said that "there is no question there were communications problems [between the Fire Department and the Police Department]
From page 93...
... But in crisis, interagency differences impede interagency cooperation, and they cannot be overcome by fiat at the scene of the crisis. For example, a policy directive requiring that agencies adopt and use common communications protocols does not necessarily require emergency responders from different agencies to actually interact with one another while an emergency response is occurring.
From page 94...
... To achieve effective interagency cooperation in crisis, many things must happen prior to the occurrence of crisis, taking into account the realities of organizational resistance to interoperability. Such cooperation is likely to require: · Strong, sustained leadership.
From page 95...
... ~ . 92Despite the creation of New York City's Office of Emergency Management in 1996 and expenditures of nearly $25 million to coordinate emergency response, the city had not conducted an emergency exercise between 1996 and September 11, 2001, at the World Trade Center which had been bombed in 1993 that included the Fire Department, the police, and the Port Authority's emergency staff.
From page 96...
... These methods must minimize loads on human memory and attention and task interference while providing the appropriate levels of security in the face of adversaries who use sophisticated technologies as well as social engineering techniques to penetrate the security.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.