Computer-Related Tools for Training and Operations
Those responsible for providing the medical response to a chemical or biological terrorist attack on a civilian population will face extraordinary crisis control and consequence management problems. Depending on the specific nature of the event (e.g., threatened or actual, release of chemical or biological substances), these first responders may have to (1) immediately provide and coordinate adequate first-aid and critical emergency medical assistance; (2) identify prospectively or retrospectively the location, type (chemical and/or biological), and mechanism of release, such as a stationary or mobile spray, an explosive device, etc. (the "source-term" in computer models), and construct a reasonable footprint for exposure (e.g., atmospheric dispersion over space and time) and potential doses; (3) conduct a hazard assessment and recommend practical intervention procedures (e.g., isolation, shielding, distribution of pharmaceuticals) to limit exposures and further amplification of adverse health effects; and (4) determine the extent of physical contamination and then isolate and decontaminate property to restore and salvage landscapes, buildings, and transportation for rapid reutilization. In parallel with these tasks, and perhaps even competing and conflicting with them, will be separate efforts devoted to collecting and preserving evidence in order to apprehend and prosecute the perpetrators. Therefore, the most effective action will depend on all first responders communicating and coordinating their actions, as well as working closely with federal, state, and local authorities, healthcare institutions, and even news services. Clearly, for civilian medical and law-enforcement first responders to address these
acts of terrorism optimally and rapidly, their collective efforts will need some choreographing, and they will have to react instinctively and collaboratively as they do in other emergency situations for which they have been adequately trained.
Fortunately, medical and other first responders can acquire these essential instinctive and collaborative reactions for responding to an actual or threatened chemical or biological terrorist act by enhancing their existing skills, knowledge, and abilities for dealing with more conventional disasters. However, as unlikely a chemical or biological terrorist act is in any given locale, its potential impact makes it vital to the first-responder community, especially the principal decisionmakers, that such enhancement of existing capabilities also be sustained. Accordingly, this section of the report identifies relevant computer-related tools and pertinent health-effects information that could be used by medical and other first responders to train regularly or even use operationally. These tools will also decrease the need for frequent participation in large exercises that can be disruptive, logistically complicated, expensive, and unproductive.
Medical Vigilance and Dose Reconstruction
Extremely rare infections, chemical exposures, or alternatively, temporally or geographically unusual or uncommonly frequent adverse-health effects could serve as an early warning that there has been a covert release of a chemical or biological substance into a civilian population. Emergency care facilities are likely to be the sentinels for observing such effects in a population. Consequently, the medical community can actively contribute to the rapid identification of a chemical or biological release if they have at their disposal communication systems by which they easily can report confirmed or suspected, rare diagnoses to public health officials.
As mentioned in Chapter 5, Recognizing Covert Exposure in a Population, although epidemiological surveillance systems exist and public health authorities do compile some health-effects information (e.g., morbidity/mortality reports), the process is slow, somewhat isolated, and should be better networked so that data streams documenting rare events can be received, assimilated, and analyzed for trends far more rapidly. In fact, a computer network, combined with easily understood software, perhaps involving the Internet, and an approach that is similar to or is connected with the Program for Monitoring Emerging Diseases (ProMED) or the Global Infectious Disease and Epidemiology Network (GIDEON) could be designed for rapidly collecting diagnostic data from the medical community electronically, particularly from sentinel locations, such as emergency departments. These data could then be sent to a secure, centralized
electronic data collection point for compilation, prompt assessment, and distribution of results, along with the raw data, to local, state, and national levels for further analyses.
Although the centralized system and its analytical tools are not currently available, the benefits of developing rapid assessment procedures for addressing the accumulating data would be extremely valuable. Not only would it contribute to forensic epidemiology related especially to covert acts of biological and perhaps chemical terrorism, but it would also help to recognize instances of emerging disease or infection. For example, a computerized analysis could be designed to promptly detect in the data any unusual disease or chemical toxicity event(s), as well as those with particular characteristics related to specific chemicals or microorganisms, that might otherwise be ignored or uncorrelated because of infrequency or geospatial and/or temporal dispersion, and then alert public health authorities to this finding.
Should the reported symptomatology for certain individuals signal a possible covert release of chemical or biological substances in a civilian population, the computer system could also aid in determining the environmental media of exposure (e.g., air, water, or food release) and assist public health, law enforcement, and hazardous-material authorities in reconstructing the source, footprint of the exposures, and spectrum of doses. It could do this by modeling the applicable vectors of dispersion (e.g., wind speed and direction, water flows, transportation systems, or even the distribution of contaminated goods or services). Swift retrospective analyses of these data would enable public health authorities to more quickly deduce the source, isolate areas of exposure, locate contaminated property, and distribute available vaccines and antidotes and beneficial information. Also, law enforcement personnel could act to acquire evidence to catch the perpetrator, evidence that might otherwise become undetectable over time. The benefits of having such a system are clearly demonstrated by Meselson et al. (1994). In this case, the investigators effectively combined medical, biological, meteorological, and demographic data to demonstrate retrospectively and convincingly that atmospherically released anthrax from a Russian production facility was responsible for incidents of infection in the downwind community of Sverdlovsk in 1979.
Models Facilitating Assessment and Planning
Even if individual first responders can be adequately equipped and prepared to safely handle the physical and emotional hazards of attending to victims and their families following an act of chemical or biological terrorism, it is crucial that the efforts of these individuals be administered
and coordinated systematically and objectively. Such direction is necessary for minimizing or eliminating additional exposures, averting needless pain and suffering, and preventing any amplification of serious adverse health effects in the population or among first responders.
An important part of the task of providing the guidance needed by the first-responder community to react rapidly, competently, collaboratively, and instinctively during and after a chemical or biological terrorist event is to provide a mechanism for establishing a clear understanding of how to quickly adjust to different environmental circumstances (e.g., meteorological conditions, hydrological events, and geophysical formations), human-behavior (e.g., traffic and mass hysteria), infrastructure limitations (e.g., availability of hospitals, pharmaceuticals, and services) and communication interruptions (e.g., network breakdowns). Field exercises are one such mechanism, but these seldom can address more than one issue at a time, may be difficult to conduct frequently due to scheduling constraints by participants, and may be expensive due to the required levels of personnel and equipment involved. An alternative is offered by advances in computers and software, which make it possible to address the essential training and operational requirements more conveniently and cost-effectively. In fact, such computer-related tools could also help formulate responses to unintended releases of conventional chemicals and hazardous materials. Thus, if developed, such computer-related tools could permit first-responders to enhance and sustain their ability to assess and plan for a variety of different situations. The computer models that could be used now and in the future are discussed next, along with the importance of an improved understanding of the toxicological properties of the chemical and biological substances that might be used for acts of terrorism.
During any threatened or actual act of chemical or biological terrorism, the immediate reaction of the first-responder community will be to identify the specific agent, determine the best methods for reaching and treating any exposed individuals, decide whether to evacuate any critically ill or other potentially susceptible members of the population (e.g., children, seniors, etc.), and consider the most appropriate ways to avert further exposures and casualties. These efforts will require an understanding of chemical and physical properties of the agent, its likely mechanism and location of release, its environmental transport and fate, and the acute and chronic health effects resulting from both low and high dose levels. This can involve individual or multiple exposure pathways (e.g., across skin, lungs, or gastrointestinal tract) and translate into a variety of symptoms, some that may not even require medical intervention.
As described in Chapters 4 and 6, which address detection and measurement of chemical and biological agents, research is under way in
analytical chemistry and genomics to provide advanced techniques and miniature devices that rapidly and accurately detect and recognize small concentrations of chemicals and microorganisms present in environmental and biological samples. First responders would then have at their disposal analytical devices for more rapidly determining the presence of a chemical or biological substance in a sampled environmental media or biological fluid.
Once the agent is known, its toxicological and chemical properties will be of interest to those responding to the incident. Currently, this information can be obtained by verbal communication with local poison control centers and/or experts in the federal government, but eventually the information and experts might be even more promptly available by computer network (see Chapter 5). Much of the currently available toxicological information is not comprehensive for the chemical and biological substances considered in this report, and in most cases documents only lethal dosage or acute effects by a specific exposure pathway. It is nevertheless reasonable to expect even such limited information will be used and extrapolated, if necessary, pending the development of more precise and relevant data. In fact, such a conclusion about the quality of the toxicological information available is consistent with results contained in a review of the acute-human toxicity estimates for selected chemical-warfare agents performed by the Committee on Toxicology of the National Research Council (COT/NRC, 1997). Developing more information to address the toxicological behavior of such substances, including physiologically based pharmacokinetic (PBPK) models for estimating biochemical metabolism, is necessary for understanding the full range of health effects likely to be seen during and after a release, especially those likely to occur from low-dose exposures and that might not require extensive medical intervention. Furthermore, such information is valuable for providing realistic instructions for using protective equipment and for authorizing reentry into contaminated areas and the decontamination of property.
Where public event planning requires that consideration be given to preparing for chemical or biological terrorism, or there is advance knowledge of the likely location, timing, and type of such a terrorist act, then transport and fate modeling can be performed to determine the extent of the release and to identify the population likely to be exposed. The models that would be used for this purpose are those that assess the movement and dissipation of the agent and identify potential locations of serious exposures as well as surfaces of contamination. Combined with dose-response algorithms and demographic data, these models also can be used to translate concentrations in environmental media (e.g., air, water, and soil) into casualty maps.
Scenarios for terrorist acts that involve the release of chemical or biological substances into ambient air are considered among the most likely, as this is an easy way for a terrorist to achieve dispersion, affect a large population, and gain attention. Therefore, models currently available or that are undergoing development primarily focus on identifying the consequences of a release into air.
For addressing release directly into the atmosphere the available models range from the simple (Gaussian puff simulating advection and dispersion) that can be operated on a desktop personal computer to the complex (three-dimensional, particle-tracking models that use real-time acquisition of local meteorological data and account for terrain). The latter models are computationally intensive and require larger computers and specialists for their operation. However, to be applied correctly, even desk-top computer programs at this time are technologies that require the user possess a good degree of familiarity with the software and its operation and a reasonable knowledge of the model attributes and limitations. Mazzola et al. (1995) describe many of the different atmospheric dispersion models currently available and indicate whether they are governmental or commercial.
There also are other more specialized models for describing the behavior of materials released into the airflow near buildings and into particular structures (e.g., buildings and subways). A description of the purpose of these models appears in a more recent U.S. Department of Energy (DoE) publication to which several National Laboratory research groups contributed (ANL, LBNL, LLNL, and LANL, 1997). That document focuses on explaining recent developments and plans by DoE researchers to direct their atmospheric-science and computer-simulation expertise toward improving transport and fate modeling for application to urban environments, building interiors, and subway systems. This becomes necessary because the accuracy in predicting the transport and fate of material released above urban terrain, inside buildings, and within subway systems of metropolitan areas requires more scaling (e.g., finer grid resolution) and physical considerations (e.g., more complex fluid dynamics) than do currently available regional-scale diagnostic models.
In applying any atmospheric-dispersion model, it is important that the source-term properties be reasonably well defined. Some atmospheric-dispersion models now available, for example, VLSTRACK (Bauer and Gibbs, 1996), have attempted to address specifically the likely methods used to release a chemical or biological substance and then attempt to describe adequately the resulting dispersive and advective nature of the release. Models of this type have in common that they can predict concentrations in air, and to some degree, the footprint of deposition during and after release. However, the current versions of such models address specific
physical processes, landscapes, and even sources of release, and one model may provide better results than another, depending on the situation being considered. Furthermore, requirements for model selection, including sophistication, accuracy, and computer power, may very well depend not only on how the model addresses the type and nature of the release, but also the degree to which the model can approximate the terrain over or through which the released material will disperse (e.g., simple or complex, rural or urban), and the quantity of the meteorological information that is needed and can be made available or approximated.
Along with VLSTRACK there are other atmospheric-dispersion modeling programs. Some are designed to combine atmospheric-dispersion modeling with effects analyses to help emergency responders and decision-makers address chemical releases. Among these models are CAMEO (computer-aided management of emergency operations), which uses ALOHA (areal locations of hazardous atmospheres) as its atmospheric-dispersion model. This model was developed by the U.S. Environmental Protection Agency and the National Oceanic and Atmospheric Administration (1996) to assess unintentional chemical releases. The program operates on a personal computer, but would require some adaptation (e.g., specific information about the type of material released and the physical characteristics of the source-term) to address the specific release of a chemical or biological agent into an urban environment. Another modeling system for assessing potential hazards specifically related to the release of particular chemical, biological, and nuclear materials is HPAC (hazard prediction and assessment capability), which includes the Second-order Closure Integrated Puff (SciPUFF) model for assessment of atmospheric transport. This software also operates on a personal computer and is distributed by the Defense Special Weapons Agency (1997) to aid in hazard assessment relating to the atmospheric transport of a chemical, biological, or nuclear source-term that can be estimated. The adequacy of any one or all of these models for application by first-responders requires further examination, including evaluation of their applicability to different situations and requirements for operational expertise.
Finally, one of the most sophisticated computer simulation programs for hazard assessment is the property of the DoE and is located in California at the Lawrence Livermore National Laboratory's National Atmospheric Release and Advisory Center (NARAC). The Center's primary responsibility involves predicting the dispersion of accidentally released radioactive materials, but the system can address a variety of other substances as well. Because this system uses real-time meteorological information, particle tracking, and accounts for complexity of terrain, it is a numerically complex tool that does not operate on a desktop personal computer and requires trained personnel for operation and interpretation.
The more information that can be provided about the source-term and meteorology, the more accurate the predictions of dispersion and effect will be and the faster such information can be obtained and fed back to the requester.
If any of the models just discussed are to be employed during a real emergency, it must be emphasized that their operation and limitations must be familiar to the users. In fact, NARAC represents a valuable tool to the DoE precisely because it is a dedicated operation and its operators practice regularly and can be called upon any time of day. However, there is a cost associated with its operation, because of its centralization, dedicated personnel, and extensive data requirements. Nevertheless, the system employed by NARAC personnel, with recent enhancements specifically addressing release of a chemical or biological agent, has been successfully employed to plan protection of the public at events of special significance (Ermak, 1998). Similarly, recent modifications to subway ventilation models (Policastro, 1998) and to indoor air models (Sextro, 1998) have made it possible to apply these models to release of chemical or biological agents, although at this time only under certain conditions and primarily by the model developers. It is conceivable that emergency response organizations might solicit the use of these models to plan appropriate responses to releases of chemical or biological substances during special events. Unfortunately, such models may be too difficult or too costly for many communities to take advantage of. In such cases, it may be more appropriate that they independently develop their own expertise in using less sophisticated desktop computer models. With such expertise, it is conceivable these models could be used to produce contingency plans based on conservative parameter estimates, as well as provide conservative estimates of concentration levels over a landscape in the event of real emergencies.
Training to respond to a chemical or biological terrorist act can involve using any currently available atmospheric-dispersion modeling system and would help prepare the first responder community intellectually to deal with unfamiliar situations. However, current computer models for addressing the transport and fate of substances are not interactive and do not reflect to any significant degree the movements of people temporally or spatially during and after the event. To address this situation there is currently an effort under way at the Lawrence Livermore National Laboratory to introduce into a conflict-simulation software system realistic scenarios involving releases of chemical and biological substances. Specifically, the conflict-simulation software has been used successfully as a training tool for military actions on a battlefield (Sackett, 1996), and is undergoing proof-of-principle modification to include results from an atmospheric-dispersion program and relevant toxicological data, so it can
also address health consequences of chemical and biological releases. In the conflict simulation, multiple operators are allowed to role play interactively and move their personnel and equipment during the simulated incident. For example, the simulation software allows individuals and groups to be moved on a variety of urban landscapes and in the presence of different meteorological conditions (e.g., rain, wind, sunshine) and human activities (e.g., traffic patterns, mass exodus, and the actions of perpetrators and responders), all during a specifically designed scenario involving passage of an atmospherically dispersed substance. The dose-response algorithms that are introduced permit the model to provide information about declining performance or death likely to be observed in exposed individuals, both stationary and moving in and out of the cloud. This conflict-simulation model allows communication failures, changing weather conditions, and human-behavior patterns to be introduced into the scenario, giving users the opportunity to respond to situations changing dynamically and to immediately visualize the results of their acts of commission and omission.
Support for Decontamination and Reoccupation Strategies
In many cases unexposed and decontaminated populations are going to need access to safe zones and routes of evacuation. Additionally, contaminated people, structures, and landscapes are going to require cleanup. Certifying exposed facilities, structures, and vehicles as suitable for reuse and individuals as being adequately decontaminated requires defining safe levels for released substances and their degradation products. Performing quantitative health-risk assessments for such substances would generate such information. Such assessments would benefit greatly from, and represent another reason for obtaining, a better understanding of the toxicological and chemical behavior of the substances that could be released in an act of terrorism.
The Committee advocates the following research and development efforts be undertaken to enhance and sustain the capabilities of the medical community to deal with chemical and biological terrorism. Such events, serious as they are, have a low probability of occurrence, but the products of these R&D efforts will also help to identify emerging infections and diseases and to respond to events involving hazardous substances released unintentionally in industrial settings.