National Academies Press: OpenBook

Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop (2017)

Chapter: 3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response

« Previous: 2 An Overview of BioWatch Strategic Priorities
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

3

Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response

The objectives of the workshop’s second session were to accomplish the following:

  • Review the findings and recommendations of the October 2015 Government Accountability Office’s (GAO’s) review of BioWatch system enhancements.
  • Review the Department of Homeland Security’s (DHS’s) response to the GAO recommendations and assessment of gaps in current BioWatch detection, early warning, and decision support capabilities, and discuss any possible gaps DHS has not considered.
  • Review available technological and non-technological capabilities for biosurveillance and reporting, including autonomous systems, as well as the assessment and testing of such systems, with a focus on the BioWatch program.

Presenting the findings in Biosurveillance: DHS Should Not Pursue BioWatch Upgrades or Enhancements Until System Capabilities Are Established (GAO, 2015) was the task of Timothy Persons, chief scientist at GAO. DHS’s response was delivered by Mike Walter. A question-and-answer session followed each of the two presentations.

GAO FINDINGS AND RECOMMENDATIONS

Before discussing the findings and recommendations in the GAO report, Persons thanked Walter and his staff for cooperating fully with the GAO

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

audit and the National Academies of Sciences, Engineering, and Medicine for holding this workshop as a safe place to discuss and try to resolve the issues raised by this GAO report. Persons also reminded the workshop that GAO is an independent, nonpartisan agency that serves Congress and helps support their Article One oversight functions.1 Although this oversight role receives most of the attention, GAO also tries to improve the government by providing foresight as well. The only factor that GAO is against, said Persons, is ineffective government.

Since 2003, Persons explained, DHS has focused on acquiring an autonomous detection system to replace the current BioWatch Generation 2 (Gen-2) system, but has faced challenges in clearly justifying the BioWatch program’s needs and ability to reliably address those needs. He noted that an Institute of Medicine and National Research Council study released in 2011 (IOM and NRC, 2011) concluded that the rapid initial deployment of the BioWatch Generation 1 and Gen-2 technologies did not allow for sufficient testing, validation, and evaluation of their technical capabilities. In September 2012, GAO found that DHS approved the Generation 3 (Gen-3) system acquisition in October 2009 without fully developing critical knowledge that would help ensure sound investment decision making, pursuit of optimal solutions, and reliable performance, cost, and schedule information. Persons said GAO has found that other technology acquisition programs, such as the Advanced Spectroscopic Portal Monitor and the Transportation Security Administration’s advanced imaging technology, were challenged in similar ways.

Based on its findings, GAO recommended that before continuing the Gen-3 acquisition, DHS should conduct key acquisition steps, including reevaluating the mission need and systematically analyzing alternatives based on cost–benefit and risk information. DHS subsequently commissioned an analysis of alternatives, which was interpreted by DHS as showing that any advantages of an autonomous system over the current manual system were insufficient to justify the cost of a full technology switch. In April 2014, DHS canceled the Gen-3 acquisition. Having done so, DHS continues to rely on the Gen-2 system for early detection of an aerosolized biological attack.

The 2015 GAO study, said Persons, was requested by several congressional committees and subcommittees, as well as one senator, and was a systems engineering and technology-centered analysis and evaluation. The study did not evaluate the decision process that goes into declaring a BioWatch Actionable Result (BAR). The three research objectives of the study were to determine the following:

___________________

1 For more information on congressional oversight, http://www.senate.gov/artandhistory/history/resources/pdf/CRS.Oversight.pdf (accessed October 4, 2016).

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
  1. To what extent has DHS assessed the technical capability of the currently deployed system (Gen-2) to detect a biological attack, which is necessary to inform decisions about upgrades and enhancements?
  2. To what extent did DHS adhere to best practices for developmental testing during Gen-3 Phase I, and what lessons can be learned as DHS considers upgrades to Gen-2?
  3. Which technology is currently most mature for an autonomous detection system as a possible upgrade from Gen-2, and what would the potential benefits and likely challenges be if DHS were to pursue an autonomous detection system in the near future?

The analysis, he explained, looked at the program on three levels: the components of the detector, the functionality of a single detector, and the functionality of an array of detectors (see Figure 3-1). Strategic thinking, he said, goes into each layer of the system.

Finding 1

GAO’s first finding was that DHS has not developed performance requirements that would allow for conclusions about Gen-2’s ability to detect attacks. Persons stressed that GAO was not critiquing how well the program collaborated with other agencies and levels of government or how well the decision-making process worked. He noted that DHS officials told GAO that the system’s operational objective is to detect catastrophic attacks, which DHS defines as attacks large enough to cause 10,000 infections, and they stated that the system is able to meet this objective. However, while DHS has commissioned some testing of the system’s performance characteristics, department officials told GAO they have not developed technical performance requirements that would enable them to interpret the test results and draw conclusions about the system’s ability to meet its operational objective.

The GAO review found that DHS conducted four key tests of the Gen-2 system. These included a system-level test using killed agents in a chamber and component-level tests of the filter wash and DNA extraction procedure conducted at the Dugway Proving Grounds in 2013, component-level tests of the aerosol collector plus filter wash performed at the Edgewood Chemical Biological Center in 2010, and component-level tests of the polymerase chain reaction (PCR) assays for sensitivity and specificity conducted over several years at Los Alamos National Laboratory. In addition, said Persons, DHS commissioned a demonstration of the system in an outdoor environment and conducts quality assurance tests on an ongoing basis. Both of these provide additional information about the system’s capabilities, but GAO did not include them in its list of key tests because neither was

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Image
FIGURE 3-1 The three levels at which the Government Accountability Office evaluated the BioWatch system.
SOURCE: Persons presentation, July 27, 2016.

designed to produce estimates of key performance characteristics, including sensitivity, or to support conclusions about the types and sizes of attack the system can reliably detect. Persons explained that the outdoor demonstration, performed by the Naval Surface Warfare Center Dahlgren Division, involved releasing a simulant for one of the BioWatch threat agents and showed that the Gen-2 technology could successfully detect this simulant in an open-air environment. However, aerosol concentrations were not varied systematically and measured independently in such a way as to produce statistical estimates of the system’s sensitivity.

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

GAO also found limitations and uncertainties in the four key tests of the Gen-2 system’s performance characteristics, particularly with DHS’s use of test chambers instead of operational environments and the use of simulants in place of live biothreat agents. Chamber environments, said Persons, are generally not designed to represent operational environments in such factors as air temperature, humidity, and even the presence of potential interfering agents in the air; he described them as well-controlled lab environment tests. Similarly, simulants may not mimic the biothreat agents that the system is designed to detect in all of the ways that matter for system performance, so as a result, the system might perform differently when presented with the target biothreat agents than when tested with simulants.

In addition, while one of the four tests assessed the performance of the whole Gen-2 system, the three other tests were limited by their focus on components of the system, including the aerosol collection component; the filter wash process; the DNA extraction process; and the analytical component. Persons noted that a National Research Council (NRC) committee in the report Review of Testing and Evaluation Methodology for Biological Point Detectors (NRC, 2005) determined it is uncertain whether test results from individual components of a biodetection system will accurately reflect the performance of the whole system. He added that DHS did take steps to mitigate the limitations associated with not testing the Gen-2 system in an operational environment with live biothreat agents, but these limitations could not be entirely eliminated. For example, the tests at the Dugway Proving Grounds included a direct comparison of live and killed agents, but this could be done only for the analytical component of the system. The Edgewood test of the aerosol collection components included variations in temperature and humidity, but only a small number of combinations of temperature and humidity were tested. Finally, the Los Alamos tests of the PCR assays included testing the assays with a set of environmental organisms and substances, but they did not generalize to other organisms and substances that might occur in BioWatch operational environments.

Finding 2

GAO’s second finding was that the modeling and simulation studies DHS commissioned have not directly and comprehensively assessed Gen-2 capabilities. In the absence of technical performance requirements for Gen-2, DHS officials said they have used modeling and simulation studies, commissioned from multiple national laboratories, to link test results to conclusions about the system’s ability to detect attacks. However, said Persons, these studies were not designed to directly and comprehensively assess Gen-2’s operational capabilities, and none of the studies that were

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

provided or described to GAO incorporated specific test results, accounted for uncertainties in those results, and drew specific conclusions about the Gen-2 system’s ability to achieve the defined operational objective.

Persons said the national laboratories did the best they could with the information they had, but GAO found that the modeling and simulation studies were focused largely on probability detection by the national system as a whole and were less focused on what capabilities the detectors needed to provide results that would inform the decision framework. The modeling and simulation studies were designed to predict the capabilities of hypothetical biodetection systems similar to BioWatch with different performance characteristics and deployed in different ways, and they assessed possible trade-offs in deploying fewer detectors with higher sensitivity or deploying more detectors with lower sensitivity. The modeling and simulation studies also analyzed how these hypothetical systems would respond to simulated attacks of different sizes, using different agents, in different locations, and under different conditions using optimal locations rather than actual locations of Gen-2 collector units. They also analyzed ranges of hypothetical system sensitivities rather than incorporating the results of the four key tests of the performance characteristics of Gen-2.

DHS commissioned additional modeling and simulation studies for the purpose of selecting sites for Gen-2 collector units. These studies included a test result that was meant to describe the sensitivity of the Gen-2 system, but this was an older result and represented only one of the five BioWatch threat agents. In addition, these studies used a measure of operational capability that did not directly support conclusions about the BioWatch objective of detecting attacks large enough to cause 10,000 casualties. Instead of the latter, the tests used a measure called fraction of population protected (Fp). Persons explained that Fp is a weighted probability of detection that represents a system’s probability of successfully detecting simulated attacks. Fp is calculated in a way that gives more weight to attacks that infect more people and less weight to attacks that infect fewer people.

Summarizing the general limitations of the modeling and simulation studies, Persons said that none of them provided a full accounting of statistical and other uncertainties, which means decision makers have no means of understanding the precision or confidence in what is known about system capabilities. The studies did not incorporate information about uncertainties associated with estimates of the system’s limits of detection, and they did not account for uncertainty in some model inputs and assumptions, including estimates of how infectious each of the BioWatch threat agents is and how quickly each agent decays after it is released into the air. The researchers who conducted these studies reported that gaps in their knowledge of the correct dose–response relationship significantly limited their ability to predict the outcome of outdoor anthrax attacks.

With regard to the current system capabilities, GAO concluded that

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

as a result of gaps and limitations in testing and analysis, considerable uncertainty remains as to the types and sizes of attacks that the Gen-2 system could reliably detect, and therefore DHS lacks the basis for informed cost–benefit decisions about possible upgrades or enhancements to the system. GAO also concluded that assessing the operational capabilities of the Gen-2 system against technical performance requirements directly linked to an operational objective, incorporating specific test results, and explicitly accounting for statistical and other uncertainties would help ensure that future investments are actually addressing a capability gap.

Based on those conclusions, GAO recommended that DHS should not pursue upgrades or enhancements to the Gen-2 system until it

  • establishes technical performance requirements, including limits of detection, necessary for a biodetection system to meet a clearly defined operational objective for the BioWatch program by detecting attacks of defined types and sizes with specified probabilities;
  • assesses the Gen-2 system against these performance requirements to reliably establish its capabilities; and
  • produces a full accounting of statistical and other uncertainties and limitations in what is known about the system’s capability to meet its operational objectives.

Finding 3

During a June 4, 2013, meeting conducted in collaboration with the National Academies, GAO identified eight best practices for developmental testing of binary threat detection systems.2 For the most part, said Persons, the BioWatch program partially aligned with these best practices when it was testing the Gen-3 system (see Table 3-1). Persons said the Gen-3 acquisition was canceled while GAO was conducting its assessment; however, GAO also identified several lessons DHS could learn by applying these practices more systematically to improve future testing and acquisition efforts (e.g., when testing possible upgrades or enhancements to Gen-2).

Persons noted that in recent years, DHS has canceled major acquisitions that GAO previously found could have been more rigorous in their test design or execution and that could not have performed as intended given that they were based on relatively immature technologies placed in an unsuitable operational context. He stressed that this context is under-

___________________

2 GAO convened a meeting of 12 experts with assistance from the National Academies in order to determine best practices on this topic. The meeting was held to inform GAO report 15-263, titled Combating Nuclear Smuggling: DHS Research and Development on Radiation Detection Technology Could Be Strengthened. For more information on the meeting, see Appendix II of the report at http://www.gao.gov/assets/670/668906.pdf (accessed October 10, 2016).

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

TABLE 3-1 DHS Alignment with Best Practices for Developmental Testing During Gen-3 Testing

Best Practice for Developmental Testing Assessment
Practice 1: Ensure that accountability and engagement in developmental testing are commensurate with the amount of risk accepted Partially aligned with the best practice
Practice 2: Include representatives from the user community in design and developmental testing teams to ensure acceptance of the system by the user community Did not align with the best practice
Practice 3: Take a proper systems engineering view of the system prior to entering into any developmental test Partially aligned with the best practice
Practice 4: Use statistical experiment design methodology to establish a solid foundation for developmental testing Partially aligned with the best practice
Practice 5: Measure and characterize system performance with established procedures, methods, and metrics Partially aligned with the best practice
Practice 6: Test to build resilience, especially in the development stages Partially aligned with the best practice
Practice 7: Use developmental tests to refine requirements Partially aligned with the best practice
Practice 8: Engage in a continuous cycle of improvement by (1) conducting developmental testing, (2) conducting operational testing, and (3) incorporating lessons learned Fully aligned with the best practice

SOURCE: Persons presentation, July 27, 2016.

standable given the threats the nation faces and the department’s desire to do what it can to meet the challenge of detecting these threats. “It is just that too often the technology can be oversold and underdelivering,” said Persons. He noted, too, that DHS responded positively to the recommendation to incorporate best practices and has done so with its technology development projects, including those aimed to enhance BioWatch Gen-2 capabilities.

Finding 4

The fourth finding, which Persons characterized as a miniaturized technology assessment, was that PCR is the most mature technology available for an autonomous detection system, but there are still uncertainties about

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

TABLE 3-2 National Academies’ Evaluation of Technologies for Autonomous BioWatch

PCRa (Nucleic Acid Signatures) Immunoassays and Protein Signatures Genomic Sequencing Mass Spectrometry
Technology Readiness Levelb 9 6+ 4 6
Sample Preparation Effort Moderate Low Moderate None
Sensitivity High Moderate High High
Specificity High Moderate High Moderate
Cost Moderate Low High Low
Stand-alonec Yes No No No

a PCR = polymerase chain reaction. PCR is currently used in the Gen-2 BioWatch system.

b Technology readiness level (TRL) is a method of estimating the maturity of a technology for a given purpose, ranging from TRL 1 to TRL 9. For example, a technology readiness level of 6 indicates the technology has been validated and is ready for testing in a setting representative of an operational environment. A readiness level of 9 indicates the technology has been deployed under operational mission conditions. Although the National Academies report assessed PCR at TRL 9—the highest TRL level—the PCR-based autonomous detection system tested under the Gen-3 acquisition was not assessed at that high a level for the whole system. We reported in 2012 that an independent technology readiness assessment rated all but one of the critical technology elements of that system as TRL 7—indicating a relatively high level of maturity for each technology element assessed—but lower than TRL 9.

c A technology is stand-alone if it can be used to detect and confirm the identity of an agent without the use of another technology.

SOURCES: Persons presentation, July 27, 2016; IOM and NRC, 2014.

the benefits and challenges regarding how it would be used to support the BAR decision science framework. The GAO report noted that while autonomous detection may provide benefits that include reduction in casualties and clean-up costs and greater cost efficiency, the potential benefits of an autonomous system for BioWatch depend on specific assumptions, some of which are uncertain.

In 2014, a National Academies workshop titled Technologies to Enable Autonomous Detection for BioWatch: Ensuring Timely and Accurate Information for Public Health Officials explored several technologies that could be used in an autonomous BioWatch system (IOM and NRC, 2014), and several discussants agreed that PCR was the most capable at the time of serving in this role (see Table 3-2). Persons noted in concluding his comments that an autonomous detection system faces five likely challenges to development in the near term. These include

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
  1. ensuring the detection technology is sensitive enough to identify targeted agents dispersed in the air;
  2. ensuring the system does not generate false positives;
  3. ensuring that the networked communications between the autonomous system and stakeholders is secure;
  4. managing the extent of the reported data and interpreting the data to determine appropriate follow-up actions; and
  5. overcoming the difficulty of keeping such a system operational over an extended period of time in dirty or extreme environments.

DISCUSSION

The discussion began with moderator Gerald Parker, associate vice president for public health preparedness and response, Center for Innovation in Advanced Development & Manufacturing, Texas A&M University Health Science Center, crediting the BioWatch program for bringing together public health, emergency management, law enforcement, and intelligence at the local, state, and federal levels to learn how to deal with what at first is low-confidence information that increases in confidence as more information comes into the system once a BAR is declared. “This is probably one of the best examples of how a program and a technology has driven that collaboration in understanding how to deal with low-confidence information in a confident way,” said Parker. He believes, however, that if the program does not get the technology right, this collaborative spirit will disappear, which is why he believes the GAO audit of the program’s technology and the insights from that audit are so important. Going forward, he said, the future of the program will depend on how the entire BioWatch community addresses the shortcomings noted in the GAO report.

James Lambert, research professor and associate director of the Center for Risk Management of Engineering Systems, University of Virginia, first noted that systems engineering is not a monolithic philosophy, but rather one in which there is a key schism. In the academic community, represented by the Institute of Electrical and Electronics Engineers and the International Council on Systems Engineering, systems engineering emphasizes requirements and decision sciences. However, in the practitioner community, which includes the U.S. military, the North Atlantic Treaty Organization, and others, systems engineering is more forward thinking and is willing to accept technologies whose missions will evolve in the face of ever-changing requirements. He suggested that the BioWatch community would benefit from keeping this more future-oriented philosophy in mind. Doing so led Lambert to consider the goal of keeping the casualty toll from any biological agent release below 10,000. It also led Lambert to consider the role of technology in meeting that goal. He estimated that technology

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

contributes perhaps 5 to 10 percent of the capability to meet that casualty goal, with intelligence, deterrence, and other national capacities playing larger roles. Lambert also wondered whether the same level of technology will be deployed universally across the country, or if some places will get a more advanced version of the technology depending on how resources are allocated. His final point was that he would like to see more of a focus on the time horizons—minutes, hours, or days—over which the technology provides the ability to make decisions.

Persons replied that it was not GAO’s position that DHS should not take risks or take the lead in developing technology. GAO’s message in this report, said Persons, was “with the risk that you have, how do you manage those so that you get that one-of-a-kind type of technology that only the government can do itself?” The market, he said, is not going to supply the needed technology without some incentive or some symbiosis with the federal government. The report, he added, was not meant to eviscerate the idea of risk. “It is how you manage it and how you test it that is the theme of the report,” explained Persons. Given the technology is not perfect, the question is how to have assurance up the decision chain if the technology is not trustworthy at the lowest level. With regard to how the technology is deployed across the country, Persons said that for other programs, GAO has endorsed piloting technologies in different parts of the country to see how they function under different meteorological conditions and in varied contexts. The same principle should be applied to BioWatch detectors, he said.

Persons and Walter had a brief discussion about the 10,000-casualty metric and the Fp metric. Walter reiterated that Fp is the fraction of the population that would be affected by a BioWatch-detected release. The 10,000-casualty metric was established by DHS’s Science and Technology Directorate (S&T) as a measure of a widespread, catastrophic bioterrorism event. The difference between these two measures, he explained, is that Fp addresses the monitoring and public health protection BioWatch provides to a given jurisdiction as opposed to addressing merely the size of an attack.

Walter then questioned why GAO determined that BioWatch has issues with developmental testing given that BioWatch does not do such testing. Developmental testing, he said, is the purview of S&T, and S&T uses probability of detection, not Fp, as a metric for determining how well the system works in development. BioWatch’s testing program looked at the independent performance of its collectors, and that, he explained, entailed conducting a whole system test against a killed agent to examine the entire process, from collection through analysis and identification, and a separate operational demonstration on a military base using a simulant to show that the system could detect and identify it. Nonetheless, Walter said, BioWatch was “deemed as unreliable by the GAO report because we had never conducted a live agent test in an operational environment to prove the system

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

would work under conditions of operations.” In his opinion, conducting that type of test would be extremely irresponsible and akin to a bioterrorism attack; he noted that if he oversaw that type of exercise, there would be serious professional repercussions, including possible prison time.

Kathryn Brinsfield expressed this conundrum in a slightly different way. “We want to create an excellent program, but there is a certain amount of risk with the technology that will never be fully enough developed to be perfect. Short of actually releasing live agent in New York, which is probably not a good idea, from either GAO’s or the committee’s perspective, what are the other tests that could be done that would put the confidence necessary to say that we know it’s not perfect, but this is good enough for making accurate decisions?” Answering that question, she added, seems to be an unanswerable holy grail. Persons replied that he understands those comments and suggested that the issue is with the way S&T operates.

Matthew Davenport, program manager, DHS S&T, added that he sees the GAO report as directed more toward DHS, not BioWatch, in that BioWatch has operational requirements that set the stage for what DHS should be doing in the research and development environment and how it should approach testing the technologies it develops for deployment in the field. In the case of BioWatch, said Brinsfield, no one seems to be able to say what the optimal test for BioWatch should be, in part because the science does not yet exist to say what the perfect metric should be for detecting a biological agent in a way that would protect a certain percentage of the population.

Roger Pollok, interim assistant director of the Environmental Safety Division, San Antonio Metropolitan Health District, said his city has fully embraced BioWatch and has wanted to increase the size of the program. As a former laboratory director, he understands that the technology is not perfect, but is unsure how it could be better. “I am sure there’s a possibility, but how much money do you throw at the system to make it perfect?” he asked. He noted that he and his colleagues see BioWatch as an assessment tool and not the sole piece of information that would trigger activating the national public health emergency response system. “There are a lot of other things that go into play with this,” said Pollok. “It is getting on the telephone with your infection control practitioners in the hospitals and looking at the reportable diseases that are required to be reported by law. It is looking at syndromic surveillance systems and at what the threats are. There’s a much bigger picture.”

In that respect, added Davenport, one of the benefits of BioWatch has been that it led to the development of many of the protocols that have enabled decision makers at the local, state, and federal levels to understand how to deal with low-confidence information and try to increase confidence in that information. “That is a huge benefit that should not be overlooked,” he said. Julia Gunn, director, Communicable Disease Division, Boston Pub-

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

lic Health Commission, agreed with that comment and said that perhaps what is needed is a more thorough examination of how the single data point from BioWatch sits within all of the other information becoming available and the analytics required to interpret the pooled pieces of information. “How do we build those systems to expand the utility of the technology to monitor air for biological agents and think about this as a system to improve public health decision making and response?” asked Gunn.

Mark Buttner, professor, Department of Environmental and Occupational Health, School of Community Health Sciences, University of Nevada, Las Vegas, said Las Vegas did not embrace BioWatch at first, but since it has the city believes the program is a great success. From his perspective, the city’s robust quality assurance program has generated a large database of false-negative and false-positive rates, and so he was curious about what additional performance requirements need to be specified and met before the program can pursue enhancements. He noted that the GAO report implies that the program should not pursue enhancements until the capabilities are fully established, but he believes one could argue that the program should continue to pursue enhancements while it establishes its capabilities.

Persons replied that GAO did not evaluate what frontline public health officials such as Buttner and Pollok think of the program. “We were just asking a simple engineering question of [whether you can] tell me the sensitivity and specificity in its operational environment,” said Persons. From an engineering perspective, the answer to that question does not currently exist. “I’m not denying that you all are saying, ‘It’s good enough for us,’” he added. “That is not the issue. That gap we are talking about is for numbers that say here is how the system works with sensitivity, specificity, limits of detection, and so on in its operational environment, not just the laboratory.”

Brinsfield, attempting to summarize the discussion to this point, said there are three issues:

  1. Simulants do not exactly mimic a real biological agent.
  2. Releasing a real agent in a real city is not feasible.
  3. At a certain point, spending money to make the system that much better does not make sense.

She then asked the assembled experts at the workshop for their ideas on a test DHS could run that would say, “We know we cannot answer your question exactly because that will never happen, but is this good enough?” She noted that the simulant test may not have been perfect, but a great deal of thought and environmental study went into planning that test to provide as much information as possible.

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

Lieutenant Michelle Hohensee, U.S. Public Health Service Officer and jurisdictional coordinator program lead, BioWatch program, OHA, DHS, said that in 2015 there was an uptick in Francisella tularensis, the agent that causes tularemia, and there were multiple BARs for this organism that were indicative of a public health issue in one of the BioWatch jurisdictions. “The fact that it did track what was going on at that particular location, to me, seems very operational even though we did not release something, it was aerosolized, and it was environmental, but it shows the system does work and that it does track [observations],” said Hohensee. Buttner replied that the limit to that test was that the amount of organism released and its source were not known, but Walter disagreed with that comment. “I think the fact that we have an outbreak of disease that is actually a vector-borne disease and the system is still sensitive enough to pick up the organism that is causing the disease in an aerosol in an operational environment demonstrates the efficacy of the system,” said Walter. “If we are that good and we know we can detect this organism during the outbreak of disease, it argues from an operational perspective that the system is effective.” Persons countered that “good” seems relative. “When you say ‘we are good,’ what is the threshold of good? That is the core issue I am trying to get at,” said Persons.

Daren Brabham, assistant professor, Annenberg School for Communication and Journalism, University of Southern California, commenting as someone with expertise in communications rather than lab science, said that it seems the BioWatch detector works well in the context of being an indicator that a process needs to start. From that perspective, he wondered why that is not good enough. He then noted that it appears that user testing was missing in the table that Persons presented on testing (see Table 3-1) and asked why user testing was missing at such a late stage of development; in most technology development efforts, users are involved along the way. He also suggested that a deep organizational communication study examining how these systems are used in different cities, why false positives happen, how the cities handle false positives, and other aspects of process and outcome could yield some important insights that would improve the overall program. Persons responded that nowhere in the GAO report is there a recommendation to develop a perfect system. The issue the report does raise, he reiterated, is that the metrics used to determine that the BioWatch detector is good enough are unclear.

John Vitko, Jr., remarked that some of the biggest uncertainties in the system arise not from the detectors, but from the ability to model air flows and other aspects of how a biological agent would disperse in a given microenvironment. In his opinion, the question that needs asking is how to balance refining the instrumentation further in the face of all of the other uncertainties in the system. Persons agreed that is a good question,

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

and he reiterated that the comments in the GAO report were directed at DHS as a whole and not just at the BioWatch program office. Brinsfield added that the challenge for DHS is to balance how much money to spend improving the technology, developing better microenvironmental modeling, increasing public health’s involvement in the project, and so on to meet the goal of saving as many lives as possible in the event of a biological agent release.

Roger Pollok noted that there are two essential technology pieces to the BioWatch system: the collection system and the analytical system. Thanks to the large amount of money invested by medical testing companies, PCR works, he said. The question is whether the system can capture enough of the organism to detect by using PCR. To answer that question, he suggested using carbon-13 labeling to quantify the fraction of a test organism that the detectors capture, which would address at least one of GAO’s concerns.

Al Romanosky noted that the BioWatch system has generated a great deal of data since its inception. He suggested these data should be available for the scientific community to analyze to see if any insights can be gleaned from the data. In the same way, he said, the program should take a closer look at how the different pieces of the BioWatch system, from the detector to the local response, have worked as a means of identifying which parts are operating together most efficiently. An examination of how New York City responded to a recent Legionella outbreak, for example, produced some important lessons that are being applied to improve the public health response system there.

Concluding the discussion session, Brinsfield said, “What I get loud and clear from this discussion is that while we lack the perfect test and clearly lack the ability to do the perfect test, one way we could look at some of those issues is to figure out what data we have within the program that we could release to some of the partners and see what studies they would like to run on their own data.” She noted there will be jurisdictional issues to resolve for that to happen, but she did not see that as a major obstacle given the partnerships that exist between the BioWatch program and the local public health communities.

DHS RESPONSE TO THE GAO RECOMMENDATIONS AND AN ASSESSMENT OF GAPS IN CURRENT BIOWATCH CAPABILITIES

Providing a response of how the BioWatch program has addressed GAO’s recommendations, Walter first said that the GAO investigation was something to be taken advantage of because it is always beneficial to have someone outside of the program and who does not have a vested interest in any particular aspect of the program to look at the way things are done and identify places that can be improved. The 18-month investigation,

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

he said, produced three recommendations, related to establishing technical performance requirements, testing the system against those requirements, and accounting for uncertainties and limitations of the system. To address those three recommendations, BioWatch engaged three national laboratories—Argonne, Los Alamos, and Sandia—to complete the following tasks by April 30, 2016:

  • Identify and define an alternate BioWatch system performance measure other than fraction of population protected (Fp).
  • Develop a model for the alternate performance measure and run it using current BioWatch system performance parameters—the result of the modeling should yield a specific measurement for each jurisdiction.
  • Develop and document the new program performance measure, modeling, and outcomes, and provide a description of uncertainties and limitations associated with BioWatch program measures.

With regard to the fourth GAO recommendation, which was to use best practices to inform test and evaluation actions, Walter said BioWatch has and will continue to adhere to acquisition management guidance provided by DHS, and is ensuring the recommended best practices are incorporated into its approach and documentation. He noted that BioWatch is working closely with DHS S&T on the best ways to do testing and evaluation going forward.

GAO cited some items on which BioWatch and DHS do not agree, and Walter discussed those. To start, GAO concluded that DHS has not defined technical performance requirements that would link test results to conclusions about the types and sizes of attacks. To the contrary, said Walter, BioWatch uses Fp to make this linkage, where Fp considers multiple attack scenarios, as well as the known performance of the BioWatch system, and focuses on maximizing the number of lives protected. GAO also concluded that BioWatch does not incorporate empirical data gathered on the currently deployed system to inform modeling and simulation studies, when in fact BioWatch actively sponsors modeling and simulation studies that use a range of empirically derived sensitivity values. The GAO report also stated that DHS lacks reliable information about BioWatch Gen-2’s technical capabilities to detect a biological attack, when in fact, four independent tests have been conducted over the past 6 years that have tested all components of the BioWatch system, and Walter said these tests proved system reliability. In addition, the BioWatch Quality Assurance Program has analyzed more than 30,400 samples to monitor operations against performance benchmarks and requirements. “If someone can come up with a different test for us to use that is more definitive, I am happy to do it,” said Walter.

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

“It may be, though, that this is one of those unresolvable issues, because no one is going to allow me to do bioterrorism in their city to prove a point.”

In multiple places in the report, GAO cited the limitations and uncertainties associated with testing a biological detection system, and Walter said these limitations and uncertainties are known to the program and are addressed in program documentation and other BioWatch-funded efforts. Walter said the assertion that GAO “discovered” these limitations and uncertainties can be construed as the BioWatch program being unaware of these issues and implies that the program is unreliable, and has the potential to unfairly undermine confidence in a national security program. “That is something, more than anything else, that we had a problem with,” said Walter. “We believe the program is reliable. We believe the system does work. We have had 160 BARs and the system has worked quite well.”

Walter also addressed the GAO recommendation to identify a metric other than Fp as a system performance measure. The three national laboratories identified the probability of detection of a biological aerosol attack that results in more than 10,000 infections (Pd-10K) as an alternate measure. Pd-10K, he explained, detects catastrophic attacks, maximizes the number of scenarios covered, and weighs all scenarios involving more than 10,000 people equally to maximize the number of scenarios covered. By contrast, Fp detects attacks of the greatest consequence in terms of the number of people infected, including catastrophic attacks, and maximizes the fraction of the population covered by detecting the most consequential scenarios.

Though the national laboratories came up with the Pd-10K metric, they concluded that there is no significant difference between Fp and Pd-10K performance measures when the new, experimentally derived Dugway Proving Ground sensitivity values for limits of detection are used versus the sensitivity values used in current BioWatch models and simulations. The national laboratories noted that Pd-10K is not a meaningful metric for indoor facilities and subways because attacks that large would likely infect the entire facility or system and thus any collector architecture would detect it. In addition, Pd-10K is not a robust performance metric for determining the optimal number and placement of collectors in facilities or subways. Modeling results for the two measures produced nearly identical results for four different outdoor jurisdictions, and the national laboratories concluded that neither metric is completely sufficient for measuring achievement of BioWatch’s ultimate mission objective, which is to reduce casualties and the economic and social impacts of a biological attack.

Based on the results from the national laboratories, Walter explained that BioWatch will continue to use Fp for collecting purposes in outdoor, indoor, and subway locations. For outdoor deployments, BioWatch will characterize system performance using both Fp and Pd-10K, but it will continue to rely on Fp to characterize system performance for indoor facili-

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

ties. He noted that prior to siting collectors, Monte Carlo simulations are performed with hundreds of thousands of simulated attacks, ranging from very small to very large quantities. “That will give us a pretty good idea on how to site these collectors indoors, which is becoming more and more important to the program,” said Walter.

Walter said BioWatch is now working with Los Alamos National Laboratory to modify its modeling and simulation tool to automatically compute Pd-10K for BioWatch jurisdictions. Once this tool is optimized to BioWatch’s satisfaction, said Walter, it will be used whenever possible to determine the Pd-10K value for all BioWatch jurisdictions and report system performance using both Fp and Pd-10K. When possible, BioWatch will also use the experimentally measured sensitivity values from the Dugway Proving Grounds test, along with confidence intervals, for all applicable modeling and simulation. It will continue to follow established DHS acquisition policy for any future upgrades or changes to BioWatch technology. He added that BioWatch is working with S&T to find a technology that is not PCR, but as effective as PCR.

DISCUSSION

Lisa Gordon-Hagerty opened the discussion by asking Walter to discuss the timeline BioWatch is following to further characterize system performance and develop enhancements to the technology. Walter replied that for the modeling effort using Pd-10K for outdoor deployment, the program has funded Los Alamos National Laboratory and that should be completed in the 2017 fiscal year. Regarding technology acquisition, he said that the hope is to get through an initial test and evaluation effort with S&T in early to mid-2017. “That will be a technology demonstration or evaluation of technologies that we believe are mature enough for us to go into acquisition and deployment,” said Walter. The next step, which would be completed by the end of the 2017 calendar year, will be to take those test results and identify suitable companies to develop the technology for deployment. He noted that S&T has already funded the technology demonstration phase and that the current plan is to budget approximately $10 million at the beginning of 2018 for technical testing of up to three technology concepts developed by private-sector contractors and operational testing for as many as two technology concepts. BioWatch has already contracted with the Applied Physics Laboratory at Johns Hopkins University to serve as an independent test assessor. Operational testing, he noted, will take place in select jurisdictions and will be overseen by those jurisdictions.

Discussion also took place about the dangers of developing a technology and purchasing it at the same time, something that is not limited to the BioWatch program. During this discussion, Persons suggested the gap

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

that needs to be filled with respect to testing is to conduct modeling and simulation across all levels of the system. Brinsfield noted that BioWatch has done a good job testing the system with the available funds and that DHS has yet to determine how much technology enhancement will cost. Walter added that there is still a need for a policy decision that states the threat against which BioWatch is charged with defending.

The message Vitko got from the GAO report was that the modeling and simulation for BioWatch were conducted on an “ideal technology” rather than the system architecture in place, but Walter said that is not true. The siting models used to place the collectors rely on data from the Dugway tests and meteorological data from proposed locations at that time of year. “Short of conducting releases every day of the year under every sort of environmental condition, that is the best I can do,” said Walter.

Gunn asked if there is a searchable database for recording problems with specific detectors, and Walter replied that there is. “We track the performance of all the collectors on a daily basis. If there is a problem with a collector, then the field director is required to submit to the program, within 6 hours, [which] collector it was, what was not working, and then to put in what they call an exception report,” he said. That information, he added, has informed acquisitions. He also noted that BioWatch is introducing an automated sample tracking system that will tabulate these data.

Responding to another question from Gunn about whether the program reevaluates detector siting, Walter replied that his office responds to jurisdictional requests for reoptimization and modeling of the arrays deployed in the requesting jurisdiction. A reexamination in two jurisdictions where explosive growth has occurred, San Antonio and Austin, found that while their populations had increased, the mean statistical area containing that population had not changed. BioWatch has also reappraised detector siting in Dallas–Fort Worth and is now looking at Las Vegas, given that the population there has expanded to new areas. These reappraisals are done on a case-by-case basis because they are not inexpensive to conduct, he explained. When Gunn suggested it might be appropriate to reappraise every jurisdiction according to a planned timetable, Walter said that would be a good idea, but the program lacks sufficient funds to do so. Brinsfield added that if the local BioWatch program partners want a systematic reappraisal, the funds to do so would have to be requested in future budget requests. Acknowledging the budgetary restrictions, Gunn argued that a more systematic reappraisal process could allow areas that had been overlooked to be reappraised instead of just reappraising in those jurisdictions with more “squeaky wheels.” Walter replied that the entire system was reoptimized in 2012, and he expressed concern that the jurisdictions would argue among each other about the order in which they would be

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

examined. He also reiterated that if a jurisdiction requests a reexamination the program will make it happen.

Gordon-Hagerty asked Walter if there was a plan to expand the BioWatch system and if so, if there is a standard procedure for adding new jurisdictions that want to participate in the program. Walter replied that there is not a plan and that the BioWatch program needs to create a standard operating procedure on how it would expand using the deployments it has made for various national events such as the Super Bowl or political conventions as the model. He noted that when the system was deployed in Charlotte for the 2012 Republican National Convention, city officials wanted to know why the system did not remain in place. Walter explained that while the program had the equipment to deploy, it did not have the $2 million to $4 million per year in its budget to maintain the system. The last jurisdiction to be added permanently to the program was Las Vegas in 2010, and additional cities have asked to be added to the program. Walter noted that part of the Gen-3 acquisition program called for doubling the size of the network, and that returning to that plan might be feasible. Referring to any future expansion plan, Jonathan Greene said there is no methodology for deciding which cities should or should not be included. Brinsfield added that the original list of cities for BioWatch deployment was developed with significant input from Congress. She also said that without a stratification of risk, DHS could develop a new list, but it would do so with significant input from risk assessment experts, including those at the Federal Emergency Management Agency (FEMA).

Mullins asked if there was a secondary evaluation of the 160 BARs that have been reported over the 2003 to 2016 period. Walter replied that it is up to the laboratory to decide whether to culture the detected organism or do other follow-up examinations. He explained that the performance of the assays and the laboratories is tracked daily, and that a standard set of controls, spiked samples, and blanks are run at each of the 29 laboratories around the country. This is done every time samples are collected and analyzed. He noted that the spiked samples contain DNA at a concentration that has a 95 percent chance at detection, so some laboratories will occasionally report a negative finding when in fact the sample was positive for the test DNA.

PANEL DISCUSSION ON THE ALIGNMENT OF S&T FOCUS AREAS AND BIOSURVEILLANCE

With the groundwork laid by Persons’s summary of the GAO findings and Walter’s presentation of the DHS response to the GAO recommendations, a panel with representatives from DHS and the local jurisdictions provided a broader perspective on the issues raised previously. The panel

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

members included Mark Buttner; Matthew Davenport; John Fischer, director of DHS S&T’s Chemical-Biological Defense Division; Julia Gunn; Lieutenant Michelle Hohensee; Roger Pollok; and Wendy Smith, epidemiology program manager for the Georgia Department of Public Health. The session was moderated by John Vitko, Jr., who asked the panelists a series of questions to stimulate the discussion.

Are there additional perspectives on the gaps or issues identified by either the GAO review or DHS’s response to that assessment? Is there something you feel you want to add, insights into what’s already been discussed, proposed, or things that you think needed to have been and were not on those bases? (Vitko)

Hohensee responded that one gap that had not yet been mentioned is that there are not many automatic data feeds in the BioWatch system that can submit data to a centralized database. Currently, most BioWatch operational data are entered manually into a decentralized data system, so one potential requirement for any technology enhancements could be the capability to submit data automatically. Hohensee also noted that the current portal for the BioWatch database is compliant with the Federal Information Security Management Act (FISMA) and that the department plans to move the portal from the “.org” environment to the DHS security environment. Ensuring that all future data feeds are compliant with FISMA and the DHS security environment will be a significant consideration when looking at technology enhancements.

With regard to addressing some of the issues GAO raised about developmental testing, Davenport said that S&T has funded work at the U.S. Army Edgewood Chemical Biological Center (ECBC) to leverage some of the test capabilities it developed as part of the Joint United States Forces Korea Portal and Integrated Threat Recognition program. He noted that S&T and the DHS Office of Health Affairs (OHA) issued a series of Requests for Information for as many as six different technology categories that could be transitioned into the program in the near term and over the long term. For the near-term technologies, as well as some that Sandia National Laboratory and the BioWatch program identified through a market survey, S&T has provided funds to ECBC to conduct tests and generate independent performance data that can determine if any of them meet BioWatch’s needs. Examples of such technologies include trigger and collector systems and portable handheld detectors that could be used to do a presumptive field screen upon a triggering event. Davenport said that S&T is now working with Walter and his team to develop what he called entrance and exit criteria for the testing program that will identify perfor-

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

mance criteria and enable S&T to prepare standard operating procedures for test protocols.

Given that BioWatch fits into an overall biosurveillance system comprising multiple elements, how are risk and responsibility apportioned across that system? Who is responsible for making those determinations? (Vitko)

Before answering that question, Fischer responded by reminding the workshop that BioWatch came to be because President George W. Bush declared in his 2003 State of the Union address that the United States would build a biosurveillance system. “That is not an efficient way to start anything,” said Fischer. In the Department of Defense (DoD), where he worked 31 years prior to joining DHS, the procedure for any new system that might be developed was to first develop the requirements for that system—what its function will be, what its performance characteristics will be, and how that performance would be measured. “That is not just for aircraft carriers or nuclear submarines,” said Fischer. “You want to buy a $2 item, you will have a requirement. If you do not, you do not buy it. It is that simple.” The result, he explained, is that any new program must develop a well-orchestrated, disciplined process for setting requirements and costs, a process that takes time. BioWatch, by contrast, was deployed in 3 months, and the result is a program with extremely high risk. “BioWatch has never been given the opportunity to sit back, take a deep breath, and ask what it should be doing differently now, how it makes improvements, how it can measure performance, and [how] it can measure progress,” said Fischer.

Today, he said, “BioWatch lives day to day because there is always a [congressional representative], a senator, or a staffer knocking on your door, demanding that something needs to be done immediately with no consideration for how long it should take to truly develop this system to serve its intended function. We really have to think long term and plan long term, but we have to react day to day,” said Fischer. He noted that 20 percent of his budget for next year is dedicated to biosurveillance and is responding to specific needs in the near term for what the program has requested, but there is little time to develop a long-term roadmap that would say where the program should be in 10 years. That time horizon is important for this particular program because nothing in biosurveillance is commercially available. “Right now, with a very limited budget, with intense scrutiny, and with unrealistic expectations from Congress and [the Office of Management and Budget], we’re making incremental progress and incremental gains,” said Fischer. He noted that S&T can borrow some pieces that DoD is developing for its biosurveillance system, but a system to be deployed for

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

military purposes has much different operational requirements and costs far more than a system intended for use in cities and inside buildings.

Fischer noted that S&T, working with OHA, has an incredibly aggressive plan to meet congressional demands. He believes, though, that if GAO conducted another review at this point, it would not be good because addressing the issues GAO raised will take time, time he suspects will exceed the patience of Congress. He also raised the point that neither BioWatch nor S&T have ever been asked how much money it needs to build the desired system, but rather is given a budget and told to make do with it. “We are in what I would call a very high-risk scenario right now and I personally am very worried,” said Fischer.

Returning to the question Vitko asked, Fischer suggested creating a program executive office in DHS for biosurveillance. This office would control the budget, serve as the acquisition official, and coordinate all activities among S&T, acquisitions, and BioWatch, which is the operational arm.

Does S&T get enough clarity about the technical performance requirements for systems you’re being asked to develop? (Vitko)

The short answer, said Fischer, is no. He added that the BioWatch program office is now developing an operational requirements document that would be used to develop key performance parameters and metrics. “Without those documents, there are no requirements,” said Fischer, who noted that within 1 year, the answer to that question will then be yes.

Given their experiences with BioWatch, do the local jurisdictions accept the validity of the technology and its potential utility? (Vitko)

Pollok responded that San Antonio has a high level of confidence in the program, particularly in the science underlying the technology. His experience working with the BioWatch program office detected no gap between the local and federal levels. “What I am seeing here today is a gap between the GAO and the program,” said Pollok. “To me, it seems that if you all would sit down and identify what the operational requirement is, the specifications, I think we could meet those requirements.” He suggested that a possible test material could be a fluorescent powder that behaves and looks like Bacillus anthracis or Francisella tularensis, which would provide a test of the operational system. He noted that the laboratory science has repeatedly been proven.

Buttner noted the importance of communication and collaboration with stakeholders to acceptance of the program. As an example, when a new public health officer was hired in Las Vegas, the BioWatch jurisdictional coordinator and people from the national program office came

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

to Las Vegas and educated the new officer and his staff. The result, said Buttner, was that Las Vegas went from having almost no interest in the program to what he characterized as almost total buy-in, including expansion of the local BioWatch advisory committee, increased engagement with the program office, and unification of all of the local responders. He credited improvements in the BioWatch program office’s communication and collaboration efforts for prompting this acceptance. He also noted that the jurisdictions have been invited to participate in the process of creating a BioWatch strategic plan, which includes communication as an underlying theme. One gap that Buttner said needs to be addressed concerns expansion of the BioWatch system. Las Vegas, he said, wants to expand its monitoring system to include new areas in and around the city where events take place nearly every weekend.

Smith said that Georgia’s experience with BioWatch has been a very positive one and that Georgia has come a long way since the early days of the program, when there was no plan for a response in the event of a BAR. The BioWatch national meetings have acted as great sources of information that she then uses to educate her stakeholders on the BioWatch advisory committee, whose 70-some members include officials from five public health jurisdictions and other partners. She said she has focused on helping the committee and her staff understand what a BAR really says and how to proceed once a BAR is declared. Another good source of information was the Naval Postgraduate School Homeland Security Program, which she attended along with her public health emergency preparedness director. She said she understands that BioWatch is just an environmental sampling program, but the health security advisory committee formed to support BioWatch in Georgia is the only place in the state where all of the stakeholders come together and have productive discussions about how to respond to a BAR, as well as to Zika or other biological threats. BioWatch, she said, has enabled a great deal of trust building across communities in Georgia.

What are other important gaps aside from ones discussed in the GAO report that concern local jurisdictions? (Vitko)

In Texas, said Pollok, there is a gap between the cities and the states, though that is not the result of anything the BioWatch program has done. Rather, there is a feeling among some cities in the BioWatch network that they would like to keep the local meetings and conference calls as local events that do not involve state-level officials.

Davenport said that BioWatch should develop an operational information infrastructure that would describe what should happen in terms of information gathering and response once a BAR is declared. He noted that

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

S&T has a program that works with state and local officials to identify their information-sharing needs and how to best share that information. He suggested that improving information flow for local and state public health and threat response programs and the local BioWatch advisory committee could hasten a coordinated response to a BAR. Adding to this sentiment, Hohensee said the information and data systems are largely decentralized, but creating a secure, centralized system would not only feed information on a BAR to every participating jurisdiction but would also enable mapping and other assessments that could trigger a broader, coordinated response to a BAR. Gordon-Hagerty said FEMA uses the WebEOC system, through which every state and city can report local data. Gunn added that DoD, which has an ongoing project for gathering and analyzing data from different locations, has noted the difficulty of bringing non-standardized data into a centralized system. She suggested that BioWatch would benefit from having natural-language programming capabilities and machine learning from text data to accompany a centralized data platform.

Gunn agreed that data and information are the critical pieces of the larger BioWatch system. A gap that was obvious to her was that DHS might not be getting all of the critical local, and perhaps state, information given the current decentralized data system, and even if that information does get to DHS, it may have gotten out to the local public by that point. “Your ability to interpret information will become extremely difficult once things get out,” said Gunn. She also said there is a need for more practical modeling tools, as opposed to ones that require 24 hours of supercomputer time to run. Such models could provide crucial information for the decision makers to help them interpret a BAR. An additional need, she said, is a public health workforce that can use these new data sources, interpret them, and identify hot zones for further investigation. Currently, public health school graduates are not well versed in using these new data sources.

Another gap, said Gunn, has to do with the increasing complexity of electronic health records, which are making it difficult to obtain information from the health care system. She also noted there are instances where negative information from BioWatch is also valuable. As an example, when the Boston Marathon bombing occurred, getting a negative result from BioWatch was valuable because it eliminated a bioterrorism threat from the concerns of first responders. She suggested this type of story should be shared with Congress.

Not all data are equally valuable, said Gunn, so there is a need for tools to create a data hierarchy. The growth of social media data could be a valuable source of information, but there is still much to be understood about those data. The same could be said, she added, for environmental data. “How do we decide what is going to be informative, what’s going to be clutter, and then how do we train the next generation of the analysts

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

who can look at what’s being displayed, and know where we should have some guidance?” asked Gunn.

Mark Buttner said that deploying detectors indoors in arenas and convention centers is something his community would like to see. The challenge from a BioWatch advisory committee’s perspective, though, is what to do with a BAR when a private company stakeholder needs to interpret that BAR and make decisions. He suggested that DHS should engage in educational outreach with various stakeholder groups on what a BAR means. He also thought there are ways of decreasing the time to declare a BAR that in his opinion do not cost a great deal of money or require autonomous detection. An example he cited was “perhaps using triggers that tell us that there is a large plume of biological particles, and activate a field collection and an analysis so we can get that result faster.” In his opinion, steps also can be taken to optimize the PCR analysis component.

From the local perspective, what future enhancements should BioWatch pursue? (Vitko)

Although not a member of the panel, Romanosky suggested that DHS and BioWatch may want to consult with a futurist to help plan for future enhancements, an idea with which Brabham agreed. Davenport responded that DHS has hired a consulting group that is looking at technologies that could be available within 5 years and is looking at what is being developed in industry and academia that could be useful in the 10- to 20-year time-frame. In the near term, S&T is looking at new microwave technologies, mass spectrometry, and metagenomic sequencing, and is also exploring alternatives to PCR, though there are still many challenges with those technologies with regard to environmental modeling. Brinsfield added that she and her colleagues do spend time discussing what a platform change would entail given what is happening in the genetic engineering and sequencing space. Gunn noted that the demographics of the U.S. populations are changing, so it may be necessary to have technology that is more sensitive given that the very elderly or immunocompromised may be more susceptible to lower doses of an organism.

Are there tools or information beyond the immediate BioWatch BAR signal that might be useful for informing local responses? Either things you want to explore further, actions you want to take, major limitations you currently have, where a technology or process might help you? (Vitko)

Romanosky said he would like to acquire information from predictive modeling that would suggest if specific populations are at risk, down to

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

the zip code level, so that he could anticipate the penetration illness at that location based on population demographics, pathophysiology, and disease characteristics. For example, in the aftermath of a powerful storm that knocked out power, he wanted to match the length of time of the power outage with the number of individuals presenting to emergency departments with heat-related illness and zip codes. Such information might allow him to model and anticipate the impact of a prolonged power outage, high temperatures, and the socioeconomic and health status of the population, then deploy limited resources to areas where they would have the greatest impact. Although such detailed demographic information about the distribution of chronic illnesses or other factors that would make someone more susceptible to a biological agent may not exist today, it should in the near future and would help guide a response not only to a BAR but also to emerging infectious diseases such as Zika. The Centers for Medicare & Medicaid Services data on prescriptions and diagnoses at the Census tract level might be a good source of such information. Adding to what Romanosky said, Hohensee noted that DHS is developing a decision support tool that probably could use zip code–based information. This tool may be useful for a BAR and other public health triggers.

Agreeing with Romanosky, Smith said a great deal of work has been done on preparing to help vulnerable populations in the aftermath of Hurricane Katrina. “There is actually information that we might be able to get pre-event to load into such a system so we could have a clear understanding of our population prior to an event occurring,” said Smith. “Then it might be a little easier to monitor how things are happening at that very local level.” She also noted that syndromic surveillance programs can prioritize populations by zip code, demographic characterization, and chief complaint. The challenge with such systems, she said, is that they come in many forms and do not all generate the same information or provide access to the same information, which is also true of WebEOC. One strength of BioWatch is that everyone uses the same platform. Pollok said that every emergency management team in Texas is required to use WebEOC, which has its own mapping component that can generate a plume map for BioWatch in 20 minutes. The Texas deployment of WebEOC, which is in every hospital, school, and utility office, has a database that identifies which vulnerable individuals require a backup generator in case the power goes out. Brabham suggested that a dashboard application that consolidated information from WebEOC, BioWatch, and other useful data sources would be achievable and of value to the entire public health community. The challenge for such an application in general and WebEOC in particular, said Romanosky, is vetting the information that goes into these systems. Brabham replied that the key is to crosscheck information from different sources for consistency. Romanosky noted that researchers are using social media data to map the

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

spread of disease through a population and that such data could be a useful source of validation information in the future.

Gunn provided the last comment for the panel. “Not all data is of equal value, and the most valuable data [are] going to be a confirmed ill person with the organism of interest,” said Gunn. However, she added, “When you get into that kind of protected health information, there are many legal constraints on sharing and vetting that on time.” She then said that many times when looking at a reportable condition, the geographic location of that case may not be available to the federal government, depending on state statutes. The same is true for syndromic surveillance, and it will be important going forward to identify those data that can be shared through the Centers for Disease Control and Prevention platform or WebEOC and that will be valuable for interpreting a BAR. Gunn suggested that some thought should be given to data-sharing requirements and legal relationships that need to be established ahead of time to facilitate information sharing. She acknowledged that some information may prove to be impossible to share. It will also be important to understand the details behind information so that it can be put in the right context, and to decide who should keep this information so decision makers can better discern their options when a BAR is declared.

CONCLUDING REMARKS OF THE DAY’S DISCUSSIONS

To close the first day of the workshop, Brinsfield and Gordon-Hagerty recapped what they thought were the items that warrant further attention. Modeling was one item, and what will be needed is to determine which models are useful to state and local public health officials before, during, and after an event. Another request Brinsfield heard was the need for more tools and technology, particularly with regard to data collection. A clear need, she said, is for basic science data on the dose–response curves for special populations that may be more vulnerable to biological agents. In the absence of clear data, an option may be to convene an expert panel to make educated guesses on the susceptibility of specific populations. This information, she noted, can inform basic performance characteristics for the BioWatch system.

The discussion clarified that communication is an important and valued aspect of the program. One aspect that was not discussed was how to coordinate incoming data and push it back out to the decision makers, said Brinsfield. Along those lines, Gordon-Hagerty stressed the importance of sorting out how to share data and identify the limitations of datasets and how they are packaged. Both Brinsfield and Gordon-Hagerty said that various platforms can aid communication, but the challenge is determining how to use those platforms in terms of how data go into the platform, how data

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

are validated, and how they are presented to the user. “Creating another six platforms will not necessarily get us where we need to go,” said Brinsfield. “What we need to figure out is how those platforms work together.”

The last item Brinsfield noted was the universal agreement that the BioWatch system warrants further testing. The problem DHS has, she said, is that the day-to-day operational management of the program cannot be replaced by testing to make it perfect. Gordon-Hagerty concluded the day’s proceedings by thanking both Persons and Walter for their presentations and said she hoped those presentations helped everyone gain a better appreciation for and understanding of what the report was about, what the questions from Congress were focused on, and how DHS, OHA, and the broader U.S. government are going to tackle some of the issues that were raised.

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×

This page intentionally left blank.

Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 13
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 14
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 15
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 16
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 17
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 18
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 19
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 20
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 21
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 22
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 23
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 24
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 25
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 26
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 27
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 28
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 29
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 30
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 31
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 32
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 33
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 34
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 35
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 36
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 37
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 38
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 39
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 40
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 41
Suggested Citation:"3 Recommendations from the Government Accountability Office Report and the Department of Homeland Security Response." National Academies of Sciences, Engineering, and Medicine. 2017. Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23687.
×
Page 42
Next: 4 BioWatch Collaborative Planning »
Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop Get This Book
×
 Enhancing BioWatch Capabilities Through Technology and Collaboration: Proceedings of a Workshop
Buy Paperback | $57.00 Buy Ebook | $45.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Department of Homeland Security’s (DHS’s) BioWatch program aims to provide an early indication of an aerosolized biological weapon attack. The first generation of BioWatch air samplers were deployed in 2003. The current version of this technology, referred to as Generation 2 (Gen-2), uses daily manual collection and testing of air filters from each monitor, a process that can take 12 to 36 hours to detect the presence of biological pathogens. Until April 2014, DHS pursued a next-generation autonomous detection technology that aimed to shorten the time from sample collection to detection to less than 6 hours, reduce the cost of analysis, and increase the number of detectable biological pathogens. Because of concerns about the cost and effectiveness of the proposed Generation 3 system (Gen-3), DHS cancelled its acquisition plans for the next-generation surveillance system.

In response to the cancellation announcement, Congress asked the Government Accountability Office (GAO) to conduct a review of the program and the proposed system enhancements that would have been incorporated in BioWatch Gen-3. However, Mike Walter, BioWatch Program manager, Office of Health Affairs, DHS, said that DHS did not agree with all of GAO’s characterizations of the BioWatch program efforts described in this review. In response to this, DHS requested that the National Academies of Sciences, Engineering, and Medicine conduct a workshop to further explore the findings of the 2015 GAO report and discuss the impact these findings may have with regard to the future development of the technical capabilities of the BioWatch program. Workshop participants also discussed existing and possible collaborations between BioWatch, public health laboratories, and other stakeholders that could contribute to the enhancement of biosurveillance capabilities at the federal, state, and local levels. This publication summarizes the presentations and discussions from the workshop.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!