The best way to predict the future is to invent it.
This report is the second of two National Research Council (NRC) reports that investigate how the Department of Defense (DoD) can use long-term forecasting to reduce the type of surprise that can be caused by high-impact technologies. Both reports were prepared by the NRC’s Committee on Forecasting Future Disruptive Technologies, whose efforts were sponsored by the Office of the Director of Defense Research and Engineering (DDR&E) and the Defense Warning Office (DWO) of the Defense Intelligence Agency (DIA).
The first report, Persistent Forecasting of Disruptive Technologies, describes existing forecasting methodologies and critiques the latest, most innovative attempts to build comprehensive forecasting systems (NRC, 2010). It also discusses disruptive technologies as a source of surprise and suggests design features that should be incorporated into a next-generation forecasting system for disruptive technologies so that better predictions can be made of innovations arising from nontraditional sectors.
For this report, the committee was asked to outline one or more conceptual models that could be used to build a forecasting system for disruptive technologies based on the design features identified in the first report. The sponsor also requested that the committee provide evidence of the feasibility of creating the proposed system and that it recommend options for proceeding. These two reports are intended to help the DoD and the intelligence community (IC) develop a forecasting system that will assist in detecting and tracking global technology trends, producing persistent long-term forecasts of disruptive technologies, and characterizing their potential impact on future U.S. warfighting capabilities. The statement of task for this second report is given in Box 1-1.
To meet the goals of the statement of task for this second report, the committee met in San Francisco on November 5 and 6, 2009. On November 5, the committee convened the Forecasting Future Disruptive Technologies Workshop, a 1-day workshop at which it was joined by a panel of invited experts in related fields (see Appendix C for a complete list of workshop participants). On November 6, the committee held a closed meeting to develop the basis of this report, using outputs and comments from the workshop in addition to data and insights collected in all previous committee meetings (meeting dates and presentations are listed in Appendix B). This report reflects the information received during both phases of the study.
From the 1-day workshop and the previous committee meetings, the committee had three broadly defined goals: (1) to develop further the structural framework for how to think about the problem of developing a long-term persistent forecast of disruptive technologies, (2) to create alternative models of what such a system might look
Statement of Task
The committee shall conduct a workshop to provide expert insight in designing a persistent forecasting system.* The committee will invite expert forecasters and users of forecasting systems, including:
The workshop will focus on the development of one or more conceptual high-level diagrams of a process that could be used to produce persistent forecasts of disruptive technologies. The final report will include transcripts of the workshop and copies of visualizations created during the workshop. The committee will comment on the insights gained from past committee meetings and the workshop and recommend options for future courses of action in the development of a persistent technology forecasting system.
like, and (3) to define actionable steps toward the development of such a system. Specifically, the committee’s objectives for the workshop were the following:
Develop one or more high-level designs of potential approaches to a 1.0 version of the system.
Gain insights on how to approach the development of the system.
Estimate a gross level of effort for launching such a system.
Document the key insights from the sessions and workshop that could provide guidance for the development of the system.
The present report uses multiple methodologies to approach the development of a version 1.0 system of a forecasting system model that specifically addresses the needs of the defense intelligence community. The sections below in this chapter introduce the context and bridge from the committee’s work in its first report (NRC, 2010) on traditional forecasting processes to forecasting systems as conceived by the committee. Chapter 2 describes the results of experiments undertaken by three subgroups of the workshop to actually design forecasting systems that would meet the design criteria explored by the committee; the chapter also outlines a fourth system—a storytelling model suggested by an individual workshop participant. Chapter 3 evaluates and synthesizes the results of the experiments, describes the characteristics of a system that integrates the best attributes of the four design options, and recommends the next steps toward the development of the system. Appendix A contains biographical sketches of the members of the committee. Appendix B lists the presentations delivered to the committee throughout this project. Appendix C lists the experts who participated in the November 5 workshop. Appendixes D and E (on the CD enclosed with this report) provide the unedited transcripts of the workshop, and Appendix F presents graphics created as visualizations of the main ideas produced in the workshop. Because of the volume of the material they contain, Appendixes D and E do not appear in print form.
DEFINING “DISRUPTIVE TECHNOLOGIES”
As described in the first (NRC, 2010) report, the word “disruptive” connotes an interruption or upset to the orderly progression of an event, process, or activity, or a break in service. The word can also imply confusion or disorder, or a drastic alteration in structure. A disruptive technology is an innovative (although not necessarily new) technology that triggers sudden and unexpected effects. Disruptive technologies can have both negative and positive consequences. A disruptive technology can be an enabler (such as the automobile), can pose a threat (e.g., improvised explosive devices, or IEDs), or can have elements of both (e.g., the Internet). By contrast, “emerging technologies,” those that are currently gaining prominence or importance, may become disruptive early or late in their life span, or in a region far from their origin, or they may not become disruptive at all. Often, the potential disruptive impacts of a technology are not initially obvious but become evident in hindsight. For the development of this second report, Committee Chair Gilman G. Louie explained to workshop participants that a disruptive technology is characteristically hard to predict and by nature occurs infrequently, so it might be difficult to identify or foresee. Such a technology can cause an abrupt, revolutionary change to established technologies and markets; and while perhaps starting locally, it may significantly alter the balance of global power (in financial, military, security, trade, and scientific realms).
PITFALLS IN FORECASTING
Why do disruptions happen and how can they be predicted? Looking at past events caused by disruptive innovations—the attacks of September 11 or at Pearl Harbor, for example—it is immediately obvious that useful data were available that, if acted on, might have averted those surprise attacks. The analysis of disruptive events after the fact shows that the information necessary to predict the event was missed for a variety of reasons, including the following:
Not knowing enough to ask the right question,
Asking the right question but at the wrong time,
Assuming that the past is an indication of the future,
Mirroring (assuming that one’s beliefs are held by others),
Not having enough pieces of the puzzle to put together the whole picture because of information fragmentation,
Not being able to distinguish good from bad information amidst the noise of information overload,
Biases (institutional, communal, personal, etc.) in data evaluation, and
Lack of vision.
A NEED FOR ENHANCED FORECASTING OF DISRUPTIVE TECHNOLOGIES
Importance of Forecasting to the Department of Defense
The use of technological surprise was identified in the 2006 Quadrennial Defense Review (QDR) as one of four potential threat strategies that could challenge U.S. military capability. Forecasting disruptive technologies or events is important to the DoD for three reasons, as laid out by Alan Shaffer, director of Plans and Programs, DDR&E (Shaffer, 2005). First, in both corporate and military environments, staying current on technologies and anticipating potential disruptive influences are vital to staying competitive. Second, until recently, the United States minimized surprise by protecting its advanced-technology secrets. As other countries and non-state actors become more technologically sophisticated—as purchasers of commoditized technology or as developers or both—the United States can no longer assume technological leadership in every area of technology development or application that might be used for military purposes. In the new paradigm, the military needs to stay abreast of new technologies as well as of new applications being developed throughout the world in order to avoid military surprise. Third, although many believe that the United States does well at keeping abreast of big-platform and
dedicated military technologies, it can still be surprised by the application of commercially and publicly available technologies to create unanticipated disruptive military applications. The pervasiveness and effectiveness of IED attacks against the U.S. military in Iraq and the use of commercial airliners by terrorists against the U.S. homeland are examples of how applications of readily available commercial technologies can be used as highly disruptive military weapons and can surprise military planners.
Additionally, the DoD could use information obtained from forecasting activities to leverage emerging trends and to create asymmetric advantages for the United States. Disruptive technology forecasting need not be a purely defensive exercise.
A high-level disruptive technology forecasting system could be used by the DoD in the following ways:
To increase the lead time for stakeholders to plan and address potential disruptions;
To provide early indications of potential emerging new and disruptive technologies, and
To provide stakeholders with tools for prioritizing potential threats and allocating resources to increase the ability to capitalize on, protect against, or mitigate the impact of a potential disruption.
Observation. The Department of Defense needs one or more effective forecasting systems of disruptive technologies to reduce surprise created by future disruptive technologies.
An effective forecasting platform could help the DoD to better prepare for potential disruptive technologies and enable it to develop proactively the preparatory strategies for countering the negative effects of disruption. Given the typically long development cycle for counter-disruptive technologies in the United States and the capability and speed of some U.S. adversaries in developing new disruptive technologies, particularly those that leverage commercial technologies, the United States cannot afford to rely on a reactive strategy. It must proactively prepare for potentially highly disruptive technologies.
Technology and Disruption in the 21st Century
It is believed by many forecasters and technologists that at this historical moment the risk of global disruption is greater than ever before. According to Irving Wladawsky-Berger (2008) of the IBM Academy of Technology at a presentation to the committee:
There are a set of forces converging on organizations today—both business forces and technical possibilities—that are driving different choices about business designs and the underlying computing infrastructures. Those forces aren’t new. But in a networked world that’s always on, you feel these pressures more acutely and in real time. Because of the global marketplace and the Net, every institution has far greater contact with the world—access to more markets and information, exposure to more threats, and a rapid fire competitive environment. Those companies that lead their industries are the ones best able to adapt and build the right partnerships at this intersection of business and technology.
Experts in other disciplines agree. Ray Kurzweil, whom Bill Gates called “the best person I know at predicting the future of artificial intelligence,” proposes in The Singularity Is Near: When Humans Transcend Biology that the pace of change of technology “is exponential (that is, it expands by repeatedly multiplying by a constant) rather than linear (that is, expanding by repeatedly adding a constant)” (Hodgkinson, 2009; Kurzweil, 2005). See Figure 1-1. He postulates that at some point a “technological singularity” will occur where the level of human technology will become infinite or extremely high and artificially enhanced human/computer intelligence will synergistically transcend human biological intelligence. His description of the disruptive power of this technological synergy is very apt for disruptive technologies in general: “Exponential growth is deceptive. It starts out almost imperceptibly and then explodes with unexpected fury—unexpected, that is, if one does not take care to follow its trajectory” (Kurzweil, 2005, p. 8). Kurzweil’s vision for the future is valuable not so much for its accuracy as for its insight into the disruptive potential of technological confluences. While the principle of exponential growth does not have universal validity, it does have important application in technology forecasting.
Each specific technological area (mechanical, vacuum tubes, discreet transistors, integrated circuits) follows its own S curve: slow initial growth, followed by exponential growth, and then slowing and diminishing growth in capability. Kurzweil’s point is that as a technology reaches maturity and capabilities fall off the exponential growth curve, a new replacement technology emerges and continues the exponential growth. These exponential curves apply to many human-made technologies over hundreds (if not thousands) of years. The committee believes that examples include our capacity to record and store information (from cave wall, to written word, to the printing press, to computer storage, to the cloud). The same can be applied to our capacity to compute, generate energy, increase the lethality of weapons, or produce food, to name a few examples.
The exponential advances of information technology (IT) growth are the result of each generation of IT going through the standard performance logistic curve and being replaced by a new generation as the older one becomes obsolete. Each new generation offers an order of performance over the generation that it replaces. In Figure 1-1, the increasing performance of computation is actually the envelope of the logistic curves for the increase of performance of the successive underlying technologies over time. The transition from one logistic curve to the next is fertile ground for the emergence of new disruptive technologies.
A system of forecasting of disruptive events needs to be developed to meet new realities that include exponential technological growth, globalism, commercialization, the rapid diffusion of technical knowledge, the viral application of technology, the proliferation of asymmetric and disruptive strategies, and changing global competitive forces. As a report by the National Intelligence Council (NIC) observed: “We see globalization—growing interconnectedness reflected in the expanded flows of information, technology, capital, goods, services, and people throughout the world—as an overarching ‘mega-trend,’ a force so ubiquitous that it will substantially shape all the other major trends in the world of 2020” (NIC, 2004, p. 6-7).
Traditionally, U.S. military strength has been built on a foundation of global technological superiority owing to strong technical leadership and heavy investment by the government in large platforms. As stated in an earlier NRC report, Avoiding Surprise in an Era of Global Technology Advances, “These sophisticated platforms now require investments of tens of billions of dollars spread over decades—investment levels that few foes can match.
However, the defined lifespan of the advanced technology in these platforms can now be less than the development cycle” (NRC, 2005, p. 46). In the Cold War world of head-to-head, platform-to-platform conflict, it was possible for groups of experts to agree on a list of emerging technologies to watch. Given the experts’ knowledge of the adversary, the characteristic composition of these groups of experts—by gender (male), culture (Western, English-speaking), race (white), generation (older), organization (governmental), region (Northern Hemisphere), and sector (military), among others—may not have been a significant limitation. But the power of technological innovation no longer resides exclusively with the traditional military and economic superpowers, and developments can come from virtually anywhere.
While U.S. technological advances in areas such as stealth technologies and satellite imagery once afforded multi-decade military advantage, the rapid pace of technological innovation driven by the global commercial marketplace is shifting the advantage to those who rapidly adopt, exploit, and integrate evolving technologies. While defense-specific investments will continue to spawn important technological advances, U.S. technological superiority is no longer assured (NRC, 2005, p. 13).
The globalization of knowledge exchange and of commerce has lowered barriers and spread the distribution of technological innovation centers throughout the world.
No single nation—or company—can expect to innovate in isolation. That’s because the global adoption of the internet, as well as advanced pervasive technologies, have stripped away the traditional barriers to innovation—such as proximity of natural resources, geographic constraints, and access to both information and insight. (Wladawsky-Berger, 2008)
The recent global recession has brought into stark relief the interdependencies of global economies. Multinational corporations can take advantage of Japanese strengths in nanotechnology, optics, and electronic devices, or be consumers of top-notch high-technology manufacturing and design in photonics in China (discussed in detail in Chapter 5 of NRC, 2005). In such an interdependent world, the definitions of ally and foe can be very fluid and flexible. In the future, the successful leaders in the global marketplace will be highly adaptive, rapidly responsive to world trends, and adept at leveraging strengths through strategic partnering.
Global commercial forces will play a major role in guiding the development of most future technologies (e.g., information technology, biotechnology, microtechnology, nanotechnology, materials, and energy). As technological research and development become increasingly driven by market opportunities and market demand, technology developed specifically to meet a military need may become a niche market. This also means that military applications and uses of commercial technology will be increasingly innovative and important. A forecasting system for disruptive technologies must incorporate global market and investment trends in addition to tracking government-funded programs.
One important result of the recent growth in globalism and commercialism is that important indicators of potentially disruptive technological developments are likely to appear in open sources—on commercial Internet sites, at industrial fairs, at trade associations, among special-interest groups, on Web blogs, and on intranets of key corporations. Many of these technologies and applications will be developed and exploited by small or obscure start-up companies or in the research efforts of young university or postdoctoral students. The committee is concerned that in the future many of these technologies will be developed outside the United States. It was observed by several committee members that many technologies that become disruptive start off as outliers—innovations that are by definition outside the mainstream consciousness.
KEY REQUIREMENTS FOR SYSTEM MODELS
This section outlines some of the most important characteristics and design elements of a disruptive technology forecasting system identified in the committee’s first report (NRC, 2010) and during the workshop discussions. These characteristics were chosen by the committee for inclusion in the first report in part because they are believed to optimize the use of available technology to address the challenges mentioned earlier in this chapter.
To accommodate the changing global landscape, the committee believes that a good disruptive technology forecasting system needs to be open to the public and persistent. Chapter 5 of the committee’s first report identified persistence as one of the most important principles in a system for forecasting disruptive events (NRC, 2010).
Traditional forecasting systems provide technological snapshots that can quickly become outdated in the face of today’s realities. In many cases, these one-time predictions fail to develop technology roadmaps or to include a strategy for tracking and incorporating new signals1 that emerge after the prediction’s creation. Many technology forecasts fail to explore potential secondary effects or to consider the possibilities for enabling new disruptive applications that might arise from integrating multiple technology disciplines. To capture an evolving technology trend, a forecasting system must continuously scan, sample, question, and imagine from multiple sources of information; this is what makes it “persistent” (NRC, 2010). A key goal of a persistent system is to continuously improve forecasts on the basis of new data, signals, and participant input. A persistent system can look at trends and recalculate trajectories as data change or as new events appear on the horizon, generating new hypotheses. If all of the queries posed for the participants to answer, the queries from participants to the organizers, the various hypotheses proposed, and the scenarios developed are saved and tracked, then the data to develop these potential answers, even if the answers were discarded, are not lost. For example, information developed later might create a situation that would give new meaning to scenarios that were previously considered impossible. It is difficult to ask all the right questions at the right time (Jonas, 2008). By storing each query, a previous query can be matched against a current query, which could be used in signal matching. For example, someone who is studying automotive design might be watching the falling cost of storing a unit of energy, believing that once a price hits a certain threshold, the electric motor becomes viable as a principal form of vehicle propulsion. Another person might be studying the same measure of interest for its implications for technologies interfacing with the electric power grid. The queries themselves could be an important signal.
As it generates interest among different communities, a persistent forecasting system can be built to serve many different customers, providing a continuously active, dynamic, and responsive platform. Although the sponsors are primarily concerned with national defense, disruptive technology forecasting is also useful in many types of business applications.
The outcomes of a persistent system differ from traditional methods of forecasting in the following ways:
The forecasts of the persistent system are current and based on the best knowledge available as they advance.
Data archives can be used as a repository for other forecasts. Currently, there is no single place within the DoD, much less the U.S. government, where technology forecasts from multiple agencies can be seen and referenced.
A persistent data-gathering platform can be incorporated into multiple programs and applications.
Rather than having a forecasting cycle, a persistent system allows forecasts to be generated or updated continuously as new signals appear.
As new technologies, communities, or applications evolve, they can be integrated into the system and applied to existing work.
As the historic base of data grows, this archive becomes a valuable resource with which to test and refine new forecasting tools and methods using backcasting (see the Glossary).
Openness and Crowdsourcing
In its first report, the committee focused on openness as an important attribute of a future forecasting system to allow the broadest possible collection of ideas, signals, and interpretations and to reduce bias (particularly Western bias) through diverse participation. Attendees at the workshop echoed the belief that input from a broad range of
participants is the key to tapping into far-flung signals indicating technology change. Crowdsourcing is “the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call” (Whitford, 2008). The definition was initially discussed by Howe (2006). A forecasting system that incorporates crowdsourcing uses open participation to develop a wide range of scenarios and possible future narratives. Ideas can be drawn directly from the crowd using Web-based models that include participatory special-interest-community Web sites, predictive market tools, and interactive gaming. Crowdsourcing can also be used indirectly: sophisticated Web-based search algorithms can extract data of interest from such sources as blogs, professional association Web sites, competitions, or published literature.
Classical approaches and sources for forecasting, such as brainstorming by experts, market surveys, searches of published papers, and classic data collection can be combined with new enabling technologies and applications such as searching tweets for emerging disruptive ideas in the news. Disparate inputs inspire ideas and raise questions that feed back into iterative loops where the data are used to refine the information. For example, more interesting scenarios or narratives can be selected to be developed further and put back out to the larger crowd community in the form of an alternative reality game (ARG) to see how it plays out. Data from persistent Web crawling may also indicate a growing area of interest—a community that is attracting resources or an idea that is cropping up in interesting places—that should be explored further.
Some may consider the idea of an open forecasting system that includes crowdsourcing to be fairly radical. The rationale was that it would take a very broad range of inputs from a wide variety of global populations to have the kind of reach that would enable real forecasting and that crowdsourcing using Web-based technologies is an effective means to access global populations. Research has shown that amateurs are capable of outperforming experts in forecasting (Önkal et al., 2003; NRC, 2010). The committee believed strongly that adding crowdsourcing techniques would broaden and complement the viewpoint and data that any collection and analysis team could access and prioritize. This could significantly improve the chances of discovering disruptive indicators before the disruptive technology emerged. This improvement would result not just from the complementary data coming from open sources. Crowdsourcing techniques may demonstrate different prioritization of technologies, their applications, and hypotheses than those from traditional forecasting of technologies.
Workshop participants expressed diverse views on the superiority of crowd analysis over expert analysis. Many of the participants recognized that crowdsourcing presents unique challenges not present in other forecasting approaches. For example, it can be noisy, uneven in quality, and self-reinforcing (crowd-generated ideas include zero-point energy, human time travel, faster-than-light travel). Nevertheless, expert analysis remains subject to error due to problems such as mirroring, blind spots, and failure of imagination.
Proponents of crowdsourcing and of expert judgment at the workshop accepted the idea that both forms of analysis have a role in the forecasting process. Experts can analyze the plausibility of data feeds leading to particular scenarios or select high-impact scenarios for further exploration to discover possible paths that enable them. A key role for experts is to backcast: to begin with a projected future scenario and explore potential paths that could lead from the present to that future. The experts can either develop their own model or evaluate and validate potential paths suggested by the crowd. Fully developing a backcast model would entail laying out potentially important signposts, tipping points, thresholds, or measures of interest that could be persistently tracked for a convergence of effects. The implications and impacts of a particular scenario might not be immediately obvious, so both crowd and expert insight might be valuable. For example, if one only watches areas of interest to technology experts, minimalistic applications and technologies such as IEDs might be missed. The crowd, however, might not be able to roadmap the development of stealth technologies with the same level of precision as that offered by experts. Thus, the committee proposes that input from both experts and crowds be used to gain the strengths of both for better forecasts of disruptive technologies.
The stochastic model of creativity pioneered by D.K. Simonton (2003) states that the level of creative output can be traced directly to the breadth of the domain from which an innovator draws knowledge and the frequency with which creative efforts are attempted. Therefore, “a researcher who spends 40 hours per week engaged in theory development is more likely to develop an innovative idea than a researcher who spends only 5 hours per week on the same task … for creativity to flourish, both aspects of the creative process must be emphasized” (Fehr, 2009, p. 345).
Creativity and Tolerance for Failure
A forecasting system for disruptive technologies must be open to creative ideas and have tolerance for failure. By nature, a disruptive technology will be difficult to predict, and so most predictions will not be realized. Rather than predicting the occurrence of a specific innovation, a worthwhile forecast predicts the problems that will be solved with technology and the effects of different possible solutions. Therefore, it is important to develop a roadmap of many potential futures of concern and to collect signals to help track the development of key technologies that might enable a specific future.
In the opinion of the committee, the technologies that are the most disruptive are most likely to emerge from the unexpected application of both new and existing technologies. In many cases, it is the unexpected application of existing and well-understood technologies that can cause the greatest surprise and disruption. In these cases, it is the new application of technology rather than the novelty of the technologies involved that causes the disruption. Uncovering unexpected connections between existing technologies requires a tolerance for ideas that seem “crazy,” and it requires creative people to contribute seemingly unrelated data that may become a significant marker when combined with other data to form a pattern. While the interpretation of the significance of the patterns can be wrong, the knowledge of such patterns is important and can be refined as new signals, data, or events emerge. The creativity necessary to envision less obvious connections is also needed to place forecasts in believable contexts. However unlikely an interpretation, it must be included in a compelling narrative of a future scenario. As in the case of Cassandra, the Greek mythological character who was able to see the future but was cursed with the inability to convince others of the truth of her predictions, accurate forecasts are useless if they cannot inspire action before a disruption emerges.
Predictions Versus Roadmaps
An effective forecasting system does not necessarily predict the future accurately. In his foreword to The Knowledge Base of Futures Studies (Slaughter, 1996), James Dator proposed two hypotheses and postulates for forecasting the future, known as Dator’s law. The first hypothesis is that the future cannot be studied, because the future does not exist. He then postulates that the future cannot be predicted, but alternative futures can be forecast and preferred futures can be “envisioned, invented, implemented, continuously evaluated, revised, and re-envisioned” (Slaughter, 1996, p. xx). The second hypothesis is that any useful idea about the future should appear to be ridiculous from today’s point of view. A related postulate is that a future considered to be the most likely is probably one of the least likely futures. Therefore, “decision-makers, and the general public, if they wish useful information about the future, should expect it to be unconventional and even shocking, offensive, and seemingly ridiculous…. futurists have the additional burden of making the initially ridiculous idea plausible and feasible by marshaling appropriate evidence and weaving alternative scenarios on possible developments” (Slaughter, 1996, p. xx).
Traditionally, forecasting methodologies have focused on identifying technologies, but Dator’s law emphasizes the process of envisioning futures. The future will reflect the impacts of technologies on society, and it is these impacts, rather than the technologies that cause them, that are the essential and convincing elements in a narrative of a potential future.
A focus on technologies can also fail to incorporate other factors relevant to disruption. Some forces that enable or encourage disruptive events are not technological. For example, when Twitter was used to mobilize protests of the results of the 2009 Iranian presidential election, the power of this relatively simple new technology to affect political change by leveraging global social networks was made apparent to the world. Very simple and old technologies can also have dramatic impacts when applied to uses other than those originally intended.
A list of emerging technologies provides little basis for prioritizing their disruptive potential impact or allocating resources to the most threatening scenarios, especially if those scenarios are seen as low-probability outcomes. A set of potential future roadmaps provides the necessary insight for making better preparations for the unexpected. Like building an opening book for a chess program, a forecasting system can use roadmapping to map signals against important signposts to help identify the potential emergence of an alternative future (see
Learning from Blackjack and Chess
A good forecast should allow the user to slightly change his or her odds of success from being completely random to being slightly better than random. Think of it as card counting in blackjack. It does not guarantee that at any moment in time one is going to have a winning hand, but over the long term of playing the game out, one beats the house odds by changing it just a little bit. A good forecast is like counting cards: it does not guarantee a win; it just begins to subtly shift the odds in one’s favor. And most importantly—and a lot of forecasters forget this—at the end of all the forecasts, what do we look for to see whether or not a forecast is coming true or not coming true? What are the signals, what are the signposts, what are the thresholds, what are the tipping points that we should be out there listening and monitoring for to say “Oh! It’s happening”? So think of it as a chess game. You’re sitting there and you’re playing a grand master, and the grand master looks at the chessboard and in about 10 seconds says, “Oh, I see a pattern here. It just kind of looks like that game. I know my next eight moves.” A novice looks at the board and says, “I don’t know what to do next.” So an early warning system is having what can be called that opening book in a chess program. Now how can we fill that opening book, and increase that pattern recognition that allows somebody to say, “Hey, this might be coming true, that may not be coming true”?
SOURCE: Committee Chair Gilman G. Louie, adapted from the unedited workshop transcript in Appendix D, provided on the CD included in the back inside cover of this report.
Box 1-2 for a more detailed example). A well-designed forecasting and warning system provides forecasters with the equivalent of an opening book. In certain cases, disruptive applications can be subtly hidden in second- and third-order effects. Like chess grand masters, experts are important to a forecasting process because of their ability to identify subtle patterns.
A useful forecast will lay out several roadmaps of potential futures, with indicators and parameters that can be used to follow the progression of events and evaluate the likelihood of each scenario’s coming true. It should indicate the technological, societal, economic, or other tipping points or thresholds that might enable a disruptive scenario. The committee believes that it is important to include economic, social, and cultural experts along with scientists and technologists on a disruptive technology forecasting team.
FRAMEWORK FOR MODEL BUILDING
The workshop opened with a description of a framework for thinking about forecasting and a list of desired system attributes developed in the first report. The methodology flow diagram from that report (NRC, 2010, p. 59), reprinted here as Figure 1-2, illustrates some proven forecasting methodologies and provided a helpful example for the workshop participants to use when building their models. As shown in the diagram, forecasters begin by defining who the users are and what the mission is. Once the priorities of the mission are understood, potential sources of interesting information can be identified and pursued. Sources of information can then be collected through the pursuit of new data sources and the use of active techniques such as data mining, interviewing, and data repository acquisition and licensing as well as through more passive means such as monitoring and tracking. Generally, these sources include both data input and human input. These data then go through a data hygiene process, which restructures, eliminates duplication, and “cleans up” the data. Forecasters should be cautious about discarding data that seem irrelevant. “Bad data” could prove useful in unexpected ways and could teach forecasters how to make better inferences.
Observation. Even data and hypotheses that are considered irrelevant or of poor quality could be useful sometime in the future in a persistent forecasting system. Selecting “good” and “bad” data too soon has the potential to introduce bias and blind spots into a forecast. All data, questions, and hypotheses generated should be preserved.
One of the challenges inherent in any forecasting methodology is ensuring that the inputs used to derive the forecast are of the highest quality and represent a broad range of quality sources. This is especially true for persistent systems that require continuous inputs. Systems that rely on experts can control the quality of input by selecting participants with the desired composition and quality of expertise and by controlling the quality of the questions being asked. Systems that rely on data can ensure quality by ensuring that there is an understanding of the sources of data and how the data were derived. Unlike other inputs, the quality of crowdsourced data is determined more by the structure of the system than by the quality of the individual participants. Quality comes from the breadth and number of participants and the structural approach used to perform oversight of the system.
After collection, the data are processed through a number of possible mechanisms. The result of processing should be a portfolio of possible, but not necessarily probable, futures against which resources can be allocated for future tracking and reprocessing.
This methodology uses a traditional linear forecasting approach, which may be good for a one-time forecast but may be flawed for a persistent system. New tools should be able to replicate the processes that traditional forecasting systems use while incorporating rapid, continuous feedback to produce persistent, constantly updated forecasts. The committee acknowledges that the proposed model in the first report may not be the most accurate for a persistent system, but it provides a useful framework for the processes included within a single pass of a forecasting loop of a hypothetical persistent system.
INSIGHTS FROM THE WORKSHOP
After hearing briefly about the vision that the committee held for a forecasting system derived from its work on the first report and a brief description of the framework in Figure 1-2, the workshop attendees participated in a large-group discussion and then worked in subgroups to prioritize design elements; define input, output, and intermediate processes; and build models of potential forecasting systems for disruptive technologies incorporating both computer and human elements. The most salient insights and conclusions are described briefly in this section and in more depth in Chapter 3.
Flexibility and Leadership
The model-building exercise performed by the workshop participants illustrated that there are multiple viable approaches to building a high-level persistent forecasting system. Given the multitude of design options available, the committee concluded that it would be more productive to begin building a basic 1.0 system2 to test and develop than to spend excessive resources on planning a robust and complete system structure at the outset. The design could then evolve to incorporate additional desired elements and to meet user demands. Rapid and cohesive design evolution is unlikely to occur without strong leadership and vision; therefore, the selection of the core leadership team is one of the most important elements for determining the success of the system. Desirable qualities and experience for the leadership team are presented in greater detail in the descriptions of individual models in Chapter 2.
Another major concept reinforced by the workshop is the importance of a focus on developing narratives of the human use and impact of technology instead of a focus on specific technologies. It is the context of solving human needs that drives technology use. The system should therefore start by identifying big problems and big opportunities and using them to generate alternative scenarios and hypotheses from which relevant technologies can be derived. There are often multiple solution paths to solve a single problem, but the immediate and second-order effects of the different solutions might be very similar. Scenarios will also vary by region, as they will have distinct impacts and solutions in different locales. By emphasizing the narrative, forecasters avoid devoting too many resources to tracking “the wrong” solutions and technologies, stay focused on potential outcomes, and create compelling arguments that can be later presented to stakeholders.
A number of workshop attendees commented that the forecasting system should be built and funded according to a start-up model, with the government providing seed-level financing to build a working 1.0-level version of the system. Beginning the project with minimal resources forces the development team to make tough decisions up front and to focus the effort on developing and perfecting core system features. The leaders in charge of the system should be forced to seek additional outside funding sources, ensuring that the system is robust enough in the early stages to inspire confidence and attract sponsorship.
A number of risks could compromise a forecasting system. The committee believes that the following is a list of potential risks that need to be considered and mitigated:
Technology and engineering risk: Some of the proposed systems depend on technologies such as databases, search engines, data mining, classifiers, data visualization tools, and Web tools. Although all of these individual technologies exist, developing a robust and reliable system that integrates these various technologies can be challenging even to an experienced technology team.
Data risk: The quality of the output is highly dependent on the quality of the input. Operators must make sure that bias is either balanced or minimized in the data collection and analytical phases of the various approaches. Data need to be collected using a diverse set of sources to be representative of the range of technologies and impacts being forecast. Expert sources should be qualified. Great care must be given to the selection of various data sources.
Security vulnerability risk: The system must be secured from various forms of malicious attacks, including spoofing, purposeful bias creation, data corruption, unauthorized modification of the data, planting of information, and denial-of-service attacks. The system, especially the portions that are crowdsourced, must be architected with appropriate security, monitoring, and analytics.
Leadership and personnel risk: Start-up activities are inherently difficult. The quality of the leadership within a founding team has a major effect on the success or failure of most start-up activities. The vision, perseverance of the team, determination of purpose, and ability to execute are driven by the quality of leadership. Finding the right individual(s) to head up such an effort is critical to the success of the project.
Disruptive idea risk: As with many forecasts of disruptive technology, there may be a natural tendency to reject wild or challenging ideas as either impossible or highly improbable based on the knowledge, trends, and understanding at the time of the forecast. It is important that every forecast be stored for future consideration in case of a change in understanding or fact.
User risk: There are multiple user risks: (1) The system could suffer from lack of adequate participation by experts and/or the crowd. (2) Malicious users might corrupt the system. (3) Users might disclose copyrighted, proprietary, confidential, or classified materials. (4) There could be a failure to have enough diversity in the user community, resulting in a biased forecast.
Financial risk: To build a persistent forecasting system, adequate and persistent funding is required. A diverse source of funding would reduce the risk of failure due to inadequate financial support.
Stakeholder risk: The final risk is that the forecasts produced by the system are not used or are rejected inappropriately by stakeholders and decision makers. There may be several factors that can cause this:
An unactionable forecast: A forecast that does not provide adequate insight into potential futures results in the inability to make decisions from it. A forecast that is overly general or fails to assess potential impact results in an unactionable forecast.
An unbelievable forecast: It is difficult for stakeholders to accept a forecast that might be considered unbelievable or improbable because it may challenge current beliefs. It is important that the greater the improbability, the greater the amount of effort given to explain how an alternative future can occur from a current point in time. Roadmapping and narratives are important tools that can be deployed to mitigate this risk.
Inappropriate use of a forecast: Forecasters need to guard against the use of selective portions of a forecast to support a conclusion different from what the forecast intended. The forecast team should, whenever possible, review how stakeholders and decision makers are using the forecast and its conclusions.
Fehr, Ryan. 2009. Why innovation demands aren’t as conflicted as they seem: Stochasticism and the creative process. Industrial and Organizational Psychology 2: 344-348.
Hodgkinson, Mike. 2009. By 2040 you will be able to upload your brain … The Independent. September 27. Available at http://www.independent.co.uk/news/science/by-2040-you-will-be-able-to-upload-your-brain-1792555.html. Last accessed on May 26, 2010.
Howe, Jeff. 2006. The Rise of Crowdsourcing. Wired Magazine. June 14. Available at http://www.wired.com/wired/archive/14.06/crowds.html. Last accessed May 27, 2010.
Kay, Alan. 1989. The best way to predict the future is to invent it. Stanford Engineering 1 (1, Autumn): 1-6.
Kurzweil, Ray. 2005. The Singularity Is Near: When Humans Transcend Biology. London: Viking Penguin.
NIC (National Intelligence Council). 2004. Mapping the Global Future: Report of the National Intelligence Council’s 2020 Project. Pittsburgh, Pa.: Government Printing Office.
NRC (National Research Council). 2005. Avoiding Surprise in an Era of Global Technology Advances. Washington, D.C.: The National Academies Press.
NRC. 2010. Persistent Forecasting of Disruptive Technologies. Washington, D.C.: The National Academies Press.
Önkal, D., J.F. Yates, C. Simga-Mugan, and S. Öztin. 2003. Professional vs. amateur judgment accuracy: The case of foreign exchange rates. Organizational Behavior and Human Decision Processes 91: 169-185.
Simonton, D.K. 2003. Scientific creativity as constrained stochastic behavior: The integration of product, process, and person perspectives. Psychological Bulletin 129: 475-494.
Slaughter, Richard A. (ed.). 1996. The Knowledge Base of Futures Studies. Volume 1: Foundations. 3 vols. Hawthorn, Australia: DDM Media Group.
Whitford, David. 2008. Hired guns on the cheap: New online services can help you find freelancers for less. Fortune Small Business Magazine. Available at http://money.cnn.com/magazines/fsb/fsb_archive/2007/03/01/8402019/index.htm. Last accessed January 29, 2010.
Jonas, Jeff. 2008. Macro trends and their national security consequences. Presented to the Committee on Forecasting Future Disruptive Technologies, May 29.
Shaffer, Allan. 2005. Disruptive Technology: An Uncertain Future. Available at http://www.docstoc.com/docs/889200/Disruptive-Technology-An-Uncertain-Future. Last accessed May 6, 2009.
Wladawsky-Berger, Irving. 2008. Complex digital systems in the knowledge economy: Some key grand challenges. Presented to the Committee on Forecasting Future Disruptive Technologies, May 28.