Formalized technology forecasting dates back to the years immediately following World War II when the RAND Corporation developed the Delphi method, a structured process for eliciting collective expert opinions about technology trends and their impact (Dalkey, 1967). Gaming and scenario planning also emerged as important technology forecasting methods in the 1950s and experienced a dramatic increase in popularity during the 1970s. All of these methods as well as other quantitatively oriented methods such as extrapolation and trend analysis are in use today. All forecasting methods depend to some degree on the inspection of historical data. However, an exclusive reliance on historical data inevitably leads to an overemphasis on evolutionary views of innovation and leaves the user vulnerable to surprise from rapid or nonlinear developments.
Technology forecasts are widely used by governments, corporations, financial institutions, and the investment community. A useful forecast provides insights on potential future outcomes that lead to effective action in the present. A forecast of disruptive technologies is designed to reduce surprise by alerting decision makers and providing them with the tools needed to avoid unanticipated and perhaps catastrophic outcomes. It should supply decision makers with a range of possible alternative futures to assist them in allocating resources and making informed decisions.
One way that a forecast can support decision making is by providing a technological roadmap that can be used for tracking and planning and that alerts users to significant changes in the likelihood of a predicted scenario. A useful forecast must provide insight into the possible, not just the probable. Likewise, forecasts should be evaluated on their ability to capture high-impact, disruptive outcomes rather than on the ratio of correct-to-incorrect predictions that they make.
This committee’s first report studied the value of using forecasting methods that solicit input from the general public (NRC, 2010). The goal of soliciting public participation, or crowdsourcing, in a forecasting system is to cast a wide net that gathers a multitude of forecasts, signals, and opinions. This is especially important as technology innovation becomes more diverse and geographically diffuse in its approaches and as regional variations of technology applications flourish. Collaboration technologies, especially those that leverage the power of the Internet, can be used to discover expertise in unexpected places.
Experts are typically better than novices are at judging the importance of new signals in an existing forecasting system (Enis, 1995). In the technology forecasting platforms examined in the committee’s first report (X2, Tech-Cast, and Deltascan), it was found that experts generally create high-signal and low-noise forecasts. However, other research (Önkal et al., 2003) suggests that experts are not necessarily better than the public at making forecasts.
Experts may not catch the full range of alternative solutions from fields outside their areas of expertise or from the reapplication of technologies developed to solve a different problem. Paradoxically, the specificity of knowledge required to achieve expert status can invalidate forecasts generated by experts alone (Johnston, 2003).
The term “disruptive technology” describes a technology that results in a sudden change affecting already-established technologies or markets (Bower and Christensen, 1995). Disruptive technologies can be defined beyond Christensen’s market-based conception as technologies and applications of technologies that can significantly influence the balance of global power. Disruptive technologies cause one or more discontinuities in the normal evolutionary life cycle of technology. This may lead to an unexpected destabilization of an older technology order and an opportunity for new competitors to displace incumbents. Frequently cited examples include digital photography and desktop publishing, as well as older innovations such as the automobile and the telephone.
Other disruptions can be caused by “reverse innovations” that can bring well-established technologies to markets and societies that previously did not have access to these technologies or could not afford them (Govindarajan, 2009). These innovations could be the result of breakthroughs in pricing, accessibility, distribution, business models, manufacturing, research and development (R&D), resource use, or ease of use. Many of these innovations are built around what has been labeled the Gandhian engineering concept of more (social value) from less (low technology, resources use, and cost) for more (dissemination) (Giridharadas, 2008). For disruption to take place, many of these innovations rely not just on low cost and affordability, but also on distribution to developing countries. Emerging markets can be sources of disruptive innovations (Bhan, 2010). Tata’s Nano, the One Laptop Per Child computer, and India’s AirTel are notable examples.
Disruptive technologies can impact society both positively and negatively. The nature of such impacts is greatly dependent on an individual’s point of view—a disruption that is harmful to some will benefit others. Given the ability of disruptive technologies to dramatically alter a competitive environment, displace incumbents, and impact society, there is a great need for technology forecasts (1) to help identify potentially disruptive technologies and (2) to contribute to the understanding of their potential disruptive effects. These two forecasting outputs are fundamental to producing a useful forecast.
This report is the second of two reports produced under the auspices of the National Research Council’s (NRC’s) Committee on Forecasting Future Disruptive Technologies, sponsored by the Office of the Director of Defense Research and Engineering (DDR&E) and the Defense Warning Office (DWO) of the Defense Intelligence Agency (DIA). This committee was established at the request of the sponsoring organizations to provide guidance on how to conduct long-term forecasting of disruptive technologies. The statement of task for the study is provided in Box S-1.
In its first report, Persistent Forecasting of Disruptive Technologies, the committee discussed how technology forecasts were historically made, assessed various existing forecasting systems, and identified desirable attributes of a next-generation persistent long-term forecasting system for disruptive technologies (NRC, 2010). In this, the second report, the committee was asked to attempt to sketch out high-level forecasting system designs that could satisfy the key design criteria of the forecasting system concept developed in the first report. The sponsor also sought further evaluation of the system attributes defined in the first report, and evidence of the feasibility of creating a system with those attributes. Together, the reports are intended to help the Department of Defense (DoD) and the intelligence community (IC) identify and develop a forecasting system that will assist in detecting and tracking global technology trends, producing persistent long-term forecasts of disruptive technologies, and characterizing their potential impact on future U.S. warfighting and homeland defense capabilities.
The committee identified three broadly defined goals for addressing its statement of task: to develop further the structural framework for how to approach the problem of developing a long-term persistent forecast of disruptive technologies, to create alternative models of what such a system might look like, and to define actionable steps toward development. To meet these goals, the committee held a one-day workshop with invited experts from related fields (see Appendix C for a list of participants), followed by a one-day closed meeting to analyze the
Statement of Task
The committee shall conduct a workshop to provide expert insight in designing a persistent forecasting system.* The committee will invite expert forecasters and users of forecasting systems, including:
The workshop will focus on the development of one or more conceptual high-level diagrams of a process that could be used to produce persistent forecasts of disruptive technologies. The final report will include transcripts of the workshop and copies of visualizations created during the workshop. The committee will comment on the insights gained from past committee meetings and the workshop and recommend options for future courses of action in the development of a persistent technology forecasting system.
input from the workshop and previous committee meetings. This report reflects the information received during both phases of the study.
To gain practical information, the committee defined the following objectives for the workshop:
Develop one or more high-level designs of potential approaches to a prototype version1 of the system.
Gain insights on how to approach the development of the system.
Estimate a gross level of effort to launch such a system.
Document the key insights from the sessions and workshop that could provide guidance for the development of the system.
In the first part of the workshop, several key observations made by presenters and participants helped frame the need for and challenges of a persistent disruptive forecasting system:
War in the future may be very different from war as it is waged today. It may not involve the use of deadly force, or in the words of Committee Chair Gilman G. Louie, “things that go boom!”
New applications of technologies are hard to predict, and the pace at which new applications are being developed and adopted globally is ever faster.
The United States is not the sole creator, keeper, and distributor of high-quality technologies that have disruptive impact.
The world is “lumpy”: technologies impact different people and cultures differently. In different countries, different technology clusters have different priorities.
In many cases, it is more important to understand the impact of a technology than to understand the technology itself.
Many disruptive technologies are the result of new applications, or combinations of developed and well-understood technologies.
It is not just about high-tech. Low-tech innovations can have an even greater disruptive effect than advanced ones do, especially in developing countries.
A disruptive forecast cannot rely solely on expert advice. It is necessary to ask those who are most likely to be directly affected by future disruptive changes.
Technology lists produced by forecasts have limited value. Secondary effects also need to be explored.
Technology forecasts typically provide a snapshot of current thinking and are quickly obviated by new data and events.
To be of optimum value, a disruptive technology forecasting system must serve needs beyond those of the DoD and must be useful to other entities, including other countries.
Based on these observations and its prior work, and given the speed and quantity of today’s data flows, the committee became convinced that a system for forecasting low-probability, high-impact innovations must have a specific set of characteristics that differentiate it from most past forecasting methodologies: it must be persistent, open, and failure-tolerant and must operate in multilingual domains.2 Instead of making discrete predictions, it should include roadmaps that track the development of events as they occur.
To detect emerging trends, a successful disruptive technology forecasting system must continuously update and improve forecasts as new data become available. In this persistent system, the historical development of a forecast can be tracked and analyzed, creating valuable insights that can be used to improve later forecasts. This type of dynamic and responsive platform is attractive to many communities of users, giving it a more robust base of data and increasing its overall utility.
The outcomes of a persistent system differ from those of many traditional methods of forecasting in the following ways:
The forecasts of the persistent system are current and based on the best knowledge available as they advance.
The system’s data archives can be used as a repository for other forecasts. Currently, there is no single place within the DoD, much less the U.S. government, where technology forecasts from multiple agencies can be seen and referenced.
A persistent data-gathering platform can be incorporated into multiple programs and applications.
Rather than having a forecasting cycle, a persistent system allows forecasts to be generated or updated continuously as new signals appear.
As new technologies, communities, or applications evolve, they can be integrated into the system and applied to existing work.
As the historic base of data grows, this archive becomes a valuable resource with which to test and refine new forecasting tools and methods using backcasting.
Openness and Crowdsourcing
Internet technology spreads knowledge virally, distributing it worldwide to people of all ages and cultural backgrounds. Technological change can come from unconventional and nontraditional sectors. Attendees at the workshop and the committee believe that input from a broad range of participants is key to tapping into far-flung
signals indicating technology change. Crowdsourcing is “the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call” (Whitford, 2008). The definition was initially discussed by Howe (2006). A crowdsourced forecasting system harnesses the creative ability and diversity of different global populations to develop the widest possible range of scenarios and potential future narratives. Ideas can be drawn directly from the crowd using established Web-based models such as participatory special-interest-community Web sites, predictive market tools, and interactive gaming. Crowdsourcing can also be used indirectly: sophisticated Web-based search algorithms can extract data of interest from such sources as blogs, professional association Web sites, competitions, or published literature.
Classical approaches and sources for forecasting, such as brainstorming by experts, market surveys, searches of published papers, and classic data collection can be combined with data obtained through crowdsourcing to create a richer base of knowledge. This could significantly improve the chances of discovering disruptive indicators before the disruptive technology emerges. Although forecasts that involve classified, proprietary, or private data cannot be crowdsourced and made public, the analysis of sensitive data can be run parallel to the process that uses crowdsourced data.
Creativity and Tolerance for Failure
The unexpected application of existing and well-understood technologies can, in many cases, cause the greatest surprise and disruption. Often, surprise is not caused by a single new technology but by the application of a new technology in conjunction with an existing technology, or by a novel application of an old technology. Uncovering the connections between new technologies, old technologies, and current human needs requires a willingness to explore “crazy” ideas. Creative minds are needed to tease out useful information and find patterns among disparate sets of data. In a persistent system, the meaning of these patterns can be reinterpreted, in the light of earlier work, as new signals emerge. During this process, scenarios that initially seemed highly unlikely might emerge as relevant.
A forecasting system for disruptive technologies must tolerate failed forecasts. By nature, a disruptive technology will be difficult to predict, and so most predictions will not be realized. Rather than predicting the occurrence of a specific innovation, a good forecast predicts the problems that will be solved with technology and the effects of different possible solutions. Given the multitude of possible technological solutions to a single problem, predicting effects is an effective way to limit surprise and to measure the success of a forecast.
The integrity of a forecasting system should be protected through multiple stages of development and operation. Specific areas of risk that should be addressed include technology and engineering risk, data risk, security vulnerabilities risks, leadership and personnel risk, disruptive idea risk, user risk, financial risk, and stakeholder risk. Each of these risks should be considered and mitigated during the design and implementation of a persistent forecasting system, as discussed in Chapter 1.
Predictions Versus Roadmaps
A list of emerging technologies provides little basis for prioritizing their disruptive potential impact or allocating resources to the most threatening scenarios, especially if those scenarios are seen as low-probability outcomes. A set of potential future roadmaps provides the necessary insight to enable better preparation for the unexpected. A useful forecast will lay out several roadmaps of potential futures, with indicators and parameters that can be used to follow the progression of events and evaluate the likelihood of each scenario’s coming true. It should include the cultural, social, and environmental signals and signposts that might indicate a disruptive scenario.
MODEL DESIGN OPTIONS DEVELOPED AT THE WORKSHOP
The workshop began with a group discussion of system goals and design features, followed by a moderated system design exercise. The workshop participants were divided into three small subgroups to facilitate participation by all group members. Each subgroup developed an option for system design, and a fourth option was submitted by workshop attendee Stan Vonog. The four options were labeled by the committee as follows:
Intelligence Cycle Option
Intelligence Cycle Option
The name Intelligence Cycle Option was given to the system design that uses an approach similar to the classic approach used by the intelligence community: hypothesize, task, collect, and analyze. This system design is organized around four functions:
The input of a “big question,”
Signal identification and hypothesis generation,
Hypothesis evaluation and testing, and
The authoring of potential future narratives.
The initial system input is a “high-level question” framed by the stakeholder, which is then used to initiate creative hypothesis generation fed by passive data collection (from movies, media, and online databases, for example) and active data-gathering (e.g., crowdsourcing, games). The hypotheses undergo evaluation by experts or outside participants in the science and technology, financial, and sociopolitical arenas or through data analysis or mechanisms such as games (by which scenarios can be tested) or focus groups. The output from these processes is shaped into a complete narrative of possible events and presented to stakeholders. It can then be used as input for further hypothesis development and analysis.
The Roadmapping Option for system design focuses on developing roadmaps of predicted events proceeding from the present to predicted future scenarios. The roadmaps are based on collected data, including observations of communities of interest. The key to this model is the generation of signposts that can be monitored as the roadmap progresses. The major elements of the system are as follows:
Techniques for mapping, processing, and evaluating inputs; and
Communication to decision makers.
This design starts with the selection of existing communities that are in the process of experimenting to solve problems. The subgroup decided that for the 1.0 version, a limited number of communities of interest should be monitored in order to gain an understanding of their activities. These communities of interest should have considerable activity and resources—both human and capital—behind them.
Ideas collected from these communities plus other traditional data-gathering techniques are employed to develop future scenarios to be explored. Knowledge-discovery tools such as data mining, classifiers, and data-visualization tools can be used to assist forecasters in monitoring various communities of interest. The ideas generated by these communities then need to be filtered and spun into narratives. The predictions or hypotheses
generated from the narratives are correlated with current events, mapped to current trends and models, and explored by different communities to test their validity. Refined hypotheses or narratives are analyzed to determine the impacts of and paths to their realization. Experts backcast by predicting, based on a forecasted future event, a roadmap of how that event might occur, including the signposts and signals that would indicate progress. The forecasting system then searches for the identified signposts and signals. As data accumulate and correspond to the narratives, some will emerge as more relevant than others. Ultimately, the signals, signs, and scenarios and their impacts are reported to decision makers.
The Crowdsourced Option forecasting system is organized by input, analytical approaches, and outputs, with a focus on creating clear, actionable outputs in the form of reports. Its name reflects its use of open participation from the “crowd” (either the general public or targeted populations) to gather forecasting inputs. These inputs are analyzed in multiple ways, employing a combination of crowdsourcing techniques and expert analysis. The final analysis is done by an expert forecasting committee or their delegates. If this endeavor were conceptualized as a business, the expert forecasting committee would be the founding board. That committee, or its delegates, would respond to a specific query from a stakeholder or a sponsor. It would then be the responsibility of the expert forecasting committee to produce regular, systematic reports. Reports can also be made on interesting signals, events, or technologies that are independent of a customer query.
The public face of this crowdsourced option would be “Disruptipedia,” an online portal where data, information, live questions and responses, signals, signposts, forecasts, scenarios, and narratives would be displayed. To attract contributions from smart, observant, knowledgeable people, incentives could include real or virtual currency or the attention of influential people. The input could have an alternate use that would be beneficial to creative people—as a source for movie script ideas, for example. Such an arrangement could therefore be mutually beneficial to the forecasting committee members and the participants. Disruptipedia would serve as a living repository for the gathered information, so that consistent data, information, and language could be accessed through the decades as the system grew.
One final option, suggested by an individual workshop attendee, was inspired by attendee comments that forecasts should be contextualized by forming associated narratives of possible future events. The system is derived from a functional organization chart released by Walt Disney Studios in 1943 and is based on the storyboard process—bringing a story idea through production to the screen. The model focuses on the development of narratives from broad themes or “big questions.” Using this question as a central theme, a small set of potential scenarios is created that identify possible contexts for exploration. Data relevant to those potential scenarios are then collected using both human- and machine-based methods. Next, the data undergo critical analysis by teams of scientific, technical, and political and economic experts who identify trends and form viable hypotheses, all of which are reported back to a story director. These hypotheses are applied to the initial scenarios to create output in the form of complete narratives that can be used in reports, demonstrations, and entertainment media. Notably, the Disney model incorporates no hierarchy that breaks operating divisions into separate “silos.” The chart lacks chain of command and authority. Instead, all staff positions serve to support a common work flow.
KEY OBSERVATIONS AND RECOMMENDATIONS
After the workshop, the committee met and discussed the data collected at previous meetings and the output from the workshop exercises. After considering the sum of collected ideas from these activities, the committee made the following key observations and recommendations for future courses of action in the development of a persistent technology forecasting system. Additional observations and recommendations are offered in Chapters 1 through 3.
Flexibility and Leadership
The model-building exercise performed at the Forecasting Future Disruptive Technologies Workshop convinced the committee that there are multiple viable approaches to building a persistent forecasting system. Given the multitude of design options available, the committee believes that building a minimal but functioning system (hereafter referred to as a 1.0 system) to test and develop would be a more productive next step than to spend excessive resources on planning a complete system structure at the outset. A 1.0 system should embody six important functions: (1) needs definition, (2) collecting alternative futures, (3) developing alternative futures, (4) roadmapping, (5) engagement, (6) and feedback. Each option produced at the workshop contained at least one particularly well-thought-out element that merits special consideration.
Key Observation. The Department of Defense needs one or more effective forecasting systems of disruptive technologies to reduce surprise created by future disruptive technologies.
Key Observation. There is more than one way to build a forecasting system; each model has different strengths and weaknesses.
Key Recommendation. The 1.0 version of a forecasting system should employ the extensive passive and active data-gathering techniques employed in the Intelligence Cycle Option, using the data to develop roadmaps of potential futures with signals and signposts derived from data inputs (as seen in the Roadmapping Option). The end product of the system should include constant output and objective-driven output as described in the Crowdsourced Option. (Recommendation 2-1)
Key Observation. The illustrative models developed at the workshop indicate that the design and building of a 1.0 version persistent forecasting system for disruptive technologies are possible using existing technologies and forecasting methods and can be achieved within a reasonable time frame using a modest level of human and financial resources.
Key Observation. A disruptive technology forecasting system focuses on technological wildcards: innovations that have a low or unknown probability of development but, if developed, would have enormous impact.
Key Recommendation. A persistent disruptive forecasting system should be built to help the intelligence community reduce the risk of being blindsided by disruptive technologies. (Recommendation 3-9)
Another major concept reinforced by the workshop is the importance of a focus on developing narratives of the human use and impact of technology instead of a focus on specific technologies. It is the context of solving human needs that drives technology use. The forecasting system should therefore start by identifying big problems and opportunities and using them to generate alternative scenarios and hypotheses from which relevant technologies can be derived. There are often multiple solution paths to solve a single problem, but the immediate and second-order effects of the different solutions might be very similar. Scenarios will also vary by region, as they will have distinct impacts and solutions in different locales. To improve the robustness of the scenarios, it is important that there be regional representation in the creation of these scenarios and the development of the narratives. By emphasizing the narrative, forecasters avoid devoting too many resources to tracking “the wrong” solutions and technologies and instead stay focused on the potential effects of disruptive technologies. They also create compelling arguments that can later be presented to stakeholders.
Narrative is a useful and powerful tool that can augment and contextualize other forecasting tools and approaches. Both quantitative and qualitative forecasting approaches have a role in a robust forecasting system. Many of these approaches, discussed in detail in this committee’s first report (NRC, 2010), can and should be
integrated into the various model design options. The workshop participants and committee members focused on the importance of including the use of narrative to contextualize the impact of forecasted technologies and alternative futures derived from both qualitative and quantitative forecasting methods.
Key Observation. Beginning the forecasting process with narratives of potential futures rather than starting with a list of potential technologies produces more useful insights into possible outcomes.
Key Recommendation. The 1.0 version of a forecasting system should begin developing a forecast of future events or conditions by constructing structured narratives describing disruptive impacts within a specific contextual framework related to particular technology use. It should then use backcasting to roadmap potentially disruptive technologies and the triggers that enable these technologies, and then iterate the mileposts for the narrative. (Recommendation 3-1)
Key Recommendation. The responsible organization should develop a repository of narratives of potential futures, organized both globally and by region, that include potential economic, technological, and societal impacts. (Recommendation 3-2)
An internal DoD team will need to address the challenges of handling classified and compartmentalized data and scenarios, and create appropriate processes for including them in a repository.
Role of Government
The workshop attendees and committee members believe that a large portion of the forecasting system should be independent of the government in order to attract contributions of data from diverse sources and to maximize opportunities to collect innovative input. To generate data representing the widest possible variety of ages, regions, ethnicities, and points of view, the system should be open to the general public. An obvious U.S. government affiliation might deter participation, leading to biased forecasts. Therefore, the resources and talent for the design, building, and retaining of the system should come primarily from outside the government.
Key Recommendation. Any forecasting system developed should be insulated to allow users to generate and investigate controversial or uncomfortable ideas. Participants and staff should identify the reasons that an idea is considered implausible and be able to understand what developments will be needed to arrive at that future. These developments should become signposts on the roadmap of the forecast. (Recommendation 3-4)
Key Recommendation. The Department of Defense and the intelligence community should consider using a separate, independent, multinational, multidisciplinary nonprofit or dot-org group to run the crowdsourced platform. The organization should be structured correctly from the beginning to ensure trust and good working relationships among staff. The crowdsourced platform should have its own separate governance with leadership representing multiple ethnicities and disciplines. (Recommendation 3-7)
Key Recommendation. A forecasting system should have two separate teams, one team working on the open external forecasting platform and another team developing an internal forecasting platform that services specific needs of an organization. The external team should encourage broad and open participation and exchange of ideas and scenarios from a broad range of participants and experts. The internal forecasting platform should address scenarios that are specific to the organization and may involve sensitive, proprietary, or classified scenarios and data that it is only willing to share with trusted parties. (Recommendation 3-8)
Funding and Management
The workshop attendees and committee believe that the forecasting system should be built and funded according to a start-up model, whereby the government provides seed-level financing to build a working 1.0 version of the system. Beginning the project with minimal resources forces the development team to make tough decisions up front and to focus the effort on developing and perfecting core system features. The leaders in charge of the system should be forced to seek additional outside funding sources, ensuring that the system is robust enough in its early stages to inspire confidence and attract sponsorship. Once developed, the system needs to be able to sustain itself by providing enough ongoing value to attract continual sponsorship from both government and other parties (including governments, corporations, institutions, and organizations) to cover the cost of operating, maintaining, and improving the system.
Key Recommendation. The Department of Defense and the intelligence community should begin the process of building a persistent forecasting system by selecting leadership and a small, independent, development team. The team should be given seed-level funding to establish an organizational structure and business plan and build a working 1.0 version of a disruptive technology forecasting system. The organization should have to attract additional funds from domestic and foreign corporate, nonprofit, or government sources. (Recommendation 3-6)
Bhan, Nita. 2010. Emerging markets as a source of disruptive innovation: 5 case studies. Core 77 Design Magazine and Resource. February 3. Available at http://www.core77.com/blog/business/emerging_markets_as_a_source_of_disruptive_innovation_5_case_studies_15843.asp#more. Last accessed March 1, 2010.
Bower, Joseph L., and Clayton M. Christensen. 1995. Disruptive technologies: Catching the wave. Harvard Business Review. January-February.
Dalkey, Norman C. 1967. DELPHI. Santa Monica, Calif.: RAND Corporation.
Enis, C.R. 1995. Expert-novice judgments and new cue sets: Process versus outcome. Journal of Economic Psychology 16(4): 641-662.
Giridharadas, Anand. 2008. The making of Tata’s new car. New York Times. January 7. Available at http://www.nytimes.com/2008/01/07/business/worldbusiness/07iht-car.1.9051152.html. Last accessed March 1, 2010.
Govindarajan, Vijay. 2009. The case for “reverse innovation” now. Business Week. October 26. Available at http://www.businessweek.com/innovate/content/oct2009/id20091026_724658.htm. Last accessed March 1, 2010.
Howe, Jeff. 2006. The Rise of Crowdsourcing. Wired Magazine. June 14. Available at http://www.wired.com/wired/archive/14.06/crowds.html. Last accessed May 27, 2010.
Johnston, Rob. 2003. Reducing analytic error: Integrating methodologists into teams of substantive experts. Studies in Intelligence 47(1): 57-65. Available at https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/vol47no1/article06.html. Last accessed May 8, 2009.
NRC (National Research Council). 2010. Persistent Forecasting of Disruptive Technologies. Washington, D.C.: The National Academies Press.
Önkal, D., J.F. Yates, C. Simga-Mugan, and S. Öztin. 2003. Professional vs. amateur judgment accuracy: The case of foreign exchange rates. Organizational Behavior and Human Decision Processes 91: 169-185.
Whitford, David. 2008. Hired guns on the cheap: New online services can help you find freelancers for less. Fortune Small Business Magazine. Available at http://money.cnn.com/magazines/fsb/fsb_archive/2007/03/01/8402019/index.htm. Last accessed January 29, 2010.