Information technology (IT) underpins every sector of the economy—from telecommunications to commerce to health care to agriculture. IT-enabled innovation, advances in IT products and services, and increasing IT capabilities across many U.S. industries draw on a deep tradition of research, rely on sustained investment, and benefit from a uniquely strong partnership in the United States among government, industry, and universities. This IT innovation ecosystem, key features of which are depicted in Figure 2.1, fuels a virtuous cycle of innovation with growing economic impact.
This chapter discusses key lessons about this vibrant ecosystem—drawing on examples depicted in Figure 2.1 and documented in Appendix B, distilled further in Chapters 3 through 5, and drawing on results from prior Computer Science and Telecommunications Board studies.
One measure of the impact of investment in IT research and development (R&D) is its contribution to the creation of numerous U.S. firms with annual revenues
exceeding billions and of entire new sectors that contribute billions of dollars to the U.S. economy. Many of these firms are household names, and their products and services underpin the digital economy—and indeed the economy more broadly. In 2019, the combined estimated annual revenue of the top 10 Fortune 500 technology companies was nearly $1 trillion1 and the combined estimated annual revenue of the top 10 Fortune 500 telecommunications companies was approximately $500 billion.2 These figures do not, however, capture the full economic impact of IT more broadly in automotive, health care, agriculture, entertainment, commerce, and other sectors that now depend heavily on IT.
The paths from basic IT research to economic impact have been driving innovation for at least six decades. Figure 2.1, an update of the 1995 “tire tracks” figure and subsequent 20033 and 20124 versions, illustrates how fundamental research in IT, conducted in industry and universities, has led to the introduction of entirely new product and service categories and capabilities. It reflects a complex research environment in which concurrent advances in multiple subfields—in particular, within computer science and engineering, but extending into other fields too, from electrical engineering to psychology and design—have been mutually reinforcing, stimulating, and enabling one another, leading to powerful IT innovations carried forward by top-performing U.S. firms.
Listed on the left side of Figure 2.1 are areas where major investments in fundamental research in the subfields of computing and communications have had the impacts shown on the right side of the figure. The red tracks represent university-based (and largely federally funded) research, and the blue tracks represent industry R&D (some of which is also government funded). The green lines represent products and services resulting from academic and industry R&D. These are mostly commercial products and services, but in some cases these are widely used open-source software artifacts that are key ingredients in further research and innovation.
The arrows between the tracks represent some salient examples of the rich interplay between academic research, industry research, and products and indicate the cross-fertilization resulting from multi-directional flows of ideas, artifacts, technologies, and people. These examples were selected by the committee as being
1 This figure includes revenues from Apple, Alphabet, Microsoft, Dell Technologies, IBM, Intel, HP, Facebook, Cisco Systems, and Oracle. See Fortune 500, 2019, “Sector: Technology,” https://fortune.com/fortune500/2019/search/?sector=Technology.
2 This figure includes revenues from AT&T, Verizon Communications, Comcast, Charter Communications, CenturyLink, Frontier Communications, Windstream Holdings, and Telephone and Data Systems. See Fortune 500, 2019, “Sector: Telecommunications,” https://fortune.com/fortune500/2019/search/?sector=Telecommunications.
representative of the interconnections in the IT innovation ecosystem and reflective of significant IT developments. The representative examples are documented in Appendix B.5 Notably, these flows are multi-directional, from academic work to industry research to industry products and services, but also depicting the flow of industry artifacts and research to academic laboratories. Arrows spanning research areas provide only a few indications of the interdependence of research advances in various areas. Stepping back, these flows energize virtuous cycles of innovation across areas of IT research and industry activity.
The tracks connect at the center band of the figure to depict key IT innovations and outcomes. These underpin leading U.S. IT firms as indicated by the colored arcs. Increasingly, these firms have come to draw on the results of multiple research tracks.
In the right portion of the figure, these outcomes are linked to some major U.S. economic sectors that have been transformed by the cumulative impact of IT innovation coupled with domain-based innovation and expertise. This report describes this phenomenon as the “confluence” of multiple threads of IT-enabled innovation coming together to have a transformative impact on a major industry sector through converging contributions from multiple areas of IT innovation The selected companies and industries are illustrative of the now widespread impact of IT innovation in the U.S. economy. As described in Chapter 5, some confluence threads draw from IT companies providing key products and services while other confluence threads reflect fundamental business transformations catalyzed by new computing capabilities woven together with deep domain knowledge and invention. The background ripples in the figure illustrate schematically how the left-to-right flow of innovation to impact is part of a larger, virtuous cycle dependent on use-inspired and use-informed interdisciplinary research.
Figure 2.1 is of necessity incomplete and representative in nature. For one, it only captures a very small sample of significant academic and industrial IT research and resulting IT products and services. For example, it does not chart research leading to remarkable improvements in analog or digital semiconductors that have helped advance computing and communications capabilities. It would be nearly impossible to chart all of the important cumulative contributions of research and their
5 Some of the examples also appeared in earlier versions of this figure. However, reports prior to 2012 did not preserve documentation describing the developments. Consequently, many of the research developments and commercial impacts depicted are new to this report. Appendix B documents the developments represented by the arrows in Figure 2.1.
TABLE 2.1 Estimated 2019 Annual Revenue of Illustrative U.S. Information Technology (IT) and IT-Enabled Firms Shown in Figure 2.1 (in $billions)
|IT Firm||Revenue||IT-Enabled Firm||Revenue|
|Amazon Web Services||35.00||Disney||59.40|
|Cisco||49.30||National Football League||14.50|
links to today’s products, firms, and industries. For example, Google’s success can be attributed to advances in at least five research areas: networking and communications; systems and architecture; databases; artificial intelligence (AI), machine learning, and data science; and human-computer interaction. Likewise, Google’s investments in autonomous vehicles and home automation draw from a rich federally funded portfolio in robotics and cyber-physical systems. Amazon is an even more involved example and is the backdrop for the confluence narrative in Chapter 5 that unpacks a “simple” e-commerce transaction. In the entertainment industry, the rapid adoption of interactive technologies on top of broadband communications, cloud computing, dazzling graphics, engaging user experiences, and AI-driven personalization bolsters traditional companies such as Disney while enabling entrants such as Netflix, Amazon, and Hulu. In short, the question of what is an “IT company” has become much more complicated since the 2012 report, and the economic impact of IT capabilities extends well beyond this dynamic border.
This interplay across industries is mirrored by the cross fertilization of research ideas within computing. Although the tracks in Figure 2.1 were chosen to illustrate, through prominent examples, how each selected research area is closely connected with major innovations stemming from that field, in reality, each research area is linked in many ways to one or more industry areas. Research outcomes in one area have continued to affect and enable research in other areas. Several examples of this phenomenon are shown in the figure as arrows from one research area to another.
This page intentionally left blank.
Furthermore, synergies among research areas often lead to surprising results and have impacts that were not originally intended or envisioned. This characteristic of technological innovation is most evident in the broad-based impact of computing research. Such research often starts as a search for fundamental knowledge, but time and again produces practical technologies that enable significant economic impact (Table 2.2). Moreover, these tracks also depict the impact of complementary use-inspired research stemming from increasing amounts of deep interdisciplinary collaboration between computing and other areas of science (e.g., psychology in human-computer interaction) and industry (e.g., health care making use of and helping advance many IT innovations).
TABLE 2.2 Linking Core Information Technology (IT) Research Areas to Examples of Economic and Confluence Impacts
|Core IT Research Areas||Fundamental Research Goals||Significant IT Innovation||Economic, Societal, or Confluence Impact|
|Networking, Communications||Reliable, scalable, manageable, tethered and untethered communications networks||Local area networks, Internet, wireless, broadband||Pervasive use of the Internet, the Web, and cell phones throughout society and economy; communications networks used to operate cars, airplanes, and ships; e-commerce, telehealth, teleconferencing|
|Systems, Architecture||Manage increasingly complex computers, storage devices, and distributed systems and enhance their performance||Smartphones, cloud, personal computing, microprocessors||Over 3 billion smartphones worldwide; cloud services; Web and search technologies; enterprise data sharing|
|Theory, Programming Languages||More effectively create software; understand the nature of computation and apply that understanding to create more efficient methods||Scalable, dependable, and agile software||Pervasive use of optimization, digital reconstruction, DNA sequencing, cryptocurrencies, and blockchain|
|Databases, Analytics||Manage, discover, locate, and analyze information||Enterprise software and systems||Widespread use of data sharing or data warehouses; precision medicine, electronic health records; precision farming|
|Core IT Research Areas||Fundamental Research Goals||Significant IT Innovation||Economic, Societal, or Confluence Impact|
|Security, Privacy||Protect networks and computers from disruption or theft or damage to the data they contain; allow people to control their personal information||Secure computing||Confidential Internet financial transactions|
|Robotics, Cyber-Physical Systems||Create systems incorporating sensors and actuators that operate autonomously or semi-autonomously in cooperation with humans; manage cyber-physical and physical-cyber interdependencies||Automation, robotics, sensors, control systems||Surgical robots, smart medical devices, adaptive cruise control, automated manufacturing and fulfillment centers, smart homes|
|Artificial Intelligence, Machine Learning, Data Science||Simulation of human-level intelligence, including language understanding, vision, learning, and planning||Speech and image recognition, reasoning, prediction, optimization||Medical diagnostics, sports coaching and training, crop management, predictive analytics|
|Graphics, Simulation||Display of images and movies; realistic modeling and simulation||Video and animation techniques; virtual, augmented, and mixed reality; GPUs||Video games, computer animated films, computer-aided design, advanced training tools|
|Human-Computer Interaction||Advances in theory, design, and technology to create usable, useful, and compelling computing experiences||Web, social media, mobile interaction tools and gesture interfaces, accessibility, interaction design||Productivity and collaboration tools, mobile apps, recommendation tools, user experience design|
A virtuous cycle of innovation has evolved as IT innovation has become increasingly woven into innovation across the economy. Overall, the timescale for translation from basic research to commercial impact varies widely. It is difficult to predict a priori which research will pay off rapidly, which will take time, and which will surge repeatedly. As new research emerges, rich connections in the ecosystem and the availability of funding—both venture capital and investment by existing firms—sometimes make it possible for research ideas to be quickly scaled up for commercial use.
At the same time, some payoffs emerge only after years or decades, as innovations compound, design and product knowledge mature, technology components become less expensive, and viable market paths emerge. One of the most important messages of Figure 2.1—which can be seen, for example, as arrows that span significant periods of time, representing research that has an impact on further research or products and services years later—is the unpredictable and sometimes long incubation period that separates the multiple phases of innovation: research exploration, initial commercial deployment, and eventual business breakthrough. The time from early research insights to successful product or service can run decades—a contrast to the more incremental innovations that tend to be publicized as evidence of the rapid pace of IT innovation. This observation speaks to the value of sustained funding and long research horizons in enabling many IT innovations.
To further explore these timescales and payoffs, this report focuses on two key patterns of innovation that connect investment in fundamental research with economic impact: resurgence and confluence.
The interests of researchers or funders can fall off in areas where progress has slowed, to be followed by a resurgence when new ideas or enablers emerge. Examples that are further described in Chapter 3 include the following:
- Early work in neural networks was later revived as the basis for deep learning—enabled by a combination of new algorithms, new sources of training data, and hefty advances in computational power and memory capacity.
- Formal methods research was applied years later to improve software quality—enabled by research that improved the usefulness of verification and model-checking techniques and a growing appreciation by potential users of the power of formal analysis to prevent attacks and losses.
- 1960s-era industry work on virtual machines was aimed at sharing a lone computer among many users. 1990s academic research later reinvented virtualization to allow a microprocessor to run multiple software environments and led to the capabilities that make it possible to share computing resources within a data center—a vital foundation for cloud computing.
The canonical example for resurgence in research is the dramatic twists and turns that make up the history, and likely the future, of work in AI. This subfield
in computing has notably weathered multiple “winters” of disappointing results between highly promising “summers,” but recently has seen an explosion of research and commercial success in machine learning and deep neural networks that has had significant impact on IT itself as well as on the applications and services that this success has enabled. Chapter 4 portrays these seasons of AI research in several key areas: machine learning, reasoning, natural language processing, robotics, and computer vision.
There are other examples of resurgence throughout computing. For example, virtual and augmented reality is starting to be widely used, enabled in part by new headsets that are lightweight, untethered, and usable throughout large working volumes. Virtual reality research over five decades has advanced with enhancements to image processing and computer graphics, especially methods for rapidly rendering realistic scenes. As is depicted in Figure 2.1 and detailed in Appendix B, many advances in computing research connect back to groundbreaking work in Ivan Sutherland’s 1963 Massachusetts Institute of Technology doctoral research that is the root of subsequent work in graphics and human-computer interaction, catalyzing work years later across networking, systems, programming languages, graphics, and human-computer interaction (HCI).
This repeated pattern of resurgence has lessons for the researchers and funders seeking to understand the timing of major breakthroughs and economic payoffs for fundamental research. Sustained funding is needed to weather the winters and summers of research activity, and researchers should also be encouraged that persistence and inspiration from the past has its rewards.
IT innovations yield transformative results across the U.S. economy, fueling advances in sectors such as health care, agriculture, transportation, and commerce. A virtuous cycle of innovation has arisen and grown as IT innovation has become increasingly woven into innovation across the economy. This report calls this innovation phenomenon confluence because it relies on the bringing together of multiple streams of innovation in IT, innovation within sectors, and innovation in how IT is used to solve problems and create new capabilities in those sectors.
This report—particularly Chapter 5 and the examples in Figure 2.1—focuses on especially salient and more recent examples of confluence, pulling from more than a decade of mutually reinforcing innovation in health care, agriculture, and automotive sectors.
It is important to recognize that there are other important examples of confluence that are so pervasive and that happened so long ago that their impact is less evident. Highly innovative at the time, they have become today’s “routine IT”—essential capabilities now used to innovate across the economy and society. To capture a portion of this legacy, this report provides examples from the long history of research investments in networking, systems, architecture, programming languages, theory, security, AI, and HCI that intertwine together as the backdrop of each and every consumer e-commerce transaction. Although AI is the canonical example for resurgence, the impact of the Internet (now 45 years old) is the canonical component that underlies many confluence successes.
In many instances of confluence, the curiosity and serendipitous discoveries that fueled early computing research are combined with innovations and insights from “use-inspired” research that often looks to other fields and applications for rich problems and opportunities for impact. This virtuous cycle, a push and pull from computing with other fields and industries, has repeatedly created a path for economic impact that begins with the industry adoption of basic IT capabilities (e.g., databases, digitization of work products, and networking and communication) to more profound opportunities for transformation (e.g., new business models, data-driven insights, automation, optimization, and new ways to connect with consumers). Notably, the resurgence narrative of AI is also repeatedly found in confluence examples as data-driven insights transform industries across the board.
Confluence then fuels many IT “surprises” as fields combine in novel and unanticipated ways. Consider the development in recent years of online forums for patients with rare diseases. These forums, which would not be possible without the Internet, Web, or desktop and mobile operating systems, combined with a long history of research in online collaboration and advances in each of these areas—enabled by government investment—have made it possible for patients to aggregate their information in a way otherwise not possible, creating new opportunities for clinical trials of potential treatments. These interactions foster further innovation; use-inspired research often yields new IT research insights and challenges.
This innovation relies on several factors. One is the ability to combine deep expertise about the domain in which IT is being applied with deep expertise in IT. A second is design and production knowledge, again combining knowledge of the application and IT. A third is the development of new business models that take advantage of the capabilities afforded by IT. One example of these processes is the role of computational thinking as IT concepts are brought into the process of analyzing and reshaping information and processes in other domains.
Perhaps the most important enabler of confluence is that the IT research community and research funders have embraced the conduct of and support for IT research that is deeply informed and inspired by challenges in other domains. Critically, these research interactions foster further innovation. Looking ahead, domain-centric challenges are shaping IT research “up and down the stack” as the research community now embraces work in application-specific computing systems, new kinds of data storage, input/output communication, and new computing architectures. This shift away from focusing solely on general-purpose computing platforms holds great potential.
The United States fosters a unique and powerful range of research investment and partnerships. These exist within a complex ecosystem encompassing university and industrial research enterprises, federal research funders, emerging start-ups and more mature technology companies, investors in innovative firms, communities that develop and support open-source software, and the regulatory environment and legal frameworks in which innovation takes place.
The federal government plays an essential role in sponsoring fundamental research in IT, largely based in universities, because its investments in long-term, fundamental research are an essential complement to industrial research, which reflects different goals and incentives, resulting in differences in style, focus, and time horizon. The federal role has coevolved with the development of IT industries. Its programs and investments have focused on capabilities not ready for commercialization and on emerging national needs alongside growing commercial capabilities, both of which are moving targets. Federal research funding complements rather than preempts industry investments in research.
Most often, the federal investment that contributed to the development of the IT capabilities at the center band of Figure 2.1 took the form of grants or contracts awarded to university and industry researchers by the Defense Advanced Research Projects Agency (DARPA) and other defense research agencies and/or the National Science Foundation (NSF), with the latter playing an increasingly dominant role in supporting academic IT research.6 A shifting mix of other funding agencies has also been involved, reflecting the missions of these agencies and their needs for IT. For example, the Department of Energy (DOE), the National Aeronautics
6 According to the NSF 2021 Federal Budget Request, NSF funds 87 percent of all federal funding for basic computing research.
and Space Administration (NASA), and the military services have supported high-performance computing, networking, HCI, software engineering, embedded and real-time systems, and other kinds of research. The National Institutes of Health invests in research in biomedical computing and robotics, and the Intelligence Advanced Research Projects Activity invests in areas such as data analytics, speech recognition, and language translation. Today, an array of federal agencies participates in the federal Networking and Information Technology Research and Development (NITRD) program and IT-related committees of the National Science and Technology Council, reflecting their interest in supporting advances in various aspects of computing and communications to fulfill their missions.
Universities are especially well suited for foundational research, including work on topics for which no immediate business purpose has been identified. Academic researchers carry out work at multiple scales, from individual principal investigators to multi-institution, multidisciplinary centers. The majority of funding for university-based research comes from the federal government, but they also receive funding from industry for individual projects and centers as industry has come to appreciate more the value of academic research. Industry funding is growing in proportion at academic research centers, in part because federal funding has not kept pace. Maintaining a healthy balance of funding sources continues to be a challenge.
Among the distinguishing characteristics of universities is their ability to pursue foundational research—provided their sponsors are willing to take a similarly long-term perspective. Examples where research seemingly had no immediate application but ultimately had major impact include decades of research in number theory that ultimately gave rise to modern cryptography and 1960s work on perceptrons aimed at mimicking human brain activity, which ultimately gave rise in the 2010s to deep learning. Relatedly, collateral results, often unanticipated, can be as important as the anticipated results of research programs.
Universities are also well positioned to engage in transdisciplinary use-inspired research. They are able to bring together experts from diverse fields to work together on transdisciplinary societal challenges that implicate IT research such as improving health care outcomes, the role of smart technologies in city systems, and even the future of sports. It should be noted, however, that universities must proactively incentivize and support transdisciplinary work. The basic footprint of almost all universities into disciplinary silos can be a powerful impediment to the growing need for multidisciplinary collaboration.7 Universities can also act as a convener of
7 National Academy of Sciences, National Academy of Engineering, and Institute of Medicine, 2005, Facilitating Interdisciplinary Research, The National Academies Press, Washington, DC, https://doi.org/10.17226/11153.
companies within a particular industry sector—acting together to elevate a field and to spur innovation within that sector.
The IT sector invests an enormous amount each year in R&D. IT research both creates the virtual world and discovers truths intrinsic to it and its connection to the social and natural spheres. It is critical to understand, however, that the vast majority of corporate R&D has been focused on product and process development. This focus is what shareholders (or other investors) demand. It is much more difficult for corporations to justify funding long-term, fundamental research. In contrast, incremental research and product development can be performed in a way that directly benefits the sponsor. It can be done under wraps, and it can be moved into the marketplace more quickly and predictably. The level and focus of industry research support also tends to wax and wane according to industry trends and market conditions. By contrast, universities are often better positioned to carry out long-term, sustained research that has broad applicability. The resurgence narrative of AI provides many examples of bursty investments in AI development in industry, with long gaps where AI research was sustained in academic laboratories.
Economists have articulated the concept of appropriability to express the extent to which the results of an investment can be captured by the investor, as opposed to being available to all players in the market. The results of long-term, fundamental research are widely appropriable for several reasons: they tend to be published openly and thus to become generally known; they tend to have broad value; it is difficult to predict in advance which investments will be important; and they may become known well ahead of the moment of realization as a product, so many parties thus have the opportunity to incorporate the results into their thinking. Such innovations effectively “raise everyone’s boat” in the same way as do government investments in bioscience, health care, and other strategically important scientific disciplines. In contrast, private-sector firms have less incentive to invest significantly in appropriable research—activities whose benefits could spread quickly to their rivals. Appropriability also helps to explain why the companies that have tended to provide the greatest support for fundamental research are larger firms with leadership positions in their respective markets. This pattern holds even when the expected benefits sum up to far more than needed to justify the expense to industry. Hence, there persists a need for federal funding of early-stage, higher-risk research that may provide broad benefits despite the economic growth of the IT industry overall.
Nevertheless, large IT companies also have strong incentives to share ideas and artifacts in order to gain mindshare and market influence around use of their tools and technology ecosystems, stoke general innovation that benefits both the company and society more broadly, and respond to competition for top talent who
demand such opportunities. For example, IT companies have a history of publishing innovative research in tech reports, conference papers, and journal articles, and they make industry-developed tools (e.g., Google’s BERT neural language models or Microsoft’s Z3 Theorem Prover) available for use by universities and start-ups.
Especially in recent years, there are some forms of computing research that prosper only with industry participation, because research advances depend on access to artifacts and other resources of a kind and scale not available outside of industry. For example, some research requires access to proprietary, often large, data sets that capture information from a huge segment of the world’s population. Mechanisms, such as visiting scientist appointments for faculty and students, help facilitate needed access. For academic researchers, such arrangements can be necessary to allow certain research ideas to be pursued. Industry in turn benefits from the intellectual contributions of faculty and students, both directly (the opportunity to recruit promising students) and indirectly (contributing to the talent pool that supports the industry).
Start-ups represent the other end of the spectrum. A hallmark of U.S. entrepreneurship, start-ups and start-up financing have facilitated the development of high-risk products as well as an iconoclastic, risk-taking attitude in contrast to more traditional companies and managers in the IT business. But they do not engage in research. Thus, start-ups are notable for two reasons: first, although start-ups at least temporarily attract some researchers away from university-based research, they place them in a position to spearhead innovation, often based on their university work; second, notwithstanding the popular labeling of start-ups as “high-tech,” they generally combine and apply the fruits of past research rather than generating more research capabilities and results.
Through the 1980s there were a significant number of U.S. industry research laboratories conducting fundamental research in IT, including Bell Labs, Xerox PARC, IBM, and Bellcore. As the industry matured, many of these laboratories became less focused on fundamental research and more focused on shorter-term advanced development. Another major laboratory, Microsoft Research, was founded in 1991. The patterns are different across IT firms. Google, for example, invests significantly in research, but its researchers are generally embedded in product teams; it has also invested in research-focused organizations such as Deep Mind. There are also philanthropy-funded enterprises with roots in the IT industry that conduct foundational research, including the Allen Institute and OpenAI.
To augment their research activities, many IT firms also leverage the top IT research groups in the country by supporting university-based consortia. One advantage of this model is that the resulting intellectual property is not locked
up in corporate silos but rather is widely available for adoption by the broader IT ecosystem.
Research-enabled commercial developments also expand the possibilities for research, given that commercialization has led to substantial decreases in cost. Lower costs have allowed for much wider penetration of technology and have in turn greatly reduced the barrier for who gets to innovate, opening the door to a much wider range of both research and researchers.
The IT innovation ecosystem relies on a growing number of mechanisms to foster innovation, facilitate its uptake by potential users, and sustain interactions among existing and potential commercial users and the computing research community. Which mechanisms work best in which circumstances is a complex and, in many instances, open question that this report does not address.
A general strategy underlying these mechanisms is that research environments can provide a “time machine” that allows researchers, students, and industry partners to experience future visions of IT in the present. Examples include the major investment in supercomputers that provided access to powerful computational capabilities in advance of today’s cloud infrastructure; the research and education networks such as ARPANET, CSNET, and NSFNET that presaged today’s Internet; dedicated efforts to build personal computing experiences and networked machines for collaboration in advance of widespread adoption of personal computers; and current grand challenges and testbeds for autonomous vehicles. IT research invests significant resources to construct the future in the here and now to explore, invent, and refine ideas—and train others.
Many companies directly incorporate cutting-edge IT advances stemming from research fueling U.S. leadership across many sectors. These transfers may traditionally involve the movement of faculty and students from universities to industry and sometimes take place through other mechanisms, including open-source projects (see below). The most common form of technology transfer is the training of students as part of their educational experience. For undergraduates, much of this training takes place in classes but also occurs as research opportunities in faculty laboratories, while research activities are the sin qua non of a doctoral education. Many of the innovation jumps from academic research into industry laboratories and products depicted in Appendix B document the path of talented (mostly but not exclusively graduate) students. It is important to not underestimate the role of federal
funding of academic research programs in creating pivotal training opportunities that later translate into commercial impact. Government sponsorship of research in universities supports the education of many graduate students in computing, especially in doctoral programs—some of whom go on to be leaders in industry. There are many well-known examples, including Sergey Brin and Larry Page, co-founders of Google, as well as others noted in Appendix B. University-based research also provides formative experiences in computing for students who go on to careers involving computing but outside the computing industry itself. Together, these education and training paths contribute significantly to the IT talent relied on by industry, universities, and other parts of the economy.
Perhaps the best publicized transfer mechanism is start-ups based on fundamental research undertaken in universities. These are fostered by several aspects of the U.S. ecosystem. First and foremost, major universities support technology transfer offices; technology can be transferred to a private firm through licensing of patents or copyrighted material, either through exclusive or non-exclusive licensing. Additionally, many universities maintain policies that allow faculty to take leave or work part time off campus to engage in commercial endeavors. For start-up activities, the availability of venture capital and a thriving ecosystem for supporting small businesses is key. More recently, for many start-ups, the Internet has lowered the barrier to entry for interacting with customers and partners, and cloud infrastructure frees new firms from having to make up-front capital investments and makes it easier for them to scale up the IT aspects of successful ideas rapidly. Like many other sectors, taking IT innovation to scale requires significant investment. But by contrast to many other sectors, where the incubation cycle requires years of incremental development, testing, regulatory compliance, and rollout to achieve scale, IT frequently has a different dynamic and timeline.
Another model for translation and transfer is when faculty take temporary industry positions, sometimes in industry-run laboratories established in close proximity to universities. The transfer of research ideas is bidirectional. Faculty bring talent and ideas to companies that contribute to the development of new products and services. They bring back to universities new insights about important technical problems that inspire new research. Industry research can take the lead when long-term research reaches a tipping point where real-world impact is in sight, and when collective needs by industry (e.g., video codecs for Skype, Netflix, and others) drive the development of new algorithms and approaches (e.g., multimedia compression algorithms).
The open-source model provides an increasingly important framework for loosely coordinated work among multiple universities and companies toward a