Skip to main content

Currently Skimming:

Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools
Pages 105-120

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 105...
... 2 National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, Washington, DC: The National Academies Press, 2004. 3 The methodology developed as part of the first-round assessment of the SBIR program also identifies two areas that are excluded from the purview of the study: "The objective of the study is not to consider if SBIR should exist or not -- Congress has already decided affirmatively on this question.
From page 106...
... increase private-sector commercialization of innovations derived from federal R&D." 5 The STTR program, on the basis of highly competitive solicitations, provides modest initial funding for selected Phase I projects (in most cases up to $150,000) and for feasibility testing and further Phase II funding (in most cases up to $1.5 million)
From page 107...
... Tools Utilized in the Current STTR Study Quantitative and qualitative tools being utilized in the current study of the STTR program include the following Academies activities: • Surveys. An extensive survey of STTR award recipients as part of the analysis.
From page 108...
... continuing to R&D? support high risk research Measures Peer-review Sales, follow-up Patent counts and Innovative scores, funding, other other intellectual products publication commercial property/employ- resulting from counts, activities ment growth, SBIR/STTR citation number of new work analysis technology firms Tools Case studies, Phase II surveys, Phase I and Phase Program agency program manager II surveys, case manager program discussions, case studies, study of surveys, case studies, study studies, study of repeat winners studies, agency of repeat repeat winners program winners, studies, study bibliometric of repeat analysis winners Key Difficulty of Skew of returns; Measures of actual Major Research measuring significant success and failure interagency Challenges quality and of interagency and at the project and differences in identifying inter-industry firm levels; use of proper differences relationship of SBIR/STTR to reference federal and state meet agency group programs in this missions context NOTE: Supplementary tools may be developed and used as needed.
From page 109...
... COMMERCIALIZATION METRICS AND DATA COLLECTION Recent congressional interest in the SBIR-STTR programs has to a considerable extent focused on bringing innovative technologies to market. This enhanced attention to the economic return from public investments made in small business innovation is understandable.
From page 110...
... In this regard, the STTR program can benefit from access to the survey data. The survey work provides quantitative data necessary to provide an evidence-driven assessment and, at the same time, allows management to focus on specific questions of interest, in this case related to operations of the STTR program itself.
From page 111...
... We also acknowledge the likelihood that data from the survey may be affected by the undoubted survey deployment bias toward surviving firms. Very limited information is available about SBIR/STTR award recipients: company name, location, and contact information for the PI and the company point of contact, agency name, and date of award (data on woman and minority ownership are not considered reliable)
From page 112...
... Finally, in its recent study of the SBIR program at DoD, 9 the committee compared outcomes drawn from the Academies survey and the CCR database and found that, where there was overlap in the questions, outcomes were approximately similar even though the DoD database is constructed using a completely different methodology and is mandatory for all firms participating in the SBIR-STTR programs. Although equivalent cross-checks are not available for the other agencies, the comparison with CCR data does provide a direct cross-check for one-half of all SBIR/STTR awards made and also suggests that the Academies survey methodology generates results that can be extended with some confidence to the other study agencies.
From page 113...
... • Successful and more recently funded companies are more likely to respond. Research by Link and Scott demonstrates that the probability of obtaining research project information by survey decreases for less recently funded projects and increases the greater the award amount.b Winners from more distant years are difficult to reach: small businesses regularly cease operations, are acquired, merge, or lose staff with knowledge of SBIR/STTR awards.
From page 114...
... Strict adherence to a revealed preference paradigm could lead to misguided policy conclusions because the paradigm assumes that all policy choices are known and understood at the time that an individual or company reveals its preferences and that all relevant markets for such preferences are operational. See Gregory G
From page 115...
... e Data from the National Research Council assessment of the SBIR program at NIH indicate that a subsequent survey taken 2 years later would reveal substantial increases in both the percentage of companies reaching the market and the amount of sales per project. See National Research Council, An Assessment of the SBIR Program at the National Institutes of Health, Washington, DC: The National Academies Press, 2009.
From page 116...
... 116 APPENDIX A TABLE A-2 Similarities and Differences: 2005 and 2014 Surveys Item 2005 Survey 2014 Survey Respondent selection Focus on Phase II winners   All qualifying awards  PIs  POCs  Max number of questionnaires per respondent <20 2 Distribution Mail  No Email   Telephone follow-up   Questionnaire Company demographics Identical Identical Commercialization outcomes Identical Identical IP outcomes Identical Identical Women and minority participation   Additional detail on minorities  Additional detail on PIs  New section on agency staff activities  New section on company recommendations for SBIR/STTR  New section on STTR  New section capturing open-ended responses  Determining the Survey Population Following the precedent set by both the original GAO study and the first round of Academies analysis, we differentiate between the total population of STTR recipients, the preliminary survey target population, and the effective population for this study, which is the population of respondents that were reachable. Initial Filters for Potential Recipients Determining the effective study population required the following steps: • acquisition of data from the five study agencies covering record-level lists of award recipients during the relevant fiscal years;
From page 117...
... , and finally by random number; and • elimination of records for which there were significant missing data. This process of excluding awards either because they did not fit the selection profile approved by the committee or because the agencies did not provide sufficient or current contact information reduced the total STTR award list for the five agencies from 1,501 awards to a preliminary survey population of 1,400 awards.
From page 118...
... TABLE A-3 2011-2014 STTR Survey Response Rates Total Total Awards 1,501 Excluded from survey population 101 Preliminary target population 1,400 Not contactable 807 Bad emails 266 Bad phone 518 Opt outs 23 Effective survey population 593 Completed surveys 292 Success rate (preliminary population) 20.9 Success rate (effective population)
From page 119...
... These are technology-based companies at an early stage of company development, which have the demonstrated capacity to undertake challenging technical research and to provide evidence that they are potentially successful commercializers. Given that the operations of the SBIR-STTR programs are defined in legislation and limited by the Small Business Administration (SBA)
From page 120...
... 120 APPENDIX A TABLE A-4 STTR Responses by Year of Award (Percent Distribution) Fiscal Year of Award STTR 1998 1999 0.7 2000 0.7 2001 4.8 2002 6.5 2003 5.5 2004 7.2 2005 14.4 2006 14.4 2007 19.2 2008 7.5 2009 7.2 2010 12.0 Total 100.0 BASE: ALL RESPONDENTS 292 SOURCE: 2011-2014 Survey.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.