Skip to main content

Currently Skimming:

Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools
Pages 225-241

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 225...
... 2 National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, Washington, DC: The National Academies Press, 2004. 3 National Defense Authorization Act of 2012 (NDAA)
From page 226...
... 5 The SBIR/STTR programs, on the basis of highly competitive solicitations, provides modest initial funding for selected Phase I projects (up to $150,000) for feasibility testing, and further Phase II funding (up to $1 million)
From page 227...
... Publication of the 2004 Methodology The committee that undertook the first-round study and the agencies under study formally acknowledged the difficulties involved in assessing SBIR programs. Accordingly, that study began with development of the formal volume on methodology, which was published in 2004 after completing the standard National Academies peer-review process.
From page 228...
... The committee notes that while sales is a legitimate indicator of progress toward commercialization, it is not a reliable measure that commercial success has occurred. SOURCE: National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, Washington, DC: The National Academies Press, 2004, Table 1, p.
From page 229...
... Taken together with our committee deliberations and the expertise brought to bear by individual committee members, these tools provide the primary inputs into the analysis. We would stress that, for the first-round study and for our current study, multiple research methodologies feed into every finding and recommendation.
From page 230...
... Challenges in Tracking Commercialization Despite substantial efforts at DoE, described below, significant challenges remain in tracking commercialization outcomes for the DoE SBIR/STTR programs. These include the following: • Data limitations.
From page 231...
... BEYOND COMMERCIALIZATION METRICS Although Congressional interest has focused primarily on commercialization in recent years, it remains the case that there are four congressionally mandated objectives for the SBIR program, and that commercialization is only one of them. STTR adds additional objectives beyond commercialization.
From page 232...
... 10 The survey conducted as part of the current, second-round assessment of the SBIR program is referred to below as the "2014 Survey" or simply the "survey." In general, throughout the report, any survey references are understood to be to the 2014 Survey unless specifically noted otherwise. 11 Delays at NIH and DoE in contracting with the National Academies combined with the need to complete work contracted with DoD NSF and NASA led the committee to proceed with the survey at three agencies only.
From page 233...
... demonstrates that the probability of obtaining research project information by survey decreases for less recently funded projects, and it increases the greater the award amount.b Nearly 75 percent of Phase II responses to the 2011 Survey (the population for which was awards made FY 1998-2007) were for awards received after 2003, largely because winners from more distant years are more difficult to reach: small businesses regularly cease operations, are acquired, merge, or lose staff with knowledge of SBIR awards.
From page 234...
... Finally, the committee suggests that, where feasible, future assessments of the SBIR/STTR programs include comparisons of non-awardees, such as in matched samples (Azouley et al., 2014) or regression discontinuity analysis (Howell, 2015)
From page 235...
... Strict adherence to a revealed preference paradigm could lead to misguided policy conclusions because the paradigm assumes that all policy choices are known and understood at the time that an individual or firm reveals its preferences and that all relevant markets for such preferences are operational.
From page 236...
... It was also consistent with a previous GAO study, published in 1992, which surveyed awards made through 1987. The aim of setting the overall time frame at 10 years was to reduce the impact of difficulties generating information about older awards, because some companies and PIs may no longer be in place and because memories fade over time.
From page 237...
...  Max number of questionnaires <20 2 Distribution Mail  No Email   Telephone follow-up   Questionnaire Company demographics Identical Identical Commercialization outcomes Identical Identical IP outcomes Identical Identical Women and minority participation   Additional detail on minorities  Additional detail on PIs  New section on agency staff  New section on company recommendations for SBIR  New section capturing open-ended responses  Determining the Survey Population Following the precedent set by both the original GAO study and the first-round study of the SBIR program, we differentiated between the total population of awards, the preliminary survey target population of awards, and the effective population of awards for this study. Two survey response rates were calculated.
From page 238...
... This process of excluding awards either because they did not fit the protocol agreed upon by the committee or because the agencies did not provide sufficient or current contact information, reduced the total award list provided by DoE from an initial list of 1,325 to a preliminary survey population of Phase II SBIR and STTR awards of 1,077 awards. Secondary Filters to Identify Recipients with Active Contact Information: Identifying the Effective Population This preliminary population still included many awards for which the PI contact information appeared complete, but for which the PIs were no longer associated with the contact information provided and hence effectively unreachable.
From page 239...
... . In addition, two voice mails were delivered to non-responding PIs of awards in the effective population, between the second and third and between the third and fourth rounds of email.
From page 240...
... 1,325 Preliminary Population of Awards 1,077 Awards for which the PIs Were Not Contactable No Email 320 No Phone Contact 263 Effective Population of Awards 494 Number of Awards for which Responses Were Received 269 Response Rate: Percentage of Effective Population of Awards 54.5 Response Rate: Percentage of Preliminary Population of Awards 25.0 SOURCE: 2014 Survey. 13 Many surveys of entrepreneurial firms have low response rates.
From page 241...
... In response, we sought to develop a comparison group from among Phase I awardees that had not received a Phase II award from the three agencies surveyed in 2011 Survey during the award period covered by the survey (FY 1999-2008)


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.