National Academies Press: OpenBook
« Previous: 2. DRAFT PHASE II SURVEY
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page101
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page102
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page103
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page104
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page105
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page106
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page107
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page108
Suggested Citation:"3. DRAFT PROGRAM MANAGER SURVEY." National Research Council. 2004. An Assessment of the Small Business Innovation Research Program: Project Methodology. Washington, DC: The National Academies Press. doi: 10.17226/11097.
×
Page109

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Draft Program Manager Survey I Basic schedule II Outreach III Topic development IV Phase I Selection V Phase I Tracking VI Phase I Program Characteristics VII Phase II Application Support and Preparation VIII Phase II Selection IX Phase II Tracking X Phase II Program Characteristics XI Phase III Application Support and Preparation XII Phase III Selection XIII Phase III Tracking XIV Phase III Program Characteristics XV Outcomes Analysis XVI Electronic services I Basic Schedule Please identify completion dates for the following activities in 2002-2003 cycle a Phase I Topics set Solicitation published Application deadline First step review completed (for basic program compliance) In-agency review completed Outside reviews completed Initial selection completed Final selection completed Grant begins b Phase II Phase I awardees invited to apply Pre-application workshops completed Application deadline In-house review completed Outside reviews completed Initial selection completed Final selection completed Grant begins II Outreach a How many outreach conferences does your staff attend each year [measured in staff attendances] b How important are the following elements of your outreach program [sum to 100%] i SBIR conferences ii State conferences iii Academic conferences c How much do you rely on your web site to provide basic information to applicants [0-100%] 101

d Do you partner with the following to provide outreach services: i Business organizations ii State and other non-Federal government agencies iii Academic units iv Private firms e What share of your work year is consumed by outreach activities [0-100%] f can you identify the most successful outreach activities? i Those drawing the largest number of applicants ii Those that are most cost effective III Topic Development a Who initially develops the topics for solicitation b Who edits or adjusts them c Who makes the final topic selection d What criteria are used to guide the development of topics [please weight the influence of the following, summing to 100%] i Technical needs of the agency ii Cutting edge of the field iii Likely commercial technologies iv Other (describe) e On average, what percentage of topics change substantially year on year? f Is "topic management" (e.g. topic narrowing) used to help manage the number of proposals received? IV Phase I Selection a How many Phase I applications were received in i 2003 ii 2002 iii 2001 b Is there an initial determination that the proposal falls within the scope of the solicitation i Yes/no ii If yes, who makes that determination c Is An initial technical assessment made in house? i Yes/no ii If yes, who makes that determination d Outside reviewers i Maximum number used for a proposal ii Minimum number used iii Sources of reviewers [please assign percentages, summing to 100%] i) Agency staff ii) Academics iii) Industry scientists iv) Other industry personnel v) Other e Commercial review i Is a commercial review conducted for Phase I projects (Y/N) ii If yes, who makes that determination i) Agency staff ii) Academics iii) State or other govt economic development officers iv) Other industry personnel v) Consultants vi) Other f Do Phase I awards in practice range in amount, or are they almost always awarded at or near the maximum value (currently $100,000) 102

g In scoring proposals, please assign relative weights to the following areas and sub areas. Total should sum to 100% i Technical merit i) Significant advance in field ii) Appropriate technical approach iii) Strength of scientific approach iv) PI qualifications v) Adequate facilities vi) Sufficient and qualified staff ii Commercial potential i) Market understanding ii) In-company commercial capacity iii Agency benefit i) Addresses identified agency technical/scientific need ii) Endorsed by relevant COTAR iii) Other program agency staff (e.g. procurement officers) h Are all administratively acceptable proposals sent for outside review? i) If not, who makes that decision ii) Which of the following criteria are used to make that determination a) Obvious technical weakness b) Not R&D c) Other DOE criteria e) Other DOE criteria f) Other DOE criteria i Who initially scores and ranks proposals i SBIR office staff ii Agency program staff j Who makes final selection of winners i SBIR office staff ii Agency program staff k What percentage of final scores deviate substantially from the average of outside reviewer scores (i.e. how much flexibility does the program officer have?) i 0-20% ii 21-40% iii 41% or more l How is the funding for each topic or program area decided? i Strictly on the basis of funds to SBIR provided by that program ii By the SBIR office iii Other m Who decides how to allocate that funding across winning proposals i Allocations are for practical purposes fixed (very few deviate from the standard award) ii SBIR staff iii Agency program staff n Are the following criteria known to selecting staff or reviewers? Do they play a role in selection? i Geographical location of proposed work ii Minority status of PI or proposing company 103

iii Prior awards iv Outcomes from prior awards V Phase I Tracking a Is any contact maintained by SBIR staff with Phase I awardees during the course of Phase I b The final report for Phase I is sent to the following: i SBIR office staff ii The relevant agency technical contact iii Contracts office iv Other c The final report is assessed to evaluate Phase I outcomes (yes/no) d If yes, who makes that evaluation i SBIR office staff ii The relevant agency technical contact iii Contracts office iv Other e Are Phase I recipients ever surveyed for program satisfaction? f If so, are results used for program modification (please explain/give examples) VI Phase I Program Characteristics a Multiple awards i On average, what percentage of awards go to companies with no prior SBIR wins in your agency? ii On average, of the companies winning Phase I awards, how many have never won an award from your agency before iii On average, how many awards does your biggest Phase I award winner receive b Minority/women led companies. If known, what percentage of awards go to minority/women- led companies? c Does your program have a fixed start and fixed end date. If so, what are they for 2003? VII Phase II Application Support and Preparation a Do you directly solicit or encourage Phase I recipients to apply for Phase II awards? b If so, do you solicit all Phase I awardees c How long before the Phase II deadline do you solicit interest? Do you provide any assistance with the development of a Phase II proposal? i Assistance with the business case ii Assistance with matching funds iii Assistance with technology partnering or other technology support d Do you now plan to encourage non-Phase I companies to apply directly for Phase II e If that was permitted would you support such a change of policy f What percentage of Phase I recipients apply for Phase II g Are Phase I recipients permitted to apply to subsequent Phase II competitions (a year or two behind their "cohort.") i If so, are there any limitations to the delay VIII Phase II Selection a Is An initial technical assessment made in house? i Yes/no ii If yes, who makes that determination b Outside reviewers i Maximum number used for a proposal ii Minimum number used iii Sources of reviewers [please assign percentages, summing to 100%] i) Agency staff 104

ii) Academics iii) Industry scientists iv) Other industry personnel v) Other c Commercial review i Is a commercial review conducted for Phase II projects (Y/N) ii If yes, who makes that determination i) Agency staff ii) Academics iii) State or other govt economic development officers iv) Other industry personnel v) Consultants vi) Other d Do Phase II awards in practice range in amount, or are they almost always awarded at or near the maximum value (currently $750,000) i If awards vary, please provide i) The average size of the awards for the most recent year ii) the number of awards not receiving the maximum amount iii) The number of awards greater than the standard maximum (i.e. more than $750,000) e In scoring proposals, please assign relative weights to the following areas and sub areas. Total should sum to 100% i Technical merit i) Significant advance in field ii) Appropriate technical approach iii) Strength of scientific approach iv) PI qualifications v) Adequate facilities vi) Sufficient and qualified staff ii Commercial potential i) Market understanding ii) In-company commercial capacity iii Agency benefit i) Addresses identified agency technical/scientific need ii) Endorsed by relevant COTAR iii) Other program agency staff (e.g. procurement officers) f Are all administratively acceptable proposals sent for outside review? i) If not, who makes that decision ii) Which of the following criteria are used to make that determination i) Obvious technical weakness ii) Not R&D iii) Other DOE criteria iv) Other DOE criteria v) Other DOE criteria g Who initially scores and ranks proposals i SBIR office staff ii Agency program staff h Who makes final selection of winners 105

i SBIR office staff ii Agency program staff i What percentage of final scores deviate substantially from the average of outside reviewer scores (i.e. how much flexibility does the program officer have?) i 0-20% ii 21-40% iii 41% or more j How is the funding for each topic or program area decided? i Strictly on the basis of funds to SBIR provided by that program ii By the SBIR office iii Other k Who decides how to allocate that funding across winning proposals i Allocations are for practical purposes fixed (very few deviate from the standard award) ii SBIR staff iii Agency program staff l Are the following criteria known to selecting staff or reviewers? Do they play a role in selection? i Geographical location of proposed work ii Minority status of PI or proposing company iii Prior awards iv Outcomes from prior awards IX Phase II Tracking a Is any contact maintained by SBIR staff with Phase II awardees during the course of Phase II b The final report for Phase II is sent to the following: i SBIR office staff ii The relevant agency technical contact iii Contracts office iv Other c The final report is assessed to evaluate Phase II outcomes (yes/no) d If yes, who makes that evaluation i SBIR office staff ii The relevant agency technical contact iii Contracts office iv Other e Are Phase II recipients ever surveyed for program satisfaction? f If so, are results used for program modification (please explain/give examples) X Phase II Program Characteristics a Multiple awards i On average, what percentage of awards go to companies with no prior SBIR wins in your agency? ii On average, of the companies winning Phase II awards, how many have never won an award from your agency before (other than the related Phase I) iii On average, how many Phase II awards does your biggest Phase II award winner receive in each year b Minority/women led companies. If known, what percentage of awards go to minority/women- led companies? c Does your program have a fixed start and fixed end date. If so, what are they for 2003? XI Phase III Application Support and Preparation 106

a Do you directly solicit or encourage Phase I recipients to apply for Phase III awards? b If so, do you solicit all Phase II awardees c Do you provide any assistance with the development of a Phase III proposal? i Assistance with the business case ii Assistance with identifying and acquiring funding iii Assistance with technology partnering or other technology support iv Assistance with general marketing v Assistance with marketing within your agency d Can Phase I companies skip directly to Phase III f What percentage of Phase II recipients apply for Phase III XII Phase III Selection a Does your agency have a formal Phase III program, providing further funding or support for companies completing Phase II's but not quite ready for full commercialization b If so, does you agency provide funding i If so, what is the average size of the Phase III award c Does the award require matching funds i What is the required match? ii Is advantage given to companies which provide a higher match iii Are there requirements or advantages attached to specific sources of the match (e.g. government agency funding, private venture money, etc. d Is a further technical assessment made in house? i Yes/no ii If yes, who makes that determination e Are outside reviewers used for Phase II proposals. If so, i Maximum number used for a proposal ii Minimum number used iii Sources of reviewers [please assign percentages, summing to 100%] i) Agency staff ii) Academics iii) Industry scientists iv) Other industry personnel v) Other f Commercial review i Is a detailed review of commercial opportunities conducted for Phase III projects (Y/N) ii If yes, who conducts that review i) Agency staff ii) Academics iii) State or other govt economic development officers iv) Other industry personnel v) Consultants vi) Other g Do Phase III awards in practice range in amount, or are they almost always the same amount (and what is that amount) i If awards vary, please provide i) The average size of the awards for the mort recent year ii) the number of awards not receiving the maximum amount h In scoring proposals, please assign relative weights to the following areas and sub areas. Total should sum to 100% i Technical merit i) Significant advance in field ii) Appropriate technical approach iii) Strength of scientific approach 107

iv) PI qualifications v) Adequate facilities vi) Sufficient and qualified staff ii Commercial potential i) Market understanding ii) In-company commercial capacity iii) Advanced marketing and distribution plans iv) Existing marketing and distribution arrangements v) Further product development plans iii Agency benefit i) Addresses identified agency technical/scientific need ii) Endorsed by relevant COTAR iii) Other program agency staff (e.g. procurement officers) i Is there are formal competition or are proposals treated case by case3 j Are Phase III proposals subject to outside review? If so, to whom are they sent? k Who initially scores and ranks proposals i SBIR office staff ii Agency program staff l Who makes final selection of winners i SBIR office staff ii Agency program staff m How is the funding for each topic or program area decided? i Strictly on the basis of funds to SBIR provided by that program ii By the SBIR office iii Other n Who decides how to allocate that funding across winning proposals i Allocations are for practical purposes fixed (very few deviate from the standard award) ii SBIR staff iii Agency program staff o Are the following criteria known to selecting staff or reviewers? Do they play a role in selection? i Geographical location of proposed work ii Minority status of PI or proposing company iii Prior awards iv Outcomes from prior awards XIII Phase III Tracking a Is any contact maintained by SBIR staff with Phase III awardees during the course of Phase III b The there a final report for Phase III. If so, is it sent to the following: i SBIR office staff ii The relevant agency technical contact iii Contracts office iv Other c The final report is assessed to evaluate Phase III outcomes (yes/no) d If yes, who makes that evaluation i SBIR office staff ii The relevant agency technical contact iii Contracts office iv Other e Are Phase III recipients ever surveyed for program satisfaction? f If so, are results used for program modification (please explain/give examples) XIV Phase II Program Characteristics 108

XV Outcomes analysis a Which of the following indicators of success do you regularly capture from your grantees or other sources Phase I Phase II Phase III Commercialization Actual sales of related products Expected sales of SBIR-related products or services Further development funding Investment in the company Business plan Marketing staff Distribution arrangements Licensing agreement Trademarks filed/granted Copyrights filed/granted Agency mission Knowledge adoption by agency Knowledge adoption by prime contractor Other agency indicators Field development Patents filed/granted Scientific publications Scientific conference presentations Other field development activities b on what criteria is the success of the program offer judged i Efficient program management (grants made on time) ii Commercial outcomes iii Agency outcomes iv Customer (grantee) satisfaction XVI Electronic Services a Which of the following elements are available online at your agency: I Phase I application ii Phase I reporting iii Phase II application iv Phase II reporting v Survey capability b What other services are available electronically c What other services would you like to make available electronically 109

Next: 4. CASE STUDY TEMPLATE »
An Assessment of the Small Business Innovation Research Program: Project Methodology Get This Book
×
Buy Paperback | $43.00 Buy Ebook | $34.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In response to a Congressional mandate, the National Research Council conducted a review of the SBIR program at the five federal agencies with SBIR programs with budgets in excess of $100 million (DOD, NIH, NASA, DOE, and NSF). The project was designed to answer questions of program operation and effectiveness, including the quality of the research projects being conducted under the SBIR program, the commercialization of the research, and the program's contribution to accomplishing agency missions. This report describes the proposed methodology for the project, identifying how the following tasks will be carried out: 1) collecting and analyzing agency databases and studies; 2) surveying firms and agencies; 3) conducting case studies organized around a common template; and 4) reviewing and analyzing survey and case study results and program accomplishments. Given the heterogeneity of goals and procedures across the five agencies involved, a broad spectrum of evaluative approaches is recommended.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!