National Academies Press: OpenBook

Evaluation of the Minerva Research Initiative (2020)

Chapter: 4 Research Supported by the Minerva Program: Quantity and Quality

« Previous: 3 Processes of the Minerva Program
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

4

Research Supported by the Minerva Program: Quantity and Quality

This chapter reviews the research output that has been supported by Minerva grants. To gain an understanding of this research, the committee relied on the information sources described in Chapter 2, including Department of Defense (DoD) documents and interviews with DoD staff, the grantee survey, the sponsored research administrators survey, discussions with national security experts and other stakeholders at public committee meetings, and the Minerva Conference. The committee also analyzed the outputs reported by the Minerva grantees.

The first section of this chapter describes the committee’s review and analysis of the grant outputs. The second section provides an overview of the perceptions of Minerva research among those who provided input to the committee. The chapter ends with a summary.

REVIEW OF OUTPUTS OF THE MINERVA RESEARCH INITIATIVE GRANTS

Assessment of research outputs often relies on both qualitative peer review and quantitative assessment using various types of bibliometric indicators (Hicks and Wouters, 2015; Moed, 2017). Strengths and weaknesses have been identified in both approaches (deRijcke et al., 2016; Siler et al., 2015), but it can be argued that peer review is the stronger method for evaluating the outputs of individual researchers and small research units, while quantitative bibliometrics are more useful in assessing the outputs of large and heterogeneous groups of researchers (Sugimoto and Lariviere, 2018). Because the committee’s evaluation of the Minerva Research Initiative is

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

targeted at the program level, it relies primarily on quantitative indicators to assess the program’s research outputs while also considering input from stakeholders for additional context. The committee considered a variety of additional approaches and analyses for carrying out this task, but rejected them because of their limitations.

Accordingly, the committee’s review of the outputs of Minerva-sponsored research involved compiling lists of the outputs, coding and summarizing the outputs by type and subtype, and examining in greater depth the journals in which articles were published and the citations of those articles by other researchers as an indicator of contributions to the social science knowledge base. The committee also considered outreach and dissemination to nonacademic audiences by reviewing nonacademic publications reported by Minerva principal investigators (PIs). As discussed in Chapter 2, one of the challenges for the evaluation was that the Minerva program had been in existence for less than 10 years at the time this study was launched, and many of the grants were relatively recent and/or still active. The evaluation results could thus be affected because such grants likely have produced fewer outputs and fewer citations of their publications relative to grants initiated earlier.

Output Data and Coding

As part of the committee’s grantee survey, grantees were asked to provide lists of their Minerva-sponsored outputs, which included peer-reviewed publications, any other publications (e.g., papers, manuscripts, reports, op-ed pieces), presentations (e.g., conference presentations, briefings, or testimony); and any other products, such as publicly available software, websites, databases, patents, licenses, or training materials that resulted from their Minerva grant(s) (for details, see Chapter 2 and Appendix E).

As discussed in Chapter 2, to facilitate response to the survey, grantees could submit information about their outputs by either responding to the survey questions or uploading their curriculum vitae (CV), highlighting outputs that resulted from their Minerva grant. The survey questions asked about “peer-reviewed publications” and “other publications” (see Appendix E, Q17 and Q18). However, in the case of grantees who chose to upload a CV, this distinction often was unknown. Therefore, the peer-reviewed status of all journal publications was ascertained according to three library databases: Scopus, Proquest, and Ulrich’s.

The definitions of peer review used by these three sources are quite broad. Scopus, for example, defines peer review as including various strategies, such as main editor peer review, open peer review, single-blind peer review, and double-blind peer review (Scopus, 2019a,f). To designate journals as peer-reviewed, Proquest uses Ulrich’s (Ulrich’s Web, n.d.), which uses the term “refereed” for such journals. Ulrich’s website states, “Refereed serials

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

include articles that have been reviewed by experts and respected researchers in specific fields of study including the sciences, technology, the social sciences, and arts and humanities” (Ulrich’s Web, n.d.). It is possible, moreover, that some articles counted by the committee as being published in a peer-reviewed journal may actually have been review or opinion articles that may not have been peer-reviewed. Additionally, rules for which types of articles are peer-reviewed vary across journals. For example, the website of Nature Research states that the journal subjects articles, letters, brief communications, technical reports, analyses, resources, reviews, perspectives, and insight articles to peer review, while the website of the journal International Organization states that it sends all submissions, except letters to the editor, to reviewers. In addition, some journals (e.g., the New England Journal of Medicine) include articles that are invited but may not be peer-reviewed. The committee did not carry out a more refined assessment of the peer-review policies of different journals (i.e., searching each journal’s website to identify which types of articles were peer-reviewed).

Finally, although books and conference proceedings may also be considered to be peer-reviewed, the committee limited the designation of peer-reviewed status for this evaluation to journals. This was primarily because standard library databases, such as Scopus (2019a) and Web of Science (Testa, 2017; Web of Science, 2019), cover limited types of books and conference proceedings; thus they could not be used to check the peer-reviewed status of all the books and conference proceedings reported by PIs. Appendix H describes additional coding procedures for publications and other types of outputs.

Number of Grantees Who Provided Information on Outputs

As reported in Chapter 2, 67 PIs provided information on outputs related to their Minerva grants. Four of these PIs stated that they were at a point too early in their grant to have produced outputs. Therefore, the following results pertain to information on outputs provided by 63 PIs. Six of these PIs reported only “other products” (i.e., publicly available software, websites, databases, patents, licenses, or training materials), but no publications or presentations. Therefore, the following summary of reported publications and presentations includes information from 57 PIs. This summary is followed by information on other products.

Summary of Minerva-Sponsored Publications and Presentations Reported by Principal Investigators

Table 4-1 provides a summary of the publications and presentations reported by PIs responding to the grantee survey. As discussed, the committee did not receive lists of outputs from all PIs, so the table represents a subset

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

TABLE 4-1 Summary of Publications and Presentations Reported by Principal Investigators (PIs)

Output Type All 57 PI Reports Nonzero PI Reports
Number of Outputs Percent of All Outputs Range Median Total (Percent of 57) Range Median
Publications
Peer-reviewed publications 152 13 0–18 2 39
(68%)
1–18 3
Other, non-peer-reviewed publications 333 28 0–124 1 36
(63%)
1–124 2.5
Books and book chapters 62 5 0–12 0 21
(37%)
1–12 1
Conference proceedings 28 2 0–18 0 5
(9%)
1–18 3
Publications in progress 47 4 0–7 0 17
(30%)
1–7 2
Presentations 582 48 0–80 5 44
(77%)
1–80 8
Total Publications and Presentations 1,204 100% 1–133 10 57
(100%)
1–133 10

of all Minerva grants. (See Chapter 2 for further detail on this and other limitations of data on Minerva outputs obtained from the grantee survey.) Despite limitations of the data, the committee’s focus on relatively more robust measures (such as medians and percentages rather than means and totals) should ensure that the analyses and conclusions concerning the outputs from Minerva grants are reasonably robust.

Columns 2 and 3 of Table 4-1 show the numbers of publications and presentations produced by Minerva grantees by type, and for each type, the proportion of the total reported by the 57 PIs who provided this information. Slightly more than half of all the outputs (52%) reported by these PIs consisted of written materials in the form of peer-reviewed journal publications, other publications, books or book chapters, and conference proceedings; the other half (48%) was presentations, including briefings and testimony. Columns 4 and 5 of Table 4-1 show the range and median number of each type of publication and presentation reported across the 57 PIs. The ranges are quite wide, and some output categories (other, non-peer-reviewed publications and conference proceedings) are dominated by one extraordinarily productive grantee.

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

Columns 6 through 8 of the table present the same information without the PIs who reported no outputs in certain categories. While funding studies that ultimately lead to no outputs would be a concern, examining this additional information about the distribution of outputs is useful because as noted, some of the grants included in the evaluation were in the very early stages after being awarded, and some were focused primarily on producing the types of outputs that are not included in this table. When PI observations that were zero are excluded (Column 6), slightly higher medians are seen in Column 8. For example, whereas the median for peer-reviewed publications is two when all 57 PIs are counted, the median is 3 when only observations for PIs with at least one peer-reviewed publication are used (68% of the 57 PIs reported one or more peer-reviewed publication).

Whereas Table 4-1 provides summary data for all publications and presentations reported by all responding PIs, Table 4-2 provides summaries, by year of the PI’s grant (2009 to 2017), for all publications/presentations (Columns 2–4) and for the subset of peer-reviewed publications (Column 5–7). The pattern of outputs is generally in the expected direction over time, from higher numbers produced in earlier years (except the initial startup years, 2009 and 2010) to lower numbers produced in more recent years, reflecting the more limited time available to recent grantees to publish and give presentations. This same general pattern can be seen in Appendix I, which provides a similar view of the range and median of each type of publication and presentation, broken out by year of the grant.

TABLE 4-2 Summary of All Publications and Presentations and Peer-Reviewed Journal Publications, by Year of Grant, Reported by Principal Investigators (PIs)

Year of Grant (No. of PIs Reporting) All Publications and Presentations Peer-Reviewed Publications
Number Range Median Number Range Median
2009 (3) 236 1–124 111 18 0–18 0
2010 (12) 129 1–30 8 30 0–5 2.5
2012 (5) 237 9–133 24 16 0–5 4
2013 (7) 187 16–37 26 36 0–15 5
2014 (8) 189 3–45 20.5 20 0–7 2
2015 (10) 113 2–32 7.5 24 0–6 2
2016 (7) 87 2–52 6 4 0–2 0
2017 (5) 26 1–9 6 4 0–4 0
Total All Years (57) 1204 1–133 10 152 0–18 2
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

Journal-Level Impact Metrics for Peer-Reviewed Publications Reported by Principal Investigators

Journal-level impact metrics can be used to assess the quality and influence of journals, but some approaches to constructing such metrics and some uses of the metrics have been criticized (Agarwal et al., 2016; Sugimoto and Lariviere, 2018). It has been argued, for example, that calculating citations over only a 2-year period, as is the case for some metrics, provides inadequate time for such a field as the social sciences to accrue citations. Journal impact metrics also have been used inappropriately to assess individual researchers even though they were developed as journal-level indicators.

Although all journal-level impact metrics have limitations, two with relatively strong measurement properties are CiteScore and the Scimago Journal Ranking (SJR) (González-Pereira et al., 2010; James et al., 2018; Sugimoto and Lariviere, 2018). A journal’s CiteScore for 2017, for example, counts the citations made in 2017 to documents published by the journal in the previous 3 years (2014, 2015, or 2016), and divides the total number of citations by the number of documents published by the journal in those same years (Scopus, 2019d). The SJR divides the weighted number of citations received in a year by the number of documents published in the previous 3 years. The weights are intended to reflect the prestige of the journals in which the citations appear (Scopus, 2019e).

Consistent with recommendations in the literature to use multiple indicators (Hicks and Wouters, 2015; Moed, 2017), the committee gathered both CiteScore and SJR metrics for journals in which Minerva researchers published. These metrics were accessed through the National Academies’ research library via the Scopus database, which is considered to have wider coverage than Web of Science, including good coverage of the social sciences (Sugimoto and Lariviere, 2018). The committee used CiteScore and SJR as journal-level indicators only, not to assess individual researchers.

The detailed results of these metrics for 112 journals in which the 57 Minerva PIs reported publishing are provided in two appendix tables. The table in Appendix J1 shows the CiteScore and SJR metrics for 82 peer-reviewed journals. The table in Appendix J2 lists the remaining 30 of the 112 journals, for which CiteScore and SJR journal-level metrics were not available. Of these 30 journals, 12 were found to be peer-reviewed according to the designation explained earlier, while 18 were found not to be peer-reviewed. Impact factor metrics may not be available for journals because they are not designated as peer-reviewed, are too new to have established data for calculating journal impact factors, or do not meet criteria for being included in the major library citation indexes that feed into the CiteScore or SJR databases. Columns 1 and 2 of both appendix tables list the journal

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

names and the number of Minerva-sponsored articles published in each. Across the 82 journals for which CiteScore and SJR metrics were available (presented in Appendix J1), 138 Minerva-supported articles were published, based on the data available to the committee from the grantee survey. Column 3 of Table J1 contains for each journal a subject field to which it is assigned by Scopus (Scopus, 2019g). Columns 4–6 (shaded) contain three metrics based on CiteScore: (1) the 2017 CiteScore, (2) a rank that allows for comparing journals within a subject field (e.g., 4/1029 means that a journal’s citation impact ranks 4th out of 1,029 journals in that journal’s subject field), and (3) a percentile that allows for comparing journals across subject fields1 (Scopus, 2019c). Columns 7–9 (unshaded) of Table J1 contain three similar SJR metrics. Instead of percentiles, SJR calculates quartiles, with Q1 being the highest. The same subject fields used to gather CiteScore metrics were used to gather SJR metrics. The table is sorted, first, by the CiteScore percentile, from highest percentile (99th) to lowest, and then alphabetically by the journal name.

Table 4-3 summarizes the CiteScore and SJR rankings for the journals in which the 138 articles produced by Minerva-supported grants were published, displaying the number of articles for each combination of CiteScore decile and SJR quartile. The table shows that 46 percent (63 of 138) of the articles reported by PIs were published in journals that fell into the top CiteScore decile and the top SJR quartile. Another 29 percent (40 of 138) fell into the second or third CiteScore decile and the top SJR quartile. These findings reveal that articles resulting from Minerva-supported grants have been published in top-ranked journals.

Table 4-4 shows the journal rankings for articles produced by Minerva-supported grantees when the articles are sorted into 13 subject fields.2 The field with the most articles—53—is political science and international rela-

___________________

1 According to Scopus (2019c), “CiteScore Percentile indicates the relative standing of a serial title in its subject field. The Percentile and Ranking are relative to a specific Subject Area.” In displaying percentiles, Scopus first shows the subject area in which the source performed the best, and in some cases it also displays other subject fields in which the journal performed second, third, or fourth best. For its evaluation, the committee used the percentile for the area in which the journal performed the best, except in three cases in which that area did not appear to reflect most closely the goals of the journal. For example, Public Choice was rated highest in the area of sociology and political science, but the journal emphasizes economics as its dominant subject area, so the second-highest percentile, in the field of economics and econometrics, was used.

2 In some cases, similar fields with only one journal were combined to form larger subject field groups. For example, the following eight subject fields that each have one or two journals were combined into one collapsed subject field called computer science, engineering, and mathematics: computer graphics and design, computer science application, general computer science, media technology, control and optimization, control and systems engineering, electrical and electronic engineering, and applied mathematics.

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

TABLE 4-3 Journal Rankings for Minerva-Supported Articles

Number of Articles
CiteScore Percentile Scimago Journal Ranking (SJR) Quartile Total Row Percentagea
Q1 Q2 Q3 Q4
90–99 63 0 0 0 63 46
80–89 28 9 0 0 37 27
70–79 12 4 3 0 19 13
60–69 2 2 0 0 4 3
50–59 0 7 2 0 9 7
40–49 1 1 1 0 3 2
30–39 0 0 0 0 0 0
20–29 0 0 0 1 1 1
10–19 0 0 0 2 2 1
0–9 0 0 0 0 0 0
Total 106 23 6 3 138 100
Column 77 17 4 2 100
Percentageb

aRow total ÷ 138.

bColumn total ÷ 138.

tions. Of the articles in that field, 55 percent were published in journals ranked in the top CiteScore decile, while 89 percent and 91 percent, respectively, were published in journals ranked in the top quartile according to CiteScore and SJR. The table also reveals considerable variety in the subject fields of the journals in which articles produced by Minerva-supported grantees have been published—consistent with the intent of the Minerva Research Initiative to support interdisciplinary research.

Article-Level Impact Metrics for Peer-Reviewed Publications Reported by Principal Investigators

In addition to the quality of journals in which Minerva researchers have published, the research can be assessed by examining the extent to which grantees’ publications are cited in the work of other researchers. As observed by Sugimoto and Lariviere (2018, p. 64), “For the past few decades scholarly impact has been defined as an effect upon the scientific community, as measured through citations.” Hicks and Melkers (2013) point to various theoretical interpretations of how citation counts measure scholarly impact (e.g., as a measure of importance, intellectual influence,

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

TABLE 4-4 Journal Rankings for Minerva-Supported Articles, by Subject Field

Journal Subject Field (number of journals) Number of Articles Number and Percentage of Articles in Journals Ranked in
CiteScore SJR
Top Decile Top Quartile Top Quartile
N % N % N %
Political Science & International Relations (n = 23) 53 29 55 47 89 48 91
Sociology & Political Science (n = 11) 15 10 67 14 93 13 87
Computer Science, Engineering, Mathematics (n = 9) 9 5 56 7 78 8 89
General Social Sciences, General Arts & Humanities, Multidisciplinary, Development (n = 8) 12 4 33 9 75 8 67
Demography (n = 5) 6 2 33 6 100 6 100
Law (n = 5) 12 6 50 10 83 9 75
Psychology, Health (n = 5) 5 4 80 5 100 5 100
Anthropology and Cultural Studies (n = 4) 4 0 0 1 25 0 0
Management Related, Public Administration (n = 4) 5 0 0 2 40 4 80
History (n = 2) 2 0 0 1 50 1 50
Economics (n = 2) 4 1 17 1 17 1 17
General Sciences (Agriculture, Biology, Earth, Planet) (n = 2) 2 1 50 2 100 2 50
Religious Studies (n = 2) 9 1 11 9 100 1 11
All Fields 138 63 46 114 83 106 77

NOTE: SJR = Scimago Journal Ranking.

or authoritativeness). They suggest that the field has converged in viewing citations as an indicator of impact through their “social and cognitive influence on subsequent research” (p. 7). According to these authors, analyses of citations should include normalization that accounts for subject field and year of publication, and the results should be summarized using

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

percentile distributions.3 With such an approach, “analysts compare the actual citation counts to the rate to be expected” (Hicks and Melkers, 2013, p. 9) and are more likely to make correct assessments and fair comparisons (Bornnman and Marx, 2013).

Taking this approach, the committee used Scopus to obtain three article-level metrics: (1) counts of the number of citations for every article published by a Minerva grantee; (2) “benchmarked percentiles that show how citations received by each document compared with the average for similar documents, relative to a certain subject field” (Scopus, 2019b); and (3) the field-weighted citation impact (FWCI) metric, which is the “ratio of the total citations actually received to the total citations that would be expected based on the average of the subject field over a three-year window. A value greater than 1.00 means the document is more cited than expected according to the average” (Scopus, 2019b).

Appendix K presents a table of the results for each of 125 reported peer-reviewed Minerva publications for which article-level metrics could be found in Scopus,4 sorted by the benchmarking percentile, from high to low. Of these 125 publications reported by Minerva PIs, 20 had zero or one citation, in some cases probably because the article had been published very recently or was still forthcoming; these articles had FWCIs equal to zero. The median FWCI of the 125 articles was 1.9. This implies that the median Minerva-sponsored publication has been cited more often than expected relative to the global average of 1.0. Indeed, about two-thirds (85) of the 125 Minerva publications for which this information was available had an FWCI exceeding 1.0.

With regard to the percentiles, percentiles relative to subject fields were not assigned by Scopus for 22 of the 105 articles with nonzero FWCIs.5 The remaining 83 articles are presented first in the table in Appendix K, sorted by percentile, in descending order. Of these, 29 percent fell in the highest decile (the 90th–99th percentile), 66 percent fell in the highest quartile (75th–99th percentile), and 86 percent were above the median, relative to their subject fields. These percentages are shown in Table 4-5, which provides a cross-tabulation of the Minerva articles by FWCI and benchmarking

___________________

3 Personal communication with Diana Hicks, March 25, 2019.

4 Journal-level metrics (CiteScore and SJR) were gathered from Scopus for journals in which 138 Minerva-supported articles had been published. However, the article-level metrics reported in this section could be found in Scopus for only 125 articles. The metrics for the remaining 13 articles could be missing for varied reasons (e.g., a published article title may have been somewhat different from that reported on the grantee survey).

5 Scopus explains that “citation benchmarking takes into account: (a) the date of publication, (b) the document type, and (c) disciplines associated with its source. Citation benchmarking compares articles within an 18 month window and is computed separately for each of its sources’ disciplines. The Citation Benchmarking only appears when compared to all three criteria. A minimum set of 500 similar articles are required” (Scopus, 2019b).

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

percentile. The table also shows how the 22 articles with nonzero FWCIs but no assigned percentiles were distributed across FWCI categories.

Table 4-6 compares article-level benchmarking percentiles with journal-level CiteScore percentiles for the 83 articles for which both of these metrics were available. Twenty-two percent (18) of the 83 articles ranked in the top decile on both metrics, suggesting they were published in high-quality

TABLE 4-5 Summary of Article-Level Metrics for Minerva-Supported Articles

Benchmarking Percentile Number of Articles Row Percentage (excluding not assigned)a
Field-Weighted Citation Impact Total
5 2, < 5 1, < 2 > 0, < 1 = 0
90–99 15 9 0 0 0 24 29
75–89 1 18 11 1 0 31 37
50–74 0 3 6 7 0 16 19
30–49 0 0 1 11 0 12 14
Not Assigned 9 6 6 1 20 42
Total 25 36 24 20 20 125 100
Column Percentageb 20 29 19 16 16 100

aRow total ÷ 83.

bColumn total ÷ 125.

TABLE 4-6 Comparison of Article Impacts and Journal Rankings for Minerva-Supported Articles

Article-level Benchmarking Percentile Number of Articles Row Percentagea
Journal-Level CiteScore Percentile Total
90–99 75–89 50–74 30–49 <30
90–99 18 4 2 0 0 24 29
75–89 17 13 1 0 0 31 37
50–74 4 9 3 0 0 16 19
30–49 6 1 5 0 0 12 15
Total 45 27 11 0 0 83 100
Column Percentageb 54 33 13 0 0 100
Number Not Assigned a Benchmarking Percentile 18 24 7 3 3 55

aRow total ÷ 83.

bColumn total ÷ 83.

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

journals and had an impact on other researchers, as reflected by citations. Sixty-three percent (52) of the articles were in the top quartiles of both rankings. The table shows some tendency for articles to be ranked higher on the journal-level metric than on the article-level metric.

Outreach and Dissemination to Nonacademic Audiences

As noted in the earlier section summarizing the publications and presentations reported by PIs, in addition to 152 peer-reviewed academic publications and 582 presentations, Minerva PIs reported 333 other publications that included non-peer-reviewed journal articles, working papers, paper series, research briefs, commentaries and op-ed pieces, blogs, and newsletters. At least some of these publications appear to represent outreach and dissemination efforts aimed at nonacademic audiences to build public understanding of Minerva-supported research and highlight its policy implications. These publications appeared in a wide array of sources, such as The New York Times, The Washington Post, Foreign Affairs Snapshots, Scientific American, The Atlantic, Cipher Brief, The Wire, McLeans, The National Interest, and Politico, as well as international publications. Subject areas encompassed the breadth of Minerva grants since 2008 (see Appendix C for a list of grant titles). Examples of the subject areas covered are

  • uniting warring armies after civil war;
  • coups, elections, and the state of democracies;
  • the new dictators: threats, fears, and personalism;
  • humanitarian crises and refugee assistance;
  • cyberdeterrence, command, security, response, and partnerships;
  • the influence of world powers on developing or warring nations;
  • managing versus resolving conflicts;
  • what pirates want and where they strike;
  • terrorist groups, how they communicate, and female extremists; and
  • the weaponization of children.

Other Minerva-Sponsored Products Reported by Principal Investigators

This section describes other types of outputs and sharable resources reported by Minerva grantees, such as software, maps and mapping tools, websites, databases, patents and licenses, training materials, and models/methodologies. Thirty-three PIs reported developing a total of 85 such other products during the course of their Minerva grants. Box 4-1 provides an overview of the wide variety of the products developed, which might be as important, or more important than publications. The list is not exhaustive.

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

STAKEHOLDER AND EXPERT PERCEPTIONS OF THE QUALITY OF MINERVA RESEARCH

In addition to its analysis of the outputs generated by the Minerva grants, the committee reached out to a broad range of stakeholders and national security experts and solicited input on the Minerva program through a public comment mechanism in an effort to better understand perceptions

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

of the program’s contributions to social science research on national security issues. These views are undeniably subjective, but they provide useful context for and an additional dimension to the bibliometric data reported above.

There appears to be broad agreement among both DoD staff and external stakeholders who provided input at the committee’s public meetings that the Minerva Research Initiative is a unique program, and that the Minerva grants have attracted some top scholars and produced some high-quality research. In recent years, the program has struggled with staffing challenges, which have resulted in delayed postings of the grant announcement and a corresponding decline in the number of white papers submitted (the number of white papers was 313 in 2016, 261 in 2017, and 192 in 2018). However, Minerva staff reported a gradual increase in the quality of the proposals received over the years.

Some of the Minerva program managers noted that many Minerva grantees have been particularly productive, and that the quality of their branch’s Minerva portfolio has been stronger than, or at least as strong as, that of the top grants funded through other programs within their branch. DoD senior leadership described the feedback they received about the Minerva research from both DoD staff and the broader national security community as indicating that the research has been well done, useful, and relevant. In a historical overview of the role of academic social science research in national security, Desch (2019, p. 236) observes that “Minerva attracted widespread interest among some leading social scientists.” Box 4-2 highlights projects that have been described by Minerva leadership as being among the studies shaping social science research on national security. A full list of all of the Minerva funded projects is included in Appendix C.

At the same time, current and former DoD staff acknowledged that some of the projects funded by the program over the years have failed to meet DoD’s expectations. The reasons cited ranged from poorly formulated research questions to structural challenges (for example, some grants were too large to operate efficiently). As discussed in Chapter 3, Minerva award decisions are often based on a negotiation that balances the goals of basic research and policy relevance, as well as the goals of the service branches and others within DoD. The DoD interviews suggested that in some cases, perceived policy relevance was the consideration that tipped the argument in favor of funding specific projects, and some have argued that this is the main reason why some projects fell short on scientific merit and the quality of the research. Other DoD staff expressed the view that the best way to make the program more useful would be to increase the policy relevance of projects, although they acknowledged that this is not the primary mission of the program.

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

Considering the unique contributions of the Minerva program, national security experts remarked that the program has been successful in facilitating interdisciplinary research. One perspective voiced by DoD staff was that the small but interdisciplinary and well-funded Minerva grants are a good example of the future of team science as an alternative to large, multidisciplinary, multi-institution research centers. Another DoD staff perspective

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

was that Minerva projects are successful representations of a DoD-wide focus on such research areas as radicalization, stability and resilience, and geopolitical factors, in contrast to the somewhat narrower focus of other social science research programs in the service branches.

National security experts were generally in agreement that the Minerva program has remained true to its vision and objectives over the years. Given the relatively broad scope of the topics prioritized in the funding of projects, they did not raise specific concerns about those topics, but encouraged enough consistency from year to year to allow for a body of knowledge to develop.

With respect to ensuring that the benefits of the funded research are maximized, national security experts encouraged greater outreach, both within and outside of DoD, to communicate about the research and its results and to better understand stakeholder needs. Some examples provided of communities that could benefit from the research were foreign service officers and military academies. The theme of greater outreach was echoed by representatives of professional associations, who urged greater outreach from the Minerva program to their memberships.

SUMMARY AND CONCLUSIONS

The committee’s ability to evaluate systematically all the products of the Minerva grants over the years was limited, and there was no group that could have served as a valid hypothetical comparison group for the volume and quality of the outputs produced by the Minerva grantees. It is clear, however, that the Minerva program has supported research that has been published in top journals and articles that have been cited much more than the average expected of their respective social science fields. In addition to peer-reviewed publications and presentations at academic conferences, Minerva researchers reported outreach and dissemination to nonacademic audiences through research briefs, commentaries, and other publications on the context, meaning, and implications of their research. While journals focused on political science and international relations had the highest number of publications based on Minerva research, Minerva research has been published in journals representing all major social science fields, and other fields, such as computer science, engineering, and mathematics. The diversity of journals that have published Minerva research illustrates the program’s interdisciplinary nature. Likewise, the broad range of policy-relevant products and tools developed, such as software, maps and mapping tools, websites, databases, patents and licenses, training materials, and models/methodologies, indicates that the program values innovative outputs, beyond publications and conference presentations.

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

The input received from stakeholders and national security experts highlighted that the Minerva Research Initiative is a unique program and that its grants have attracted some top scholars and produced some high-quality research that is useful and relevant. The Minerva program has remained true to its vision, funding research that addresses social science questions of interest to all of DoD and the broader national security community. However, the input the committee received from national security experts, professional associations, and DoD staff indicated that the research is not as widely disseminated or utilized as it could be. Further discussion of this issue and the committee’s recommendations for addressing it are included in Chapters 3 and 5.

Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 61
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 62
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 63
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 64
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 65
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 66
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 67
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 68
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 69
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 70
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 71
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 72
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 73
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 74
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 75
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 76
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 77
Suggested Citation:"4 Research Supported by the Minerva Program: Quantity and Quality." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 78
Next: 5 Direction and Vision of the Minerva Program »
Evaluation of the Minerva Research Initiative Get This Book
×
Buy Paperback | $55.00 Buy Ebook | $44.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Minerva Research Initiative is a Department of Defense (DoD) social science grant program that funds unclassified basic research relevant to national security. The goal of the program is to make use of the intellectual capital of university-based social scientists to inform understanding of issues important to DoD and the broader national security community. Evaluation of the Minerva Research Initiative discusses the program's successes and challenges over its first decade of operation, and highlights ways to strengthen the program's foundations and take advantage of opportunities for broadening its reach and usefulness.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!