National Academies Press: OpenBook

Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework (2015)

Chapter: Appendix B: Committee's Response to the Use Case Scenarios Report

« Previous: Appendix A: Use Case Scenarios Report for SMART Vaccines
Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×

B

Committee’s Response to the Use Case Scenarios Report

TABLE B-1
Consultant’s Feedback on Bugs and the Committee’s Action or Response

Consultant’s Feedback (from Table A-1) Committee’s Action or Response
Observation Suggestion

1. Unable to save progress consistently.

The committee should continue to investigate this due to the high level of frustration this can create and should provide the ability to save progress consistently throughout the software.

The software has been modified to allow users to print results showing all key parameters at any stage of the analysis. The addition of the print option helps provide a log of the user preferences.

2. Disease burden percentages appear to add up to 100 percent, but the tool still does not accept them consistently.

The tool should either allow one or more decimal places consistently or should specify the decimal place limit for percentages or better inform the user about the data entry needs.

As part of the redesigned disease data entry page, the calculations adding to 100 percent have been corrected to remove this bug.

3. The total attribute acceptance limit is not clear, and the software run does not complete if the limit is exceeded.

The tool should either inform the user on the limit of 10 attributes or increase the limit, or do both of the above.

When a user tries to enter more than 10 attributes, that limit is now specified, and the “Continue” button is disabled to prevent this action.

4. “Death” as an outcome is required even for diseases with no morbidities.

The tool should either inform users that “Death” is a required outcome or allow users to set their criteria in advance so that the tool only requires data for the criteria that are specified.

As part of the disease page redesign, the user has been given the option of entering costs relating to “Death” as a separate entry, distinct from what information is needed for illness due to the disease.

Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×

TABLE B-2
Consultant’s Feedback on Additional Use Case Observation and the Committee’s Action or Response

Consultant’s Feedback (from Table A-2) Committee’s Action or Response
Observation Suggestion

1. Age refinements are limited.

Create more granular age groups and allow these groups to be combined as necessary. Suggest using World Health Organization age group dataset.

The committee has suggested the idea of making more refined age groups in future modifications. The original coarsely grained age groups were a choice of the Phase II committee (influenced by the approach taken by the Phase I committee) when balancing precision with likely data availability. The situations where more refined age groups arose came from uses of the software that go beyond the original intent. In particular, this issue arose when users were attempting to select among existing vaccines where highly age-specific recommendations for use were made by the vaccine developers. Because this extended use created the primary concern, the committee decided to focus on other software improvement priorities for SMART Vaccines 1.1.

2. Subpopulation choices are confusing.

The tool should inform users clearly that the subpopulation data pertain only to the specific disease and vaccine candidates under consideration. The final results are based on the whole population. The committee should consider adding a full population option for analysis.

The relevant pages in SMART Vaccines displays now inform the reader that the results pertain to the entire population. As with the previous issue, this comment emerged from an extended use wherein the software was used to select among existing vaccines for deployment. The final SMART Scores are normalized to the entire population. This has been emphasized in a note inside the software.

Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×

TABLE B-2
Continued

Consultant’s Feedback (from Table A-2) Committee’s Action or Response
Observation Suggestion

3. Attributes not required for a scenario are required by the tool, which can adversely impact results.

There should be an option of “zero” or “NA” for attributes that are not required for all scenarios.

Because of the basic structure of SMART Vaccines, users are led through the specification of populations, diseases, and vaccines (the Specify section) before they are asked to consider the Evaluate steps. Indeed, the committee envisions that the Specify steps will likely be undertaken by a technical support person and then a decision maker will enter the scene to participate in the Evaluate phase. Although the current structure does impose a data entry burden on users who can anticipate in advance the precise set of attributes that will enter the model, modifying the structure would create a programming task that exceeds the committee’s resources at this point. Thus, the committee chose to leave this issue for consideration in future versions of the program.

4. Data exist to calculate “total cost” offline and must be entered into the tool manually.

Allow import of spreadsheet files or request “total cost” calculated offline to simplify data entry. To eliminate confusion, this value needs to be clearly defined. It would be better to ask for the “total cost” (assuming the user knows how to calculate this figure) and eliminate duplicate data entry or separate file upload.

The software has been simplified to show exactly what this suggestion calls for. The discussion in Chapter 2 of the report provides guidance to users for several ways to estimate treatment costs.

5. The field highlight and the cursor color are both blue, and this is confusing.

Changing the color of either the cursor or the highlight would eliminate the confusion and should be an easy fix.

This issue—which the committee believes was specific only to certain operating systems and browsers—has been fixed.

Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×

TABLE B-2
Continued

Consultant’s Feedback (from Table A-2) Committee’s Action or Response
Observation Suggestion

6. Attributes, Weights, and Priorities need to be reentered whenever a change is made.

Allow an option for Attributes, Weights, and Priorities to be adjusted without reentering all the choices.

While future versions of SMART Vaccines could offer this option, the committee believes that the current structure is optimal for now, because it requires users to verify that the selection of attributes and weights is correct after other data changes have occurred.

7. Must select “Continue” when on a previously completed screen instead of navigating from the top.

Eliminate the need to select “Continue” when going back to a screen that is already complete.

The committee believes, as with the previous question, that requiring use of the “Continue” button ensures that users do not inadvertently skip past choices that are no longer valid after other data changes have occurred.

8. Change Attributes with Yes/No to Likert scale to allow for more granularity.

Change Attributes with Yes/No inputs to a Likert scale to allow for more granularity. Provide guidance on adding user-defined attributes.

The committee has concluded that there is a need for further research to address the question of granularity in Likert scales for user-defined attributes.

9. Vaccine-related complications should be a quantitative entity instead of an attribute.

Consider changing vaccine-related complications to be a quantitative entity again instead of an attribute.

This issue, as with others discussed before, arose in a creative extended use of SMART Vaccines where the analysis focused on the deployment of existing vaccines. With existing vaccines, the details of vaccine complications are reasonably well known, which would make this feature improvement relevant. For to-be-developed vaccines (the originally intended focus of the program), the Phase II committee chose the current Yes/No description with the belief that the nature and extent of complications could not be known with any meaningful certainty. Thus, the current version retains the Yes/No descriptor. Future enhancements could provide a richer alternative.

Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×

TABLE B-2
Continued

Consultant’s Feedback (from Table A-2) Committee’s Action or Response
Observation Suggestion

10. Multi-disease vaccine comparisons are difficult.

Consideration should be given to eventually modifying the tool to allow multi-disease vaccines to be compared without treating composite diseases separately.

The committee suggests that future enhancements to SMART Vaccines incorporate this change.

11. Additional clarity is required on data entry needs.

The committee should consider offering additional information through notes or tool tips so that the user understands how these data entries work.

The committee has made the best use of offering pop-ups in the Matlab platform to inform the user of data needs. Additional information on data needs is provided in the Phase II report, and those needs are elaborated on in this report. Addition of user-friendly features is certainly possible in future versions of SMART Vaccines if these versions are carried out in a Web-based domain, which the committee suggests as the next logical step in the development of the tool.

12. Results or outputs cannot be saved.

Provide the ability to save the output results.

As noted in Table A-1 response 1, the software now allows saving the results of any analysis through the option to print a table showing the “state of the software” with key variable values all listed.

Comment

Many of the feature enhancement requests the committee received from the user groups arose from uses that went beyond the intended use of the software. Future users will probably benefit from having specialized versions, one for the “prioritization of new vaccines” issue and another for “selection among existing vaccines” or perhaps for comparing a vaccine against another public health intervention for the same disease. The future versions of SMART Vaccines could be designed to include more refined age brackets for disease burden and vaccine program implementation, more refined entry of vaccine-related complications, and the granularity of Likert scales to describe attributes.

Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×

This page intentionally left blank.

Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×
Page 93
Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×
Page 94
Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×
Page 95
Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×
Page 96
Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×
Page 97
Suggested Citation:"Appendix B: Committee's Response to the Use Case Scenarios Report." Institute of Medicine and National Academy of Engineering. 2015. Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework. Washington, DC: The National Academies Press. doi: 10.17226/18763.
×
Page 98
Next: Appendix C: SMART Vaccines Software Updates »
Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework Get This Book
×
Buy Paperback | $75.00 Buy Ebook | $59.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

SMART Vaccines - Strategic Multi-Attribute Ranking Tool for Vaccines - is a prioritization software tool developed by the Institute of Medicine that utilizes decision science and modeling to help inform choices among candidates for new vaccine development. A blueprint for this computer-based guide was presented in the 2012 report Ranking Vaccines: A Prioritization Framework: Phase I. The 2013 Phase II report refined a beta version of the model developed in the Phase I report.

Ranking Vaccines: Applications of a Prioritization Software Tool: Phase III: Use Case Studies and Data Framework extends this project by demonstrating the practical applications of SMART Vaccines through use case scenarios in partnership with the Public Health Agency of Canada, New York State Department of Health, and the Serum Institute of India. This report also explores a novel application of SMART Vaccines in determining new vaccine product profiles, and offers practical strategies for data synthesis and estimation to encourage the broader use of the software.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!