National Academies Press: OpenBook

Redesigning Continuing Education in the Health Professions (2010)

Chapter: Appendix A: Literature Review Tables

« Previous: 7 Implementation, Research, and Evaluation
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Appendix A
Literature Review Tables

Evidence on the effectiveness of continuing education (CE) and CE methods was identified through a literature review. Although nonexhaustive, the review included a comprehensive search of the Research and Development Resource Base (RDRB), a bibliographic database of more than 18,000 articles from fields including CE, knowledge translation, interprofessional literature, and faculty development. Articles in the RDRB are culled from Medline, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), Excerpta Medica Database (EMBASE), Education Resources Information Center (ERIC), Sociological Abstracts, PsychoInfo, Library Information and Science Abstracts (LISA), and business databases, as well as automatic retrieval of articles from journals dedicated to medical education (e.g., Journal of Continuing Education in the Health Professions, Medical Education, Studies in Continuing Education).

The RDRB was searched using keywords,1 and the results of the searches were culled by two independent reviewers using an iterative approach. Studies collected were from 1989 to April 2009.

1

Keywords used to search the RDRB included “patient participation,” “patient initiated,” “patient mediated,” “physician prompt,” “audit,” “feedback,” “checklist,” “checklists,” “protocol,” “protocols,” “reminder,” “reminders,” “academic detailing,” “simulation,” “simulations,” “lifelong learning,” “experiential,” “self-directed,” “reflection,” “problem based,” “model,” and “modeling.” These keywords were used alone or in combination.

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Abstracts of search results were reviewed to eliminate articles that clearly did not pertain to CE methods, cost-effectiveness, or educational theory and to categorize the studies as informative, equivocal, or not informative of CE effectiveness. A wide range of designs were classified as informative, including randomized controlled trials, prospective cohort studies, observational studies, and studies with pre- and post-intervention assessment methodologies. Quantitative and qualitative approaches were included, and inclusion was not limited to studies with positive results. The most common reasons articles were classified as not informative were absence of a trial design, small sample size, and high likelihood of confounding factors in the design that could affect outcomes. The two reviewers independently classified abstracts and full texts of the articles and then compared their classification results. Interreviewer reliability was greater than 80 percent, and discrepancies were resolved by a consensus process. A third reviewer verified the results classified as informative or equivocal in a final round of detailed assessment of the study design, populations, intervention, type of outcome, and conclusions for each article. Systematic reviews and metaanalyses are included in Table A-1; studies and articles are included in Table A-2.

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Table A-1 begins on the next page.

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

TABLE A-1 Summary of Systematic Reviews on Effectiveness of CE Methods

Reference

Purpose

Number of Studies, Inclusion Criteria, and Databases Searched

Reflection

Ruth-Sahd, L. A. 2003. Reflective practice: A critical analysis of data-based studies and implications for nursing education. Journal of Nursing Education 42(11):488-497.

* Identify common themes that emerge from data-based studies

* Identify implications for reflective practice in the field of nursing education

Sample: 20 articles, 12 doctoral dissertations, and 6 books

Inclusion criteria: Delineated methodology section; emphasis on reflective practice in an education setting; publication between 1992 and 2002; English language

Databases: CINAHL, Dissertation Abstracts International, ERIC, PsychInfo

Simulation

Issenberg, S. B., W. C. McGaghie, E. R. Petrusa, D. L. Gordon, and R. J. Scalese. 2005. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Medical Teacher 27:10-28.

Determine the features and uses of high-fidelity medical simulators that lead to the most effective learning (high-fidelity simulators are models, mannequins, or virtual packages that utilize realistic materials and equipment and incorporate feedback, computerized control, or other advanced technology)

Sample: 109 articles

Inclusion criteria: Empirical study; use of a simulator as an education assessment or intervention; learner outcomes measured quantitatively; experimental or quasi-experimental design

Databases: ERIC, Medline, PsychInfo, Web of Science, Timelit

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Main Results

Limitations

* Conditions necessary for reflection to be successful:

  • Active motivation

  • Safe learning environments

  • Time availability

* Students require guidance about how to practice reflection

* No research on how unconscious knowledge is affected by reflective practice

* Lack of hypothesis testing in reviewed studies

High fidelity simulators facilitate learning under certain conditions:

  • Repetitive practice

  • Used in conjunction with multiple learning strategies

  • Variety of clinical conditions captured

  • Controlled environment where errors can be made and corrected

  • Individualized learning where participants are actively involved

Heterogeneity of research designs, educational interventions, outcome measures, and time frame precluded data synthesis using meta-analysis

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Purpose

Number of Studies, Inclusion Criteria, and Databases Searched

Sutherland, L. M., P. F. Middleton, A. Anthony, J. Hamdorf, P. Cregan, D. Scott, and G. J. Maddern. 2006. Surgical simulation: A systematic review. Annals of Surgery 243(3):291-300.

Evaluate the effectiveness of surgical simulation compared with other methods of surgical training

Sample: 30 trials with 760 participants

Inclusion criteria: Randomized controlled trial; assessing surgical simulation; measures of surgical task performance

Databases: Medline, EMBASE, Cochrane Library, PsycINFO, CINAHL, Science Citation Index

Reminders

Balas, E. A., S. M. Austin, J. A. Mitchell, B. G. Ewigman, K. D. Bopp, and G. D. Brown. 1996. The clinical value of computerized information services. A review of 98 randomized clinical trials. Archives of Family Medicine 5(5):271-278.

Determine the clinical settings, types of interventions, and effects of studies in randomized clinical trials addressing the efficacy of clinical information systems

Sample: 98 articles reporting on 100 trials

Inclusion criteria: Randomized controlled trial (RCT); computerized information intervention in the experimental group; effect measured on the process or outcome of care

Databases: Medline

Shea, S., W. DuMouchel, and L. Bahamonde. 1996. A meta-analysis of 16 randomized controlled trials to evaluate computer-based clinical reminder systems for preventive care in the ambulatory setting. Journal of the American Medical Informatics Association 3(6):399-409.

Assess the overall effectiveness of computer-based reminder systems in ambulatory settings directed at preventive care

Sample: 16 trials

Inclusion criteria: Randomized controlled trial; computer-based reminder; control group received no intervention

Databases: Medline, Nursing and Allied Health database, Health Planning and Administration database

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Main Results

Limitations

Computer simulation generally showed better results than no training at all but was not superior to standard training (e.g., surgical drills) or video simulation

Insufficient evidence to evaluate types of simulation because outcomes were often not comparable across studies

Patient and physician reminders, computerized treatment planners, and interactive patient education can make a significant difference in managing care (P < 0.05)

Many trials evaluate the effect of information services on care processes as opposed to patient outcomes

* Computer reminders improved preventive practices for vaccinations, breast cancer screening, colorectal cancer screening, and cardiovascular screening

* Computerized reminders did not improve preventive practices for cervical cancer screening

Heterogeneity in study designs and the ways in which results were presented

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Purpose

Number of Studies, Inclusion Criteria, and Databases Searched

Audit and Feedback

Jamtvedt, G., J. M. Young, D. T. Kristoffersen, M. A. O’Brien, and A. D. Oxman. 2006. Does telling people what they have been doing change what they do? A systematic review of the effects of audit and feedback. Quality & Safety in Health Care 15(6):433-436.

Review the effects of audit and feedback on improving professional practice

Sample: 118 trials

Inclusion criteria: Randomized controlled trials; utilized audit and feedback; objective measures of provider performance

Databases: Cochrane Library

Multifaceted Interventions and Reviews of Multiple Methods

Cheraghi-Sohi, S., and P. Bower. 2008. Can the feedback of patient assessments, brief training, or their combination, improve the interpersonal skills of primary care physicians? A systematic review. BMC Health Services Research 8.

* Review the efficacy of patient feedback on the interpersonal care skills of primary care physicians

* Review the efficacy of brief training (up to one working week in length) focused on the improvement of interpersonal care

Sample: 9 studies

Inclusion criteria: Randomized controlled trials; published in English; based on primary care practitioners and their patients; utilized patient feedback or brief training or a combination of these methods; outcome measure was a patient-based assessment in change

Databases: CENTRAL, Medline, EMBASE

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Main Results

Limitations

* Effects of audit and feedback on improving professional practice are generally small to moderate

* Effects of audit and feedback are likely to be larger when baseline adherence to recommended practice is low and audit and feedback are delivered more frequently and over longer periods of time

* Lack of a process evaluation embedded in trials

* Few studies compare audit and feedback to other interventions

Brief training as currently delivered is not effective

* Limited evidence on the effects of patient-based feedback for changes in primary care physician behavior

* Evidence is not definitive due to the small number of trials

* Variation in training methods and goals

* Lack of theory linking feedback to behavior change

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Purpose

Number of Studies, Inclusion Criteria, and Databases Searched

Davis, D., M. A. O’Brien, N. Freemantle, F. M. Wolf, P. Mazmanian, and A. Taylor-Vaisey. 1999. Impact of formal continuing medical education: Do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA 282(9):867-874.

Review, collate, and interpret the effect of formal continuing medical education (CME) interventions on physician performance and health care outcomes

Sample: 14 studies

Inclusion criteria: Randomized controlled trial of formal didactic and/or interactive CME; >50% physicians

Databases: RDRB, Cochrane Library, Medline

Forsetlund, L., A. Bjørndal, A. Rashidian, G. Jamtvedt, M. A. O’Brien, F. Wolf, D. Davis, J. Odgaard-Jensen, and A. D. Oxman. 2009. Continuing education meetings and workshops: Effects on professional practice and health care outcomes. Cochrane Database Systematic Reviews (2):CD003030.

To assess the effects of educational meetings on professional practice and health care outcomes

Sample: 81 trials involving more than 11,000 health professionals

Inclusion criteria: Randomized controlled trial of educational meetings that reported an objective measure of professional practice or health care outcomes

Databases: Cochrane Library

Grimshaw, J., L. Shirran, R. Thomas, G. Mowatt, C. Fraser, L. Bero, R. Grilli, E. Harvey, A. Oxman, and M. A. O’Brien. 2001. Changing provider behavior: An overview of systematic reviews of interventions. Medical Care 39(8 Suppl 2):II2-II45.

Identify, appraise, and synthesize systematic reviews of professional education or quality assurance interventions to improve quality of care

Sample: 41 reviews

Inclusion criteria: Interventions targeted at health professionals; reported measures of professional performance and/or patient outcomes; study design included explicit selection criteria

Databases: Medline, Healthstar, Cochrane Library

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Main Results

Limitations

* Interactive CME sessions that enhance participant activity and provide the opportunity to practice skills can effect change in professional practice and, on occasion, health outcomes

* Didactic sessions did not appear to be effective in changing physician performance

* Limited number of randomized controlled trials and settings limits generalizability of findings

* The comparability of CME interventions is debatable due to the lack of comparability of reviewed interventions

* Educational meetings alone are not likely to be effective for changing behaviors

* The effect of educational meetings combined with other interventions is most likely to be small and similar to other types of CE, such as audit and feedback, and educational outreach visits

* Heterogeneity in study designs and the ways in which results were presented

* Observed differences in changing behaviors cannot be explained with confidence

* Passive approaches generally ineffective

* Active approaches effective under some circumstances

* Multifaceted interventions more likely to be effective than interventions with one method

Lack of agreement within the research community on a theoretical or empirical framework for classifying interventions

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Purpose

Number of Studies, Inclusion Criteria, and Databases Searched

Gross, P. A., and D. Pujat. 2001. Implementing practice guidelines for appropriate antimicrobial usage: A systematic review. Medical Care 39(8 Suppl 2):II55-II69.

* Conduct a systematic review of guideline implementation studies for improving appropriate use of antimicrobial agents

* Determine which implementation methods appear to improve the outcome of appropriate antimicrobial use

Sample: 40 studies

Inclusion criteria: Comparative study; quantitative data; English language; between 1966 and 2000

Databases: Medline

Lam-Antoniades, M., S. Ratnapalan, and G. Tait. 2009. Electronic continuing education in the health professions: An update on evidence from RCTs. Journal of Continuing Education in the Health Professions 29:44-51.

Update evidence from RCTs assessing the effectiveness of electronic CE (e-CE)

Sample: 15 studies

Inclusion criteria: Evaluated a CE intervention for any group of health professionals; intervention included a computer interface (CD-ROM or Internet); randomized controlled trial; published between 2004 and 2007

Databases: Medline, EMBASE, CINAHL

Marinopoulos, S. S., T. Dorman, N. Ratanawongsa, L. M. Wilson, B. H. Ashar, J. L. Magaziner, R. G. Miller, P. A. Thomas, G. P. Prokopowicz, R. Qayyum, and E. B. Bass. 2007. Effectiveness of continuing medical education.

Evidence report/technology assessment no. 149. AHRQ Publication No. 07-E006. Rockville, MD: Agency for Healthcare Research and Quality.

Synthesize evidence regarding the effectiveness of CME and differing instructional designs in terms of knowledge, attitudes, skills, practice behavior, and clinical practice outcomes

Sample: 136 articles and 9 systematic reviews

Inclusion criteria: Reporting on the effects of CME or simulation; written in English; contained original human data; included at least 15 fully trained physicians; evaluated an educational activity; published between 1981 and 2006; conducted in the United States or Canada; included data from a comparison group

Databases: Medline, EMBASE, Cochrane Library, PsycINFO, ERIC

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Main Results

Limitations

* Multifaceted implementation methods most successful

* Individual implementation methods determined to be useful:

  • Academic detailing

  • Feedback from nurses, pharmacists, or physicians

  • Local adoption of a guideline

  • Small-group interactive sessions

  • Computer-assisted care

* Multimethod approaches make it difficult to determine which method(s) were critical for appropriate antimicrobial use

* Findings may not be generalizable because study conditions vary

* Positive effects of e-CE on knowledge sustained up to 12 months

* Positive effects of e-CE on practice sustained up to 5 months

* e-CE interventions that only included text via reading passages of limited effectiveness in changing knowledge or practice

None of the studies attempted to identify which components of a multifaceted intervention were responsible for effects

* CME effective in achieving and maintaining knowledge, attitudes, skills, practice behavior, and clinical practice outcomes

* Live media more effective than print; multimedia more effective than single-media interventions; multiple exposures more effective than a single exposure

* Firm conclusions not possible because of overall low quality of the literature

* Heterogeneity in study designs and the ways in which results were presented

* Limited evidence on reliability and validity of the tools used to assess CME effectiveness

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Purpose

Number of Studies, Inclusion Criteria, and Databases Searched

Mansouri, M., and J. Lockyer. 2007. A meta-analysis of continuing medical education effectiveness. Journal of Continuing Education in the Health Professions 27:6-15.

Examine the effect of moderator variables on physician knowledge, performance, and patient outcomes

Sample: 31 studies

Inclusion criteria: Randomized controlled trial or before-and-after experimental design; participants were practicing physicians; focus on at least 1 of the 3 identified outcomes (physician knowledge, physician performance, patient outcome); adequate description of the intervention; quantitative analyses

Databases: Medline, ERIC

O’Brien, M. A., N. Freemantle, A. D. Oxman, F. Wolf, D. A. Davis, and J. Herrin. 2001. Continuing education meetings and workshops: Effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews (Online) (2).

Assess the effects of educational meetings on professional practice and health care outcomes

Sample: 32 studies

Inclusion criteria: Randomized trials or quasi-experimental studies; effect of lectures, workshops, and/or courses on clinical practice or health care outcomes

Databases: Cochrane Library, Medline, RDRB

Prior, M., M. Guerin, and K. Grimmer-Somers. 2008. The effectiveness of clinical guideline implementation strategies—A synthesis of systematic review findings. Journal of Evaluation in Clinical Practice 14(5):888-897.

Synthesize evidence of effectiveness of clinical guideline implementation strategies in terms of improved clinical processes and improved cost-benefit ratios

Sample: 33 systematic reviews that included 714 primary studies

Inclusion criteria: Generic implementation strategies; comparison study; measured clinical practice change and/or compliance; published between 1987 and 2007; English language

Databases: Medline, Amed, CINAHL, Academic Search Elite, Cochrane Library

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Main Results

Limitations

* Larger effect size when the interventions are interactive (r = 0.33 [0.33]) and use multiple methods (r = 0.33 [0.26])

* Larger effect size for longer interventions (r = 0.33) and multiple interventions over time (r = 0.36)

* Smaller effect size for programs with multiple professions (r = −0.18) and a geater number of participants (r = −0.13

Studies did not always provide

  • Descriptive information about the participants or the intervention

  • Statistical information using accepted standards for reporting

  • Numeric data

  • Validity and reliability data

  • Data about the period of time between the intervention and the measurement of performance

* Interactive workshops can result in changes in professional practice

* Didactic sessions alone unlikely to change professional practice

* Study design generally poorly reported, making it difficult to judge the degree to which results may be biased

* Substantial variation in the complexity of targeted behaviors, baseline compliance, and the characteristics of interventions

* Heterogeneity in study designs and the ways in which results were presented

* Implementation strategies where there was strong evidence of guideline compliance included

  • Multifaceted interventions

  • Interactive education

  • Clinical reminder systems

* Didactic education and passive dissemination strategies (e.g., conferences, websites) ineffective

* Implementation strategies varied and rarely comparable

* Cost-effectiveness analyses rare

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Purpose

Number of Studies, Inclusion Criteria, and Databases Searched

Robertson, M. A., K. E. Umble, and R. M. Cervero. 2003. Impact studies in continuing education for health professions: Update. Journal of Continuing Education in the Health Professions 23:146-156.

* Determine if CE is effective and for what outcomes

* Determine what kinds of CE are effective

Sample: 15 syntheses

Inclusion criteria: Primary CE study; professionals’ performance and/or patient health outcomes considered; published since 1993

Databases: RDRB, Medline, ERIC, Digital Dissertation Abstracts

Steinman, M. A., S. R. Ranji, K. G. Shojania, and R. Gonzales. 2006. Improving antibiotic selection: A systematic review and quantitative analysis of quality improvement strategies. Medical Care 44(7):617-628.

Assess which interventions are most effective at improving the prescribing of recommended antibiotics for acute outpatient infections

Sample: 26 studies reporting on 33 trials

Inclusion criteria: Clinical trial; reports on antibiotic selection in acute outpatient infections; randomized trials, controlled before-and-after and interrupted time-series designs with at least 3 data points; English language

Databases: Cochrane Library, Medline

Tian, J., N. L. Atkinson, B. Portnoy, and R. S. Gold. 2007. A systematic review of evaluation in formal continuing medical education. Journal of Continuing Education in the Health Professions 27:16-27.

Improve CME evaluation study design by determining

  • Effects of using randomization strategies on outcome measurement

  • Reliability and validity of measurement

  • Follow-up period recommendations

Sample: 32 studies

Inclusion criteria: Randomized controlled trial or quasi-experimental trial; published between 1993 and 1999; primary studies; >50% physicians; CME intervention was didactic, interactive, or both

Databases: Medline, EBSCOhost

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Main Results

Limitations

* CE can improve knowledge, skills, attitudes, behavior, and patient health outcomes

* Effective CE is ongoing, interactive, contextually relevant, and based on needs assessment

* Few primary studies addressed the impact of CE on patient health outcomes (and instead measured patient satisfaction)

* Focus on how CE affects individuals as opposed to teams or organizations

Multidimensional interventions using audit and feedback less effective than interventions using clinician education alone

* Sample size too small to conduct detailed analysis of all potential confounders and effect modifiers

* Heterogeneity in study designs and the ways in which results were presented

* Valid and reliable questionnaire addressing variables necessary to allow comparison of effectiveness across interventions

* Minimum 1-year post-intervention period necessary to investigate sustainability of outcomes

Variation across study designs prevents comparing the effectiveness of CME programs

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Purpose

Number of Studies, Inclusion Criteria, and Databases Searched

Tu, K., and D. A. Davis. 2002. Can we alter physician behavior by educational methods? Lessons learned from studies of the management and follow-up of hypertension. Journal of Continuing Education in the Health Professions 22(1):11-22.

Review the literature on the effectiveness of physician educational interventions in the management and follow-up of hypertension

Sample: 12 studies

Inclusion criteria: Use of replicable educational interventions; >50% physician involvement; objective measures of physician behavior change or patient outcomes; dropout rate of <30%; outcomes assessed for >30 days

Databases: PubMed, RDRB

Wensing, M., H. Wollersheim, and R. Grol. 2006. Organizational interventions to implement improvements in patient care: A structured review of reviews. Implementation Science 1(1).

Provide an overview of the research evidence on the effects of organizational strategies to implement improvements in patient care

Sample: 36 reviews

Inclusion Criteria: Evaluated organizational strategies; published in 1995 or later; rigorous evaluations (e.g., randomized trials, interrupted time-series, controlled before-and-after, and prospective comparative observational studies)

Databases: PubMed, Cochrane Library

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Main Results

Limitations

* Studies included 7 different educational interventions: reminders, formal CME, computerized decision support, printed materials, academic detailing, continuous quality improvement, and prompts

* Relatively small number of trials in each of the types of interventions

* Randomized trials using quantitative outcomes do not capture processes and dimensions of learning

* Professional performance was generally improved by revision of professional roles and utilization of computer systems for knowledge management

* Multidisciplinary teams, integrated care services, and computer systems generally improved patient outcomes

Heterogeneity in study designs and the ways in which results were presented

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

TABLE A-2 Literature Review on the Effectiveness of CE Methods

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Experiential and Self-Directed Learning

East, D., and K. Jacoby. 2005. The effect of a nursing staff education program on compliance with central line care policy in the cardiac intensive care unit. Pediatric Nursing 31(3):182-184.

Demonstrate the effectiveness of a self-study education module on nurse compliance with central line care policy

Sample: 20 registered nurses (RNs) in a 12-bed pediatric cardiovascular intensive care unit

Method: Quasi-experimental cohort study with pre- and post-test design

Outcome measures: Compliance with 10 central line policies; intravenous (IV) line audit tool used to collect data on 47 patients pre-and post-intervention

Duration: 7 months

Hewson, M. G., H. L. Copeland, E. Mascha, S. Arrigain, E. Topol, and J. E. Fox. 2006. Integrative medicine: Implementation and evaluation of a professional development program using experiential learning and conceptual change teaching approaches. Patient Education & Counseling 62(1):5-12.

Raise physicians’ awareness of, and initiate attitudinal changes toward, integrative medicine through a professional development program involving experiential learning

Sample: 48 cardiologists at an academic medical center

Method: Randomized controlled trial

  • Experimental group: participation in intervention

  • Control group: no intervention

Outcome measure: Self-reported knowledge, attitudes, likelihood of changing practice, and satisfaction

Duration: 8 hours

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

Self-study module included a fact sheet and poster outlining proper care

Self-study had a statistically significant impact on staff compliance with central line policy (p < 0.001, 95% CI)

Professional development session in which participants participated in integrative medicine modalities (e.g., yoga, Reiki)

* Participant group had significant positive changes in their conceptions about and attitudes to complementary and alternative medicine after the program

* Physicians significantly increased their willingness to integrate CAM into their practice

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Karner, K. J., D. C. Rheinheimer, A. M. DeLisi, and C. Due. 1998. The impact of a hospital-wide experiential learning educational program on staff’s knowledge and misconceptions about aging. Journal of Continuing Education in Nursing 29(3):100-104.

Examine the impact on the knowledge and attitudes of hospital personnel of their participation in an experiential learning program to increase knowledge about aging

Sample: 95 hospital employees (administrative, nursing, social work, occupational therapy, physical therapy, dietary, maintenance, and pastoral care)

Method: Cohort study with pre- and post-test design

Outcome measures: Knowledge gains as evidenced by improvement on a 25-question exam about the feelings of older people; bias as determined by responses on the exam

Duration: 2 hours

Love, B., C. McAdams, D. M. Patton, E. J. Rankin, and J. Roberts. 1989. Teaching psychomotor skills in nursing: A randomized control trial. Journal of Advanced Nursing 14(11):970-975.

Compare the effectiveness of teaching psychomotor skills in a structured laboratory setting with self-directed, self-taught modules

Sample: 77 second-year students in a baccalaureate nursing program in Ontario, Canada

Method: Randomized controlled trial

  • Experimental group: clinical laboratory training

  • Control group: self-directed learning

Outcome measure: Achievement as measured by the Objective Structured Clinical Examination (OSCE)

Duration: One clinical term

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

One-hour role-play game designed for participants to experience and then reflect on their feelings toward older people

* Significant increase in scores between pre-test and post-test (F = 64.08, p < 0.0001)

* Negative bias scores decreased significantly from pre- to post-test (F = 23.86, p < 0.0001)

* Packets containing information on specific skills, definitions, resources, problem-solving scenarios were distributed

* Learners watched expert clinicians

No difference between psychomotor skill performance of students who learned in a self-directed manner and those taught in a structured clinical laboratory

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Russell, J. M. 1990. Relationships among preference for educational structure, self-directed learning, instructional methods, and achievement. Journal of Professional Nursing 6(2):86-93.

Analyze nurses’ preference for educational structure, self-directed learning, instructional method, and achievement on a written exam

Sample: 40 RNs in 8 community hospitals

Method: Randomized controlled trial

  • Experimental group: self-directed learning

  • Control group: lecture

Outcome measures: Scores on a 50-item post-test; scores on the Self-Directed Learning Readiness Scale

Duration: 1 week

Suggs, P. K., M. B. Mittelmark, R. Krissak, K. Oles, C. Lane, Jr., and B. Richards. 1998. Efficacy of a self-instruction package when compared with a traditional continuing education offering for nurses. Journal of Continuing Education in the Health Professions 18(4):220-226.

Determine whether a multimedia, self-instructional education package can provide similar learning results as received from a conventional CE conference

Sample: 63 RNs and licensed practical nurses (LPNs) in 2 rural regions in North and South Carolina

Method: Ecologic study

  • Experimental group: self-instructed course

  • Control group: traditional CE course

Outcome measure: Knowledge gains evaluated by a pre- and post-multiple-choice test

Duration: NA

Reflection

Forneris, S. G., and C. Peden-McAlpine. 2007. Evaluation of a reflective learning intervention to improve critical thinking in novice nurses. Journal of Advanced Nursing 57(4):410-421.

Determine if a reflective contextual learning intervention would improve novice nurses’ critical thinking skills during their first 6 months of practice

Sample: 6 novice nurse-nurse preceptor dyads at an urban acute care facility

Method: Qualitative case study

Outcome measures: Self-reported anxiety, influence of power, use of questioning, use of sequential thinking, use of contextual thinking

Duration: 6 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

Self-directed group received reading materials, audio tapes, self-evaluation tests, case study analyses, and an instruction session for clarification

* No significant relationships found between exam scores and self-directed learning readiness (p = 0.24)

* No participant in the self-directed group chose to participate in an instructor clarification session

* 5-hour CE workshop delivered by a pharmacist

* Self-paced 6- to 10-hour instructional education package with videotapes, a workbook with case histories, and a textbook

* Both control and experimental groups had statistically significant improvement (t = 4.86, p < 0.0001 and t = −2.54, p < 0.18, respectively)

* Knowledge gains were not significantly higher for the control group

* Narrative journals

* Daily coaching to help incorporate critical thinking into practice

* Leader-facilitated discussion groups

* Lack of trust in one’s knowledge base influenced how an individual used critical thinking

* Thinking out loud allowed nurses to verbalize sources of knowledge and plan actions

* Contextual learning assisted in the development of critical thinking

* Sustainability of critical thinking skills post-intervention unknown

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Mathers, N. J., M. C. Challis, A. C. Howe, and N. J. Field. 1999. Portfolios in continuing medical education—Effective and efficient? Medical Education 33(7):521-530.

Evaluate the effectiveness and efficiency of portfolios for the continuing professional development of general practitioners (GPs)

Sample: 32 general practitioners in Sheffield, UK

Method: Qualitative cohort study comparing traditional CME activities and portfolio-based learning

Outcome measures: Presence of defined learning objectives; hours of participation in CME activity

Duration: 12 months

Ranson, S. L., J. Boothby, P. E. Mazmanian, and A. Alvanzo. 2007. Use of personal digital assistants (PDAs) in reflection on learning and practice. Journal of Continuing Education in the Health Professions 27:227-233.

Describe the use of (1) personal digital assistants (PDAs) in patient care and (2) a PDA version of a learning portfolio intended to encourage documentation of reflection on practice and medical education

Sample: 10 physicians

Method: Case study

Outcome measures: PDA usage data; written comments in learning portfolios; self-reported PDA use information

Duration: 6 months

Academic Detailing

Doyne, E. O., M. P. Alfaro, R. M. Siegel, H. D. Atherton, P. J. Schoettker, J. Bernier, and U. R. Kotagal. 2004. A randomized controlled trial to change antibiotic prescribing patterns in a community. Archives of Pediatrics and Adolescent Medicine 158(6):577-583.

Examine the effects of academic detailing on community pediatricians’ prescription of antibiotics for children

Sample: 12 pediatric practice groups in the greater Cincinnati area

Method: Cluster randomized controlled trial

  • Experimental group: report cards and academic detailing visits

  • Control group: report cards only

Outcome measure: Antibiotic prescription rate pre- and post-academic detailing

Duration: 24 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

3 small-group sessions with a CME tutor to use a portfolio-based learning route to

  • Identify individual educational needs

  • Develop strategies to meet these needs

  • Use reflection to modify objectives

Portfolio learners developed individual learning objectives and had flexibility in methods and timing

Physicians received a PDA preloaded with learning portfolio software and were individually trained in its use

* Use of the PDA associated with the value of information for making clinical decisions

* Use of the learning portfolio prompted physicians to reflect on changes in clinical practice

* Each group practice in the experimental group identified 1 leader to present academic detailing sessions to the practice on a monthly basis

* Quarterly report cards detailing antibiotic-prescribing data from each practice

* Academic detailing no more effective in reducing antibiotic use than the practice-specific report cards

* Antibiotic prescription rate decreased to 0.82 of the baseline rate for the experimental group (95% CI: 0.71-0.95) and to 0.86 of the baseline for the control group (95% CI: 0.77-0.95)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Goldberg, H. I., E. H. Wagner, S. D. Fihn, D. P. Martin, C. R. Horowitz, D. B. Christensen, A. D. Cheadle, P. Diehr, and G. Simon. 1998. A randomized controlled trial of CQI teams and academic detailing: Can they alter compliance with guidelines? Joint Commission Journal on Quality Improvement 24(3):130-142.

Determine the effectiveness of academic detailing techniques and continuous quality improvement teams in increasing compliance with national guidelines for the care of hypertension and depression

Sample: 15 small group practices at 4 Seattle primary care clinics

Method: Randomized controlled trial

  • Experimental groups: academic detailing; academic detailing and continuous quality improvement (CQI)

  • Control group: usual care

Outcome measures: Changes in hypertension prescribing; changes in blood pressure control; changes in depression recognition; changes in use of older tricyclics; changes in scores on the Hopkins Symptom Checklist depression scale

Duration: 29 months

Goldstein, M. G., R. Niaura, C. Willey, A. Kazura, W. Rakowski, J. DePue, and E. Park. 2003. An academic detailing intervention to disseminate physician-delivered smoking cessation counseling: Smoking cessation outcomes of the Physicians Counseling Smokers Project. Preventive Medicine 36(2):185-196.

Determine the effect of a community-based academic detailing intervention on the quit rates of a population-based sample of smokers

Sample: 259 primary care physicians and 4,295 adult smokers in Rhode Island

Method: Quasi-experimental trial

  • Experimental group: 3 counties received academic detailing visits

  • Control group: 2 counties with no intervention

Outcome measures: Measures of smoking behavior assessed at baseline and at 6, 12, 19, and 24 months

Duration: 24 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* 2 opinion leaders at each site conducted 15-minute academic detailing sessions

* On-site pharmacists conducted 2 sessions to discuss physician-specific prescribing patterns in comparison to peer prescribing patterns

* A CQI facilitator trained practice leaders in “plan, do, study, act” and the use of real-time data collection

* Academic detailing alone and CQI alone were generally ineffective in improving clinical outcomes

* Academic detailing was associated with decreased use of older tricyclics

* Use of CQI teams and academic detailing in combination increased percentage of adequately controlled hypertensives

* Resources provided to offices, including patient education resources, pocket cards, and desk prompts

* Practice consultants conducted 4-5 visits to offices in the intervention counties

Smokers who resided in intervention areas were more likely to report they had quit smoking than smokers who resided in control areas (OR = 1.35; 95% CI: 0.99-1.83; P = 0.057)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Ilett, K. F., S. Johnson, G. Greenhill, L. Mullen, J. Brockis, C. L. Golledge, and D. B. Reid. 2000. Modification of general practitioner prescribing of antibiotics by use of a therapeutics adviser (academic detailer). British Journal of Clinical Pharmacology 49(2):168-173.

Evaluate the use of a clinical pharmacist as an academic detailer to modify antibiotic prescribing by GPs

Sample: 112 GPs in Perth, Western Australia

Method: Randomized controlled trial

  • Experimental group: academic detailing session

  • Control group: no intervention

Outcome measures: Total prescriptions; prescriptions for individual antibiotics before and after the intervention

Duration: 7 months

Kim, C. S., R. J. Kristopaitis, E. Stone, M. Pelter, M. Sandhu, and S. R. Weingarten. 1999. Physician education and report cards: Do they make the grade? Results from a randomized controlled trial. American Journal of Medicine 107(6):556-560.

Determine whether tailored educational interventions can improve the quality of care and lead to better patient satisfaction

Sample: 41 primary care physicians who cared for 1,810 patients at a large health maintenance organization

Method: Randomized controlled trial

  • Experimental group: ongoing education and academic detailing

  • Control group: ongoing education

Outcome measures: Provision of preventive care reported by patients and in medical records; patient satisfaction

Duration: 2.5 years

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* A panel of experts prepared a best-practice chart of recommended drugs for various infections

* A pharmacist visited each prescriber in the experimental group to disseminate the chart and discuss its recommendations

* Academic detailing decreased prescription numbers and costs

* Total cost of antibiotics prescribed by doctors in the control group increased by 48% from the pre- to post-intervention periods

* Costs for the experimental group increased by only 35%

* All physicians received mailed educational materials that contained overviews of preventive care services

* The experimental group received peer-comparison feedback and academic detailing from a pharmacist at 3 separate sessions

* Patient-reported preventive care measures did not align with medical records review data, resulting in an ambiguous effect of education, peer comparison, and academic detailing on preventive services

* Education, peer comparison, and academic detailing had modest effects on patient satisfaction

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Mol, P. G. M., J. E. Wieringa, P. V. NannanPanday, R. O. B. Gans, J. E. Degener, M. Laseur, and F. M. Haaijer-Ruskamp. 2005. Improving compliance with hospital antibiotic guidelines: A time-series intervention analysis. Journal of Antimicrobial Chemotherapy 55(4): 550-557.

Investigated impact of a 2-phase intervention strategy to improve antimicrobial prescribing compliance with treatment guidelines

Sample: 2,869 patients treated with an antimicrobial agent at a teaching hospital in the Netherlands

Method: Interrupted time-series study

Outcome measures: Prescribing data collected at baseline, after update of guidelines, and at the conclusion of academic detailing

Duration: 25 months

Reeve, J. F., G. M. Peterson, R. H. Rumble, and R. Jaffrey. 1999. Programme to improve the use of drugs in older people and involve general practitioners in community education. Journal of Clinical Pharmacy & Therapeutics 24(4): 289-297.

Determine the effect of educational materials and academic detailing sessions on GP prescribing patterns for older patients

Sample: 13 GPs in Australia

Method: Cohort study

Outcome measures: Scores on pre- and post-multiple choice tests; number of prescribed “indicator” medications

Duration: NA

Siegel, D., J. Lopez, J. Meier, M. K. Goldstein, S. Lee, B. J. Brazill, and M. S. Matalka. 2003. Academic detailing to improve antihypertensive prescribing patterns. American Journal of Hypertension 16(6): 508-511.

Determine whether using academic detailing increased practitioner compliance with antihypertensive treatment guidelines

Sample: 5 Department of Veterans Affairs (VA) medical facilities

Method: Quasi-experimental design

Outcome measures: Antihypertensive prescribing patterns; blood pressures

Duration: 17 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Sessions with users conducted to improve guidelines

* Antimicrobial guidelines were updated and disseminated in paper and electronic formats

* Academic detailing was used to improve compliance with the guidelines

* Updating guidelines in collaboration with specialists followed by active dissemination resulted in a significant change in the level of compliance

* Academic detailing did not lead to statistically significant changes in already high levels of guideline compliance

* Pharmacist-developed prescribing guidelines discussed at academic detailing sessions

* GP-conducted education sessions to interdisciplinary groups of practitioners

* Patient-held medication record distributed to elderly patients

* Significant decline in prescribing of psychoactive drugs (χ2 = 4.1, df = 1, p < 0.05) and nonsteroidal anti-inflammatory drugs (NSAIDs) (χ2 = 4.8, df = 1, p < 0.05)

* Patient-held medication records were useful in cueing discussions but time-consuming and infrequently used

* 1 pharmacist per VA facility was trained as an academic detailer

* Academic detailing included lectures, educational materials, provider profiling (one-on-one meetings), and group meetings

* Prescribing patterns more closely followed national recommendations with use of academic detailing

* Changes in prescribing patterns may have resulted from factors other than the intervention

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Simon, S. R., S. R. Majumdar, L. A. Prosser, S. Salem-Schatz, C. Warner, K. Kleinman, I. Miroshnik, and S. B. Soumerai. 2005. Group versus individual academic detailing to improve the use of antihypertensive medications in primary care: A cluster-randomized controlled trial. American Journal of Medicine 118(5): 521-528.

Compare group vs. individual academic detailing to increase diuretic and β-blocker use in hypertension

Sample: 9,820 patients with newly treated hypertension in a large health maintenance organization

Method: Cluster randomized controlled trial

  • Experimental groups: practices received group detailing; individuals received one-on-one academic detailing

  • Control group: no intervention

Outcome measures: Rates of diuretic or β-blocker use 1- and 2-years post-intervention; average per-patient cost of antihypertensive medications; rates of hospitalization; per-patient cost of the intervention

Duration: 3 years

Solomon, D. H., L. Van Houten, R. J. Glynn, L. Baden, K. Curtis, H. Schrager, and J. Avorn. 2001. Academic detailing to improve use of broad-spectrum antibiotics at an academic medical center. Archives of Internal Medicine 161(15):1897-1902.

Test the efficacy of academic detailing designed to improve the appropriateness of broad-spectrum antibiotic use

Sample: 51 interns and residents in 17 general medicine, oncology, and cardiology services at a teaching hospital

Method: Randomized controlled trial

  • Experimental group: academic detailing

  • Control group: no intervention

Outcome measures: Number of days that unnecessary levofloxacin or ceftazidime was administered; rate of unnecessary use of levofloxacin or ceftazidime

Duration: 18 weeks

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Individual academic detailing entailed a single meeting of a physician-educator with a clinician to address barriers implementing guidelines

* Small-group academic detailing with physician “idea champions”

* After 1 year, both individual and group academic detailing improved prescribing compliance by 13% over usual care

* By the second year following the interventions, effects had decayed

* Group detailing intervention ($3,500) cost less than individual detailing ($5,000); these intervention costs were of similar magnitude to the medication costs savings

* Peer leaders were trained in academic detailing through practice sessions using role play

* Academic detailing targeted to interns and residents who wrote an unnecessary order

* Length of stay, intensive care unit transfers, readmission rates, and in-hospital death rates were similar in both groups

* 37% reduction in days of unnecessary antibiotic use (p < 0.001)

* Rate of unnecessary use of the 2 target antibiotics reduced by 41% (95% CI: 44-78%, p < 0.001)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Van Eijk, M. E. C., J. Avorn, A. J. Porsius, and A. De Boer. 2001. Reducing prescribing of highly anticholinergic antidepressants for elderly people: Randomised trial of group versus individual academic detailing. British Medical Journal 322(7287):654-657.

Compare effect of individual vs. group academic detailing on prescribing of highly anticholinergic antidepressants in elderly people

Sample: 190 GPs and 37 pharmacists in 21 peer-review groups in the Netherlands

Method: Randomized controlled trial

  • Experimental groups: individual academic detailing; group academic detailing

  • Control group: no intervention

Outcome measure: Incidence rates calculated as the number of elderly people with new prescriptions of highly anticholinergic antidepressants

Duration: NA

Wong, R. Y., and P. E. Lee. 2004. Teaching physicians geriatric principles: A randomized control trial on academic detailing plus printed materials versus printed materials only. Journals of Gerontology Series A-Biological Sciences & Medical Sciences 59(10):1036-1040.

Compare the effectiveness of academic detailing with printed materials on promoting geriatric knowledge among physicians

Sample: 19 post-graduate trainees (residents and fellows) in British Columbia, Canada

Method: Randomized controlled trial

  • Experimental group: printed materials and academic detailing

  • Control group: printed materials

Outcome measures: Scores on pre and post multiple choice tests

Duration: 12 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* A peer educator met individually with GPs to discuss guidelines and prescribing patterns from the past year

* Group academic detailing sessions were similar to the individual sessions and included group and individual performance data

* Individual and group academic detailing improved the clinical appropriateness of prescribing behavior

* Patients in both groups more likely to receive drugs that were less anticholinergic

15-minute face-to-face educational outreach with a specialist in geriatric medicine

Academic detailing plus printed educational materials demonstrated a trend toward increased knowledge retention (1.1 ± 1.3) compared with printed materials alone (0.0 ± 1.1, p = 0.053)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Simulation

Crofts, J. F., C. Bartlett, D. Ellis, L. P. Hunt, R. Fox, and T. J. Draycott. 2006. Training for shoulder dystocia: A trial of simulation using low-fidelity and high-fidelity mannequins. Obstetrics and Gynecology 108(6):1477-1485.

* Evaluate the effectiveness of simulation training for shoulder dystocia management

* Compare training using a high-fidelity mannequin with training using a traditional mannequin

Sample: 45 physicians and 95 midwives

Method: Randomized controlled trial

  • Experimental group: training with high-fidelity mannequins

  • Control group: training with traditional, low-fidelity mannequins

Outcome measures: Pre-and post-training delivery, head-to-body delivery time, use of appropriate actions, force applied, and communication

Duration: NA

Gerson, L. B., and J. Van Dam. 2003. A prospective randomized trial comparing a virtual reality simulator to bedside teaching for training in sigmoidoscopy. Endoscopy 35(7):569-575.

Compare the exclusive use of a virtual reality endoscopy simulator with bedside teaching for training in sigmoidoscopy

Sample: 16 internal medicine residents at an academic medical center

Method: Prospective randomized controlled trial

  • Experimental group: training using a virtual reality simulator

  • Control group: bedside teaching

Outcome measures: Score on 5 endoscopic evaluations based on procedure duration, completion, ability to perform retroflexion, and level of patient comfort or discomfort

Duration: 10 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* High-fidelity mannequin training incorporated force perception and occurred at a simulation center

* Low-fidelity mannequin training occurred at local hospitals

* Both high- and low-fidelity simulation were associated with improved successful deliveries pre-and post-training (42.9% vs. 83.3%, p < 0.001)

* Training with high-fidelity mannequins was associated with a higher successful delivery rate than the control (94% vs. 72%; OR: 6.53; 95% CI: 2.05-20.81; p = 0.02)

Residents had unlimited use of a virtual reality simulator that included

  • Didactic modules and practice cases

  • Virtual patients that complained when appropriate

  • Critique provided by simulator

  • No bedside teaching

* Simulator group had more difficulty with initial endoscope insertion and endoscope negotiation than control group residents

* Simulator group less likely to be able to perform retroflexion (mean score = 2.9) than the control group residents (mean score = 3.8) (p < 0.001)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Gordon, D. L., S. B. Issenberg, M. S. Gordon, D. Lacombe, W. C. McGaghie, and E. R. Petrusa. 2005. Stroke training of prehospital providers: An example of simulation-enhanced blended learning and evaluation. Medical Teacher 27(2):114-121.

Assess the effectiveness of a stroke course that incorporates didactic lectures, tabletop exercises, small-group sessions, and standardized patients (a type of simulation used to develop communication, interpersonal, and psychomotor skills)

Sample: 73 pre-hospital paraprofessionals participating in a stroke class

Method: Cohort study with a pre- and post-intervention design

Outcome measures: Scores on a pre- and post-multiple choice test; scores on 4 case scenarios as determined by clinician raters

Duration: 9 months

Grantcharov, T. P., V. B. Kristiansen, J. Bendix, L. Bardram, J. Rosenberg, and P. Funch-Jensen. 2004. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. British Journal of Surgery 91(2):146-150.

Examine the impact of virtual reality simulation on improvement of psychomotor skills relevant to the performance of laparoscopic cholecystectomy

Sample: 16 surgical trainees

Method: Randomized controlled trial

  • Experimental group: virtual reality training

  • Control group: no training

Outcome measures: Baseline and post-intervention time to complete the procedure, error score, and economy-of-movement score

Duration: 2 years

Quinn, F., P. Keogh, A. McDonald, and D. Hussey. 2003. A study comparing the effectiveness of conventional training and virtual reality simulation in the skills acquisition of junior dental students. European Journal of Dental Education: Official Journal of the Association for Dental Education in Europe 7(4):164-169.

Measure the effectiveness of exclusive use of a virtual reality simulator in the training of operative dentistry

Sample: 20 second-year dental undergraduate students in Dublin, Ireland

Method: Randomized controlled trial

  • Experimental group: trained solely by virtual reality

  • Control group: conventional training using a combination of virtual reality and clinical instruction

Outcome measures: Assessment on 2 class-1 cavities

Duration: NA

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

Participants evaluated 2 standardized patients before the stroke course and 2 different standardized patients after the stroke course

Mean scores on case scenarios improved significantly (85.4%) from the pre-test (53.9%) (p < 0.0001)

Experimental group participated in 10 repetitions of each of 6 tasks on a virtual reality surgical simulator

* Experimental group performed laparoscopic surgery significantly faster than control group (p = 0.021)

* Experimental group showed significantly greater improvement in economy-of-movement scores (p = 0.003)

* Both groups carried out procedures on virtual reality-based training units

* The control group received feedback and evaluation from a clinical instructor

* The experimental group received real-time feedback and software evaluation from the virtual reality simulator

* Group trained exclusively on the virtual reality simulator scored worse on cavity assessment

* 84% of participants did not believe exclusive virtual reality training could replace conventional training

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Schwid, H. A., G. A. Rooke, P. Michalowski, and B. K. Ross. 2001. Screen-based anesthesia simulation with debriefing improves performance in a mannequin-based anesthesia simulator. Teaching & Learning in Medicine 13(2):92-96.

Measure the effectiveness of screen-based simulator training with debriefing on the response to simulated anesthetic critical incidents

Sample: 21 first-year clinical anesthesia residents

Method: Randomized controlled trial

  • Experimental group: screen-based simulator

  • Control group: traditional handout

Outcome measures: Quantitative scoring on residents’ management of 4 standardized scenarios in a mannequin-based simulator

Duration: 2 years

Triola, M., H. Feldman, A. L. Kalet, S. Zabar, E. K. Kachur, C. Gillespie, M. Anderson, C. Griesser, and M. Lipkin. 2006.2006. A randomized trial of teaching clinical skills using virtual and live standardized patients. Journal of General Internal Medicine 21(5):424-429.

Assess the educational effectiveness of computer-based virtual patients compared to standardized patients

Sample: 55 health care providers (RNs and physicians)

Method: Randomized controlled trial

  • Experimental group: training using 2 live, standardized patients and 2 virtual (web-based) cases

  • Control group: training using 4 live, standardized patients

Outcome measures: Knowledge and diagnostic scores assessed through clinical vignettes

Duration: 1 day

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* The simulator used a graphical interface and an automated record system to produce a detailed record of the simulated case

* The program included learning objectives and diagnostic and treatment suggestions

Residents who managed anesthetic problems using a screen-based simulator handled emergencies in a mannequin-based simulator (52.6 ± 9.9) better than residents who studied a handout (43.4 ± 5.9, p = 0.004)

* Virtual (web-based) standardized cases were conducted individually at a computer

* Live, standardized patient cases were faculty-facilitated, small-group sessions

* Experimental and control groups scored the same in preparedness to respond (p = 0.61), to screen (p = 0.79), and to care (p = 0.055) for patients

* Improvement in diagnostic abilities were equivalent in both groups (p = 0.054)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Reminders

Cannon, D. S., and S. N. Allen. 2000. A comparison of the effects of computer and manual reminders on compliance with a mental health clinical practice guideline. Journal of the American Medical Informatics Association 7(2):196-203.

Evaluate the relative effectiveness of computer and manual reminder systems on the implementation of clinical practice guidelines

Sample: 78 outpatients and 4 senior clinicians at an urban VA Medical Center

Method: Randomized controlled trial

  • Experimental group: computer reminder system

  • Control group: paper checklists

Outcome measures: Screening rates for mood disorder; completeness of the documentation of diagnostic criteria for patients with a major depressive disorder

Duration: 9 months

Chen, P., M. J. Tanasijevic, R. A. Schoenenberger, J. Fiskio, G. J. Kuperman, and D. W. Bates. 2003. A computer-based intervention for improving the appropriateness of antiepileptic drug level monitoring. American Journal of Clinical Pathology 119(3):432-438.

* Evaluate an automated, activity-based reminder designed to reduce inappropriate ordering behavior

* Determine the long-term benefit of continuous implementation of the reminder system

Sample: 1,646 serum antiepileptic drug (AED) test orders placed at a teaching hospital

Method: 2-phase randomized controlled trial

  • Phase 1: Experimental group: reminders Control group: no intervention

  • Phase 2: After 3 months, all physicians received reminders

Outcome measures: Total number of AED orders; proportion of inappropriate orders; proportion of redundant orders

Duration: 4 years

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* The CaseWalker computer reminder system generated reminders to screen patients for mood disorders

* The CaseWalker system presented and scored diagnostic criteria for major depressive disorders and created progress notes

* Computerized reminders, compared with the paper checklist, resulted in a higher screening rate for mood disorder (86.5% vs. 61%, p = 0.008)

* Computerized reminders resulted in a higher rate of complete documentation of diagnostic criteria (100% vs. 5.6%, p < 0.001)

Educational messages reminded physicians of clinical guidelines when test orders may have been inappropriate or redundant

* During a 3-month period after implementation, 13% of ordered tests were canceled following computerized reminders; for orders appearing redundant, 27% cancellation rate

* Cancellation rate sustained after 4 years

* 19.5% decrease in AED testing volume despite a 19.3% increase in overall chemistry test volume

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Demakis, J. G., C. Beauchamp, W. L. Cull, R. Denwood, S. A. Eisen, R. Lofgren, K. Nichol, J. Woolliscroft, and W. G. Henderson. 2000. Improving residents’ compliance with standards of ambulatory care: Results from the VA cooperative study on computerized reminders. Journal of the American Medical Association 284(11):1411-1416.

Examine whether a computerized reminder system operating in multiple VA ambulatory care clinics improves resident physician compliance with standards of ambulatory care

Sample: 275 resident physicians caring for 12,989 patients at 12 VA medical centers

Method: Clinical trial

  • Experimental group: reminders

  • Control group: no intervention

Outcome measures: Compliance with 13 standards of care, tracked using hospital databases and encounter forms

Duration: 17 months

Dexter, P. R., S. Perkins, J. Marc Overhage, K. Maharry, R. B. Kohler, and C. J. McDonald. 2001. A computerized reminder system to increase the use of preventive care for hospitalized patients. New England Journal of Medicine 345(13):965-970.

Determine the effects of computerized reminders on the rates at which 4 preventive therapies were ordered for inpatients

Sample: 8 independent staff teams on the general medicine ward and 6,371 patients at an urban hospital

Method: Randomized controlled trial

  • Experimental group: reminders

  • Control group: no intervention

Outcome measures: Ordering rates for pneumococcal vaccination, influenza vaccination, prophylactic heparin, and prophylactic aspirin

Duration: 18 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* All residents attended a 1-hour session to discuss standards of care

* Residents in the experimental group had a training session to introduce them to the reminder system

* Experimental group had statistically significant higher rates of compliance than the control group for all care standards combined (58.8% vs. 53.5%; OR = 1.24; 95% CI)

* Percentage of compliance in the experimental group declined over the course of the study, even though the reminders remained active

* Computer-based order-entry work stations provided clinical decision support through rule-based reminders

* Physicians could accept or reject the reminders

Computerized reminders resulted in higher adjusted ordering rates for

  • Pneumococcal vaccination (35.8% vs. 0.8%, p < 0.001)

  • Influenza vaccination (51.4% vs. 1.0%, p < 0.001)

  • Prophylactic heparin (32.3% vs. 18.9%, p < 0.001)

  • Prophylactic aspirin (36.4% vs. 27.6%, p < 0.001)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Dexter, P. R., F. D. Wolinsky, G. P. Gramelspacher, X. H. Zhou, G. J. Eckert, M. Waisburd, and W. M. Tierney. 1998. Effectiveness of computer-generated reminders for increasing discussions about advance directives and completion of advance directive forms: A randomized, controlled trial. Annals of Internal Medicine 128(2): 102-110.

* Determine the effects of computer-generated reminders to physicians on the frequency of advanced directive discussions between patients and their primary caregivers

* Determine the effects of computer-generated reminders to physicians on consequent establishment of advanced directives

Sample: 1,009 patients and 147 primary care physicians at an outpatient general medicine practice

Method: Randomized controlled trial

  • Experimental group: computerized reminders

  • Control group: no intervention

Outcome measures: Discussion about advanced directives determined by patient interview; completed advanced directive forms

Duration: 9 months

Gill, J. M., and A. M. Saldarriaga. 2000. The impact of a computerized physician reminder and a mailed patient reminder on influenza immunizations for older patients. Delaware Medical Journal 72(10):425-430.

Examine the impact of a computer physician reminder in combination with a mailed patient reminder on the rate of influenza vaccinations for older adults

Sample: 344 patients 65 years and older in a large family medicine office

Method: Retrospective cohort study

Outcome measures: Rates of receipt of influenza immunization compared to the year before and after the interventions were implemented

Duration: 2 years

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Advanced directive forms placed in the offices of all participating physicians

* Physician-investigators presented the concepts of advanced directives at grand rounds and face-to-face meetings with all physicians

* Experimental group physicians received reminders regarding advanced directive discussions

* Physicians who received reminders discussed advanced directives with more patients (24%) than control group physicians (4%) (OR = 7.7, 95% CI: 3.4-18, p < 0.001)

* Experimental group completed advanced directives with 15% of patients compared to 4% completion in control group (OR = 7.0, 95% CI: 2.9-17, p < 0.001)

* An electronic patient record system generated automatic reminders to the physician if the immunization had not been completed

* A mailed patient reminder was sent to encourage patients to schedule appointments for the immunization

Influenza immunization rates increase from 50.4% before the interventions to 61.6% after the intervention (p < 0.001)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Hung, C. S., J. W. Lin, J. J. Hwang, R. Y. Tsai, and A. T. Li. 2008. Using paper chart based clinical reminders to improve guideline adherence to lipid management. Journal of Evaluation in Clinical Practice 14(5):861-866.

Apply a paper-based clinical reminder to improve the adherence to lipid guidelines

Sample: 198 patients with coronary heart diseases at a university hospital in Taiwan

Method: Randomized controlled trial

  • Experimental group: clinical reminder stamped on the paper chart

  • Control group: no intervention

Outcome measures: New lipid-lowering therapy subscription; composite result of lipid-lowering therapy or lipid profile checkup

Duration: 6 months

Iliadis, E. A., L. W. Klein, B. J. Vandenberg, D. Spokas, T. Hursey, J. E. Parrillo, and J. E. Calvin. 1999. Clinical practice guidelines in unstable angina improve clinical outcomes by assuring early intensive medical treatment. Journal of the American College of Cardiology 34(6): 1689-1695.

* Determine the influence of clinical practice guidelines on treatment patterns and clinical outcomes in unstable angina

* Determine the effectiveness of guideline reminders on implementing practice guidelines

Sample: 519 patients with unstable angina at an academic medical center

Method: Interrupted time-series design

  • Experimental group: admitted after institution of guideline reminders

  • Control group: admitted before publication of guidelines

Outcome measures: Pharmaceutical treatments rendered; diagnostic or therapeutic procedures performed; major cardiac complications

Duration: 3.5 years

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* In the experimental group, a reminder was stamped in each medical chart

* The reminder indicated the current policy of statin reimbursement

* No difference at the end of 6 months regarding lipid-lowering therapy subscription (OR = 1.70, p = 0.248, 95% CI: 0.69-4.19)

* Composite result of lipid-lowering therapy or lipid profile checkup significantly higher in the experimental group (OR = 2.81, p = 0.001, 95% CI: 1.57-5.04)

Dissemination of guidelines was ensured by a grand rounds lecture and by posting guideline reminders on all of the experimental group’s charts

* Experimental group patients received β-blockers (p = 0.008), aspirin, and coronary angiography (p = 0.001) earlier than control group patients

* Experimental group patients experienced recurrent angina (29% vs. 54%) and myocardial infarction or death less frequently (3% vs. 9%, p = 0.028) than control group patients

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Kitahata, M. M., P. W. Dillingham, N. Chaiyakunapruk, S. E. Buskin, J. L. Jones, R. D. Harrington, T. M. Hooton, and K. K. Holmes. 2003. Electronic human immunodeficiency virus (HIV) clinical reminder system improves adherence to practice guidelines among the University of Washington HIV study cohort. Clinical Infectious Diseases 36(6):803-811.

Examine adherence to HIV practice guidelines before and after implementation of an electronic clinical reminder system

Sample: 1,204 HIV-infected patients and 41 clinicians (physicians, nurse practitioners, and physician assistants) at an HIV clinic in an academic medical center

Method: Prospective before-and-after study

Outcome measures: Proportion of patients in care who undergo (1) monitoring of CD4 cell count, (2) HIV-1 RNA level, (3) prophylaxis for pneumocystis pneumonia, (4) MAC prophylaxis, (5) tuberculin skin testing, (6) cervical Pap smears, and (7) serological screening

Duration: 5 years

Koide, D., K. Ohe, D. Ross-Degnan, and S. Kaihara. 2000. Computerized reminders to monitor liver function to improve the use of etretinate. International Journal of Medical Informatics 57(1):11-19.

Determine whether computerized reminders during the process of prescribing can improve the use of drugs requiring prior laboratory testing

Sample: 1,024 prescriptions prescribed for 111 patients at a teaching hospital in Tokyo, Japan

Method: Interrupted time-series design to compare a pre-intervention period and a post-intervention period

Outcome measures: Change in proportion of appropriate prescribing; frequency of severe hepatotoxicity between pre- and post-intervention

Duration: 2 years

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

An HIV disease-specific electronic medical record (EMR) enhancement provided clinicians with access to patient-specific information and a clinical reminder system

* More than 90% of patients received CD4 cell count and HIV-1 RNA level monitoring both before and after the intervention

* Patients were significantly more likely to receive prophylaxis (hazard ratio = 3.84; 95% CI, 1.58-9.31; p = 0.03), to undergo cervical cancer screening (OR = 2.09; 95% CI, 1.04-4.16; p = 0.04), and to undergo serological screening (OR = 1.86; 95% CI, 1.05-3.27; p = 0.03) after the reminders were implemented

* Computer alerts when physicians submit inappropriate prescriptions

* The physician can choose to proceed despite the alert or to cancel the prescription

* Appropriate prescriptions increased from 25.9% (127/491) in the pre-intervention period to 66.2% (353/533) in the post-intervention period (p < 0.0001)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Morgan, M. M., J. Goodson, and G. O. Barnett. 1998. Long-term changes in compliance with clinical guidelines through computer-based reminders. Proceedings of the American Medical Informatics Association Annual Fall Symposium 493-497.

* Evaluate the effectiveness of computer-based reminders in improving compliance with preventive medicine screening guidelines

* Examine the long-term impact of these reminders

Sample: 24,200 patients and 20 primary care physicians

Method: Ecologic study with a 12-month period prior to introduction of reminders, a 12-month period after the reminders were in place, and 5 years later

Outcome measures: Changes in compliance rates for preventive screenings

Duration: 6 years

Nilasena, D. S., and M. J. Lincoln. 1995. A computer-generated reminder system improves physician compliance with diabetes preventive care guidelines. Proceedings of the Annual Symposium on Computer Applications in Medical Care 640-645.

Evaluate the use of computerized reminders for preventive care in diabetes

Sample: 35 internal medicine residents

Method: Randomized controlled trial

  • Experimental group: detailed patient-specific reports and encounter forms

  • Control group: blank encounter forms

Outcome measure: Average compliance score of all patients seen by a resident (compliance score based on the number of items completed in accordance with the guidelines divided by the total number of items recommended for the patient)

Duration: 6 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Physicians were given a health maintenance report of preventive screening items at each patient visit

* EMR system was programmed to integrate 13 clinical guidelines

* Mean performance on 10 out of 13 health maintenance measures improved in the year following the integrated guideline report

* 5 years after introduction, improvement in mean performance persisted on 7 out of 13 measures and compliance improved for 1 additional measure

* Diabetes guidelines and encounter forms were incorporated in a computer program that served as a longitudinal patient database for storing clinical information

* The computer program outputs a health maintenance report for the physician, and the report is placed on the patient’s chart

* Clinical alerts about high-risk aspects of the patient’s profile are presented

Compliance with recommended care significantly improved in both the experimental group (38% at baseline, 54.9% at follow-up) and the control group (34.6% at baseline, 51% at follow-up)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Rhew, D. C., P. A. Glassman, and M. B. Goetz. 1999. Improving pneumococcal vaccine rates. Nurse protocols versus clinical reminders. Journal of General Internal Medicine 14(6):351-356.

Compare the effectiveness of 3 interventions designed to improve the pneumococcal vaccination rate by nurses

Sample: 3,502 outpatients and 3 nursing teams at a VA ambulatory care clinic

Method: Prospective controlled trial

  • Experimental groups: comparative feedback and clinician reminders (Team A); compliance reminders and clinician reminders (Team B)

  • Control group: clinical reminders

Outcome measure: Vaccination rates

Duration: 12 weeks

Sarasin, F. P., M. L. Maschiangelo, M. D. Schaller, C. Heliot, S. Mischler, and J. M. Gaspoz. 1999. Successful implementation of guidelines for encouraging the use of beta blockers in patients after acute myocardial infarction. American Journal of Medicine 106(5):499-505.

Assess whether implementation of guidelines increases the prescription of β-blockers recommended for secondary prevention after acute myocardial infarction

Sample: 355 patients discharged after recovery from myocardial infarction from a teaching hospital in Geneva, Switzerland

Method: Ecologic study with 12-month control period and a 6-month guideline implementation period; a neighboring public teaching hospital was used as a comparison

Outcome measures: Prescription patterns for nitrates, β-blockers, combined β-blockers and angiotensin-converting enzyme (ACE) inhibitors, and ACE inhibitors alone; physician attitude survey

Duration: 18 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Team A nurses received comparative feedback information on their vaccine rates relative to those of Team B nurses

* Team B nurses received reminders to vaccinate but no information on vaccination rates

* Nurses in all groups received clinician reminders

Vaccination rates for comparative feedback group and compliance reminder group were significantly higher than the 5% vaccination rate for the control group (p < 0.001)

* Short advisory statements regarding drug therapies were presented and distributed to all internal medicine and cardiology physicians

* Adherence was encouraged during large group meetings

* Guidelines were placed in the charts of all patients diagnosed with acute myocardial infarction

Implementation of guidelines significantly associated with prescription of β-blockers at discharge (OR = 10; 95% CI: 3.2-33; p < 0.001)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Tang, P. C., M. P. Larosa, C. Newcomb, and S. M. Gorden. 1999. Measuring the effects of reminders for outpatient influenza immunizations at the point of clinical opportunity. Journal of the American Medical Informatics Association 6(2):115-121.

Evaluate the influence of computer-based reminders about influenza vaccination on the behavior of individual clinicians at each clinical opportunity

Sample: 23 physicians and 629 patients at an internal medicine clinic at an academic medical center

Method: Cohort study

  • Experimental group: computer-based patient record system that generated reminders

  • Control group: traditional paper records

Outcome measures: Compliance with a guideline for influenza vaccination behavior for eligible patients as evidenced by ordering of the vaccine, patient counseling, or verification that the patient had received the vaccine elsewhere

Duration: 4 years

Walker, N. M., K. L. Mandell, and J. Tsevat. 1999. Use of chart reminders for physicians to promote discussion of advance directives in patients with AIDS. AIDS Care 11(3):345-353.

Determine if use of a physician chart reminder improves the rate of physician-initiated discussion and subsequent completion of advanced directives in patients with AIDS

Sample: 74 patients with AIDS and 10 primary care physicians at a university-based hospital clinic

Method: Controlled trial

  • Experimental group: chart reminders

  • Control group: no intervention

Outcome measures: Rate of documentation of discussion of advanced directives and rate of completion of an advanced directive

Duration: 6 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

Rule-based clinical reminders appeared on the electronic chart of a patient eligible for a recommended intervention

Compliance rates for computer-based record users increased 78% from baseline (p < 0.001) whereas rates for paper record users did not change significantly (p = 0.18)

Chart reminders were placed on medical records of experimental group patients at each clinic visit

* 12 out of 39 (31%) experimental group patients and 3 out of 35 (9%, p = 0.02) control group patients discussed advanced directives with physicians

* More subjects in experimental group completed advanced directives (28% vs. 9%, p = 0.03)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Weingarten, S. R., M. S. Riedinger, L. Conner, T. H. Lee, I. Hoffman, B. Johnson, and A. G. Ellrodt. 1994.1994. Practice guidelines and reminders to reduce duration of hospital stay for patients with chest pain: An interventional trial. Annals of Internal Medicine 120(4):257-263.

Evaluate the acceptability, safety, and efficacy of practice guidelines for patients admitted to coronary care and intermediate care units

Sample: 375 patients with chest pain and 155 primary physicians at an academic medical center

Method: Prospective, controlled clinical trial

  • Experimental group: guideline reminders

  • Control group: no intervention

Outcome measures: Patient instability at discharge; patient survival, hospital readmission, and other problems 1-month post-discharge; patient health perceptions; patient rating of the quality of information received at discharge; total costs (direct and indirect)

Duration: 12 months

Protocols and Guidelines

Dexter, P. R., S. M. Perkins, K. S. Maharry, K. Jones, and C. J. McDonald. 2004. Inpatient computer-based standing orders vs. physician reminders to increase influenza and pneumococcal vaccination rates: A randomized trial. Journal of the American Medical Association 292(19):2366-2371.

Determine the effects of computerized physician standing orders compared with physician reminders on inpatient vaccination rates

Sample: 3,777 general medicine patients discharged during a 14-month period from an urban teaching hospital

Method: Randomized controlled trial

  • Experimental group: reminder team

  • Control group: standing-order team

Outcome measures: Vaccine administration

Duration: 14 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

Physicians received concurrent, personalized written and verbal reminders regarding a guideline that recommended a 2-day hospital stay for patients with chest pain who were at low risk for complications

* Use of practice guidelines with concurrent reminders was associated with a 50-69% increase in guideline compliance (p < 0.001) and a decrease in length of stay from 3.54 ± 4.1 to 2.63 ± 3.0 days (95% CI)

* Intervention associated with a total cost reduction of $1,397 per patient (CI: $176-$2,618; p = 0.03)

* No significant difference found in complication rates, patient health status, or patient satisfaction

* For eligible patients in the standing order group, a computer system automatically produced a vaccine order at the time of discharge; nurses were authorized to administer vaccines in response to standing orders

* For eligible patients in the reminder group, a computer system produced a pop-up message with orders each time a physician began a daily order entry session

* Patients with standing orders received an influenza vaccine significantly more often (42%) than those with reminders (30%) (p < 0.001)

* Patients with standing orders received a pneumococcal vaccine significantly more often (51%) than those with reminders (31%) (p < 0.001)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Fakhry, S. M., A. L. Trask, M. A. Waller, and D. D. Watts. 2004. Management of brain-injured patients by an evidence-based medicine protocol improves outcomes and decreases hospital charges. Journal of Trauma 56(3):492-499.

Determine whether management of traumatic brain injury (TBI) patients according to established guidelines would reduce mortality, length of stay, charges, and disability

Sample: 830 patients with TBI

Method: Time trend analysis

  • Experimental groups: period of low guideline compliance; period of high guideline compliance

  • Control group: pre-guideline period

Outcome measures: Mortality; intensive care unit days; total hospital days; total charges; Rancho Los Amigos Scores; Glasgow Outcome Scale scores

Duration: 9 years

Audit and Feedback

Lobach, D. F. 1996. Electronically distributed, computer-generated, individualized feedback enhances the use of a computerized practice guideline. Proceedings of the American Medical Informatics Association Annual Fall Symposium 493-497.

Test the hypothesis that computer-generated, individualized feedback regarding adherence to care guidelines will significantly improve clinician compliance with guideline recommendations

Sample: 45 primary care clinicians at a clinic affiliated with an academic medical center

Method: Randomized controlled trial

  • Experimental group: biweekly e-mail with feedback on guideline compliance

  • Control group: no intervention

Outcome measures: Compliance with guideline recommendations for diabetic patients

Duration: 12 weeks

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Standard orders were developed based on established guidelines

* Guidelines were implemented by trauma service team leaders

* From the pre-guideline period to the period of high compliance, ICU stay was reduced by 1.8 days (p = 0.021) and hospital stay by 5.4 days (p < 0.001)

* Overall mortality rate was reduced from pre-guideline period (17.8%) to period of high compliance (13.8%), but the result was not statistically significant (p > 0.05)

* On Glasgow Outcome Scale score, 61.5% of patients in high compliance period had a “good recovery” or “moderate disability” compared with 43.3% in pre-guideline period (p < 0.001)

* The study site used a computer-based patient record that runs a computer-assisted management protocol, which incorporates guidelines for diabetes mellitus on paper encounter forms

* E-mail was used to transmit clinical information

Experimental group had significantly higher guideline compliance (35%) than control group (6.1%) (p < 0.01)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Multifaceted Interventions

Baker, R., A. Farooqi, C. Tait, and S. Walsh. 1997. Randomised controlled trial of reminders to enhance the impact of audit in general practice on management of patients who use benzodiazepines. Quality in Health Care 6(1):14-18.

Determine whether reminder cards in medical records enhance the effectiveness of audit and feedback in improving the care of patients with long-term benzodiazepine drugs

Sample: 742 patients taking a benzodiazepine in 18 general practices in Leicestershire, UK

Method: Randomized controlled trial

  • Experimental group: feedback plus reminder cards

  • Control group: feedback

Outcome measures: Number of patients whose care complies with each of 5 criteria

Duration: NA

Cleland, J. A., J. M. Fritz, G. P. Brennan, and J. Magel. 2009. Does continuing education improve physical therapists’ effectiveness in treating neck pain? A randomized clinical trial. Physical Therapy 89(1):38-47.

Investigate the effectiveness of an ongoing educational intervention for improving the outcomes for patients with neck pain

Sample: 19 physical therapists from 11 clinical sites in an integrated health system

Method: Randomized controlled trial

  • Experimental group: ongoing CE

  • Control group: no further education

Outcome measures: All patients treated by the physical therapists completed the Neck Disability Index and a pain rating scale before and after the ongoing intervention

Duration: 7 weeks

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* All practices received a copy of audit criteria justifying “must do” and “should do” priorities

* All practices received feedback comparing their performance to the criteria and to other practices

* The group receiving reminders had the reminders placed in the records of long-term benzodiazepine users

* Number of patients whose care complied with criteria rose after the interventions (OR: 1.46, 95% CI: 1.32-5.21)

* The increase was not statistically greater in practices receiving feedback plus reminders than in those receiving only feedback

* 2-day course on management of neck pain (for both control and experimental groups)

* 2 1.5-hour meetings to review the 2-day course, discuss management of specific cases, and co-treat a patient with neck pain in the therapist’s own setting (experimental group only)

* Patients treated by experimental group therapists experienced significantly greater reduction in disability during study period than those treated by therapists who did not receive ongoing training (mean difference = 4.2 points)

* Pain ratings did not differ for patients treated by the 2 groups

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Fjortoft, N. F., and A. H. Schwartz. 2003. Evaluation of a pharmacy continuing education program: Long-term learning outcomes and changes in practice behaviors. American Journal of Pharmaceutical Education 67(2).

Assess the long-term outcomes from a 3-month, curriculum-based pharmacy CE program on lipid management and hypertension services

Sample: 46 participants in a pharmacy continuing education course

Method: Cohort study with a pre- and post-test design

Outcome measure: Survey responses assessing participant knowledge on cognitive and psychomotor concepts; time spent providing clinical services

Duration: 3 months

Gonzales, R., J. F. Steiner, A. Lum, and P. H. Barrett, Jr. 1999. Decreasing antibiotic use in ambulatory practice: Impact of a multidimensional intervention on the treatment of uncomplicated acute bronchitis in adults. Journal of the American Medical Association 281(16):1512-1519.

Decrease total antibiotic use for uncomplicated acute bronchitis in adults

Sample: 93 clinicians (physicians, physician assistants, nurse practitioners, RNs) and 4,489 patients in 6 primary care practices

Method: Prospective, nonrandomized controlled trial with baseline and study periods

  • Experimental groups: full intervention; partial intervention

  • Control group: no intervention

Outcome measures: Antibiotic prescriptions for uncomplicated acute bronchitis during baseline and study periods

Duration: 15 months

Hobma, S. O., P. M. Ram, F. van Merode, C. P. M. van der Vleuten, and R. P. T. M. Grol. 2004. Feasibility, appreciation and costs of a tailored continuing professional development approach for general practitioners. Quality in Primary Care 12(4):271-278.

Study the feasibility and appreciation of a tailored continuing professional development (CPD) method in which GPs work in small groups to improve demonstrated deficiencies

Sample: 43 GPs in the Netherlands

Method: Cohort study

Outcome measures: Participation rates; costs per participant based on time invested by support staff, costs of materials, and time dedicated to the intervention; participant appreciation by self-reported Likert scale

Duration: 11 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Self-study materials

* 3 live, interactive workshops with case discussion and physical assessment

* Improvements in participant knowledge base and skill were observed between pre- and post-survey administration

* No change in percentage of time spent providing clinical services observed at 6 months or at 12 months

* 2 practices received house- and office-based patient education materials, clinician education, practice-profiling, and academic detailing (full intervention)

* 2 practices received only office-based patient education materials (partial intervention)

* Substantial decline in antibiotic prescription rates at the full intervention site (from 74% to 48%, p = 0.003) but no statistically significant change at the control and partial intervention sites

* Compared with control sites, nonantibiotic prescriptions (cough suppressants, analgesics) and return office visits were not significantly different for intervention sites

*Assessment to select aspects of care in need of improvement

* Comparison of assessment scores to standards in a meeting with a trained peer; identification of personal improvement goals

* Program of self-directed learning via 7 small-group meetings with fellow GPs led by trained GP tutors

* Total costs were €117.56 per hour or €2700 per participant

* Video assessment was appreciated more than knowledge tests

* Written feedback was appreciated; oral feedback from trained peer contributed little

* Role of the tutor in group sessions was described as “invaluable”

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Lagerløv, P., M. Loeb, M. Andrew, and P. Hjortdahl. 2000. Improving doctors’ prescribing behaviour through reflection on guidelines and prescription feedback: A randomised controlled study. International Journal for Quality in Health Care 9(3):159-165.

Study the effect on the quality of prescribing by a combined intervention of providing individual feedback and deriving quality criteria using guideline recommendations by peer review groups

Sample: 199 GPs in Norway

Method: Randomized controlled trial

  • Experimental group: focus on urinary tract infection (and vice versa)

  • Control group: focus on asthma

Outcome measures: Difference in prescribing behavior between the year before and the year after the intervention; self-report of intent to change disease management approach

Duration: 21 months

Laprise, R. J., R. Thivierge, G. Gosselin, M. Bujas-Bobanovic, S. Vandal, D. Paquette, M. Luneau, P. Julien, S. Goulet, J. Desaulniers, and P. Maltais. 2009. Improved cardiovascular prevention using best CME practices: A randomized trial. Journal of Continuing Education in the Health Professions 29(1):16-31.

Determine if after a CME event, practice enablers and reinforcers addressing clinical barriers to preventive care would be more effective in improving adherence to cardiovascular guidelines than a CME event alone

Sample: 122 GPs

Method: Cluster randomized trial

  • Experimental group: CME event followed by practice enablers and reinforcers

  • Control group: CME event alone

Outcome measures: Proportion of patients undermanaged at baseline who received preventive care action

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Participation in 2 peer meetings to discuss treatment guidelines and agree on common quality criteria for prescribing

* Prescription feedback provided to each GP

* Improved prescribing behavior in accordance with guideline recommendations

* Group discussion and feedback were well regarded by participants

Nurses visited GPs’ offices once a month to

  • Screen medical records for high-risk patients

  • Prompt physicians to reassess preventive care of these patients

  • Enclose a checklist in the patient chart with guideline reminders

Practice enablers and reinforcers following CME significantly improved adherence to guidelines compared to CME alone (OR = 1.78; 95% CI: 1.32-2.41)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Martin, C. M., G. S. Doig, D. K. Heyland, T. Morrison, and W. J. Sibbald. 2004. Multicentre, cluster-randomized clinical trial of algorithms for critical-care enteral and parenteral therapy (ACCEPT). Canadian Medical Association Journal 170(2):197-204.

Test the hypothesis that evidence-based algorithms to improve nutritional support in the intensive care unit (ICU) would improve patient outcomes

Sample: 499 patients in 14 ICUs over an 11-month period

Method: Cluster randomized controlled trial

  • Experimental group: introduction of evidence-based recommendations

  • Control group: no intervention

Outcome measures: Days of enteral nutrition, length of stay in hospital, mortality rates, length of stay in ICU

Duration: 11 months

Monaghan, M. S., P. D. Turner, M. Z. Skrabal, and R. M. Jones. 2000. Evaluating the format and effectiveness of a disease state management training program for diabetes. American Journal of Pharmaceutical Education 64(2):181-184.

Determine whether a CE approach to disease management training in diabetes mellitus is an effective means of improving both cognitive knowledge and confidence levels of participants

Sample: 25 pharmacists participating in a training program

Method: Cohort study with pre- and post-intervention design

Outcome measures: Scores on a pre- and post-test examination; scores on a 15-item attitudinal questionnaire

Duration: 14 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

Evidence-based recommendations were introduced via in-service education sessions, reminders by a local dietitian, posters, and academic detailing

* Patients in intervention ICUs received significantly more days of enteral nutrition (6.7 vs. 5.4 per 10 patient-days; p = 0.042), had a significantly shorter mean stay in hospital (25 vs. 35 days; p = 0.003), and showed a trend toward reduced mortality (27% vs. 37%; p = 0.058) than patients in control ICUs

* Mean stay in the ICU did not differ between control and experimental groups

Traditional lectures and small-group exercises in which participants obtained “hands-on” information related to the pharmacist’s role

* Cognitive post-test scores (68.8%) improved significantly (p < 0.001) over the pre-test scores (49.6%)

* Post-test scores on all 15 attitudinal items significantly improved over pre-test scores (p < 0.012)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Naunton, M., G. M. Peterson, G. Jones, G. M. Griffin, and M. D. Bleasel. 2004. Multifaceted educational program increases prescribing of preventive medication for corticosteroid induced osteoporosis. Journal of Rheumatology 31(3):550-556.

Assess a comprehensive educational program aimed at increasing the use of osteoporosis preventive therapy in patients prescribed long-term oral corticosteroids

Sample: All patients admitted to the Royal Hobart Hospital, Australia; all physicians and pharmacists in 2 regions in Australia

Method: Controlled trial

  • Experimental group: geographic region received multifaceted educational program

  • Control group: geographic region received no intervention

Outcome measures: Evaluation feedback from GPs and pharmacists; drug utilization data

Duration: 17 months

Pronovost, P. J., S. M. Berenholtz, C. Goeschel, I. Thom, S. R. Watson, C. G. Holzmueller, J. S. Lyon, L. H. Lubomski, D. A. Thompson, D. Needham, R. Hyzy, R. Welsh, G. Roth, J. Bander, L. Morlock, and J. B. Sexton. 2008. Improving patient safety in intensive care units in Michigan. Journal of Critical Care 23(2):207-221.

Describe the design and lessons learned from implementing a large-scale patient safety collaborative and the impact of an intervention on teamwork climate in intensive care units

Sample: 99 ICUs across the state of Michigan over 24 months

Method: Cohort study of ICU teams

Outcome measures: Improvements in safety culture scores using a teamwork questionnaire; adherence to evidence-based interventions for ventilated patients

Duration: 17 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

All GPs and pharmacies in the study area were sent educational materials and guidelines; received academic detailing visits and reminders; and were provided educational magnets for their patients

* Use of preventive therapy increased from 31% of admitted hospital patients taking corticosteroids to 57% post-intervention (p < 0.0001)

* Significant increase in the use of preventive therapy in the intervention region over the control region (p < 0.01)

* Collaborative project included group meetings and conference calls to share best practices and evaluate performance

* Partnership between hospital leadership, ICU improvement teams, and ICU staff to identify and resolve barriers

* Daily goals communication toolkits for staff education, redesign of work processes, and support of local opinion leaders

* Teamwork climate improved from baseline to post-intervention (t(71) = −2.921, p < 0.005)

* Post-intervention: 46% had >60% consensus of good teamwork; pre-intervention: 17% of ICUs had >60% consensus of good teamwork

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Rashotte, J., M. Thomas, D. Grégoire, and S. Ledoux. 2008. Implementation of a two-part unit-based multiple intervention: Moving evidence-based practice into action. Canadian Journal of Nursing Research 40(2):94-114.

Examine the impact and sustained change of a 2-part, unit-based multiple intervention on the use by pediatric critical care nurses of guidelines for pressure-ulcer prevention

Sample: 23 pediatric critical care nurses in a Canadian pediatric ICU

Method: Cohort study

Outcome measures: Before-and-after measures of frequency of use of interventions as documented in patient records and by observation

Duration: 6 months

Richards, D., L. Toop, and P. Graham. 2003. Do clinical practice education groups result in sustained change in GP prescribing? Family Practice 20(2):199-206.

Determine whether a peer-led small-group educational program is an effective tool in changing practice when added to audit and feedback, academic detailing, and educational bulletins

Sample: 230 GPs in urban New Zealand

Method: Retrospective analysis of a controlled trial

  • Experimental group: audit and feedback, individual academic detailing, educational bulletins, and peer-led group academic detailing sessions

  • Control group: audit and feedback, academic detailing, and educational bulletins

Outcome measure: Targeted prescribing for 12 months before and 24 months after education sessions

Duration: 36 months

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

* Part I targeted individuals with independent and group learning activities: laminated pocket guides, bedside decision-making algorithm

* Part II incorporated local and organizational strategies: unit champions, bedside coaching, development of standards

Significant change in implementation of 2 of 11 recommended practices following both interventions (p < 0.001)

* Clinical practice education groups met monthly

* GP-led discussion of evidence-based topics

* Individual prescribing data provided to each GP

* Peer-led small-group discussions had a sustained, positive effect on prescribing behavior that was in addition to any effect of the other educational methods (mean effect size = 1.20)

* Mean duration of significant effect was 14.5 months (CI: 95%)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Saini, B., L. Smith, C. Armour, and I. Krass. 2006. An educational intervention to train community pharmacists in providing specialized asthma care. American Journal of Pharmaceutical Education 70(5):118-126.

Test the effect of an educational intervention on pharmacist satisfaction and practice behavior as well as patient outcomes

Sample: 27 pharmacists providing asthma care to 102 patients in Australia

Method: Controlled trial

  • Experimental group: educational intervention

  • Control group: no intervention

Outcome measures: Participant reactions gauged using a questionnaire; asthma severity; peak flow indices; medication costs per patient

Duration: 6 months

Schneeweiss, S., and S. Ratnapalan. 2007. Impact of a multifaceted pediatric sedation course: Self-directed learning versus a formal continuing medical education course to improve knowledge of sedation guidelines. Canadian Journal of Emergency Medical Care 9(2):93-100.

Evaluate the effectiveness of a sedation course in improving physicians’ knowledge of pediatric procedural sedation guidelines, relative to self-directed learning

Sample: 48 emergency staff physicians, fellows, and residents in a pediatric emergency department

Method: Randomized controlled trial

  • Experimental group: self-directed learning

  • Control group: formal, 4-hour course

Outcome measures: Scores on multiple choice pre- and post-intervention exam

Duration: 2 weeks

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

Self-directed learning, small-group learning, and workshops with case studies in addition to asthma care training provided in a lecture

* Significant reduction in asthma severity in the experimental group (p < 0.001) vs. the control group

* In the experimental group, peak flow indices improved from 82.7% at baseline to 87.4% (p < 0.0010) at the final visit

* Significant reduction in defined daily dose of albuterol used by patients (p < 0.015)

* The 4-hour course consisted of small-group and didactic instruction with case studies

* The self-directed group received a package with learning objectives, guidelines, a pocket card, and reading materials

Control group’s median exam score (83.3%; range: 75.8-96.5%) was significantly higher (p < 0.0001) than median exam score of the experimental group (73.3%; range: 43.5-86.6%)

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Reference

Study Purpose

Sample, Method, Outcome Measures, and Duration

Scholes, D., L. Grothaus, J. McClure, R. Reid, P. Fishman, C. Sisk, J. E. Lindenbaum, B. Green, J. Grafton, and R. S. Thompson. 2006. A randomized trial of strategies to increase Chlamydia screening in young women. Preventive Medicine 43(4):343-350.

Evaluate an intervention to increase guideline-recommended Chlamydia screening

Sample: 23 primary care clinics; 3,509 sexually active females ages 14-25

Method: Randomized controlled trial

  • Experimental group: enhanced guideline intervention

  • Control group: standard guideline implementation instructions

Outcome measures: Post-intervention Chlamydia testing rates

Duration: 27 months

Young, J. M., C. D’Este, and J. E. Ward. 2002. Improving family physicians’ use of evidence-based smoking cessation strategies: A cluster randomization trial. Preventive Medicine 35(6):572-583.

Evaluate a multifaceted, practice-based intervention involving audit, feedback, and academic detailing to improve family physician smoking cessation advice

Sample: 60 family physicians in Australia

Method: Cluster randomized controlled trial

  • Experimental group: multifaceted intervention

  • Control group: no intervention

Outcome measures: Delivery of smoking cessation advice determined by patient recall, physician report, and medical record audit; utilization of nicotine replacement therapies

Duration: 6 months

NOTE: NA = Not applicable.

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

Description of Educational Method

Findings

The enhanced guideline group used clinic-based opinion leaders, individual measurement and feedback, exam room reminders, and chart prompts

* Enhanced intervention did not significantly affect Chlamydia testing (OR = 1.08; 95% CI: 0.92-1.26; p = 0.31)

* Testing rates increased among women making preventive care visits in intervention vs. control clinics

* Audit and feedback conducted by a medical peer

* Medical record prompt in the form of Post-it notes on medical records

* Provision of additional resources for physicians and patients

* Significant increase in the experimental group over the control group in the use of nicotine replacement gum (p = 0.0002) and patches (p = 0.0056)

* No significant differences between groups in smokers’ recall or documentation in medical record of specific cessation advice

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×

This page intentionally left blank.

Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 147
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 148
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 149
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 150
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 151
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 152
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 153
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 154
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 155
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 156
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 157
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 158
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 159
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 160
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 161
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 162
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 163
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 164
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 165
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 166
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 167
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 168
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 169
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 170
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 171
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 172
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 173
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 174
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 175
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 176
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 177
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 178
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 179
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 180
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 181
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 182
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 183
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 184
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 185
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 186
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 187
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 188
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 189
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 190
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 191
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 192
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 193
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 194
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 195
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 196
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 197
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 198
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 199
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 200
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 201
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 202
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 203
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 204
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 205
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 206
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 207
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 208
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 209
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 210
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 211
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 212
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 213
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 214
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 215
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 216
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 217
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 218
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 219
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 220
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 221
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 222
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 223
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 224
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 225
Suggested Citation:"Appendix A: Literature Review Tables." Institute of Medicine. 2010. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press. doi: 10.17226/12704.
×
Page 226
Next: Appendix B: Health Professions Table »
Redesigning Continuing Education in the Health Professions Get This Book
×
Buy Paperback | $65.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Today in the United States, the professional health workforce is not consistently prepared to provide high quality health care and assure patient safety, even as the nation spends more per capita on health care than any other country. The absence of a comprehensive and well-integrated system of continuing education (CE) in the health professions is an important contributing factor to knowledge and performance deficiencies at the individual and system levels.

To be most effective, health professionals at every stage of their careers must continue learning about advances in research and treatment in their fields (and related fields) in order to obtain and maintain up-to-date knowledge and skills in caring for their patients. Many health professionals regularly undertake a variety of efforts to stay up to date, but on a larger scale, the nation's approach to CE for health professionals fails to support the professions in their efforts to achieve and maintain proficiency.

Redesigning Continuing Education in the Health Professions illustrates a vision for a better system through a comprehensive approach of continuing professional development, and posits a framework upon which to develop a new, more effective system. The book also offers principles to guide the creation of a national continuing education institute.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!