Proceedings of a Workshop—in Brief
Convened April 3, 2025
Advancing Health Care Professional Education and Training in Diagnostic Excellence
The Forum on Advancing Diagnostic Excellence at the National Academies of Sciences, Engineering, and Medicine hosted a hybrid public workshop on April 3, 2025, in collaboration with the Global Forum on Innovation in Health Professional Education to explore opportunities to reduce the potential for diagnostic errors by strengthening health care professional education (HPE) and training in the diagnostic process.1,2 The workshop explored potential ways to improve HPE and training, which included strengthening diagnostic reasoning for trainees and educators, promoting the appropriate use of diagnostic tests and technologies to support clinician decision making, fostering patient-centered communication, and advancing interprofessional education. Andrew Bindman from Kaiser Permanente described the complex and collaborative nature of the diagnostic process, in which clinicians engage in information gathering, integration, and interpretation to determine a working diagnosis with the goal to reduce diagnostic uncertainty and to develop a better understanding of a patient’s health problem (Figure 1). He highlighted the importance of ensuring that all health care professionals involved in the diagnostic process have the education and training needed to achieve an accurate and timely diagnosis. This workshop builds on the consensus report Improving Diagnosis in Health Care (NASEM, 2015) and a workshop series on Advancing Diagnostic Excellence.3 This Proceedings of a Workshop—in Brief highlights the presentations and discussions that occurred at the workshop.4
Health Professions Education
Catherine Lucey from the University of California, San Francisco, gave a keynote address to explore the opportunities and challenges in HPE and touched on how new technologies like artificial intelligence (AI) can serve as a tool to better assist clinicians in diagnosis.
1 The workshop agenda and presentations are available at https://www.nationalacademies.org/event/44094_04-2025_advancing-health-care-professional-education-and-training-in-diagnostic-excellence-a-workshop (accessed May 31, 2025).
2 Achieving excellence in diagnosis refers to going beyond avoiding medical errors to ensure diagnosis is safe, timely, effective, efficient, equitable, and patient-centered.
3 More information is available at https://www.nationalacademies.org/our-work/advancing-diagnostic-excellence-a-workshop-series (accessed May 31, 2025).
4 This Proceedings of a Workshop—in Brief is not intended to provide a comprehensive summary of information shared during the workshop. The information summarized here reflects the knowledge and opinions of the individual workshop participants and should not be seen as a consensus of the workshop participants, the planning committee, or the National Academies of Sciences, Engineering, and Medicine.

SOURCES: Presented by Andrew Bindman, April 3, 2025. NASEM, 2015.
HPE has adapted and integrated new technologies into education and practice for generations, said Lucey, and each innovation prompts support and early adoption by some as well as skepticism and caution from those concerned about the impact on clinical practice. She said AI holds promise for supporting diagnostic excellence among health professionals by automating administrative tasks; providing rapid access to information syntheses; interpreting large, structured data sets, particularly in the fields of radiology and pathology; and confirming or expanding diagnostic possibilities. However, AI is not well suited in its current form to replace human diagnostic reasoning, Lucey said, which depends on the clinician’s ability to gather, organize, prioritize, and synthesize highly unstructured data from patients seeking care. A better understanding of the diagnostic process—how expert diagnosticians think and make decisions—can help guide future studies on the human–AI interface in achieving the goal of diagnostic excellence and she emphasized the importance of health professions educators actively collaborating with engineers and information technologists in this effort.
When discussing HPE design, Lucey suggested establishing diagnostic excellence as a core competency for all practicing clinicians with instructional strategies that help future practitioners to critically use and evaluate current and emerging tools and technologies. She also said that students will need to understand the principles of AI, including the risks, biases, and limitations, so that they can recognize when AI is wrong or not working appropriately. To effectively use AI in the diagnostic process, Lucey said that health professions students will need a strong foundation in diagnostic reasoning and an understanding of how expert diagnosticians think and decide. Lucey explained that the human brain excels in the diagnostic process (Figure 1) at the stage of information gathering, organization, and prioritization, noting that the human brain can do something that computers cannot—structure complex and ambiguous data so it can be acted upon, both in terms of gathering additional relevant data and determining next steps. She said research has found that experts store critical information about disease states in “illness scripts,” which are mental frameworks that experts use to organize knowledge about specific concepts that expand and become more detailed and elaborate as diagnosticians gain experience and encounter more and different cases. Expert diagnosticians use the storage
mechanism to select and organize patient information, develop patient problem representations, use an interplay between intuitive/automatic and analytic strategies to determine next questions and next steps, and engage in reflective reasoning to expand and organize knowledge as a form of deliberate practice. This Bayesian approach5 to hypothesis testing is quite different from how some AI systems operate, said Lucey. Health professions educators have used this knowledge of how diagnosticians think to develop educational strategies to improve competencies in diagnostic reasoning. Students are taught to organize their knowledge using the illness script framework, and to engage in continuous, iterative cycles of hypothesis testing to arrive at a prioritized differential diagnosis.
Moving forward, there are a number of opportunities to use AI as a partner and effective tool in health care, said Lucey, such as improving the understanding of the origins of disease and helping to build diagnosticians’ illness scripts, recommending questions that might positively influence thinking around diagnosis or prevent premature diagnostic closure and other biases, coaching learners to identify gaps in their knowledge of disease or their information seeking strategies, identifying diagnostic errors before they reach the patient, and guiding patients as they seek information from trusted sources.
Teaching Diagnostic Reasoning
Strategies for Improving Diagnostic Reasoning
Joseph Rencic from Boston University (BU) Chobanian and Avedisian School of Medicine described some educational strategies to improve diagnostic reasoning. Rencic stressed that “knowledge matters,” especially knowledge that enables students to distinguish diseases that present with similar findings. The key disease-related knowledge that teachers should impart to learners includes understanding the who (is at risk for the disease), what (clinical findings are associated with the disease), when (does the disease tend to occur), and how (does it develop and progress within the body) of a disease. Rencic touched on how formulating a diagnosis includes two types of thinking: Type 1 is fast, intuitive, and autonomous; Type 2 is slow, analytical, and deliberate.6 He described instructional strategies at BU that teach diagnostic reasoning by developing both autonomous and analytical thinking. In the first 6 months, students focus on key concepts and vocabulary, while the next 18 months are used to develop the learner’s analytical thinking through problem-based, case-based, part-task, and whole-task practice. Students engage with clinical cases to learn the hypothesis-driven approach to data gathering and the process of diagnosis. In the third and fourth years of medical school, clinical experiences help students to build their illness scripts and transition from slow, analytical to fast, intuitive clinical reasoning.
Rencic described how the curriculum pushes students to build on existing knowledge of pathophysiology and to use team-based learning during the Advanced Integration Module at the end of the second year, wherein learners are given clinical cases in various specialties that could include multiple organ systems. For example, a vasculitis case would require the learner to draw on both pulmonary and renal knowledge. Rencic explained that learning diagnostic reasoning requires deliberate and effortful practice combined with meaningful feedback that allows learners to develop nuanced, accurate illness scripts that will eventually allow for unconscious competence during clinical practice. He noted that deliberate practice can be difficult to provide in medical school because of the randomness of clinical experiences and lack of meaningful feedback, and he suggested addressing these challenges through individualized learning plans, coaching, and virtual patient practice and assessment. Lastly, Rencic said the main focus of diagnostic reasoning teaching and assessment is to “help learners develop accurate and robust illness scripts and educational strategies to target both the analytical and the autonomous mind and perhaps thinking dispositions.”
Optimizing Trainee Feedback
Satid Thammasitboon from Texas Children’s Hospital provided an overview of different methods that can guide students as they learn about diagnostic reasoning, which included frameworks and tools to help teachers provide effective feedback. “The diagnostic reasoning process
5 A Bayesian approach is a conditional probability or a probabilistic construct that allows new information to be combined with existing information: it assumes, and continuously updates, changes in the probability distribution of parameters or data.
6 Type 1 and Type 2 thinking refer to two different modes of cognitive processing, based on the book Thinking, Fast and Slow by Daniel Kahneman.
is very complex,” said Thammasitboon, and likened it to a “black box” that is “mysterious, obscure, and sometimes closed off” and makes providing feedback to students challenging. Trainees may feel as if they are under examination or being asked to perform, and because diagnostic reasoning is closely tied with human intelligence, telling a student to “reason better” can be perceived as telling them to “be smarter.” One way to bridge the invisible divide between teachers and learners, he said, is through “learning conversations” that involve an alliance between teacher and trainee, an informed self-assessment in which trainees receive accurate and reliable information for them to process and conduct self-assessments, and a shared mental model. Thammasitboon went on to describe the Assessment of Reasoning Tool (ART) that was developed as a framework to scaffold and guide learning conversations around diagnostic reasoning. ART provides a structure, shared language, and theoretical clarity for these conversations and focuses on five domains upon which the learner is assessed: gathering data in a hypothesis-directed manner, articulating a complete and descriptive problem representation, articulating a prioritized differential diagnosis, directing evaluation or treatment toward high priority diagnoses, and demonstrating metacognition—the ability to think about one’s own thinking. Each domain contains specific descriptions of behavior that would demonstrate minimal, partial, or complete mastery of the domain. Thammasitboon explained that using ART for assessment gives the learner specific, consistent, and actionable feedback and facilitates conversations that lead to improved future performance. A study to assess the utility and feasibility of using the tool with interns in a pediatric hospital medicine rotation found that ART provided a guidepost for conversations about diagnostic reasoning that helped facilitate feedback on the diagnostic reasoning process for teachers and learners (Cohen et al., 2022). Thammasitboon noted that ART can be adapted for different learners or contexts by focusing on different domains. He concluded by highlighting that “Teachers need robust conversational strategies to support diagnostic reasoning across contexts and environments.” He also said, “Learning conversations cultivate a teacher-trainee alliance facilitating reflection that refines the trainee’s mental model leading to actionable next steps,” and tools like the “ART provide a structured yet flexible framework to guide diagnostic reasoning conversation with clarity, precision, and purpose.”
Remediation Strategies for Learners
Andrew Parsons from the University of Virginia (UVA) School of Medicine, described remediation for learners struggling with diagnostic reasoning. Clinical reasoning deficits are common among medical students, residents, and fellows, said Parsons, suggesting between 8 and 12 percent of learners will require formal remediation. Studies of 3 different medical schools found that between 25 and 45 percent of mixed learners (including medical students and residents) were referred to a remediation program (Guerrasio et al., 2014; Parsons et al., 2024; Warburton et al., 2017), with clinical reasoning as one of the most common reasons for referral. Parsons noted that while early identification of struggling students is critical to allow sufficient time for coaching and demonstration of competence, the assessments can be challenging: self-assessment is unreliable, teacher assessment is difficult, and assessment involves looking at both the process and outcomes. Parsons and his colleagues additionally examined the other reasons for learners’ referrals in their evaluations, which included knowledge disorganization, inefficiency, indecision, not knowing what questions to ask, and “missing the big picture.” While experienced clinicians depend on rapid, intuitive thinking, Parsons explained that these learners necessarily rely on Type 2 analytical thinking as they gradually hone pattern recognition.
Parsons described the remediation program at the UVA School of Medicine, which begins with identifying the learners’ primary performance deficit followed by a targeted assessment to determine where to focus coaching. The program uses a diagnostic reasoning framework with five key components to help both learners and coaches understand the reasoning process and localize areas of difficulty: hypothesis generation, data gathering, problem representation, refining hypotheses, and working diagnosis. To diagnose a specific deficit, coaches use “case-based self-regulated learning microanalysis,” said Parsons, an approach that incorporates forethought, goalsetting, and self-reflection. Prompts within these elements allow the coach and the learner to uncover specific issues, and a strategy is developed to address these
issues. For example, a learner who neglects to ask about defining features early in a clinical encounter, gives narrow differential diagnoses, fails to consider common diagnoses, or struggles to justify why they performed a specific exam maneuver may have a primary deficit in hypothesis generation. To improve their deductive reasoning and pattern recognition, the coach develops an intervention to scaffold creation of a differential diagnosis. Remediation in diagnostic reasoning relies on deliberate practice, said Parsons, with well-defined learning objectives and a clear goal. In the program at UVA, strengthening deliberate practice comes from a time-intensive process that begins in a simulated environment and moves to an authentic clinical environment, using appropriately challenging and variable case examples, providing feedback at all levels, and focusing on developing diagnostic reasoning in multiple contexts. Parsons shared resources that describe the case-based coaching approach for diagnostic reasoning remediation, including “Remediation in Medical Education” (Connor et al., 2014) and “Clinical Reasoning: Coaching the Struggling Medical Learner” (Parsons and Warburton, 2019).
optimizing the use of diagnostic tests and technologies
Leveraging Artificial Intelligence
Cornelius James from the University of Michigan Medical School presented on leveraging AI and machine learning to support diagnostic reasoning, and broadly defined AI as using computers to perform tasks that require objective reasoning and understanding. He described the three epochs of AI, which include early rules-based AI tools that could not adapt easily to real-world complexities; a second epoch in which AI tools were task specific, could utilize both structured and unstructured data, and could handle more diverse situations; and finally, the most recent epoch that uses tools such as ChatGPT and other generative AI, where models are trained on a large body of data and can interpret and interact with text-based and image-based data.
Despite the many advances in AI, research has shown mixed results on its use in diagnosis, said James. One study found that radiologists’ diagnostic accuracy in reading chest radiographs improved when aided by an accurate AI model but worsened when aided by an inaccurate AI model (Yu et al., 2024), and another study of clinicians similarly found that an increase or decrease in diagnostic accuracy was largely dependent on the quality of the AI model rather than the clinician’s expertise (Jabbour et al., 2023). Additionally, context can influence an AI’s predictive accuracy. James highlighted a paper on how an AI model trained to predict sepsis performed well in the setting in which it was developed but did not perform as well outside of this setting (Wong et al., 2021). A 2019 study found racial bias in a model that identified patients who would benefit from a complex care program and disproportionately recommended White patients over Black patients (Obermeyer et al., 2019). The model used higher health care costs based on frequency of health care use to identify patients with greater need for the program without considering the systemic factors that may affect how different populations engage with and utilize the health care system.
Clinician interaction with AI is on a continuum, from assistive algorithms that analyze data and provide risk scores or highlight areas of concern to autonomous algorithms that analyze data and make recommendations for action with or without clinician backup, said James. Given this spectrum of AI interaction, he stressed the importance of preserving patient and clinician autonomy and determining what it means to keep humans “in the loop.” James described a framework for patients and clinicians to work with AI in a triadic patient–clinician–AI team in which all parties augment and collaborate with each other. Figure 2 shows the continuous interaction among all three, and James explained how patients might use AI to explore potential diagnoses through ChatGPT, how AI might assist clinicians using AI scribes, and how all these interactions ultimately augment and influence AI. With this relationship, it will be necessary to determine when cognitive offloading is appropriate and identify the potential consequences. In addition, James suggested that current and future clinicians will need to reskill, upskill, and deskill in order to succeed. He mentioned preparing learners by integrating AI into multiple courses, including health systems science, evidence-based medicine, clinical skills, and clinical reasoning, and he encouraged educators to develop foundational knowledge around AI. Lastly, James provided an example of a curriculum that incorporates AI called DATA or “data-augmented technology assisted medical decision making,” which was developed by an interdisciplinary team of medical educators, computer

SOURCES: Presented by Cornelius James, April 3, 2025. James et al., 2023.
scientists, lawyers, and others and teaches learners how to use AI in their diagnostic reasoning.
Diagnostic Stewardship Interventions
Valerie Vaughn from the University of Utah School of Medicine described diagnostic stewardship interventions to ensure appropriate use of diagnostic testing. In real world clinical settings, clinicians have to balance competing demands while seeing multiple patients and working with incomplete or fragmented information, but they are still ultimately responsible for deciding on the right test, the right treatment, and the right diagnosis for each patient.
Diagnostic stewardship is a program or strategy designed to optimize the process of diagnostic testing to reduce diagnostic error, said Vaughn, and was used as part of an effort to reduce misdiagnosis and mistreatment of infections, thereby reducing unnecessary antibiotic use. One common diagnostic error that involves antibiotic use is the misdiagnosis of asymptomatic bacteriuria as a urinary tract infection (UTI). Vaughn published a study on diagnostic stewardship programs aimed at reducing testing and inappropriate antibiotic use that led to a decrease of 37 percent in misdiagnosis of asymptomatic bacteriuria (Vaughn et al., 2023). She explained that preventing an inappropriate test from being conducted is a more effective way to reduce antibiotic use.
Vaughn illustrated this by using an example of an older adult with dementia admitted to the hospital with a vague complaint, such as altered mental status, who arrives alone and is unable to provide any history or detailed information. After a physical exam does not reveal any obvious issues and laboratory findings are nonspecific, a positive urinalysis may trigger a diagnosis of UTI. Yet, because the pretest probability of a positive urinalysis in an older adult patient coming from a nursing home is already very high, Vaughn said the test is essentially “useless,” and the clinician will miss the possibility that the patient perhaps started a new medication, is dehydrated or constipated, which are all conditions that can contribute to an altered mental state. In cases such as these, a diagnostic stewardship program could have prevented unnecessary urinalysis and allowed the diagnostic pathway to continue.
Vaughn noted the importance of preserving clinician autonomy when considering diagnostic stewardship approaches and suggested electronic medical record nudges and reminders to reduce reflexive antibiotic use. She also highlighted how diagnostic stewardship is a behavioral
and system approach to encourage high-value diagnostic testing with the goals of improved test selection, reduced repeat and low-value testing, and minimized harm from false-positive results or overdiagnosis. As the number of diagnostic tests proliferates in medicine, said Vaughn, it will be critical to apply diagnostic stewardship principles to ensure thoughtful and deliberate use of these tools and tailor the strategies to the specific care setting or health condition to improve diagnostic excellence.
Simulation Learning Interventions
Leah Burt from the University of Illinois Chicago described the role and impact of simulation-based education to improve diagnostic competency, insights on the creation and implementation of effective simulation, and assessment and evaluation of its effectiveness. She said simulation has several advantages compared to traditional clinical education and is an ideal learning environment for diagnostic reasoning, noting that it can be purposeful and standardized. Patient availability and complexity vary considerably in traditional clinical settings, and time constraints or disruptions can limit purposeful feedback in a clinical learning environment. In contrast, Burt explained that simulated cases can be calibrated to a range of clinical presentations, diagnostic complexities, and workflow challenges. These controlled experiences allow learners to make and correct errors without harming patients, making learners more open to taking risks and accepting challenges, and allow for more timely and specific feedback as well as the ability to engage in metacognitive reflection, which increases self-awareness.
When describing effective simulation design, Burt noted that a simulation objective may focus on diagnostic competency or diagnostic processes rather than just diagnostic accuracy because competencies and processes encompass broader sets of essential, foundational skills. This approach facilitates emphasis on effective communication, teamwork, and the many other building blocks of diagnostic excellence. A scaffolded approach to simulation design allows highly effective, diagnosis-focused simulations to be integrated through the entirety of education programs, including acquiring foundational skills (e.g., hypothesis-driven history), practicing contextual application (e.g., hypothesis-driven histories in patients with typical and atypical symptoms), and strengthening critical thinking and improving teams and systems (e.g., hypothesis-driven history in a patient with atypical symptoms during an encounter with multiple interruptions). Faculty serve different roles through this process, said Burt, from directive teacher to individualized coach to facilitator of learner-driven reflection. She emphasized the importance of training faculty to serve as coaches. Simulation provides an opportunity for feedback from multiple sources to be integrated into a holistic view of performance. Learners can get feedback from different groups (peers, experts, and standardized patients) that provide complementary perspectives and useful frames of reference for the learner. In addition to feedback and reflection, Burt suggested performance assessment needs to be conducted repeatedly throughout the process of developing diagnostic competence. “Repeated practice alone is unlikely to significantly enhance student learning to the extent possible without integration of purposeful assessment and constructive feedback to guide improvement,” said Burt. In addition, she highlighted the importance of learners being familiar with assessment criteria and performance expectations, and having opportunities to ask questions and clarify uncertainties, which helps to create a safe and transparent learning environment where a student can effectively track their progress. Burt concluded by discussing several areas where AI can assist with the assessment and feedback process: evaluating learners’ decision making; allowing learners to try out diagnostic decisions and observe real time physiological responses; tracking metrics such as time, diagnostic accuracy, and error rates; and providing customized feedback and identifying specific areas for improvement.
fostering Patient-Centered Communication
Addressing Implicit Bias
Monica Lypson from Columbia University described faculty development approaches to addressing implicit bias in clinical encounters. Both learners and faculty have biases in their thinking, said Lypson, which is why faculty development is essential for ensuring that faculty understand how bias affects diagnostic reasoning and can help learners address and mitigate bias in their practice. Implicit bias can contribute to health disparities, which disproportionately impact minority populations. In order for faculty to teach learners how to address bias in the
diagnostic process, Lypson suggested that faculty first be aware of their own biases and ways of thinking. The potential for bias can influence rapid decision making in Type 1 thinking, and Type 2 thinking allows for reflection and can mitigate biases, said Lypson. Faculty have an important role in helping learners improve clinical decision making and diagnostic equity by balancing these types of thinking and recognizing when to slow down and engage in Type 2 thinking, she said (Kahneman and Frederick, 2002). Lypson described a simulation study from her own work designed to assess the influence of racial implicit bias on physician communication skills in which physicians evaluated a standardized patient presenting as either Black or White (Gonzalez et al., 2024). She noted that such simulations can inform interventions aimed at addressing implicit bias. Lypson further stressed the importance of faculty encouraging reflection and teaching learners to consider alternative diagnoses, using decision-support tools instead of relying on assumptions, and prompting learners to ask themselves, “Would my diagnosis change if this patient looked different?” (Gonzalez et al., 2021).
Lypson shared a list of tips for teaching faculty about bias in diagnosis that she and her colleagues developed, including the suggestions to create a safe learning environment, flatten the hierarchy by encouraging open dialogue, normalize bias while reducing self-blame, integrate the science of bias, create activities that embrace discomfort, encourage critical reflection, explore structural and institutional biases, and reinforce bias recognition. She emphasized the importance of faculty being trained on bias before teaching learners and incentivizing the adoption of these bias trainings to improve diagnosis by rewarding faculty who demonstrate excellence in teaching diagnostic reasoning. Lypson encouraged educators to address implicit bias in their interactions with learners by prompting learners to slow down, challenging assumptions in case discussions, and modeling bias-aware diagnostic thinking. In closing, Lypson said, “Small changes in how we teach diagnostic reasoning can create a lasting impact on patient care equity.”
Communicating Diagnostic Uncertainty
Danielle McCarthy from the Northwestern University Feinberg School of Medicine discussed strategies on how clinicians can effectively communicate diagnostic uncertainty to patients to reduce distress and ensure that patients are safe, heard, and informed. Though the clinician experience is critical to the diagnostic process, it is important to consider the patient’s experience as well, especially in environments such as the emergency department (ED), where patients are seeking immediate answers for what is causing their symptoms, often with fears of serious and life-threatening diagnoses. ED clinicians can usually rule out immediately dangerous problems through examination and testing but may not be able to provide a complete answer to the patient about their condition. McCarthy highlighted that about 37 percent of discharged ED patients leave with a symptom-based diagnosis (Wen et al., 2015). For example, a patient with a symptom of chest pain who receives tests that rule out serious conditions may leave with a diagnosis of “chest pain.” While this may be a valuable visit from the clinician’s perspective, a patient without a clear diagnosis may feel their clinician did not believe them, suspect something is being hidden or information is withheld, and may leave with more uncertainty and unresolved fears that could lead to return visits to the ED.
McCarthy described a study that aimed to define key principles of competency in communication of uncertainty, develop a curriculum to teach effective patient communication regarding diagnostic uncertainty, and test the efficacy of the curriculum in establishing competency among resident physicians. The goal of the research was not to eliminate uncertainty but to equip clinicians with a conversational framework that helps patients understand and manage the uncertainty and feel safe after discharge. A toolkit developed for the study included a 21-item checklist informed by an expert panel on diagnostic uncertainty and health care communication, a literature review, and patient perspectives.7 McCarthy highlighted five items unique to the context of diagnostic uncertainty and examples of what an ED health care provider might say:
- Reassure: Your condition is not “life threatening” or “dangerous.”
- Be honest: “I don’t know what is causing your symptoms.”
- Validate and normalize: “I believe your symptoms are real”; “We often don’t find an answer in the ED.”
7 More information on the uncertainty communication toolkit is available at https://research.jefferson.edu/connected-care-center/uncertainty-communication-toolkit/curriculum-materials.html (accessed June 10, 2025).
- Role of the ED: “Our job in the ED is to make sure that there is nothing life threatening, but we do not always have an answer.”
- Address unmet needs: “Was there anything else you were expecting?”
McCarthy noted that while this checklist is aimed at ensuring a smooth discharge, residents are taught to begin the conversation early during the patient visit. For example, a clinician might tell the patient that they are ordering tests to look for life-threatening conditions while noting it is common for tests to come back normal. The clinician can then outline what would follow if none of the tests indicate a dangerous condition. In addition to the Uncertainty Communication Checklist, the toolkit developed by McCarthy and her colleagues contains standardized patient training videos, simulation cases, educational modules, and an online game. McCarthy concluded by saying that having this conversation about uncertainty early helps the patient feel reassured, understood, and more prepared for what comes next.
Improving Diagnostic Communication for Older Adults
Alberta Tran from MedStar Health presented approaches to improve diagnostic communication and quality for older adults. Tran emphasized how diagnostic errors in older adults stem from a complex interplay of structural, clinician, and patient-level challenges. Fragmented health care and insufficient geriatric training present challenges on a structural level; communication barriers, age bias, and overdiagnosis and screening dilemmas may present challenges on the clinician level; and patient-level challenges may include polypharmacy and prescribing cascades, atypical presentations, comorbidities, communication barriers, and functional decline. Tran illustrated the complexities of these challenges through a case example of a 74-year-old man who comes to the ED with abdominal pain, nausea, and vomiting. The patient has a history of hypertension, hyperlipidemia, and diabetes, but examination and testing show no abnormalities. The patient also mentions that knee pain has been bothering him and makes it difficult to walk. The physician prescribes medication for the nausea and orders an immediate outpatient CT scan and tells the patient that there’s “not much to worry about,” along with plans to follow-up with more testing. Because the patient hears and understands that there is “not much to worry about,” he delays the scan for six days because of his knee pain, but an eventual scan reveals a gangrenous gallbladder that requires emergency surgery. This case example, said Tran, was inspired by a real diagnostic error case that was written by nurses about how to work up nondescript abdominal pain, and it is case studies like this that enable learners to step back, identify diagnostic challenges from patients’ perspectives, and look for opportunities for improvement in the diagnostic process.
Engagement between patient and clinician exists along an “engagement ladder,” said Tran, who described a figure showing the need to advance a patient’s engagement in which the patient and clinician collaborate toward diagnostic safety excellence (Figure 3). At the lowest levels, patients are passive recipients of health care in which clinicians educate patients or tell them what to do. In the middle levels, clinicians inform, consult, and engage with patients in their care, and at the highest levels, patients and clinicians are equal and reciprocal co-producers and co-designers of care. Tran emphasized the importance of moving older adults up this ladder to achieve meaningful engagement, and one strategy is to equip them with tools to support participation in the diagnostic process. Tran and colleagues adapted a tool developed by the Agency for Healthcare Research and Quality (AHRQ) called “Be the Expert on You”8 based on feedback from focus groups of older adults, which resulted in a two-page clinical encounter sheet for patients to use. Feedback from participants was overwhelmingly positive—more than 70 percent of older adult patients found it helped them organize their thoughts and improved communication with the clinician. Clinicians were also satisfied—83 percent of clinicians felt their patients were effectively communicating their health needs compared to 33 percent before the intervention (Tran et al., 2025).
Tran suggested key areas for improving diagnostic communication and quality for older adults: teach trainees to take a patient-centered approach to health
8 For more information on the AHRQ toolkit on engaging patients to improve diagnostic safety, see https://www.ahrq.gov/diagnostic-safety/tools/engaging-patients-improve.html; and https://www.ahrq.gov/sites/default/files/wysiwyg/patient-safety/resources/diagnostic-toolkit/10-diagnostic-safety-tool-patient-note-sheet.pdf (both accessed June 12, 2025).

SOURCES: Presented by Alberta Tran, April 3, 2025. Epstein et al., 2024; created by Think Local Act Personal and the National Co-production Advisory Group, n.d.
care, including asking patients what matters to them rather than only addressing their chief medical complaint; meaningfully involve other diagnostic team members, including patients’ families and caregivers, with evidence-based tools; ensure patient-facing materials and communications are age-friendly and appropriate by asking older adult patients to review materials; increase geriatric competencies in education and training programs; and foster a culture of learning and improvement across education and practice settings.
promoting Interprofessional Education
Interprofessional Competencies in Nursing Education
Kelly Gleason from the Johns Hopkins School of Nursing discussed interprofessional competencies to improve diagnostic safety through nursing education, practice, and regulation. She described three areas of competencies (individual, team-based, and system-related) in the diagnostic process—demonstrating clinical reasoning to reach a justifiable diagnosis as an individual, partnering effectively in an interprofessional diagnostic team, and identifying and understanding systems factors that facilitate and contribute to timely and accurate diagnoses and error avoidance.
Gleason said that pre-licensure nursing education explicitly names these competencies, and the core competencies defined by the American Association of Colleges of Nursing include interprofessional partnerships, system-based practice, demonstration of clinical judgment, and contribution as a team member to the formation and improvement of diagnoses. While these competencies exist in the curriculum, she said there is a “crisis in competency” in nursing with a disconnect between educational standards and the readiness of new nurses entering the field. A 2021 article found that 2020 nursing graduates had lower clinical judgment and reasoning scores compared to previous years’
graduates,9 with as many as 40 percent failing to recognize a patient’s urgent problem, 50 percent failing to intervene appropriately, and less than 10 percent possessing safe clinical judgment skills (Kavanagh and Sharpnack, 2021). Additionally, Gleason and her team analyzed malpractice claims that named a nurse as the primary responsible party and found breakdowns in communication with health care providers were significant factors in likelihood of death and higher health expenses (Gleason et al., 2021). Adding to this complexity, nursing scope-of-practice language varies by state—some states expressly prohibit nurse participation in diagnosis, some states specifically allow nursing diagnosis but not medical diagnosis, and a few states do not restrict nurses from participating fully in contributing to a patient’s diagnosis.
To strengthen patient safety, nursing competency, and participation in the diagnostic process, Gleason offered several strategies. First, she emphasized a need for consistent language across states to clearly define the expectations for nurses in recognizing, interpreting, and communicating diagnoses. Second, Gleason said new graduate nurses can be coached and evaluated using models such as Tanner’s Clinical Judgment Model,10 which guides nurses through recognizing changes in a patient’s condition, interpreting the cause, responding, and reflecting on the outcomes in “the same way that we're giving feedback on dressing changes, on medication safety.” Third, she advocated for incorporating team-based interprofessional training, such as the AHRQ evidence-based program TeamSTEPPS (Team Strategies and Tools to Enhance Performance and Patient Safety),11 which takes place in real-world settings, rather than simulation, and optimizes teamwork. Finally, Gleason suggested shifting nursing education away from the traditional cohort-based, one-size-fits-all format toward a more adaptive, interprofessional, and competency-based format.
Virtual Interprofessional Education Consult
Raine Osborne from the University of North Florida presented how meaningful interprofessional training can engage different health care professionals, blend different perspectives and professional lenses, and create a better diagnostic model that centers the patient and the patient’s priorities. Often patients are “at the center of two different diagnostic perspectives coming from different [health care professional] silos,” said Osborne as he described the divide between medical professions that center the patient’s health condition versus rehabilitation, which centers on the patient’s activity and functional limitations. An additional barrier to collaboration is that interprofessional education typically occurs early in training, said Osborne, such as in medical school or physical therapy school, when learners are building skills and knowledge but are not yet practicing in clinical settings as health care providers.
One way to bridge the perspectives across health professions is through a team-based, interprofessional diagnostic approach. Osborne shared details of his team’s study on using Virtual Interprofessional Education (VIP) Consult to encourage and facilitate interprofessional practice for learners who are at the residency and fellowship stage,12 as they begin to build their practice patterns. This pilot project brought together residents from family medicine, physical therapy, and occupational therapy; mental health graduate students; and nutrition and dietetics interns. An orientation session allowed participants to discuss their competencies and become familiar with others’ roles and diagnostic lenses. From there, interdisciplinary teams held a virtual consult session with a real patient to take a history and ask questions, and then the team debriefed and began a care plan. A second consultation was held to follow up with the patient and for the team to give their recommendations. Finally, the group presented their work and filled out a post-questionnaire. Participants in the study reported increased confidence in key interprofessional competencies, such as communicating roles and responsibilities to other health care professionals; articulating how the team collaborated
9 Data in this study was based on the Performance Based Development System, a validated test developed by Dorothy del Bueno to assess new registered nurses in critical thinking skills, technical skills, and interpersonal skills. See Anthony, C. E., and D. Del Bueno. 1993. A performance-based development system. Nurs Manage 24(6):32-34.
10 See Tanner, C. A. 2006. Thinking like a nurse: A research-based model of clinical judgment in nursing. J Nurs Educ 45(6):204-211.
11 More information is available at https://www.ahrq.gov/teamstepps-program/index.html (accessed June 5, 2025).
12 Study began in 2023 and is currently ongoing. See https://scholars.unf.edu/en/projects/virtual-interprofessional-vip-consultation-experience-exploration (accessed June 6, 2025).
to provide care, sharing information with team members in accessible, jargon-free language; and working effectively with a range of health care professionals to meet specific patient care needs. Osborne said that participants saw many benefits of working interprofessionally, with learners reporting that working with other health care professionals created a more holistic and comprehensive view of the patient, allowed patients to voice their concerns in general rather than specific to one discipline, and helped to identify critical issues that a single clinician may have missed. Importantly, said Osborne, participants were made aware of other health care professional perspectives and considered how they might integrate these perspectives in their own work with patients.
Interprofessional Partnership to Advance Care and Education Unit
Sarah Hallen from MaineHealth presented on Interprofessional Partnership to Advance Care and Education (iPACE), a patient-centered care delivery model in which a high performing interprofessional team partners for bedside rounds with the patient and their family to co-develop a care plan that is clearly communicated as one message to both the patient and care team. Grounded in the idea of “one team, one round, one message,” the model includes structured and scheduled interprofessional bedside rounds, interprofessional educational sessions, and ongoing assessment of the model and systems. This approach enabled residents to fully participate in safety and quality improvement initiatives that had traditionally been driven by nurses and floor staff with limited involvement from residents. The model was piloted at Maine Medical Center in 2017 in an 11-bed inpatient internal medicine teaching unit, and Hallen reported that outcomes were positive with families and patients reporting high satisfaction, improved team communication and wellbeing, reduced length of stay and costs, and residents receiving detailed and more actionable feedback.
Given the positive results of the pilot project, Hallen and her colleagues set out to scale and disseminate the program, while also recognizing the program would need to be adapted to consider the different needs and environments across medicine. They identified the interprofessional bedside round as the key element of iPACE, centering the clinician, another health care professional team member, and the patient, noting the importance that these rounds address a problem or patient care need that is significant to the care team members and their patients. Hallen shared a successful adaptation and implementation of the iPACE model in a community hospital in Maine in which the hospital leveraged their existing team members—health care provider, nurse, patient, and (sometimes) pharmacist—and targeted issues of length of stay, patient experience, and staff engagement. The implementation of the adapted iPACE model yielded significant improvements in communication among patients and their physicians and nurses, better engagement among staff, and decreased length of stay for patients by 12 percent (Hallen et al., 2020). Hallen said other potential benefits of models like iPACE include improved care efficiency, the benefit of focusing on patients and families, and access to high-functioning interprofessional bedside teams to implement patient quality and safety initiatives.
Despite the benefits, there are barriers to implementing the iPACE model, said Hallen. Health care providers are often not trained to round in this way, may perceive interprofessional bedside rounds as more inefficient, and may fear losing the patient’s confidence in their care if clinical teams hold open discussions at the bedside. However, Hallen said research does not show interprofessional bedside rounds are inefficient compared to traditional rounds and highlighted one study that found that they were shorter than traditional rounds (Monash et al., 2017). Interprofessional rounds distribute time differently as interns and residents spend more time interacting with patients in the iPACE model and more time in interdisciplinary discussion, said Hallen. Preliminary findings suggest that the iPACE model leads to fewer interruptions because care team members are all on the same page. An interprofessional model may also improve health care provider wellness and decrease burnout, said Hallen. In contrast to the fear that patients will lose confidence in their care team, research has found that patients have positive perceptions of being included in discussions of their care plan and experience increased trust in health care providers (Mastalerz et al., 2025). Hallen concluded by saying iPACE is “scalable, adaptable and can be implemented using existing resources but should be distributed differently and ideally supported by individualized team workflow modifications.” She also
said the model has measurable benefits and promotes transparency and teamwork at the bedside while building trust between patients and their health care teams and health system.
envisioning the future of diagnostic excellence
In the final session, panelists highlighted key themes from the workshop. Andrew Olson from the University of Minnesota Medical School moderated the discussion, emphasizing that “one of the most pressing issues in health care is how we arrive at accurate, timely, equitable, and safe diagnoses for our patients.”
Teaching, Learning, and Assessing Diagnosis
Sydney Look-Why from BU Chobanian and Avedisian School of Medicine talked about her experience as a medical student learning the diagnostic process. During the first two years of medical school, “You think of a symptom, and you think what diagnoses can be associated with these symptoms, and it's very linear,” she said. But once learners transition into patient care in the third year, information about patients arrive at different times and information is sometimes ambiguous enough to support multiple diagnoses because “none of it is clean and clear,” Look-Why said. She emphasized that the clerkship period in the third and fourth years of medical school presents an opportunity to develop the knowledge and skills of balancing and prioritizing information in order to create a differential diagnosis. She also stressed that the most helpful instructors are those who admit their mistakes and are transparent and reflective about their own process of diagnostic reasoning.
Eric Holmboe from Intealth noted the workshop’s focus on how an individual learns to diagnose, commenting that the diagnostic process is a “deeply human activity.” He recognized the importance of tools and concepts like AI, distributed cognition, and situativity theory,13 which can help augment the diagnostic process. Holmboe suggested taking a different approach to the traditional way of assessing competencies in diagnosis, particularly the reliance on multiple choice exams that may not consider the complexity of the diagnostic process. There is also a need for feedback loops in the system to eliminate the gap in a clinician’s ability to improve, said Holmboe, because too often clinicians make diagnostic judgments about a patient’s condition without learning the outcome of the patient’s case and whether their initial diagnosis was correct. He said one way to bridge this gap would be to develop dashboards for diagnostic performance, such as a dashboard that tracks key data points like the preliminary diagnosis at time of admission, final discharge diagnosis, and information about any changes. Olson added that this type of systematic feedback is standard in other fields, such as air traffic control or meteorology, and even professional sports teams will review playback footage to reflect on their performance.
Mark Graber from Community Improving Diagnosis in Medicine followed up on the idea of diagnosis as an individual pursuit. He described a study of physicians in the ED showing that diagnostic errors were reduced by nearly one-third when they took a few minutes throughout the day to discuss their active cases with peers. Though this study demonstrated success in reducing diagnostic errors through team collaboration, medical schools still largely focus on the role of the individual in the diagnostic process, minimizing the role of peers, patients, nurses, or others. Graber suggested that medical education needs to provide more opportunities for teamwork and team diagnosis. Olson agreed and emphasized the importance of each team member’s contribution throughout patient care (e.g., obtaining and administering medication or starting intravenous therapies) saying, “I’m a hospitalist, and I’m completely worthless without everybody else [on the hospital team].” Olson added that it is easier and more straightforward to assess an individual’s performance and much harder to assess a team, so there is need to develop tools as the field calls for increased interprofessional teamwork and team diagnosis.
Grace Sun from the National Organization of Nurse Practitioner Faculties and the University of Texas at Tyler agreed with the need for health professionals to work in teams during the diagnostic process. Diagnostic reasoning involves a comprehensive assessment of not just medical information, Sun said, but also “social determinants of health, health literacy, health equity, and effective communication.” Due to this wide breadth of information and data, she suggested multidisciplinary collaboration both within and outside of health care,
13 Situativity theory refers “to theoretical frameworks which argue that knowledge, thinking, and learning are situated or located in experience.” See https://pubmed.ncbi.nlm.nih.gov/21345059/ (accessed June 10, 2025).
including professions such as social scientists, computer scientists, data scientists, and engineers. Sun said this type of collaboration is needed to achieve safe, effective patient care with sound diagnostic reasoning. Look-Why supported the need for a team approach, noting that the process of developing as a health care professional entails a lot of “trial and error,” and having a supportive team around the learner is essential for growth.
Importance of Timely Diagnosis
A workshop attendee raised the issue of prolonged diagnostic delay among patients with rare diseases—who often wait seven years on average for a diagnosis, sometimes due to systemic issues in interoperability or lack of communication between clinicians and specialists—and asked the panelists for their thoughts on how HPE could put more focus on the importance of getting a timely diagnosis. Graber responded that “timeliness is the black box of diagnosis,” noting that there are no clear guidelines on how long a diagnosis should take. He suggested that professional societies could address this gap by creating benchmarks for timeliness and determining what resources are needed to support timely diagnosis for conditions within their specialty areas. Holmboe agreed and said a need exists for teaching learners to be open to diagnostic uncertainty, how to think beyond common diagnoses, and how to reach out for other expertise when necessary. Sun agreed that current medical education discourages physicians from exploring rare conditions—so called “zebras”—saying that physicians are taught to assume common explanations and diagnoses. Sun suggested health professions students learn active listening and use more holistic approaches in gathering a wider, more comprehensive breadth of information rather than defaulting to the most likely diagnosis. AI could serve as a diagnostic support tool, Sun added, because AI can collect and synthesize massive amounts of information that clinicians may not have time or capacity to sort through on their own. Olson observed that both premature and delayed diagnoses can lead to negative consequences, making the “correct” timing for a diagnosis a “challenging question.” He suggested that there may be strategies to identify patterns of care in the electronic health records to flag potential issues with a diagnosis and identify when clinicians should investigate further.
Importance of Reflection
Sun observed that excellence in diagnostic reasoning is a process and encouraged learners to understand the importance of the iterative, reflective process of diagnosis. Sun also suggested that faculty further develop their own skills in diagnostic reasoning, diagnostic stewardship, coaching, effective communication, and interprofessional collaboration in order to teach learners. She commented, “We have to teach the teachers to have a growth mindset and a lifelong learning perspective.” Both learners and teachers need to engage in deliberate, interactive reflection around their own cognition and their interactions with patients, she said. As AI proliferates throughout the health care system, critical analysis and reflection will become even more important, she emphasized.
Olson noted that the concept of “reflection” is sometimes misunderstood as a self-indulgent practice rather than a critical cognitive task that allows one to reflect on experience to reorganize knowledge and inform future behavior. Although there is a science of reflection, it can be notably absent in the clinical environment, and, despite feedback on other aspects of performance, the time and space for meaningful reflection is considered a luxury. Holmboe echoed the sentiment and added that reflection needs to be deliberate, systematic, and active. Faculty cannot assume learners are reflecting on their own if they are not empowered to create a space and structure for reflective opportunities. Simulated clinical environments are built specifically to offer opportunities for deliberate reflection, but there is also a need to facilitate reflection in the real world. Holmboe and Sun agreed the investment is worthwhile because, while it takes time to allow learners to reflect, there may be a greater cost and potential for substantial future harm if insufficient reflection leaves learners ill-prepared for independent practice. Graber added that reflection is a skill that is needed across all health professions, and it could serve as a crucial place of intersection for interprofessional education.
To close the workshop, Bindman offered his reflections and thoughts on how much diagnostic education has changed since his time in medical school. The diagnostic process is more explicitly discussed, more tools are available, and the way diagnosis is talked about with patients has changed. However, training as a diagnostician is still largely
seen as an individual pursuit. Developing a physician is an expensive endeavor, said Bindman, and it is worth considering whether there are ways to do things differently. “Are there better ways to teach and assess learners to create better and smarter clinicians who can tackle some of the major issues in the health care system?” Bindman asked. Bindman also said the “North Star” for HPE should not be better exam scores but creating better clinicians who can change clinical operations in ways that matter to patients. Bindman noted that participants at the workshop have wisdom, expertise, and respect for one another, and he urged them to act as a “mighty engine” to empower students in ways that will ultimately improve the public’s health and safety.
Suggestions from workshop participants for improving diagnostic excellence through health professional education and training are outlined in Box 1.
References
Cohen, A., M. Sur, C. Falco, G. Dhaliwal, G. Singhal, and S. Thammasitboon. 2022. Using the assessment of reasoning tool to facilitate feedback about diagnostic reasoning. Diagnosis 9(4):476–84.
Connor, D. M., C. L. Chou, D. L. David, and A. Kalet. 2014. Remediation in medical education: A mid-course correction.
Epstein, H. M., H. Haskell, C. Hemmelgarn, S. Coffee, S. Burrows, M. Burrows, I. Corina, T. Giardina, B. Z. Hose, K. M. Smith, W. Gallagher, and K. Miller. 2024. The patient’s role in diagnostic safety and excellence: From passive reception towards co-design. Rockville, MD: Agency for Healthcare Research and Quality.
BOX 1 SUGGESTIONS FROM INDIVIDUAL WORKSHOP PARTICIPANTS TO ADVANCE HEALTH CARE PROFESSIONAL EDUCATION AND TRAINING IN DIAGNOSTIC EXCELLENCE
Improving Diagnostic Reasoning
- Teach learners how to apply and balance Type 1 (fast, intuitive) and Type 2 (slow, analytical) thinking to improve diagnostic reasoning (Lucey, Lypson, Rencic).
- Explicitly make space and time for learners to engage in deliberate self-reflection and meaningful feedback on performance and on the process of diagnostic reasoning (Burt, Lypson, Parsons, Sun, Thammasitboon).
- Provide learners with deliberate and scaffolded opportunities to practice diagnostic reasoning, in both simulated environments and clinical environments (Burt, Parsons, Rencic, Thammasitboon).
- Train faculty on topics such as Type 1 and Type 2 thinking, reflection, implicit bias, communication, coaching, and interprofessional collaboration before teaching learners (Look-Why, Lypson, Sun).
- Use frameworks such as the Assessment of Reasoning Tool to structure and guide learning conversations between teachers and trainees and provide effective feedback on diagnostic reasoning (Thammasitboon).
- Strengthen deliberate practice in simulated and clinical environments, use appropriately challenging and variable case examples, and provide feedback at all levels to help learners develop skills in diagnostic reasoning (Burt, Parsons).
Enhancing Clinical Competencies
- Establish diagnostic excellence as a core competency for all practicing clinicians (Lucey).
- Increase geriatric competencies in education and training programs to advance diagnostic excellence for older adults (Tran).
- Shift nursing education toward an adaptive, interprofessional, and competency-based format (Gleason).
Optimizing the Use of Diagnostic Tests and Technologies
- Teach learners to think critically about AI tools, how to identify the strengths and limitations of AI, and to recognize when it is not working or is not appropriate for the context (James, Lucey, Sun).
- Utilize AI in health professional education to improve assessment, feedback, and evaluation of learners’ diagnostic reasoning knowledge and skills (Burt).
- Provide educators with foundational knowledge of AI to strengthen their ability to integrate it into teaching (James, Lucey).
- Implement diagnostic stewardship strategies to promote high-value diagnostic testing by improving test selection, reducing repeat and low-value testing, and minimizing harm from false-positive results or overdiagnosis (Vaughn).
- Use electronic medical record nudges and reminders to optimize the process of diagnostic testing and reduce unnecessary treatments (Vaughn).
- Leverage simulation-based education to enhance diagnostic competency by providing a safe environment for learners to make and correct errors without harming patients and enabling timely and specific feedback from different groups such as peers, experts, and patients (Burt).
Emphasizing Patient-Centered Communication
- Address implicit bias by encouraging learners to slow down, engage in reflection, consider alternative diagnoses, use decision-support tools instead of relying on assumptions, and challenge their assumptions about patients (Lypson).
- Equip clinicians with a conversational framework that helps patients understand and manage diagnostic uncertainty by incorporating elements such as reassurance, being honest, validating and normalizing, and addressing unmet needs (McCarthy).
- Teach trainees to engage patients as co-designers and co-producers during the diagnostic process, particularly with older adults who face unique and multilayered diagnostic challenges (Tran).
- Encourage learners to actively listen to patients’ concerns to develop a holistic picture of the patient and to think broadly about potential diagnoses (Holmboe, Sun).
Advancing Interprofessional Education and Team-Based Care
- Provide opportunities for health professions learners across disciplines to practice and improve their approach to the diagnostic process, engage in peer consultation and reflection, and work as a collaborative team to deliver care (Gleason, Graber, Hallen, Olson, Osborne).
- Build opportunities for interprofessional training and practice throughout the health professions education and training trajectory (Graber, Hallen, Osborne).
- Provide meaningful interprofessional training that engages different health care professionals and blends their different perspectives and professional lenses to create a better diagnostic model that centers the patient and patient’s priorities (Osborne, Tran).
Identifying Research Priorities
- Strengthen research on the use and effectiveness of technologies and interventions to support clinicians in diagnostic decision making (Burt, Hallen, James, Vaughn).
- Improve research on the impacts of learning interventions on diagnostic reasoning (Burt, Parsons).
- Expand research on the benefits and limitations of integrating AI into the diagnostic team (James).
NOTE: This list is the rapporteurs’ summary of points made by the individual speakers identified, and the statements have not been endorsed or verified by the National Academies of Sciences, Engineering, and Medicine. They are not intended to reflect a consensus among workshop participants.
Gleason, K. T., R. Jones, C. Rhodes, P. Greenberg, G. Harkless, C. Goeschel, M. Cahill, and M. Graber. 2021. Evidence that nurses need to participate in diagnosis: Lessons from malpractice claims. Journal of Patient Safety 17(8):e959-e963.
Gonzalez, C. M., M. L. Lypson, and J. Sukhera. 2021. Twelve tips for teaching implicit bias recognition and management. Medical teacher, 43(12):1368–73.
Gonzalez, C. M., T. K. Ark, M. R. Fisher, P. R. Marantz, D. J. Burgess, F. Milan, M. T. Samuel, M. L. Lypson, C. J. Rodriguez, and A. L. Kalet. 2024. Racial implicit bias and communication among physicians in a simulated environment. JAMA Network Open 7(3):e242181-e242181.
Guerrasio, J., M. J. Garrity, and E. M. Aagaard. 2014. Learner deficits and academic outcomes of medical students, residents, fellows, and attending physicians referred to a remediation program, 2006–2012. Academic Medicine 89(2):352–8.
Hallen, S., T. Van der Kloot, C. McCormack, P. K. J. Han, F. L. Lucas, L. Wierda, D. Meyer, K. Varaklis, and R. Bing-You. 2020. Redesigning the clinical learning environment to improve interprofessional care and education: Multi-method program evaluation of the iPACE pilot unit. Journal of Graduate Medical Education 12(5):598–610.
Jabbour, S., D. Fouhey, S. Shepard, T. S. Valley, E. A. Kazerooni, N. Banovic, J. Wiens, and M. W. Sjoding. 2023. Measuring the impact of AI in the diagnosis of hospitalized patients: A randomized clinical vignette survey study. JAMA 330(23):2275–84.
James, C., K. Singh, T. S. Valley, and J. Wiens. 2023. Issue brief 13. Reimagining healthcare teams: Leveraging the patient–clinician–AI triad to improve diagnostic safety. Rockville, MD: Agency for Healthcare Research and Quality.
Kahneman, D., and S. Frederick. 2002. Representativeness revisited: Attribute substitution in intuitive judgment. Heuristics and Biases: The Psychology of Intuitive Judgment 49(49-81):74.
Kavanagh, J., and P. Sharpnack. 2021. Crisis in competency: A defining moment in nursing education. OJIN: The Online Journal of Issues in Nursing 26(1).
Mastalerz, K. A., S. R. Jordan, and S. C. Connors. 2025. A qualitative study of patient and interprofessional healthcare team member experiences of bedside interdisciplinary rounds at a VA: Language, teamwork, and trust. Journal of General Internal Medicine 40(3):538–46.
Monash, B., N. Najafi, M. Mourad, A. Rajkomar, S. R. Ranji, M. C. Fang, M. Glass, D. Milev, Y. Ding, A. Shen, B. A. Sharpe, and J. D. Harrison. 2017. Standardized attending rounds to improve the patient experience: A pragmatic cluster randomized controlled trial. Journal of Hospital Medicine 12(3):143–9.
NASEM (National Academies of Sciences, Engineering, and Medicine). 2015. Improving Diagnosis in Health Care. Washington, DC: The National Academies Press. https://doi.org/10.17226/21794.
Obermeyer, Z., B. Powers, C. Vogeli, and S. Mullainathan. 2019. Dissecting racial bias in an algorithm used to manage the health of populations. Science 366(6464):447–53.
Parsons, A. S., J. J. Dreicer, J. R. Martindale, G. Young, and K. M. Warburton. 2024. A targeted clinical reasoning remediation program for residents and fellows in need. Journal of Graduate Medical Education 16(4):469–74.
Parsons, A. and Warburton, K. 2019. A novel clinical reasoning coaching program for the medicine learner in need. MedEdPublish 8:9. https://doi.org/10.15694/mep.2019.000009.1.
Tanner, C. A. 2006. Thinking like a nurse: A research-based model of clinical judgment in nursing. Journal of Nursing Education 45(6):204–11.
Think Local Act Personal and the National Co-Production Advisory Group. n.d. Ladder of Co-Production. https://www.thinklocalactpersonal.org.uk/(accessed June 19, 2025).
Tran, A., L. Blackall, M. A. Hill, and W. Gallagher. 2025. Engaging older adults in diagnostic safety: Implementing a diagnostic communication note sheet in a primary care setting. Frontiers in Health Service 4:1474195.
Vaughn, V. M., A. Gupta, L. A. Petty, A. N. Malani, D. Osterholzer, P. K. Patel, M. Younas, S. J. Bernstein, S. Burdick, D. Ratz, J. E. Szymczak, E. McLaughlin, T. Czilok, T. Basu, J. K. Horowitz, S. A. Flanders, and T. N. Gandhi. 2023. A statewide quality initiative to reduce unnecessary antibiotic treatment of asymptomatic bacteriuria. JAMA Internal Medicine 183(9):933–41.
Warburton, K. M., E. Goren, and C. J. Dine. 2017. Comprehensive assessment of struggling learners referred to a graduate medical education remediation program. Journal of Graduate Medical Education 9(6):763–7.
Wen, L. S., J. A. Espinola, J. M. Mosowsky, and C. A. Camargo Jr. 2015. Do emergency department patients receive a pathological diagnosis? A nationally-representative sample. Western Journal of Emergency Medicine 16(1):50.
Wong, A., E. Otles, J. P. Donnelly, A. Krumm, J. McCullough, O. DeTroyer-Cooley, J. Pestrue, M. Phillips, J. Konye, C. Penoza, M. Ghous, and K. Singh. 2021. External validation of a widely implemented proprietary sepsis prediction model in hospitalized patients. JAMA Internal Medicine 181(8):1065–70.
Yu, F., A. Moehring, O. Banerjee, T. Salz, N. Agarwal, and P. Rajpurkar. 2024. Heterogeneity and predictors of the effects of AI assistance on radiologists. Nature Medicine 30(3):837–49.
Disclaimer: This Proceedings of a Workshop—in Brief was prepared by Jennifer Lalitha Flaubert, Adrienne Formentos, and Erin Hammers Forstag as a factual summary of what occurred at the workshop. The statements made are those of the rapporteurs or individual workshop participants and do not necessarily represent the views of all workshop participants; the planning committee; or the National Academies of Sciences, Engineering, and Medicine.
Planning Committee: Andrew Bindman (Chair), Kaiser Permanente; Emily Abdoler, University of Michigan Medical School; Cristina Gonzalez, New York University Grossman School of Medicine; Gene Harkless, University of New Hampshire; Barry Issenberg, University of Miami; Andrew Olson, University of Minnesota Medical School; Dimitrios Papanagnou, Thomas Jefferson University; Geeta Singhal, Baylor College of Medicine; Maria Soto-Greene, Rutgers New Jersey Medical School; Yolanda Wimberly, Grady Health System. The National Academies’ planning committees are solely responsible for organizing the workshop, identifying topics, and choosing speakers. Responsibility for the final content rests entirely with the rapporteurs and the National Academies.
Reviewers: To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop—in Brief was reviewed by Joseph Rencic, Boston University Chobanian and Avedisian School of Medicine; Alberta Tran, MedStar Health Institute for Quality and Safety; Valerie Vaughn, University of Utah. Leslie Sim, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator.
Sponsors: This workshop was supported by the American Association of Nurse Practitioners, American Board of Internal Medicine, American College of Radiology, Centers for Disease Control and Prevention, College of American Pathologists, Danaher Corporation, The Doctors Company, The Gordon and Betty Moore Foundation, The John A. Hartford Foundation, The Mont Fund, and Radiological Society of North America. Any opinions, findings, conclusions, or recommendations expressed in this publication do not necessarily reflect the views of any organization or agency that provided support for the project.
Staff: Jennifer Lalitha Flaubert, Adrienne Formentos, Adaeze Okoroajuzie, and Sharyl Nass. Board on Health Care Services, Health and Medicine Division.
Suggested citation: National Academies of Sciences, Engineering, and Medicine. 2025. Advancing Health Care Professional Education and Training in Diagnostic Excellence: Proceedings of a Workshop—in Brief. Washington, DC: National Academies Press. https://doi.org/10.17226/29203.
Copyright 2025 by the National Academy of Sciences. All rights reserved.