Kimberly Brown, MD, FACS, and Emil Petrusa, PhD
May 1, 2018
This article is intended for clerkship directors, leaders in Undergraduate Medical Education (UME), and residency program directors who are interested in the current state and evolving innovations in assessing clinical skills of trainees during the transition between medical school and residency. This article seeks to:
The goal of surgical education is a safe, competent surgeon who is ready to begin a defined scope of independent practice upon graduation. Ideally, Undergraduate Medical Education (UME) and Graduate Medical Education (GME) programs would achieve this goal through a coordinated framework of training and assessment. This ideal framework would start with defining the knowledge, skills, and behaviors expected of a surgery graduate, and create a progressive curriculum across UME and GME. The foundational general medical knowledge and skills expected of all UME graduates would be augmented with a defined set of skills required for a surgical intern to perform expected duties on day one. Consistent educational and assessment practices for bridging the surgery continuum across the country will assure all residency programs can expect the same level of knowledge and skills of interns regardless of the medical school from which they come. This objective would include a process for medical schools to identify students who do not demonstrate the expected knowledge, skills, or behaviors so that remediation occurs prior to graduation. Thus, a key feature of this ideal system is a mechanism to provide accurate, actionable performance assessment data across the transition from UME to GME.
Many pieces of this ideal system are either in place or in development, starting with a framework to define the core essential activities that a graduating surgery resident should be able to perform, and the general attributes, or competencies, required to successfully perform these expected professional activities.1,2 The Core Entrustable Professional Activities (EPA) for Entering Residency document describes the expected skills of a UME graduate, and the American College of Surgeons has described in more detail the specific knowledge and skills a surgical intern should possess prior to starting residency.3,4 The EPA framework is now being developed for GME training in multiple specialties, creating a specialty-specific continuum of progressive, observable behaviors amenable to longitudinal assessment across UME and GME.5–8 Pediatrics is furthest along with this effort; a group of institutions is studying the outcome of making advancement decisions from UME to GME based on assessments of EPAs.9 Orthopaedic surgery, drawing on the Canadian experience in a competency-based GME training program, is also exploring different frameworks for competency-based advancement bridging UME and GME.10
Currently, the American Board of Surgery is leading a pilot project around five core EPAs in General Surgery (see box 1). Using a common framework across UME and GME allows us to identify the gaps between medical school graduates’ performance and residency program expectations, and eventually tailor focused training to address deficiencies in individuals or in institutional curricula.11
The choice of assessment tools depends on the construct being assessed and the purpose of the assessment. Using Miller’s pyramid (see Figure 1) as a conceptual framework, an EPA performed in the clinical environment represents the highest level of skills, and the assessment construct is the performance of that activity in the clinical context.12 Thus, while knowledge assessments such as multiple choice exams, and skills assessments such as Objective Structured Clinical Exams, play a key role in assessing components of entrustable behavior and determining that a trainee is ready to perform effectively in the clinical environment, workplace-based assessments (WBA) allow additional assessment in real contexts.
In workplace-based assessment, the desired performance for a given clinical skill is clearly described, along with a checklist or global rating scale with anchoring descriptors for relevant aspects of the performance, which may also include the level of autonomy or supervision required, allowing for a shared mental model between rater and trainee. The trainee is observed providing clinical care, and feedback is given comparing the observed performance to the desired performance. Examples include the mini-clinical evaluation exercise (mini-CEX) used for evaluation and management skills, objective structured assessment of technical skill (OSATS) created for specific procedures, and the System for Improving and Measuring Procedural Learning (SIMPL), used in resident operative cases.13–15
The ultimate purpose of assessment is to make an entrustment decision about what level of supervision a trainee requires while performing a given activity. However, this high-stakes assessment requires multiple low-stakes assessments of knowledge, skill, and attitude or behavioral performance across different contexts, with feedback provided to the trainee about how the observed performance compares with the expected.16 In the UME setting, these low-stakes assessments occur across all clinical rotations and the high-stakes assessments occur at the institutional level, either from structured assessment events or from the aggregated, low-stakes data, or both. This arrangement may not provide a program director with helpful information regarding a graduate’s skills specific to the care of surgical patients; however, there is significant interest in leveraging opportunities in the fourth year of medical school to achieve specialty-specific EPAs.17
For example, students going into surgery can take a capstone course in the fourth year of medical school, such as the ACS/APDS/ASE Resident Prep Curriculum. Modules developed for this curriculum cover technical and non-technical skills, and most include an assessment tool, the results of which can be fed forward program directors. These assessments are knowledge- and skills-based, corresponding to the “knows,” “knows how,” and “shows how” levels of Miller’s pyramid, and can help inform decisions around indirect supervision. Medical schools can also incorporate specialty-specific EPAs into sub-internships, including a process for tailored training or remediation during the fourth year to achieve all expected goals prior to graduation. Restructuring the Medical Student Performance Evaluation (MSPE) is one potential mechanism to feed performance information forward. However, in this process it is important to manage medical schools’ potential conflict of interest between providing accurate information, and at the same time optimizing students’ chances of matching into a residency program. One way this objective could be approached is to provide a structured report of core and specialty-specific EPA assessments after a student matches.
On the other side of the UME to GME transition, assessment of interns’ skills can be accomplished through a “boot camp” style program held at the beginning of residency training. The content of this activity is driven by local needs and resources, and may be a “massed” experience prior to starting clinical duties, or a more distributed training and assessment over the first months of internship.18,19 The ACS/APDS Surgery Resident Skills Curriculum, Phase 1: Core Skills, is a useful resource for instruction and assessment tools for incoming interns.20 This initial period of training and assessment can also incorporate workplace-based assessments to inform entrustment decisions and direct further tailored training. These intern training programs have arisen from concerns over the variability of medical school graduates’ readiness to perform expected duties as first-day interns. As EPA-like assessments at the medical school level become more robust and reliable, GME programs may be able to focus resources on more tailored training activities.
Kimberly M. Brown, MD, FACS, is an associate professor of surgery at the University of Texas at Austin Dell Medical School and serves as associate chair of education for the Department of Surgery and perioperative care.
Emil R. Petrusa, PhD, is an associate professor of surgery at Massachusetts General Hospital.