See One, Do One, Take an Assessment
- 1. Department of Anesthesiology and Perioperative Medicine and the Department of Radiology, Oregon Health and Science University, Portland, Oregon
- 2. Associate Director, Regional Anesthesia and Acute Pain Fellowship, University of Miami, Miller School of Medicine, Miami, Florida USA
Citation
Woodworth G, Missair A (2013) See One, Do One, Take an Assessment. Int J Clin Anesthesiol 1: 1001.
INTRODUCTION
Medicine has a long history that is fundamentally rooted in an apprenticeship model of education. A common refrain heard during residency has been “see one, do one, teach one.” The new generation of physicians is likely to hear a much different tune. Medical Education is rapidly evolving from the traditional experienced-based model to a competency-based model with defined educational goals and competency requirements [1,2]. Recognizing the importance of competency assessment and certification for specialists, the American Board of Anesthesiology administered its first written examination in 1937 [3]. The exam covered pharmacology, pathology, anatomy, physics and chemistry. Examinees were required to complete 3 of 5 essay questions within the allotted time. This exam eventually evolved into a multiple choice assessment of medical knowledge. Medicine and graduate medical education have a time-honored tradition of assessing medical knowledge via this model and have led to the graduation of many medical students, who were considered to be test-taking machines.
Of course, physician competency requires more than medical knowledge. The last 15 years have seen dramatic changes in the definition and assessment of physician competency. In the United States this was reflected in the cooperation between the Accreditation Council for Graduate Medical Education (ACGME) and the American Board of Medical Specialties (ABMS) to define and assess competency [4]. In 1998 the ACGME, with input from the ABMS, began a 15-year “Outcome Project” to revise the process of accreditation for U.S. training programs. The goal of the project was to realign graduate medical education with academic achievement as well as patient outcomes. In the era of evidence-based medicine, the project sought to answer the question, “How can we demonstrate that educational programs are effective?” This goal requires an established definition of physician competence as well as methods for the assessment of the attainment of competence during training. In 2002, Epstein published a landmark article in the Journal of the American Medical Association, proposing that physician competence requires knowledge, communication, reasoning, technical skills, values and reflection [5]. At about the same time, the ACGME adopted six competency domains to assess medical residents: medical knowledge, patient care, professionalism, communication and interpersonal skills, practice-based learning and improvement, and systems-based practice [6]. In 2009, the ACGME updated the Outcome Project to include milestones. Each residency review committee (RRC) was tasked to develop specific competencybased milestones for their specialties. Implementation of milestones by anesthesia residencies is scheduled for 2014. The ACGME is encouraging individual specialties and programs to innovate and develop their own tools to determine if specific competency milestones have been met [7,8]. To assist in this effort the ACGME provided a list of potential types of assessment tools to consider implementing including Objective Structured Clinical Exams (OSCE), patient surveys, portfolios, record review and simulation.
To be a truly valid assessment of competency, the assessment tool must have been developed using psychometric principles and proven to be valid and reliable [9-13]. Few programs have the resources and expertise to develop valid assessments. Admittedly, assessments of residency milestones are not highstakes examinations; however, it would be preferential to use validated assessments. Lurie reviewed the existing medical literature and found few published assessment tools to be reliable and valid [14]. In the United States the National Board of Medical Examiners implemented simulation-based clinical skills assessment in 2005 [15]. Recently, attention has focused on how best to assess other important aspects of competency including judgment, teamwork, communication, professionalism, and procedural skills [10,11]. Anesthesiology has made some initial progress in the development of assessment tools. Early forms of assessment in anesthesia focused on knowledge or the application of knowledge in the form of recall of facts or principles, in part because they could be readily assessed with multiple choice style exams. Investigators are now testing the application of a variety of assessment tool types in anesthesiology including high and low fidelity simulation, OSCE, and Objective Structured Assessment of Technical Skills (OSATS) [16-25]. One of the most interesting implementations has been the incorporation of simulation and OSCE style assessments into the Israeli national board examination process [26]. The American Board of Anesthesiology has announced the restructuring of the Part 2 Oral Exam into an “Applied Exam” that will include OSCE style assessments [27].
No matter what instrument is used for assessment of milestones, it is necessary to use appropriate techniques to determine passing scores [28]. This is critically important for high-stakes examinations, but is also important for milestone assessment. Few of the published assessment tools in anesthesia address this concept.
The implementation of milestone assessments in a residency program must be practical. Given the large number of potential milestones and the variety of assessments that will be required, administration of the assessments could be problematic. In our institutions, we have looked at the regional anesthesia curriculum as a pilot for implementation of milestone assessments. For example, one aspect of regional anesthesia that will require assessment is competency with procedures. Is it practical, or necessary, to assess resident competency for each and every type of peripheral nerve block? We have chosen to pilot an assessment approach that looks at procedural competency and competency in component skills. In order to assess competency with peripheral nerve blocks, we are pilot testing a modification of an existing validated OSCE/OSATS for interscalene blocks [20]. The assessment tool can be used with a patient simulator or while a resident is performing the block on an actual patient. The tool consists of a checklist and global rating scales covering everything from equipment set-up and adherence to sterile technique, to probe handling and proper positioning of the nerve target on the ultrasound screen. Because it may not be practical to administer this type of assessment for every block type, we have chosen to look at the component skills of ultrasound-guided nerve blockade. In addition to sterile technique, ergonomics, probe handling, and nerve target acquisition with ultrasound, the resident must be able to recognize and interpret ultrasound images and be skillful at guiding needles to targets under ultrasound. If the resident can pass the basic interscalene block OSCE/OSATS, we assume they are competent in sterile technique, ergonomics, etc. for other blocks. However, in addition to the interscalene block assessment, we will require residents to pass a needle-guidance under ultrasound assessment, and an ultrasound interpretation skills assessment, to demonstrate competency in ultrasound-guided peripheral nerve blocks. Despite the importance of ultrasound to regional anesthesia, a validated assessment of ultrasound interpretation skills has not been published. Using sound psychometric principals we have developed an ultrasound interpretation skills assessment that requires learners to view ultrasound clips and answer questions about the clips. The clips are obtained from real patients, have varying degrees of image clarity as in real-life, and cover a defined knowledgebase (content validity). The questions have been designed to test a range of competency, with some questions that should be answered correctly by a junior resident, and others that should be answered correctly by a graduating senior resident. The stem for each question has an illustrated diagram and textual description of how the clip was obtained. The stem may ask the learner to identify a structure, select how to move the probe to better visualize a structure, or ask the learner why the probe should be repositioned. The exam has been pilot tested and is currently undergoing a validation phase across multiple institutions that will include determination of passing scores.
In this new era of medicine, we can expect increased scrutiny over competency at every level, including for licensure, during residency, for certification and maintenance of certification, for credentialing with hospitals and health plans, and for the adoption of new techniques or technologies that arrive on the scene after we have finished training. The role of the research community should be to provide practical and validated assessments to satisfy these demands. As academicians training future physicians, we must embrace the responsibility that comes with mentorship and the need to validate the models and benchmarks used to assess our future peers. In return, our specialties will safeguard the quality of their trainees and the safety of our patients.
REFERENCES
4. Accreditation Council for Graduate Medical Education (ACGME). 2001. The Project: Introduction.
5. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002; 287: 226-235.
9. pstein RM. Assessment in medical education. N Engl J Med. 2007; 356: 387-396.
15. apadakis MA. The Step 2 clinical-skills examination. N Engl J Med. 2004; 350: 1703-1705.
24. etzlaff JE. Assessment of competency in anesthesiology. Anesthesiology. 2007; 106: 812-825.
27. http://www.theaba.org/Home/TrainingPrograms. Accessed 7/24/13.