Showing 1 - 5 of 5 Research Library Publications
Posted: | Michael A. Barone, Jessica L. Bienstock, Elise Lovell, John R. Gimpel, Grant L. Lin, Jennifer Swails, George C. Mejicano

Journal of Graduate Medical Education: Volume 14, Issue 6, Pages 634-638

 

This article discusses recent recommendations from the UME-GME Review Committee (UGRC) to address challenges in the UME-GME transition—including complexity, negative impact on well-being, costs, and inequities.

Posted: | Jennifer L. Swails, Steven Angus, Michael Barone, Jessica Bienstock, Jesse Burk-Rafel, Michelle Roett, Karen E. Hauer

Academic Medicine: Volume 98 - Issue 2 - Pages 180-187

 

This article describes the work of the Coalition for Physician Accountability’s Undergraduate Medical Education to Graduate Medical Education Review Committee (UGRC) to apply a quality improvement approach and systems thinking to explore the underlying causes of dysfunction in the undergraduate medical education (UME) to graduate medical education (GME) transition.

Posted: | Mark Gierl, Kimberly Swygert, Donna Matovinovic, Allison Kulesher, Hollis Lai

Teaching and Learning in Medicine: Volume 33 - Issue 4 - p 366-381

 

The purpose of this analysis is to describe these sources of evidence that can be used to evaluate the quality of generated items. The important role of medical expertise in the development and evaluation of the generated items is highlighted as a crucial requirement for producing validation evidence.

Posted: | P. Harik, R.A. Feinberg RA, B.E. Clauser

Integrating Timing Considerations to Improve Testing Practices

 

This chapter addresses a different aspect of the use of timing data: it provides a framework for understanding how an examinee's use of time interfaces with time limits to impact both test performance and the validity of inferences made based on test scores. It focuses primarily on examinations that are administered as part of the physician licensure process.

Posted: | M.R. Raymond, C. Stevens, S.D. Bucak

Adv in Health Sci Educ 24, 141–150 (2019)

 

Research suggests that the three-option format is optimal for multiple choice questions (MCQs). This conclusion is supported by numerous studies showing that most distractors (i.e., incorrect answers) are selected by so few examinees that they are essentially nonfunctional. However, nearly all studies have defined a distractor as nonfunctional if it is selected by fewer than 5% of examinees.