Showing 1 - 3 of 3 Research Library Publications
Posted: | Michael A. Barone, Jessica L. Bienstock, Elise Lovell, John R. Gimpel, Grant L. Lin, Jennifer Swails, George C. Mejicano

Journal of Graduate Medical Education: Volume 14, Issue 6, Pages 634-638

 

This article discusses recent recommendations from the UME-GME Review Committee (UGRC) to address challenges in the UME-GME transition—including complexity, negative impact on well-being, costs, and inequities.

Posted: | Jennifer L. Swails, Steven Angus, Michael Barone, Jessica Bienstock, Jesse Burk-Rafel, Michelle Roett, Karen E. Hauer

Academic Medicine: Volume 98 - Issue 2 - Pages 180-187

 

This article describes the work of the Coalition for Physician Accountability’s Undergraduate Medical Education to Graduate Medical Education Review Committee (UGRC) to apply a quality improvement approach and systems thinking to explore the underlying causes of dysfunction in the undergraduate medical education (UME) to graduate medical education (GME) transition.

Posted: | P. Harik, B. E. Clauser, I. Grabovsky, P. Baldwin, M. Margolis, D. Bucak, M. Jodoin, W. Walsh, S. Haist

Journal of Educational Measurement: Volume 55, Issue 2, Pages 308-327

 

The widespread move to computerized test delivery has led to the development of new approaches to evaluating how examinees use testing time and to new metrics designed to provide evidence about the extent to which time limits impact performance. Much of the existing research is based on these types of observational metrics; relatively few studies use randomized experiments to evaluate the impact time limits on scores. Of those studies that do report on randomized experiments, none directly compare the experimental results to evidence from observational metrics to evaluate the extent to which these metrics are able to sensitively identify conditions in which time constraints actually impact scores. The present study provides such evidence based on data from a medical licensing examination.