
RESEARCH LIBRARY
RESEARCH LIBRARY
View the latest publications from members of the NBME research team
JMIR Medical Education: Volume 8 - Issue 2 - e30988
This article aims to compare the reliability of two assessment groups (crowdsourced laypeople and patient advocates) in rating physician error disclosure communication skills using the Video-Based Communication Assessment app.
JMIR Medical Education: Volume 8 , Issue 4
The Video-based Communication Assessment (VCA) app is a novel tool for simulating communication scenarios for practice and obtaining crowdsourced assessments and feedback on physicians’ communication skills. This article aims to evaluate the efficacy of using VCA practice and feedback as a stand-alone intervention for the development of residents’ error disclosure skills.
Medical Teacher: Volume 40 - Issue 8 - p 838-841
Adaptive learning requires frequent and valid assessments for learners to track progress against their goals. This study determined if multiple-choice questions (MCQs) “crowdsourced” from medical learners could meet the standards of many large-scale testing programs.
Journal of Educational Measurement: Volume 55, Issue 2, Pages 308-327
The widespread move to computerized test delivery has led to the development of new approaches to evaluating how examinees use testing time and to new metrics designed to provide evidence about the extent to which time limits impact performance. Much of the existing research is based on these types of observational metrics; relatively few studies use randomized experiments to evaluate the impact time limits on scores. Of those studies that do report on randomized experiments, none directly compare the experimental results to evidence from observational metrics to evaluate the extent to which these metrics are able to sensitively identify conditions in which time constraints actually impact scores. The present study provides such evidence based on data from a medical licensing examination.