RIS ID

97789

Publication Details

O'Mara, D. A., Canny, B., Rothnie, I. P., Wilson, I. G., Barnard, J. & Davies, L. (2015). The Australian Medical Schools Assessment Collaboration: benchmarking the preclinical performance of medical students. Medical Journal of Australia, 202 (2), 95-98.

Abstract

Objectives: To report the level of participation of medical schools in the Australian Medical Schools Assessment Collaboration (AMSAC); and to measure differences in student performance related to medical school characteristics and implementation methods. Design: Retrospective analysis of data using the Rasch statistical model to correct for missing data and variability in item difficulty. Linear model analysis of variance was used to assess differences in student performance. Setting and participants: 6401 preclinical students from 13 medical schools that participated in AMSAC from 2011 to 2013. Main outcome measures: Rasch estimates of preclinical basic and clinical science knowledge. Results: Representation of Australian medical schools and students in AMSAC more than doubled between 2009 and 2013. In 2013 it included 12 of 19 medical schools and 68% of medical students. Graduate-entry students scored higher than students entering straight from school. Students at large schools scored higher than students at small schools. Although the significance level was high (P < 0.001), the main effect sizes were small (4.5% and 2.3%, respectively). The time allowed per multiple choice question was not significantly associated with student performance. The effect on performance of multiple assessments compared with the test items as part of a single end-of-year examination was negligible. The variables investigated explain only 12% of the total variation in student performance. Conclusions: An increasing number of medical schools are participating in AMSAC to monitor student performance in preclinical sciences against an external benchmark. Medical school characteristics account for only a small part of overall variation in student performance. Student performance was not affected by the different methods of administering test items.

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.5694/mja14.00772