Browse > Article
http://dx.doi.org/10.17496/kmer.2019.21.1.51

How Do Medical Students Prepare for Examinations: Pre-assessment Cognitive and Meta-cognitive Activities  

Yune, So-Jung (Department of Medical Education, Pusan National University School of Medicine)
Lee, Sang-Yeoup (Department of Medical Education, Pusan National University School of Medicine)
Im, Sunju (Department of Medical Education, Pusan National University School of Medicine)
Publication Information
Korean Medical Education Review / v.21, no.1, 2019 , pp. 51-58 More about this Journal
Abstract
Although 'assessment for learning' rather than 'assessment of learning' has been emphasized recently, student learning before examinations is still unclear. The purpose of this study was to investigate pre-assessment learning activities (PALA) and to find mechanism factors (MF) that influence those activities. Moreover, we compared the PALA and MF of written exams with those of the clinical performance examination/objective structured clinical examination (CPX/OSCE) in third-year (N=121) and fourth-year (N=108) medical students. Through literature review and discussion, questionnaires with a 5-point Likert scale were developed to measure PALA and MF. PALA had the constructs of cognitive and meta-cognitive activities, and MF had sub-components of personal, interpersonal, and environmental factors. Cronbach's ${\alpha}$ coefficient was used to calculate survey reliability, while the Pearson correlation coefficient and multiple regression analysis were used to investigate the influence of MF on PALA. A paired t-test was applied to compare the PALA and MF of written exams with those of CPX/OSCE in third and fourth year students. The Pearson correlation coefficients between PALA and MF were 0.479 for written exams and 0.508 for CPX/OSCE. MF explained 24.1% of the PALA in written exams and 25.9% of PALA in CPX/OSCE. Both PALA and MF showed significant differences between written exams and CPX/OSCE in third-year students, whereas those in fourth-year students showed no differences. Educators need to consider MFs that influence the PALA to encourage 'assessment for learning'.
Keywords
Educational assessment; Learning; Undergraduate medical education;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Dochy F, Segers M, Gijbels D, Struyven K. Assessment engineering. In: Boud D, Falchikov N, editors. Rethinking assessment in higher education: learning for the longer term. Oxford: Routledge; 2007. p. 87-100.
2 Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33(3):206-14.   DOI
3 Gibbs G, Simpson C. Conditions under which assessment supports students' learning. Learn Teach High Educ. 2005;(1):3-31.
4 Levant B, Zuckert W, Paolo A. Post-exam feedback with question rationales improves re-test performance of medical students on a multiple-choice exam. Adv Health Sci Educ Theory Pract. 2018;23(5):995-1003.   DOI
5 Tekian A, Watling CJ, Roberts TE, Steinert Y, Norcini J. Qualitative and quantitative feedback in the context of competency-based education. Med Teach. 2017;39(12):1245-9.   DOI
6 Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, et al. Core principles of assessment in competency-based medical education. Med Teach. 2017;39(6):609-16.   DOI
7 Roh H, Lee JT, Yoon YS, Rhee BD. Development of a portfolio for competency-based assessment in a clinical clerkship curriculum. Korean J Med Educ. 2015;27(4):321-7.   DOI
8 Schuwirth LW, van der Vleuten CP. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach. 2011;33(6):478-85.   DOI
9 Timmerman AA, Dijkstra J. A practical approach to programmatic assessment design. Adv Health Sci Educ Theory Pract. 2017;22(5):1169-82.   DOI
10 Heeneman S, Oudkerk Pool A, Schuwirth LW, van der Vleuten CP, Driessen EW. The impact of programmatic assessment on student learning: theory versus practice. Med Educ. 2015;49(5):487-98.   DOI
11 Cilliers FJ, Schuwirth LW, Adendorff HJ, Herman N, van der Vleuten CP. The mechanism of impact of summative assessment on medical students' learning. Adv Health Sci Educ Theory Pract. 2010;15(5):695-715.   DOI
12 Cilliers FJ, Schuwirth LW, van der Vleuten CP. Modelling the preassessment learning effects of assessment: evidence in the validity chain. Med Educ. 2012;46(11):1087-98.   DOI
13 Cilliers FJ, Schuwirth LW, Herman N, Adendorff HJ, van der Vleuten CP. A model of the pre-assessment learning effects of summative assessment in medical education. Adv Health Sci Educ Theory Pract. 2012;17(1):39-53.   DOI
14 Park HK. The impact of introducing the Korean Medical Licensing Examination clinical skills assessment on medical education. J Korean Med Assoc. 2012;55(2):116-23.   DOI
15 Lafleur A, Laflamme J, Leppink J, Cote L. Task demands in OSCEs influence learning strategies. Teach Learn Med. 2017;29(3):286-95.   DOI