Browse > Article
http://dx.doi.org/10.17496/kmer.2018.20.1.20

Effectiveness of Medical Education Assessment Consortium Clinical Knowledge Mock Examination (2011-2016)  

Lee, Sang Yeoup (Department of Medical Education, Pusan National University School of Medicine)
Lee, Yeli (Medical Education Assessment Consortium)
Kim, Mi Kyung (Medical Education Assessment Consortium)
Publication Information
Korean Medical Education Review / v.20, no.1, 2018 , pp. 20-31 More about this Journal
Abstract
Good assessment is crucial for feedback on curriculum and to motivate students to learn. This study was conducted to perform item analysis on the Medical Education Assessment Consortium clinical knowledge mock examination (MEAC CKME) (2011-2016) and to evaluate several effects to improve item quality using both classical test theory and item response theory. The estimated difficulty index (P) and discrimination index (D) were calculated according to each course, item type, A (single best answer)/R (extended matching) type, and grading of item quality. The cut-off values used to evaluate P were: >0.8 (easy); 0.6-0.8 (moderate); and <0.6 (difficult). The cut-off value for D was 0.3. The proportion of appropriate items was defined as those with P between 0.25-0.75 and D ${\geq}0.25$. Cronbach ${\alpha}$ was used to assess the reliability and was compared with those of the Korean Medical Licensing Examination (KMLE). The results showed the recent mean difficulty and decimation index was 0.62 and 0.20 for the first MEAC CKME and 0.71 and 0.19 for the second MEAC CKME, respectively. Higher grade items evaluated by a self-checklist system had better D values than lower grade items and higher grade items gradually increased. The preview and editing process by experts revealed maintained P, decreased recall items, increased appropriate items with better D values, and higher reliability. In conclusion, the MEAC CKME (2011-2016) is deemed appropriate as an assessment to evaluate students' competence and prepare year four medical students for the KMLE. In addition, the self-checklist system for writing good items was useful in improving item quality.
Keywords
Medical education; Medical measurement; Medical student;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Baik SH. The new horizon for evaluations in medical education in Korea. J Educ Eval Health Prof. 2005;2(1):7-22.   DOI
2 Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-7.   DOI
3 Anderson LW, Krathwohl DR, Airasian PW, Cruikshank KA, Mayer RE, Pintrich PR, et al. A taxonomy for learning, teaching, and assessing: a revision of Bloom's taxonomy of educational objectives. Boston (MA): Pearson/Allyn and Bacon; 2001.
4 Baik SH. Major reforms and issues of the medical licensing examination systems in Korea. Korean Med Educ Rev. 2013;15(3):125-31.   DOI
5 Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences. 3rd ed. Philadelphia (PA): National Board of Medical Examiners; 2002.
6 Reeder FF, Joos AE. Fire service instructor: principles and practice. 2nd ed. Burlington (MA): Jones & Bartlett Learning; 2013.
7 Korea Health Personnel Licensing Examination Institute. 77th Medical licensing examination item analysis report, 2013. Seoul: Korea Health Personnel Licensing Examination Institute; 2015.
8 Korea Health Personnel Licensing Examination Institute. 78th Medical licensing examination item analysis report, 2014. Seoul: Korea Health Personnel Licensing Examination Institute; 2015.
9 Korea Health Personnel Licensing Examination Institute. 79th Medical licensing examination item analysis report, 2015. Seoul: Korea Health Personnel Licensing Examination Institute; 2015.
10 Korea Health Personnel Licensing Examination Institute. 80th Medical licensing examination item analysis report, 2016. Seoul: Korea Health Personnel Licensing Examination Institute; 2016.
11 Lim HS, Lee YM, Ahn DS, Lee JY, Im H. Item analysis of clinical performance examination using item response theory and classical test theory. Korean J Med Educ. 2007;19(3):185-95.   DOI
12 Levine HG, McGuire CH, Nattress Jr LW. The validity of multiple choice achievement tests as measures of competence in medicine. Am Educ Res J. 1970;7(1):69-82.   DOI
13 Tavakol M, Dennick R. Making sense of Cronbach's alpha. Int J Med Educ. 2011;2:53-5.   DOI