DOI QR코드

DOI QR Code

Effectiveness of Medical Education Assessment Consortium Clinical Knowledge Mock Examination (2011-2016)

2011-2016년 의학교육평가컨소시엄 임상종합평가의 효과성

  • Lee, Sang Yeoup (Department of Medical Education, Pusan National University School of Medicine) ;
  • Lee, Yeli (Medical Education Assessment Consortium) ;
  • Kim, Mi Kyung (Medical Education Assessment Consortium)
  • 이상엽 (부산대학교 의과대학 의학교육학교실) ;
  • 이예리 (의학교육평가컨소시엄) ;
  • 김미경 (의학교육평가컨소시엄)
  • Received : 2017.05.16
  • Accepted : 2017.12.12
  • Published : 2018.02.28

Abstract

Good assessment is crucial for feedback on curriculum and to motivate students to learn. This study was conducted to perform item analysis on the Medical Education Assessment Consortium clinical knowledge mock examination (MEAC CKME) (2011-2016) and to evaluate several effects to improve item quality using both classical test theory and item response theory. The estimated difficulty index (P) and discrimination index (D) were calculated according to each course, item type, A (single best answer)/R (extended matching) type, and grading of item quality. The cut-off values used to evaluate P were: >0.8 (easy); 0.6-0.8 (moderate); and <0.6 (difficult). The cut-off value for D was 0.3. The proportion of appropriate items was defined as those with P between 0.25-0.75 and D ${\geq}0.25$. Cronbach ${\alpha}$ was used to assess the reliability and was compared with those of the Korean Medical Licensing Examination (KMLE). The results showed the recent mean difficulty and decimation index was 0.62 and 0.20 for the first MEAC CKME and 0.71 and 0.19 for the second MEAC CKME, respectively. Higher grade items evaluated by a self-checklist system had better D values than lower grade items and higher grade items gradually increased. The preview and editing process by experts revealed maintained P, decreased recall items, increased appropriate items with better D values, and higher reliability. In conclusion, the MEAC CKME (2011-2016) is deemed appropriate as an assessment to evaluate students' competence and prepare year four medical students for the KMLE. In addition, the self-checklist system for writing good items was useful in improving item quality.

Keywords

References

  1. Baik SH. The new horizon for evaluations in medical education in Korea. J Educ Eval Health Prof. 2005;2(1):7-22. https://doi.org/10.3352/jeehp.2005.2.1.7
  2. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-7. https://doi.org/10.1097/00001888-199009000-00045
  3. Anderson LW, Krathwohl DR, Airasian PW, Cruikshank KA, Mayer RE, Pintrich PR, et al. A taxonomy for learning, teaching, and assessing: a revision of Bloom's taxonomy of educational objectives. Boston (MA): Pearson/Allyn and Bacon; 2001.
  4. Baik SH. Major reforms and issues of the medical licensing examination systems in Korea. Korean Med Educ Rev. 2013;15(3):125-31. https://doi.org/10.17496/KMER.2013.15.3.125
  5. Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences. 3rd ed. Philadelphia (PA): National Board of Medical Examiners; 2002.
  6. Reeder FF, Joos AE. Fire service instructor: principles and practice. 2nd ed. Burlington (MA): Jones & Bartlett Learning; 2013.
  7. Korea Health Personnel Licensing Examination Institute. 77th Medical licensing examination item analysis report, 2013. Seoul: Korea Health Personnel Licensing Examination Institute; 2015.
  8. Korea Health Personnel Licensing Examination Institute. 78th Medical licensing examination item analysis report, 2014. Seoul: Korea Health Personnel Licensing Examination Institute; 2015.
  9. Korea Health Personnel Licensing Examination Institute. 79th Medical licensing examination item analysis report, 2015. Seoul: Korea Health Personnel Licensing Examination Institute; 2015.
  10. Korea Health Personnel Licensing Examination Institute. 80th Medical licensing examination item analysis report, 2016. Seoul: Korea Health Personnel Licensing Examination Institute; 2016.
  11. Lim HS, Lee YM, Ahn DS, Lee JY, Im H. Item analysis of clinical performance examination using item response theory and classical test theory. Korean J Med Educ. 2007;19(3):185-95. https://doi.org/10.3946/kjme.2007.19.3.185
  12. Levine HG, McGuire CH, Nattress Jr LW. The validity of multiple choice achievement tests as measures of competence in medicine. Am Educ Res J. 1970;7(1):69-82. https://doi.org/10.3102/00028312007001069
  13. Tavakol M, Dennick R. Making sense of Cronbach's alpha. Int J Med Educ. 2011;2:53-5. https://doi.org/10.5116/ijme.4dfb.8dfd