Browse > Article
http://dx.doi.org/10.5762/KAIS.2014.15.6.3609

A study on the item characteristics differences of response position, response length, and question types of multiple-choice aptitude tests  

Han, Young Seok (Department of Industrial-Organizational Psychology, Hoseo University)
Kim, Myoung So (Department of Industrial-Organizational Psychology, Hoseo University)
Publication Information
Journal of the Korea Academia-Industrial cooperation Society / v.15, no.6, 2014 , pp. 3609-3615 More about this Journal
Abstract
This study examined the difference in the item characteristics in multiple-choice aptitude tests focusing on the response position, response length and question types. A university aptitude test consisting of 80 questions was used for this study. The subjects were 3120 senior high school students from 80 schools nation-wide (liberal arts-1650, natural sciences-1467 patients). The results suggest that item prediction is higher for numbers 2 and 3 (located in the middle) than numbers 1 and 4. The item discrimination was higher for pick-the-'wrong'-items than pick-the-'right'-items. In addition, longer choices are preferred. The suggestions for future research are provided based on these findings.
Keywords
Response position; Response length; Question types; Aptitude tests;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Sanbonmatsu, D. M., Shavitt, S. Sherman, S. J., & Roskos-Ewoldsen, Jr.(1987). Illusory correlation in the perception of performance by self or a salient other. Journal of Experimental Social Psychology, 23, 518-543. DOI: http://dx.doi.org/10.1016/0022-1031(87)90019-9   DOI
2 Yun, Jee Hwan, Lee, Moonbok, & Park, Yong-Hyo. (2012). The influence of the language of directions, questions, and choices in pratice CSAT on learners' test results. Korean Journal of Applied Linguistics, 28(1), 59-85.
3 Yoon, Hye-Kyong, Seong, Tae-Jae(1998). An analysis of the item and test characteristics of multiple-choice items including the 'all of the above' or 'none of the above' alternatives. The Journal of Educational Research36(1), 131-147.
4 Hwang Jung-gyu(1998). School Learning and Assessment. Seoul:Kyoyookbook.
5 Halady. T. M. & Downing, S. M.(1989). A Taxonomy of multiple-choice item writing rules. Applied Measurement in Education, 2(1), 37-50. DOI: http://dx.doi.org/10.1207/s15324818ame0201_3   DOI
6 Yigal A, Maya B. H(2003). Guess Where: The Position of Correct Answers in Multiple-Choice Test Items as a Psychometric Variable. Journal of Educational Measurement, 40(2), 109-128. DOI: http://dx.doi.org/10.1111/j.1745-3984.2003.tb01099.x   DOI   ScienceOn
7 Sang, Kyong-Ah, Yang, Kil-Seok(2007). The effect of response position in a multiple-choice examination Asian Journal of Education. 8(1), 25-46.   DOI
8 Baker F. B. Item Response Theory: Parameter Estimation Techniques. . New York: Marcel Dekker, 1992.
9 Bardo, J. W. Yeager, S. J. and Klingsporn M. J. (1982) Preminary assessment of format-specific central tendency and leniency error insummated rating scales. Perceptual and Motor Skills: 54, 227-234. DOI: http://dx.doi.org/10.2466/pms.1982.54.1.227   DOI
10 Anderson, L. W., Karthwohl, D. R., Airasan, P. W., Criksank, K. A., Mayer, R. E., Pintrich, P. R., Raths, J & Wittrock, M. C.(2001). A taxonomy for learning, teaching, and assessment: A revision of Blooms's taxonomy of educational objuctives. 1st ed. Pearson Education, Inc.
11 Marcus, S.S., & Rips, L.J.(1979). Conditional reasoning. Journal of Verbal Learning and Verbal Bahavior, 18, 199-223. DOI: http://dx.doi.org/10.1016/S0022-5371(79)90127-0   DOI   ScienceOn
12 Yongwon Sun, Youngseok Han. (2000). Effects of attention and judgemental encoding in individual and group targets on illusory correlation and performance evaluation. Korean Journal of Industrial and Organizational Psychology. 13(1). 1-21.
13 Newsis, 82.5% of large companies, which, aptitude tests, skills tests conducted, 2014 Available from: http://www.newswire.co.kr/newsRead.php?no=432553. accessed January., 01, 14.
14 Bruce B. F, Stephanie P, Lisa M. Edwards, J. (2005) Item-writing rules: Collective wisdom. Teaching and Teacher Education 21, 357-64 DOI: http://dx.doi.org/10.1016/j.tate.2005.01.008   DOI   ScienceOn
15 Cohen, A. (1994). Assessing language ability in the classroom (2nded.). Boston, MA: Heinle & Heinle.
16 Ji-hyun, Chun. (2008). A study on item type and item difficulty: A case of qualifying exam for Business English. The Korean Association of Sedretarial Studies, 17(1), 141-156.
17 Seong, Tae-Jae. (1996). Production and analysis of the theory and practice questions. Hakjisa.
18 Shohamy, E. (1984). Does the testing method make a difference? The case of reading comprehension. Language Testing, 1, 199-215. DOI: http://dx.doi.org/10.1177/026553228400100203   DOI
19 Chun, H. (2005). The effects of contents, pretasks and question types on Korean college students' lecture listening comprehension. Korean Journal of Applied Linguistics, 21(1), 241-264.