Browse > Article
http://dx.doi.org/10.7468/mathedu.2022.61.4.597

Exploring the change in achievement by the transition of the test mode from paper to computer: Focusing on the national assessment of educational achievement of high school mathematics  

Jung, Hye-Yun (Korea Institute for Curriculum and Evaluation)
Song, Chang-Geun (Seoul National University)
Kim, Young-Jun (Seoul National University)
Lee, Kyeong-Hwa (Seoul National University)
Publication Information
The Mathematical Education / v.61, no.4, 2022 , pp. 597-612 More about this Journal
Abstract
Recently, large-scale mathematics assessments are shifting from traditional paper-based tests to computer-based tests, nationally and internationally. This study explored the mode effect (the difference in student achievement by the change of test mode) according to the types of test items, the technological function reflected in the items, the characteristics of students' computer use, and the computer-based test environment. To this end, we analyzed the results of the 2020 national assessment of educational achievement of high school mathematics conducted on a paper and computer basis. As a result, firstly, the mode effect induced by the mode transition was generally insignificant, but the mode effect was larger in the extended response type than other types. Secondly, there were differences in the mode effect according to the transition to test with computer mode where innovative items were added. Thirdly, the difference between mode effects was statistically significant according to the student's sense of efficacy in computer use. The results of this study suggest that innovative items should be introduced deliberately according to the targeted content and competency in evaluation, and that assessment design and environment preparation need to be carefully developed so that nonessential abilities other than students' mathematical ability or incidental situation do not distort the assessment results.
Keywords
computer-based test of mathematics; large-scale assessment; mode effects; innovative item;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 Backes, B. & Cowan, J. (2019). Is the pen mightier than the keyboard? The effect of online testing on measured student achievement. Economics of Education Review, 68(1), 89-103. https://doi.org/10.1016/j.econedurev.2018.12.007   DOI
2 Bennett, R. E., Braswell, J., Oranje, A., Sandene, B., Kaplan, B., & Yan, F. (2008). Does it Matter if I Take My Mathematics Test on Computer? A Second Empirical Study of Mode Effects in NAEP. Journal of Technology, Learning, and Assessment, 6(9), 1-38.
3 Buerger, S., Kroehne, U., Koehler, C., & Goldhammer, F. (2019). What makes the difference? The impact of item properties on mode effects in reading assessments. Studies in Educational Evaluation, 62(3), 1-9. http://dx.doi.org/10.1016/j.stueduc.2019.04.005   DOI
4 Clements, D. H. (2020). From exercises and tasks to problems and projects: Unique contributions of computers to innovative mathematics education. The Journal of Mathematical Behavior, 19(1), 9-47. http://dx.doi.org/10.1016/S0732-3123(00)00036-5   DOI
5 Poggio, J., Glasnapp, D., Yang, X., & Poggio, A. (2005). A comparative evaluation of score results from computerized and paper & pencil mathematics testing in a large scale state assessment program. The Journal of Technology, Learning, and Assessment, 3(6).
6 Ebrahimi, M.R., Toroujeni, S.M., & Shahbazi, V. (2019). Score equivalence, gender difference, and testing mode preference in a comparative study between computer-based testing and paper-based testing. International Journal of Emerging Technologies in Learning, 14(7), 128-143. https://doi.org/10.3991/ijet.v14i07.10175   DOI
7 Fishbein, B., Martin, M. O., Mullis, I. V., & Foy, P. (2018). The TIMSS 2019 item equivalence study: Examining mode effects for computer-based assessment and implications for measuring trends. Large-scale Assessments in Education, 6(1), 1-23. https://doi.org/10.1186/s40536-018-0064-z   DOI
8 Geraniou, E., & Jankvist, U. T. (2019). Towards a definition of "mathematical digital competency". Educational Studies in Mathematics, 102(1), 29-45. https://doi.org/10.1007/s10649-019-09893-8   DOI
9 Heyd-Metzuyanim, E., Sharon, A. J., & Baram-Tsabari, A. (2021). Mathematical media literacy in the COVID-19 pandemic and its relation to school mathematics education. Educational Studies in Mathematics, 108(1), 201-225. https://doi.org/10.1007/s10649-021-10075-8   DOI
10 Hoogland, K., & Tout, D. (2018). Computer-based assessment of mathematics into the twenty-first century: pressures and tensions. ZDM, 50(4), 675-686. https://doi.org/10.1007/S11858-018-0944-2   DOI
11 Hu, L., Chen, G., Li, P., & Huang, J. (2021). Multimedia effect in problem solving: A meta-analysis. Educational Psychology Review, 33(4), 1717-1747. https://psycnet.apa.org/doi/10.1007/s10648-021-09610-z   DOI
12 Jeong, H. (2014). A comparative study of scores on computer-based tests and paper-based tests. Behaviour & Information Technology, 33(4), 410-422. https://doi.org/10.1080/0144929X.2012.710647   DOI
13 Drasgow, F., & Mattern, K. (2006). New tests and new Items: Opportunities and issues. In D. Bartram, & R. K. Hambleton (Eds.), Computer-based Testing and the Internet: Issues and Advances (pp. 59-76). John Wiley and Sons Ltd.
14 Jung, H. Y. (2022). Perceptions of students and teachers towards computer based test in National Assessment of Educational Achievement : Focused on high school mathematics test. School Mathematics, 24(1), 119-145. http://doi.org/10.29275/sm.2022.3.24.1.119   DOI
15 Martin, M. O., von Davier, M., & Mullis, I. V. S. (2020) Methods and Procedures: TIMSS 2019 Technical Report. TIMSS & PIRLS International Study Center.
16 Karay, Y., Schauber, S., Stosch, C., & Schuttpelz-Brauns, K. (2015). Computer versus paper: Does it make any difference in test performance? Teaching and Learning in Medicine, 27(1), 57-62. https://doi.org/10.1080/10401334.2014.979175   DOI
17 Kingston, N. (2008). Comparability of computer- and paper-administered multiple-choice tests for K-12 populations: A synthesis. Applied Measurement in Education, 22(1), 22-37. https://doi.org/10.1080/08957340802558326   DOI
18 Lee, J. B., Kim, Y. H., Kim, J. S., Nam, M. W., Park, J. S., Park, J. H., Baek, J. H., Sung, K. H., Lee, S. R., Jang, E. S., & Jung, H. Y. (2021). Advancing the test tools for Electronic National Assessment of Educational Achievement (eNAEA). KICE.
19 Martin, R. (2008). New possibilities and challenges for assessment through theuse of technology. In F. Scheuermann, & G. Pereira (Eds.). Towards a research agenda on computer-based assessment (pp. 6-9). Office for Official Publications of the European Communities.
20 McDonald, A. (2002). The impact of individual differences on the equivalence of computer-based and paper-and-pencil educational assessments. Computers and Education, 39(3), 299-312. https://doi.org/10.1016/S0360-1315(02)00032-5   DOI
21 Arslan, B., Jiang, Y., Keehner, M., Gong, T., Katz, I. R., & Yan, F. (2020). The Effect of Drag-and-Drop Item Features on Test-Taker Performance and Response Strategies. Educational Measurement: Issues and Practice, 39(2), 96-106. https://doi.org/10.1111/emip.12326   DOI
22 Noyes, J., & Garland, K. (2008). Computer- vs. paper-based tasks: Are they equivalent? Ergonomics, 51(9), 1352-1375. https://doi.org/10.1080/00140130802170387   DOI
23 Russell, M. (1999). Testing on computers: A follow-up study comparing performance on computer and on paper. Boston College.
24 Spires, H. A., Paul, C. M., & Kerkhoff, S. N. (2019). Digital literacy for the 21st century. In D. B. A. Khosrow-Pou (Ed.). Advanced methodologies and technologies in library science, information management, and scholarly inquiry (pp. 12-21). IGI Global.
25 Yoo, S. M. (2013). SPSS statistical analysis for writing a thesis. Slow& Steady.
26 Wang, S., Jiao, H., Young, M. J., Brooks, T., & Olson, J. (2007). A meta-analysis of testing mode effects in grade K-12 mathematics tests. Educational and Psychological Measurement, 67(2), 219-238. https://doi.org/10.1177/0013164406288166   DOI
27 Bridgeman, B. (1992). A Comparison of quantitative questions in open-ended and multiple-choice formats. Journal of Educational Measurement, 29(3), 253-271.   DOI
28 Csapo, B., Ainley, J., Bennett, R. E., Latour, T., & Law, N. (2012). Technological issues for computer-based assessment. In P. Griffin, B. McGaw, & E. Care (Eds.). Assessment and teaching of 21st century skills. Springer.
29 Eid, G. K. (2005). An investigation into the effects and factors influencing computer-based online math problem-solving in primary schools. Journal of Educational Technology Systems, 33(3), 223-240. https://doi.org/10.2190/J3Q5-BAA5-2L62-AEY3   DOI
30 Pommerich, M. (2004). Developing computerized versions of paper-and-pencil tests: mode effects for passage-based tests. The Journal of Technology, Learning, and Assessment, 2(6), 1-44.
31 Rose, J., Low-Choy, S., Singh, P., & Vasco, D. (2020). NAPLAN discourses: A systematic review after the first decade. Discourse: Studies in the Cultural Politics of Education, 41(6), 871-886. https://doi.org/10.1080/01596306.2018.1557111   DOI
32 Russell, M., & Tao, W. (2004). The influence of computer-print on rater scores. Practical Assessment, Research, and Evaluation, 9(10), 1-13.
33 Zilles, C., West, M., Mussulman, D., & Bretl, T. (2018). Making testing less trying: Lessons learned from operating a Computer-Based Testing Facility. In 2018 IEEE Frontiers in Education (FIE) Conference, IEEE.
34 Scalise, K., & Gifford, B. (2006). Computer-based assessment in E-Learning: A framework for constructing "Internet Constraint" questions and tasks for technology platforms. The Journal of Technology, Learning, and Assessment, 4(6). Retrieved from https://www.learntechlib.org/p/103254/.   DOI
35 Shacham, M. (1998). Computer-based exams in undergraduate engineering courses. Computer Applications in Engineering Education, 6(3), 201-209.   DOI
36 Taylor, C., Jamieson, J., Eignor, D. & Kirsch, I. (1998). The Relationship between computer familiarity and performance on computer-based TOEFL test tasks. Educational Testing Service.
37 Haladyna, T., Downing, S., & Rodriguez, M. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15(3), 309-333. https://doi.org/10.1207/S15324818AME1503_5   DOI
38 Schleppegrell, M. J. (2007). The linguistic challenges of mathematics teaching and learning: A research review. Reading & Writing Quarterly, 23(2), 139-159. https://doi.org/10.1080/10573560601158461   DOI
39 Schoenfeld, A. H. (2017). On learning and assessment. Assessment in Education: Principles, Policy & Practice, 24(3), 369-378. https://doi.org/10.1080/0969594X.2017.1336986   DOI
40 Sireci, S. G., & Zenisky, A. L. (2006). Innovative item formats in computer-based testing: In pursuit of improved construct representation. In S. M. Downing, &T. M. Haladyna (Eds.), Handbook of test development (pp. 329-347). Lawrence Erlbaum Associates Publishers.
41 Lee, S.-G., Ham, Y., Lee, J. H., & Park, K.-E. (2020). A case study on college mathematics education in untact DT era. Communications of Mathematical Education, 34(3), 201-214. https://doi.org/10.7468/JKSMEE.2020.34.3.201   DOI
42 Horkay, N., Bennett, R., Allen, N., Kaplan, B., &Yan, F. (2006). Does it matter if I take my writing test on computer? An empirical study of mode effects in NAEP. The Journal of Technology, Learning, and Assessment, 5(2). Retrieved from https://ejournals.bc.edu/index.php/jtla/article/view/1641
43 Jewsbury, P., Finnegan, R., Xi, N., Jia, Y., Rust, K., Burg, S., Donahue, P., Mazzeo, J., Cramer, E. B., & Lin, A. (2017). NAEP transition to digitally based assessments in mathematics and reading at grades 4 and 8: Mode evaluation study. National Center for Education Statistics.
44 Keng, L., McClarty, K., & Davis, L. (2008). Item-level comparative analysis of online and paper administrations of the texas assessment of knowledge and skills. Applied Measurement in Education, 21(3), 207-226. https://doi.org/10.1080/08957340802161774   DOI
45 Mcclelland, T., & Cuevas, J. (2020). A comparison of computer based testing and paper and pencil testing in mathematics assessment. The Online Journal of New Horizons in Education, 10(2), 78-89.
46 Sandene, B., Horkay, N., Bennett, R., Allen, N., Braswell, J., Kaplan, B., & Oranje, A. (2005). Online assessment in mathematics and writing: Reports from the NAEP technology-based assessment project, research and development series. National Center for Education Statistics.
47 Park, M. (2020). Applications and possibilities of artificial intelligence in mathematics education. Communications of Mathematical Education, 34(4), 545-561. https://doi.org/10.7468/JKSMEE.2020.34.4.545   DOI
48 Parshall, C. G., Harmes, J. C., Davey, T., & Pashley, P. J. (2009). Innovative items for computerized testing. In W. J. van der Linden (Ed.). Elements of adaptive testing (pp. 215-230). Springer. https://doi.org/10.1007/978-0-387-85461-8_11   DOI