Browse > Article
http://dx.doi.org/10.14352/jkaie.2020.24.1.117

Assessment Process Design for Python Programming Learning  

Ko, Eunji (Dept. of Educational Technology, Ewha Womans University)
Lee, Jeongmin (Dept. of Educational Technology, Ewha Womans University)
Publication Information
Journal of The Korean Association of Information Education / v.24, no.1, 2020 , pp. 117-129 More about this Journal
Abstract
The purpose of this paper is to explore ways to assess computational thinking from a formative perspective and to design a process for assessing programming learning using Python. Therefore, this study explored the computational thinking domain and analyzed research related to assessment design. Also, this study identified the areas of Python programming learning that beginners learn and the areas of computational thinking ability that can be obtained through Python learning. Through this, we designed an assessment method that provides feedback by analyzing syntax corresponding to computational thinking ability. Besides, self-assessment is possible through reflective thinking by using the flow-chart and pseudo-code to express ideas, and peer feedback is designed through code sharing and communication using community.
Keywords
Python; Computational thinking; Text-based programming language; SW education; Programming assessment;
Citations & Related Records
Times Cited By KSCI : 12  (Citation Analysis)
연도 인용수 순위
1 Cheang, B., Kurnia, A., Lim, A., & Oon, W. C. (2003). On automated grading of programming assignments in an academic institution. Computers & Education, 41(2), 121-131.   DOI
2 Choi, S. (2016). A Study on Teaching-learning for Enhancing Computational Thinking Skill in terms of Problem Solving. The Journal of Korean association of computer education, 19(2), 53-62.
3 Choi, S. (2019). Review of Domestic Literature Based on System Mapping for Computational Thinking Assessment. The Journal of Korean association of computer education, 22(6), 19-33.   DOI
4 Computing At School [CAS] (2013). Computing in the National Curriculum : A Guide for Primary Teachers. Retrieved from http://www.computingatschool.org.uk/data/uploads/CASPrimaryComputing.pdf
5 Computing At School [CAS] (2015). Computational Thinking : A Guide for teachers. Retrieved from http://community.computingatschool.org.uk/resources/2324
6 Dasgupta, S., Hale, W., Monroy-Hernandez, A., & Hill, B. M. (2016, February). Remixing as a pathway to computational thinking. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 1438-1449).
7 Edwards, S. H., & Perez-Quinones, M. A. (2008, June). Web-CAT: automatically grading programming assignments. In Proceedings of the 13th annual conference on Innovation and technology in computer science education (pp. 328-328).
8 Foxley, E., Tsintsifas, A., Higgins, C. A., & Symeonidis, P. (1999). Ceilidh, a system for the automatic evaluation of students programming work. Proceedings of CBLISS, 99.
9 Gelperin, D., & Hetzel, B. (1988). The growth of software testing. Communications of the ACM, 31(6), 687-695.   DOI
10 Korea Foundation for the Advancement of Science & Creativity (2014). Research for Introducing Computational Thinking into Primary and Secondary Education.
11 Kurnia, A., Lim, A., & Cheang, B. (2001). Online judge. Computers & Education, 36(4), 299-315.   DOI
12 Kwon, J. (2019). Research of Computational Thinking based on Analyzed in Each Major Learner. The Journal of Society for e-Business Studies, 24(4), 17-30.
13 Jackson, D., & Usher, M. (1997, March). Grading student programs using ASSYST. In Proceedings of the twenty-eighth SIGCSE technical symposium on Computer science education (pp. 335-339).
14 Lee, Y. (2018). Python-based Software Education Model for Non-Computer Majors. Journal of the Korea Convergence Society, 9(3) 73-78.   DOI
15 Wolz, U., Hallberg, C., & Taylor, B. (2011, March). Scrape: A tool for visualizing the code of Scratch programs. In Poster presented at the 42nd ACM Technical Symposium on Computer Science Education, Dallas, TX.
16 Piech, C., Huang, J., Nguyen, A., Phulsuksombati, M., Sahami, M., & Guibas, L. (2015). Learning program embeddings to propagate feedback on student code. arXiv preprint arXiv:1505.05969.
17 Reek, K. A. (1989). The TRY system-or-how to avoid testing student programs. In ACM SIGCSE Bulletin, 21(1), 112-116.   DOI
18 Saez-Lopez, J. M., Roman-Gonzalez, M., & Vazquez-Cano, E. (2016). Visual programming languages integrated across the curriculum in elementary school: A two year case study using "Scratch" in five schools. Computers & Education, 97, 129-141.   DOI
19 Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33-35.   DOI
20 Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 366(1881), 3717-3725.   DOI
21 Xu, S., & San Chee, Y. (2003). Transformation-based diagnosis of student programs for programming tutoring systems. IEEE Transactions on Software Engineering, 29(4), 360-384.   DOI
22 Zhong, B., Wang, Q., Chen, J., & Li, Y. (2016). An exploration of three-dimensional integrated assessment for computational thinking. Journal of Educational Computing Research, 53(4), 562-590.   DOI
23 Zuleger, F., Radicek, I., & Gulwani, S. (2016). Feedback generation for performance problems in introductory programming assignments. In Software Engineering (pp. 49-50).
24 Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: what is Involved and what is the role of the computer science education community?. Acm Inroads, 2(1), 48-54.   DOI
25 Aho, A. V. (2012). Computation and computational thinking. The Computer Journal, 55(7), 832-835.   DOI
26 Ala-Mutka, K. M. (2005). A survey of automated assessment approaches for programming assignments. Computer Science Education, 15(2), 83-102.   DOI
27 Atmatzidou, S., & Demetriadis, S. (2016). Advancing students' computational thinking skills through educational robotics: A study on age and gender relevant differences. Robotics and Autonomous Systems, 75, 661-670.   DOI
28 Bienkowski, M., Snow, E., Rutstein, D., & Grover, S. (2015). Assessment design patterns for computational thinking practices in secondary computer science: A first look. SRI International.
29 Boe, B., Hill, C., Len, M., Dreschler, G., Conrad, P., & Franklin, D. (2013, March). Hairball: Lint-inspired static analysis of scratch projects. In Proceeding of the 44th ACM technical symposium on Computer science education (pp. 215-220).
30 Brennan, K., & Resnick, M. (2012, April). New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 annual meeting of the American educational research association. (Vol. 1, p. 1-25). Vancouver, Canada.
31 Ministry of Education, Korea (2015). Software Education Instructional Guidance.
32 Gouda, K., & Hassaan, M. (2016, May). CSI_GED: An efficient approach for graph edit similarity computation. In 2016 IEEE 32nd International Conference on Data Engineering (ICDE) (pp. 265-276).
33 Han, Y. (2018). Analysis of Effectiveness of Programming Learning for Non-science Major Preliminary Teachers' Development of Computational Thinking. Journal of the Korean Association of Information Education, 22(1), 41-52.   DOI
34 Isaacson, P. C., & Scott, T. A. (1989). Automating the execution of student programs. ACM SIGCSE Bulletin, 21(2), 15-22.   DOI
35 Lister, R., Adams, E. S., Fitzgerald, S., Fone, W., Hamer, J., Lindholm, M., ... & Simon, B. (2004). A multi-national study of reading and tracing skills in novice programmers. ACM SIGCSE Bulletin, 36(4), 119-150   DOI
36 Marin, V. J., Pereira, T., Sridharan, S., & Rivero, C. R. (2017, April). Automated personalized feedback in introductory Java programming MOOCs. In 2017 IEEE 33rd International Conference on Data Engineering (ICDE) (pp. 1259-1270).
37 Moon, M., & Kim, K. (2008). Python programming education for elementary school students. Journnal of the Korean Association of Information Education, 9(1), 33-41.
38 Moreno-Leon, J., Robles, G., & Roman-Gonzalez, M. (2015). Dr. Scratch: Automatic analysis of scratch projects to assess and foster computational thinking. RED. Revista de Educacion a Distancia, 1(46), 1-23.
39 Shuhidan, S., Hamilton, M., & D'Souza, D. (2010). Instructor perspectives of multiple-choice questions in summative assessment for novice programmers. Computer Science Education, 20(3), 229-259.   DOI
40 Seo, Y., Yeom, M., & Kim, J. (2016). Analysis of Effect that Pair Programming Develope of Computational Thinking and Creativity in Elementary Software Education. Journal of The Korean Assocaition of Information Ecucation. 20(3), 219-234.   DOI
41 Singh, R., Gulwani, S., & Solar-Lezama, A. (2013). Automated feedback generation for introductory programming assignments. In Proceedings of the 34th ACM SIGPLAN conference on Programming language design and implementation (pp. 15-26).
42 Spacco, J., Hovemeyer, D., Pugh, W., Emad, F., Hollingsworth, J. K., & Padua-Perez, N. (2006, June). Experiences with marmoset: designing and using an advanced submission and testing system for programming courses. In ACM Sigcse Bulletin, 38(3), 13-17.   DOI
43 Truong, N., Roe, P., & Bancroft, P. (2004, January). Static analysis of students' Java programs. In Proceedings of the Sixth Australasian Conference on Computing Education, 30(pp. 317-325). Australian Computer Society, Inc..
44 Kim, S., Ham, S., & Song, K. (2015). Analytic Study on the Effectiveness of Computational Thinking based STEAM Program. The Journal of Korean association of computer education, 18(3), 105-114.   DOI
45 Jamil, H. M. (2017). Smart assessment of and tutoring for computational thinking MOOC assignments using MindReader. arXiv preprint arXiv:1705.00959.
46 Kang, E. (2019). Structural Software Education Model for Non-majors - Focused on Python. Journal of Digital Contents Society, 20(12), 2423-2432.   DOI
47 Kim, J. (2017). Development of Rubric for Assessing Computational Thinking Concepts and Programming Ability. The Journal of Korean Association of Computer Education, 20(6), 27-36.   DOI
48 Kim, T. & Han, S. (2018). Development of Python Education Program for Block Coding Learners. Journnal of the Korean Association of Information Education, 22(1), 53-60.   DOI
49 Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), 14-29.   DOI
50 Myers, G. J., Sandler, C., & Badgett, T. (2011). The art of software testing. John Wiley & Sons.
51 Nam, C., & Kim, J. (2019). Development of computational thinking based Coding_Projects using the ARCS model. Journal of the Korean Association of Information Education, 23(4), 355-362.   DOI
52 Nygard, S. H. (2016). Automatic self-evaluation system for novice Python developers (Master's thesis, NTNU).
53 Ota, G., Morimoto, Y., & Kato, H. (2016, September). Ninja code village for scratch: Function samples/function analyser and automatic assessment of computational thinking concepts. In 2016 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) (pp. 238-239).
54 Parsons, D., & Haden, P. (2006, January). Parson's programming puzzles: a fun and effective learning tool for first programming courses. In Proceedings of the 8th Australasian Conference on Computing Education, 52(pp. 157-163).
55 Vujosevic-Janicic, M., Nikolic, M., Tosic, D., & Kuncak, V. (2013). Software verification and graph similarity for automated evaluation of students' assignments. Information and Software Technology, 55(6), 1004-1016.   DOI
56 Von Wangenheim, C. G., Hauck, J. C., Demetrio, M. F., Pelle, R., da Cruz Alves, N., Barbosa, H., & Azevedo, L. F. (2018). CodeMaster--Automatic Assessment and Grading of App Inventor and Snap! Programs. Informatics in Education, 17(1), 117-150.   DOI