DOI QR코드

DOI QR Code

Assessment Process Design for Python Programming Learning

파이선(Python) 학습을 위한 평가 프로세스 설계

  • Ko, Eunji (Dept. of Educational Technology, Ewha Womans University) ;
  • Lee, Jeongmin (Dept. of Educational Technology, Ewha Womans University)
  • 고은지 (이화여자대학교 교육공학과) ;
  • 이정민 (이화여자대학교 교육공학과)
  • Received : 2020.02.05
  • Accepted : 2020.02.20
  • Published : 2020.02.28

Abstract

The purpose of this paper is to explore ways to assess computational thinking from a formative perspective and to design a process for assessing programming learning using Python. Therefore, this study explored the computational thinking domain and analyzed research related to assessment design. Also, this study identified the areas of Python programming learning that beginners learn and the areas of computational thinking ability that can be obtained through Python learning. Through this, we designed an assessment method that provides feedback by analyzing syntax corresponding to computational thinking ability. Besides, self-assessment is possible through reflective thinking by using the flow-chart and pseudo-code to express ideas, and peer feedback is designed through code sharing and communication using community.

본 논문은 기존 컴퓨팅 사고력 평가 연구를 분석하고 보완하여 형성적 관점에서 컴퓨팅 사고력을 평가하는 방안을 탐색하고, 텍스트 기반 프로그래밍 언어인 파이선을 활용한 프로그래밍 언어 학습 평가를 위한 평가 프로세스를 설계하기 위해 수행되었다. 이와 같은 목적으로 컴퓨팅 사고력 영역을 탐색하고 평가 설계에 관련된 연구를 분석하였다. 또한, 초보자가 학습하는 파이선 프로그래밍의 학습 영역을 확인하고, 파이선 학습을 통해 획득할 수 있는 컴퓨팅 사고력 영역을 규명하였다. 이들을 종합하여 컴퓨팅 사고력에 해당하는 구문을 분석하여 피드백을 제공하는 평가 방법을 설계하였다. 아울러, 순서도와 의사코드를 활용하여 아이디어를 나타내게 함으로써 반성적 사고를 통한 자기평가가 가능하게 하고, 커뮤니티를 활용한 코드공유 및 의사소통을 통해 동료피드백이 가능한 평가 프로세스를 설계하였다는 데에 본 연구의 시사점이 있다.

Keywords

References

  1. Aho, A. V. (2012). Computation and computational thinking. The Computer Journal, 55(7), 832-835. https://doi.org/10.1093/comjnl/bxs074
  2. Ala-Mutka, K. M. (2005). A survey of automated assessment approaches for programming assignments. Computer Science Education, 15(2), 83-102. https://doi.org/10.1080/08993400500150747
  3. Atmatzidou, S., & Demetriadis, S. (2016). Advancing students' computational thinking skills through educational robotics: A study on age and gender relevant differences. Robotics and Autonomous Systems, 75, 661-670. https://doi.org/10.1016/j.robot.2015.10.008
  4. Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: what is Involved and what is the role of the computer science education community?. Acm Inroads, 2(1), 48-54. https://doi.org/10.1145/1929887.1929905
  5. Bienkowski, M., Snow, E., Rutstein, D., & Grover, S. (2015). Assessment design patterns for computational thinking practices in secondary computer science: A first look. SRI International.
  6. Boe, B., Hill, C., Len, M., Dreschler, G., Conrad, P., & Franklin, D. (2013, March). Hairball: Lint-inspired static analysis of scratch projects. In Proceeding of the 44th ACM technical symposium on Computer science education (pp. 215-220).
  7. Brennan, K., & Resnick, M. (2012, April). New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 annual meeting of the American educational research association. (Vol. 1, p. 1-25). Vancouver, Canada.
  8. Cheang, B., Kurnia, A., Lim, A., & Oon, W. C. (2003). On automated grading of programming assignments in an academic institution. Computers & Education, 41(2), 121-131. https://doi.org/10.1016/S0360-1315(03)00030-7
  9. Choi, S. (2016). A Study on Teaching-learning for Enhancing Computational Thinking Skill in terms of Problem Solving. The Journal of Korean association of computer education, 19(2), 53-62.
  10. Choi, S. (2019). Review of Domestic Literature Based on System Mapping for Computational Thinking Assessment. The Journal of Korean association of computer education, 22(6), 19-33. https://doi.org/10.32431/kace.2019.22.6.003
  11. Computing At School [CAS] (2013). Computing in the National Curriculum : A Guide for Primary Teachers. Retrieved from http://www.computingatschool.org.uk/data/uploads/CASPrimaryComputing.pdf
  12. Computing At School [CAS] (2015). Computational Thinking : A Guide for teachers. Retrieved from http://community.computingatschool.org.uk/resources/2324
  13. Dasgupta, S., Hale, W., Monroy-Hernandez, A., & Hill, B. M. (2016, February). Remixing as a pathway to computational thinking. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 1438-1449).
  14. Edwards, S. H., & Perez-Quinones, M. A. (2008, June). Web-CAT: automatically grading programming assignments. In Proceedings of the 13th annual conference on Innovation and technology in computer science education (pp. 328-328).
  15. Foxley, E., Tsintsifas, A., Higgins, C. A., & Symeonidis, P. (1999). Ceilidh, a system for the automatic evaluation of students programming work. Proceedings of CBLISS, 99.
  16. Gelperin, D., & Hetzel, B. (1988). The growth of software testing. Communications of the ACM, 31(6), 687-695. https://doi.org/10.1145/62959.62965
  17. Gouda, K., & Hassaan, M. (2016, May). CSI_GED: An efficient approach for graph edit similarity computation. In 2016 IEEE 32nd International Conference on Data Engineering (ICDE) (pp. 265-276).
  18. Han, Y. (2018). Analysis of Effectiveness of Programming Learning for Non-science Major Preliminary Teachers' Development of Computational Thinking. Journal of the Korean Association of Information Education, 22(1), 41-52. https://doi.org/10.14352/jkaie.2018.22.1.41
  19. Isaacson, P. C., & Scott, T. A. (1989). Automating the execution of student programs. ACM SIGCSE Bulletin, 21(2), 15-22. https://doi.org/10.1145/65738.65741
  20. Jackson, D., & Usher, M. (1997, March). Grading student programs using ASSYST. In Proceedings of the twenty-eighth SIGCSE technical symposium on Computer science education (pp. 335-339).
  21. Jamil, H. M. (2017). Smart assessment of and tutoring for computational thinking MOOC assignments using MindReader. arXiv preprint arXiv:1705.00959.
  22. Kang, E. (2019). Structural Software Education Model for Non-majors - Focused on Python. Journal of Digital Contents Society, 20(12), 2423-2432. https://doi.org/10.9728/dcs.2019.20.12.2423
  23. Kim, J. (2017). Development of Rubric for Assessing Computational Thinking Concepts and Programming Ability. The Journal of Korean Association of Computer Education, 20(6), 27-36. https://doi.org/10.32431/KACE.2017.20.6.003
  24. Kim, S., Ham, S., & Song, K. (2015). Analytic Study on the Effectiveness of Computational Thinking based STEAM Program. The Journal of Korean association of computer education, 18(3), 105-114. https://doi.org/10.32431/KACE.2015.18.3.010
  25. Kim, T. & Han, S. (2018). Development of Python Education Program for Block Coding Learners. Journnal of the Korean Association of Information Education, 22(1), 53-60. https://doi.org/10.14352/jkaie.2018.22.1.53
  26. Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), 14-29. https://doi.org/10.1080/1369118X.2016.1154087
  27. Korea Foundation for the Advancement of Science & Creativity (2014). Research for Introducing Computational Thinking into Primary and Secondary Education.
  28. Kurnia, A., Lim, A., & Cheang, B. (2001). Online judge. Computers & Education, 36(4), 299-315. https://doi.org/10.1016/S0360-1315(01)00018-5
  29. Kwon, J. (2019). Research of Computational Thinking based on Analyzed in Each Major Learner. The Journal of Society for e-Business Studies, 24(4), 17-30.
  30. Lee, Y. (2018). Python-based Software Education Model for Non-Computer Majors. Journal of the Korea Convergence Society, 9(3) 73-78. https://doi.org/10.15207/JKCS.2018.9.3.073
  31. Lister, R., Adams, E. S., Fitzgerald, S., Fone, W., Hamer, J., Lindholm, M., ... & Simon, B. (2004). A multi-national study of reading and tracing skills in novice programmers. ACM SIGCSE Bulletin, 36(4), 119-150 https://doi.org/10.1145/1041624.1041673
  32. Marin, V. J., Pereira, T., Sridharan, S., & Rivero, C. R. (2017, April). Automated personalized feedback in introductory Java programming MOOCs. In 2017 IEEE 33rd International Conference on Data Engineering (ICDE) (pp. 1259-1270).
  33. Ministry of Education, Korea (2015). Software Education Instructional Guidance.
  34. Moon, M., & Kim, K. (2008). Python programming education for elementary school students. Journnal of the Korean Association of Information Education, 9(1), 33-41.
  35. Moreno-Leon, J., Robles, G., & Roman-Gonzalez, M. (2015). Dr. Scratch: Automatic analysis of scratch projects to assess and foster computational thinking. RED. Revista de Educacion a Distancia, 1(46), 1-23.
  36. Myers, G. J., Sandler, C., & Badgett, T. (2011). The art of software testing. John Wiley & Sons.
  37. Nam, C., & Kim, J. (2019). Development of computational thinking based Coding_Projects using the ARCS model. Journal of the Korean Association of Information Education, 23(4), 355-362. https://doi.org/10.14352/jkaie.2019.23.4.355
  38. Nygard, S. H. (2016). Automatic self-evaluation system for novice Python developers (Master's thesis, NTNU).
  39. Ota, G., Morimoto, Y., & Kato, H. (2016, September). Ninja code village for scratch: Function samples/function analyser and automatic assessment of computational thinking concepts. In 2016 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) (pp. 238-239).
  40. Parsons, D., & Haden, P. (2006, January). Parson's programming puzzles: a fun and effective learning tool for first programming courses. In Proceedings of the 8th Australasian Conference on Computing Education, 52(pp. 157-163).
  41. Piech, C., Huang, J., Nguyen, A., Phulsuksombati, M., Sahami, M., & Guibas, L. (2015). Learning program embeddings to propagate feedback on student code. arXiv preprint arXiv:1505.05969.
  42. Reek, K. A. (1989). The TRY system-or-how to avoid testing student programs. In ACM SIGCSE Bulletin, 21(1), 112-116. https://doi.org/10.1145/65294.71198
  43. Saez-Lopez, J. M., Roman-Gonzalez, M., & Vazquez-Cano, E. (2016). Visual programming languages integrated across the curriculum in elementary school: A two year case study using "Scratch" in five schools. Computers & Education, 97, 129-141. https://doi.org/10.1016/j.compedu.2016.03.003
  44. Seo, Y., Yeom, M., & Kim, J. (2016). Analysis of Effect that Pair Programming Develope of Computational Thinking and Creativity in Elementary Software Education. Journal of The Korean Assocaition of Information Ecucation. 20(3), 219-234. https://doi.org/10.14352/jkaie.20.3.219
  45. Shuhidan, S., Hamilton, M., & D'Souza, D. (2010). Instructor perspectives of multiple-choice questions in summative assessment for novice programmers. Computer Science Education, 20(3), 229-259. https://doi.org/10.1080/08993408.2010.509097
  46. Singh, R., Gulwani, S., & Solar-Lezama, A. (2013). Automated feedback generation for introductory programming assignments. In Proceedings of the 34th ACM SIGPLAN conference on Programming language design and implementation (pp. 15-26).
  47. Spacco, J., Hovemeyer, D., Pugh, W., Emad, F., Hollingsworth, J. K., & Padua-Perez, N. (2006, June). Experiences with marmoset: designing and using an advanced submission and testing system for programming courses. In ACM Sigcse Bulletin, 38(3), 13-17. https://doi.org/10.1145/1140123.1140131
  48. Truong, N., Roe, P., & Bancroft, P. (2004, January). Static analysis of students' Java programs. In Proceedings of the Sixth Australasian Conference on Computing Education, 30(pp. 317-325). Australian Computer Society, Inc..
  49. Von Wangenheim, C. G., Hauck, J. C., Demetrio, M. F., Pelle, R., da Cruz Alves, N., Barbosa, H., & Azevedo, L. F. (2018). CodeMaster--Automatic Assessment and Grading of App Inventor and Snap! Programs. Informatics in Education, 17(1), 117-150. https://doi.org/10.15388/infedu.2018.08
  50. Vujosevic-Janicic, M., Nikolic, M., Tosic, D., & Kuncak, V. (2013). Software verification and graph similarity for automated evaluation of students' assignments. Information and Software Technology, 55(6), 1004-1016. https://doi.org/10.1016/j.infsof.2012.12.005
  51. Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33-35. https://doi.org/10.1145/1118178.1118215
  52. Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 366(1881), 3717-3725. https://doi.org/10.1098/rsta.2008.0118
  53. Wolz, U., Hallberg, C., & Taylor, B. (2011, March). Scrape: A tool for visualizing the code of Scratch programs. In Poster presented at the 42nd ACM Technical Symposium on Computer Science Education, Dallas, TX.
  54. Xu, S., & San Chee, Y. (2003). Transformation-based diagnosis of student programs for programming tutoring systems. IEEE Transactions on Software Engineering, 29(4), 360-384. https://doi.org/10.1109/TSE.2003.1191799
  55. Zhong, B., Wang, Q., Chen, J., & Li, Y. (2016). An exploration of three-dimensional integrated assessment for computational thinking. Journal of Educational Computing Research, 53(4), 562-590. https://doi.org/10.1177/0735633115608444
  56. Zuleger, F., Radicek, I., & Gulwani, S. (2016). Feedback generation for performance problems in introductory programming assignments. In Software Engineering (pp. 49-50).