Differences of Teachers and Students' Perceptions on Teaching Skills

교사의 수업전문성에 관한 교사와 학생의 인식 차이

  • Received : 2022.03.14
  • Accepted : 2022.04.18
  • Published : 2022.04.30


The purpose of this study is to examine the differences of perceptions of teachers and students regarding teaching skills. For the analysis, data was collected by ICALT(International Comparative Analysis of Learning and Teaching) class observation tool and students survey called My Teacher Questionnaire. a student survey. The data of teachers and students can be compared because as the two tools have seven common domains(Safe and stimulating learning climate, Efficient organization, Clear and structured instructions, Intensive and activating teaching, Adjusting instructions and learner processing to inter-learner differences, Teaching learning strategies, Learner engagement). In 2016, in Daejeon, Chungbuk and Chungnam. trained teachers collected data from 106 classes, and 2,866 students responded the survey. The reliability and validity of the two tools, class observation and MTQ(My Teacher Questionnaire) are proven to be satisfactory for use in Korean schools. Students perception on teaching was high, particularly when students are in lower grades and learning major subjects like English, Korean, and math. The domain of higher teaching skills, male students show higher perceptions while female students reported higher perceptions on lower-level teaching skill domains. To compare the perceptions of teachers and students, the predictive reliability of students engagement against teaching skill domains was used. Teachers showed higher predictive reliability on lower teaching skill domains while students showed higher predictive reliability on higher teaching skill domains. It is recommended for further study to develop a professional development model using a teacher class observation tool and the My Teacher Questionnaire for pre-service teachers and school teachers.

본 연구의 목적은 교사의 수업전문성에 관한 교사와 학생의 인식에 어떤 차이가 있는지 분석하는 것이다. 교사의 교실수업을 이해하기 위하여 교사의 수업관찰과 학생 설문은 흔히 사용되는 방법이다. 그러나 학생과 교사의 인식이 다른 경우 수업교사는 결과 해석에 혼란스럽다. 본 연구에서는 수업의 대상인 학생들의 수업전문성에 관한 인식은 학생 배경에 따라 어떻게 나타나는지, 교사의 인식과 학생참여에의 예측력은 어떻게 다른지 비교해 보았다. ICALT의 수업관찰도구와 학생 설문지(MTQ)를 사용하였다. 이 두 도구는 교사의 수업전문성에 관한 6영역 및 학생참여 영역 등 총 7개 영역 구조로 구성되어 있기 때문에 결과를 비교할 수 있다. 2016년도 대전 충북 충남 지역 소재 중학교에서 수업관찰 전문교사가 106개의 수업을 관찰하였고, 이 수업에 참여한 2866명의 학생이 설문에 응하였다. 연구 결과 학생들의 수업에 관한 인식은 대체로 높았으나 학생의 배경변인별로 다른 성향을 보였다. 학생들의 인식은 주요교과목을 배울 때, 또 학년이 낮을수록 수업전문성을 더 높게 인지하는 경향을 보였다. 남학생들은 수업기술의 난이도가 높은 영역에서 수업전문성을 더 높게 인식하였다. 학생의 학생참여에의 예측력은 수업기술의 난이도가 높은 영역에서 교사의 예측력보다 높았다. 향후 예비교사와 교사를 위해 학생설문자료를 어떻게 연수에 적용할 수 있는지 추후 연구가 필요하다.



이 논문은 2018학년도 충북대학교 연구년제 사업의 연구비 지원에 의하여 연구되었음


  1. 김희삼(2020). 교육정책의 숨은 비용: 교사의 노력. 중등교육연구, 68(2), 295-331.
  2. 유한구(2001). 수업전문성의 두 측면: 기술과 이해. 한국교원교육연구, 18(1), 69-84.
  3. 이혁규, 김향정 & 김남수(2014). 상이한 수업 관찰 도구를 통해 드러나는 교사들의 인식이해, 열린교육연구, 22(1), 449-477.
  4. 천세영, 김득준 & 정일화(2018). 수업전문성 측정도구(ICALT) 문항별 신뢰도 및 타당도에 관한 연구, 한국교원교육 연구, 35(3), 31-54.
  5. 천세영, 이옥화 & 전미애 (2017). ICALT 관찰도구를 활용한 교사의 교실수업전문성 분석 연구, 교육공학, 33(2), 517-536.
  6. 천세영, 이옥화, 정일화, 김득준, 장순선, 방인자 외, (2021). 수업분석과 수업코칭, 학지사.
  7. Aaronson, D., Barrow, L., & Sander, W. (2007). Teachers and student achievement in the Chicago public high schools. Journal of labor Economics, 25(1), 95-135.
  8. Andre S, Maulana R, Helms-Lorenz M, Telli S, Chun S, Fernandez-Garcia C-M, de Jager T, Irnidayanti Y, Inda-Caro M, Lee O, Safrina R, Coetzee, T., & Jeon, M. (2020). Student Perceptions in Measuring Teaching Behavior Across Six Countries: A Multi-Group Confirmatory Factor Analysis Approach to Measurement Invariance. Frontiers in Psychology , 11, 273. doi: 10.3389/fpsyg.2020.00273
  9. Archer, J., Cantrell, S., Holtzman, S., Joe, J., Tocci, C. & Wood, J. (2016) Better Feedback for Better Teaching: A Practical Guide to Improving Classroom Observations. Bill & Mellinda Gates Foundation, Jossey-Bass.
  10. Cavanagh, R. F., & Romanoski, J. T. (2006). Rating scale instruments and measurement. Learning Envi ronment Research, 9, 273-289.
  11. Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences. Second Edition. LEA Hillsdale New Jersey.
  12. Creemers, B. P. M. (1994). The effective classroom. London: Cassell.
  13. Danielson, C. (2011). The Framework for Teaching Evaluation Instrument, 2011 Edition. Princeton, NJ: Danielson Group.
  14. Danielson, C. (2013). The framewo rk f or teac hing evaluation instrument, 2013 instructionally focused edition. Retrieved January, 17, 2017.
  15. Danielson, C. (2014). The framework for teaching evaluation instrument. Princeton, NJ: Danielson Group.
  16. Danielson, C. (2018). Crosswalk between Universal Desing for Learning (UDL) and the Danielson Framework for teaching (FfT). Princeton, NJ: Danielson Group.
  17. Danielson, C. (2019). The framework for teaching Clusters(V2.1_09.25.2019). Princeton, NJ: Danielson Group.
  18. Fernandez-Garcia, C-M., Maulana, R., Inda-Caro, M., Helms-Lorenz, M., & Garcia-Perez, O. (2019). Student perceptions of secondary education teaching behaviour in Spain: General profile, the role of personal factors, and educational level. Frontiers in Psychology, 10, 533.
  19. Glickman, C. D. (2002). Leadrship for Learning: How to Help Teachers Succeed, ASCD
  20. Grossman, P., Greenberg, S., Hammerness, K., Cohen, J., Alston, C., & Brown, M. (2009). Development of the protocol for language arts teaching observation (PLATO). Paper presented at the Annual Meeting of the American Educational Research Association, San Diego, CA.
  21. Hanushek, E. & Rivkin, S. (2006). Chapter 18 Teacher Quality, Handbook of the Economics of Education, Vol.2, 1051-1078.
  22. Hattie, J. A. C., & Clinton, J. (2008). Identifying accomplished teachers: A validation study. In L. Ingvarson & J. A. C. Hattie (Eds.), Assessing teachers for professional certification: The first decade of the National Board for Professional Teaching Standards (pp. 313-344). Oxford: Elsevier.
  23. Hattie, J. (2009). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement, Routledge, DOI:10.4324/9780203887332
  24. Hill, H. (2008). Mathematical quality of instruction (MQI). coding tool.
  25. Hill, H. C., Charalambous, C. Y., Blazar, D., McGinn, D., Kraft, M. A., Beisiegel, M., & Lynch, K. (2012). Validating arguments for observational instruments: Attending to multiple sources of variation. Educational Assessment, 17(2-3), 88-106.
  26. Houtveen, A.A.M. & Van de Grift, W.J.C.M. (2001). Inclusion and Adaptive Instruction in Elementary Education. Journal of Education for Students Placed At Risk, 6(4), 389-411.
  27. Houtveen, Van de Grift & Willemsen (2018). Learning to teach: Effects of classroom observation, assignment of appropriate lesson preparation templates and stage focused feedback Studies in Educational Evaluation, 58, 8-16.
  28. Hunter, J. E., & Schmidt, F. L. (1990). Methods of meta-analysis: Correcting error and bias in research findings. Sage Publications, Inc.
  29. Kane, T., Mccaffrey, D., Miller, T. & Staiger, D. (2013). Have we identified effective teachers? Validating Measures of Effective Teaching using Random Assignment. MET.
  30. Marzano, R., Pickering, D. & Pollock, J. (2001). Classroom Instruction That Works: Research-Based Strategies for Increasing Student Achievement. ERIC ED450096
  31. Maulana, R., & Helms-Lorenz, M. (2016). Observations and student perceptions of pre-service teachers' teaching behavior quality: Construct representation and predictive quality. Learning Environments Research, 19, 335-357
  32. Muijs, D. & Reynolds, D. (2011). Effective Teaching: Evidence and Practice. Third edition. ResearchGate
  33. Muijs, D., Reynolds, D., Sammons, P., Kyriakides, L., Creemers, B. P., & Teddlie, C. (2018). Assessing individual lessons using a generic teacher observation instrument: How useful is the international system for teacher observation and feedback (ISTOF)? ZDM Mathematics Education, 50(3), 395-406.
  34. Ormrod, J. E. (2014). Educational Psychology: Developing learners, 8E., Cambridge: Pearson.
  35. Pianta, R., La Paro, K., & Hamre, B. (2008). CLASS classroom assessment scoring system manual K-3. Baltimore: Paul H.Brookes
  36. Pianta, R. C., & Hamre, B. K. (2009). Conceptualization, measurement, and improvement of classroom processes: Standardized observation can leverage capacity. Educational Researcher, 38, 109-119.
  37. Rosenthal, R. (1991). Meta-analytic procedures for social research. Newbury Park, CA: Sage.
  38. Sammons, P., Hillman, J. & Mortimore, P. (1995). Key Characteristics of Effective Schools. A Review of School Effectiveness Research. Institute of Education, London: Office for Standards in Education..
  39. Scheerens, J. (2013) The use of theory in school effectiveness research revisited, School Effectiveness and School Improvement, 24:1, 1-38, DOI:10.1080/09243453.2012.691100
  40. Seidel, T. & Shavelson, R. (2007). Teaching Effectiveness Research in the Past Decade: The Role of Theory and Research Design in Disentangling Meta-Analysis Results, Review of Educational Research, 77(4). 454-499.
  41. Spooren, P., & Mortelmans, D. (2006). Teacher professionalism and student evaluation of teaching: Will better teachers receive higher ratings and will better students give higher ratings? Educational studies, 32(2), 201-214.
  42. Teddlie, C., Creemers, B., Kyriakides, L., Muijs, D., & Yu, F. (2006). The international system for teacher observation and feedback: Evolution of an international study of teacher effectiveness constructs. Educational Research and Evaluation, 12(6), 561-582.
  43. Van de Grift, W. (2007). Quality of teaching in four European countries: A review of the literature and application of an assessment instrument. Educational Research, 49(2), 127-152
  44. Van de Grift, W. (2014). Measuring teaching quality in several European countries. School Effectiveness and School Improvement, 25(3), 295-311.
  45. Van de Grift, W., & Houtveen, A. A. M. (2007). Weaknesses in Underperforming Schools. Journal of Education for Students Placed At Risk, 12(4), 383-403.
  46. Van de Grift, W. J., Chun, S., Maulana, R., Lee, O., & Helms-Lorenz, M. (2017). Measuring teaching quality and student engagement in South Korea and The Netherlands. School Effectiveness and School Improvement, 28(3), 337-349.
  47. Van de Grift, W.J.C.M., Lee, O., Maulana, R., Chun, S., Helms-Lorenz, M., (2018). Measuring teaching skill of South Korean teachers in secondary education: Detecting a teacher's potential zone of proximal development using the Rasch model. JSEE 135.
  48. Van de Grift, W., Chun, S., Lee, O. & Kim, D. (출간을 위해 제출). Quality of Teaching at Secondary Schools in Nicaragua, South Korea and The Netherlands.
  49. Van der Lans, R., Maulana, R., Helms-Lorenz, M., Chun, S., Irnidayanti, Y., et al⋯ (2021). Student perceptions of teaching behaviour in five countries: Partial credit model and linking procedure. SAGE Open.