DOI QR코드

DOI QR Code

How to develop tiered tests: A developmental framework using statistical indexes and four tier types in secondary physics

  • Published : 2009.05.30

Abstract

In the era of the outcome-based education, multiple-choice test has been widely employed owing to its efficiency that enables educators to evaluate a quantity of students with much objectiveness. However, the prevalent test has not been reconsidered enough to overcome its apparent shortcomings: examiners' effort for developing plausible and faultless distracters defending from every falsification, and students' random guessing on key choices. For alleviating such defects, tiered test as an experimental format of multiple-choice tests has been suggested in science education. Since there has not accumulated much study on the implementation of tiered tests, our research aim is set to construct a framework suggesting statistical indexes for rationally discerning tiered units that develop an effective tiered test. Graded both by our tiered-scoring and by the conventional partial-scoring, the preliminary tiered test in secondary physics attests the improvement in its discrimination and difficulty distribution. The findings reveal that the two indexes discern effective tiered items: discrimination increase (Ct-p) and difficulty decrease (Dp-t). Based on the index information, 4 heterogeneous tier types are recommended in the content of secondary physics: directional manipulation, repeated calculation, diverse explanation, and plural variables.

Keywords

References

  1. Andersson, B. (2000). National evaluation for the improvement of science teaching. In R. Millar, J. Leach, & J. Osborne (Eds.), Improving science education: The contribution of research (pp. 62-78). Birming-ham: Open University Press
  2. Atkin, J. M., Coffey, J. E., Moorthy, S., Thibeault, M., & Sato, M. (2005). Designing everyday assessment in the science classroom. New York: Teachers College Press
  3. Bae, J. (2007, December 23). Massive lawsuits expected over college test. The Korea Times. Retrieved May 27, 2009, from http://www.koreatimes.co.kr/www/news/nation/2007/12/113_16003.html
  4. Bak, H., & Kwon, J. (1990). A study on analysis of novice's protocol in solving physics problems. Journal of the Korean Association for Research in Science Education, 10(1), 57-64
  5. Barr, B. B. (1993). Research on problem solving: Elementary school. In D. L. Gabel (Ed.), Handbook of research on science teaching and learning (pp. 237-247). New York: Macmilian Publishing Company
  6. Bell, B. (2007). Classroom assessment of science learning. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education (pp. 965-1006). London: Lawrence Erlbaum Associates, Publishers
  7. Burton, R. F. (2001). Quantifying the effects of chance in multiple choice and true/false tests: Question selection and guessing of answers. Assessment & Evaluation in Higher Education, 26(1), 41-50 https://doi.org/10.1080/02602930020022273
  8. Dimes, R. E. (1973). Objective tests and their construction. Physics Education, 8, 251-254 https://doi.org/10.1088/0031-9120/8/4/004
  9. Engelhart, M. D. (1965). A comparison of several item discrimination indices. Journal of Educational Measurement, 2(1), 69-76 https://doi.org/10.1111/j.1745-3984.1965.tb00393.x
  10. Gronlund, N. E. (1971). Measurement and evaluation in teaching. New York: The Macmillan Company
  11. Hargreaves, A., Earl, L., Moore, S., & Manning, S. (2001). Learning to change: Teaching beyond subjects and standards. San Francisco: Jossey-Bass
  12. Harlen, W. (2005). On the relationship between assessment for formative and summative purposes. In J. Gardner (Ed.), Assessment and learning (1st ed., pp. 103-118). London: Sage
  13. Haslam, F., & Treagust, D. F. (1987). Diagnosing secondary students' misconceptions of photosynthesis and respiration in plants using a two-tier multiple choice instrument. Journal of Biological Education, 21(3), 203-210 https://doi.org/10.1080/00219266.1987.9654897
  14. Haynes, S. N., Richard, D. C. S., & Kubany, E. S. (1995). Content validity in psychological assessment: A functional approach to concepts and methods. Psychological Assessment, 7(3), 238-247 https://doi.org/10.1037/1040-3590.7.3.238
  15. Herman, J. L., & Golan, S. (1993). The effects of standardized testing on teaching and schools. Educational Measurement: Issues and Practice, 12(4), 20-25 https://doi.org/10.1111/j.1745-3992.1993.tb00550.x
  16. Hudson, H. T., & Hudson, C. K. (1981). Suggestions on the construction of multiplechoice tests. American Journal of Physics, 49(9), 838-841 https://doi.org/10.1119/1.12718
  17. Kim, M., Choi, J., & Song, J. (2007). Developing a web-based system for testing students' physics misconceptions (WEBSYSTEM) and its implementation. Journal of the Korean Association for Research in Science Education, 27(2), 105-119
  18. Knight, P. (2001). A briefing on key concepts: Formative and summative, criterion and normreferenced assessment. New York: Learning and Teaching Support Network
  19. Kwon, J., & Lee, S. (1987). A comparative analysis of expert's and novice's thinking processes in solving physics problems. Journal of the Korean Association for Research in Science Education, 8(1), 43-55
  20. Lawrenz, F. (2007). Review of science education program evaluation. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education (pp. 943-963). London: Lawrence Erlbaum Associates, Publishers
  21. Lee, M. (1998). Development of the three-tier test items for the thinking skills of the scientific inquiry. Journal of the Korean Association for Research in Science Education, 18(4), 643-650
  22. Lee, M., & Pak, S. (1995). A comparative study on multiple choice items of 4 options and 5 options for the thinking skills of the scientific inquiry. Journal of Science Education in Seoul National University, 20(1), 151-160
  23. Lee, M., & Pak, S. (1996). Development of twotier test items for the thinking skills of the scientific inquiry. Journal of Science Education in Seoul National University, 21(1), 19-33
  24. Linn, R. L., & Gronlund, N. E. (1993). Measurement and evaluation in teaching (10th ed.). New Jersey: The Macmillan Company
  25. Maloney, D. P. (1994). Research on problem solving: Physics. In D. L. Gabel (Ed.), Handbook of research on science teaching and learning (pp. 327-354). New York: MacMillan Publishing Company
  26. McDermott, L. C. (1998). Students' conceptions and problem solving in mechanics. In A. Tiberghien, E. L. Jossem, & J. Barojas (Eds.), Connecting research in physics education with teacher education (pp. 1-6). Ann Arbor: International Commission on Physics Education
  27. Mitchell, I., & Baird, J. (1986). Teaching, learning and curriculum 1: The influence of content in science. Research in Science Education, 16, 141-149 https://doi.org/10.1007/BF02356828
  28. Park, Y., & Cho, Y. (2005). Analysis of physics problem solving processes of high school students to qualitative and quantitative problems. Journal of the Korean Association for Research in Science Education, 25(4), 526-532
  29. Rebello, N. S., & Zollman, D. A. (2004). The effect of distracters on student performance on the force concept inventory. American Journal of Physics, 72(1), 116-125 https://doi.org/10.1119/1.1629091
  30. Sahlberg, P. (2004). Teaching and globalization. Managing Global Transitions, 2(1), 65-83
  31. Sangster, A. (1992). Computer-based instruction in accounting education. Accounting Education, 1(1), 13-32 https://doi.org/10.1080/09639289200000003
  32. Scott, B. L. (1985). A defense of multiple choice tests. American Journal of Physics, 53(11), 1035
  33. Stark, R. (1999). Measuring science standards in Scottish schools: The assessment of achievement programme. Assessment in Education: Principles, Policy & Practice, 6(1), 27-41 https://doi.org/10.1080/09695949992973
  34. Taber, K. S. (2003). Examining structure and context—Questioning the nature and purpose of summative assessment. School Science Review, 85(311), 35-41
  35. Tamir, P. (1998). Assessment and evaluation in science education: Opportunities to learn and outcomes. In B. J. Fraser & K. G. Tobin (Eds.), International handbook of science education (pp. 761-790). London: Kluwer Academic Publishers
  36. Treagust, D. F. (2002). Development and application of a two-tier multiple choice diagnostic instrument to assess high school students' understanding of inorganic chemistry qualitative analysis. Journal of Research in Science Teaching, 39(4), 283-301 https://doi.org/10.1002/tea.10023