Browse > Article
http://dx.doi.org/10.21796/jse.2018.42.3.308

Developing and Applying the Questionnaire to Measure High School Students' Unskeptical Attitude in Science Inquiry  

Rachmatullah, Arif (North Carolina State University)
Ha, Minsu (Kangwon National University)
Publication Information
Journal of Science Education / v.42, no.3, 2018 , pp. 308-321 More about this Journal
Abstract
The purpose of the study is to develop a questionnaire that examines unskeptical attitudes in scientific inquiry context. The questionnaire items were developed through literature research, expert review, and statistical analyses for validity and the differences in scores were identified by gender and tracks. A total of 363 high school students participated in the study. To explore the validity evidence of items, the Rasch analysis and the reliability of internal consistency were performed, and the two-way ANOVA was performed to compare the scores of the unskeptical attitudes between gender and academic track. Self-reporting and Likert-scaling 23 items were developed to measure unskeptical attitudes in scientific inquiry context. The items were developed in the sub-domain of scientific inquiry: 'questioning and hypothesis generating,' 'experiment designing,' and 'explaining and interpreting.' Second, the validity and reliability of the unskeptical were identified in a rigorous method. The validity of items were identified by multi-dimensional partial score model analysis through the Rasch model, and all 23 items were found to be fit to model. Various reliability evidences were also found to be appropriate. It was found that there were no significant differences of unskeptical attitude score between the gender and academic track except one comparison. The developed questionnaire could be used to check an unskeptical attitude in the course of scientific inquiry and to compare the effects of scientific inquiry classes.
Keywords
scientific skepticism; scientific inquiry; questionnaire development; science education;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Abrami, P. C., Bernard, R. M., Borokhovski, E., Waddington, D. I., Wade, C. A., & Persson, T. (2015). Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research, 85(2), 275-314.   DOI
2 Aikenhead, G. S. (2006). Science Education for Everyday Life: Evidence-based Practice. New York, NY: Teachers College Press.
3 Angner, E. (2006). Economists as experts: Overconfidence in theory and practice. Journal of Economic Methodology, 13(1), 1-24.   DOI
4 Baars, M., Vink, S., van Gog, T., de Bruin, A., & Paas, F. (2014). Effects of training self-assessment and using assessment standards on retrospective and prospective monitoring of problem solving. Learning and Instruction, 33, 92-107.   DOI
5 Beyer, B. K. (1995). Critical thinking. Fastback 385. Phi Delta Kappa, 408 N. Union, PO Box 789, Bloomington, IN 47402-0789.
6 Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. New York, NY: Springer.
7 Carey, S., Evans, R., Honda, M., Jay, E., & Unger, C. (1989). 'An experiment is when you try it and see if it works': a study of grade 7 students' understanding of the construction of scientific knowledge. International Journal of Science Education, 11(5), 514-529.   DOI
8 Chi, M. T. (2006). Two approaches to the study of experts' characteristics. In K. A. Ericson, N. Charness, P. J. Feltovich & R. R. Hoffman (Eds.), The Cambridge handbook of expertise and expert performance (pp. 21-30). New York, NY: Cambridge University Press.
9 Chinn, C. A., & Malhotra, B. A. (2002). Epistemologically authentic inquiry in schools: A theoretical framework for evaluating inquiry tasks. Science Education, 86(2), 175-218.   DOI
10 DeVellis, R. F. (2003). Scale development: Theory and applications. Thousand Oaks, CA: Sage Publications.
11 Glaser, E. M. (1941). An experiment in the development of critical thinking (No. 843). New York, NY: Teachers College, Columbia University.
12 Driver, R., Asoko, H., Leach, J., Scott, P., & Mortimer, E. (1994). Constructing scientific knowledge in the classroom. Educational Researcher, 23(7), 5-12.   DOI
13 Fauth, B., Decristan, J., Rieser, S., Klieme, E., & Buttner, G. (2014). Student ratings of teaching quality in primary school: Dimensions and prediction of student outcomes. Learning and Instruction, 29, 1-9.   DOI
14 Gilovich, T. (2008). How we know what isn't so. New York, NY: Simon and Schuster.
15 Heyman, G. D., Fu, G., & Lee, K. (2007). Evaluating claims people make about themselves: The development of skepticism. Child Development, 78(2), 367-375.   DOI
16 Hofer, B. K., & Pintrich, P. R. (1997). The development of epistemological theories: Beliefs about knowledge and knowing and their relation to learning. Review of Educational Research, 67(1), 88-140.   DOI
17 Kahneman, D. (2013). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.
18 Korsgaard, C. M. (1986). Skepticism about practical reason. The Journal of Philosophy, 83(1), 5-25.   DOI
19 Kostons, D., Van Gog, T., & Paas, F. (2012). Training self-assessment and task-selection skills: A cognitive approach to improving self-regulated learning. Learning and Instruction, 22(2), 121-132.   DOI
20 Lipman, M. (1987). Critical thinking: What can it be? Analytic Teaching, 8(1), 5-12.
21 Pajares, F., & Graham, L. (1999). Self-efficacy, motivation constructs, and mathematics performance of entering middle school students. Contemporary Educational Psychology, 24(2), 124-139.   DOI
22 Macfarlane, B., & Cheng, M. (2008). Communism, universalism and disinterestedness: Reexamining contemporary support among academics for Merton's scientific norms. Journal of Academic Ethics, 6(1), 67-78.   DOI
23 Messick, S. (1995). Standards of validity and the validity of standards in performance assessment. Educational Measurement: Issues and Practice, 14(4), 5-8.   DOI
24 Ministry of Education [MOE]. (2015). 2015 revised Science National Curriculum. Sejong: Author.
25 Neumann, I., Neumann, K., & Nehm, R. (2011). Evaluating instrument quality in science education: Rasch-based analyses of a nature of science test. International Journal of Science Education, 33(10), 1373-1405.   DOI
26 Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220.   DOI
27 Pallier, G. (2003). Gender differences in the self-assessment of accuracy on cognitive tasks. Sex Roles, 48(5), 265-276.   DOI
28 Panadero, E., Tapia, J. A., & Huertas, J. A. (2012). Rubrics and self-assessment scripts effects on self-regulation, learning and self-efficacy in secondary education. Learning and Individual Differences, 22(6), 806-813.   DOI
29 Parascandola, M. (2004). Skepticism, statistical methods, and the cigarette: A historical analysis of a methodological debate. Perspectives in Biology and Medicine, 47(2), 244-261.   DOI
30 Popkin, R. H. (1967). "Skepticism". In P. Edwards (Ed.), The Encyclopedia of Philosophy (7, pp. 449-461) New York, NY: The Macmillan Company & The Free Press.
31 Menkhoff, L., Schmeling, M., & Schmidt, U. (2013). Overconfidence, experience, and professionalism: An experimental study. Journal of Economic Behavior & Organization, 86, 92-101.   DOI
32 Quirk, P. J. (2010). The trouble with experts. Critical Review, 22(4), 449-465.   DOI
33 Sloman, S., Barbey, A. K., & Hotaling, J. M. (2009). A causal model theory of the meaning of cause, enable, and prevent. Cognitive Science, 33(1), 21-50.   DOI
34 McPeck, J. E. (1981) Critical Thinking and Education. Oxford, UK: Martin Robertson.
35 Stankov, L. & Lee, J. (2008). Confidence and cognitive test performance. Journal of Educational Psychology, 100(4), 961.   DOI
36 Trout, J. D. (2002). Scientific explanation and the sense of understanding. Philosophy of Science, 69(2), 212-233.   DOI
37 Wright, B. D. & Linacre, J. M. (1994). Reasonable mean-square-fit values. Rasch Measurement Transactions, 8(3), 370.