Browse > Article
http://dx.doi.org/10.14697/jkase.2019.39.3.363

Exploring the Factors Influencing on the Accuracy of Self-Reported Responses in Affective Assessment of Science  

Chung, Sue-Im (Eungye Middle School)
Shin, Donghee (Ewha Womans University)
Publication Information
Journal of The Korean Association For Science Education / v.39, no.3, 2019 , pp. 363-377 More about this Journal
Abstract
This study reveals the aspects of subjectivity in the test results in a science-specific aspect when assessing science-related affective characteristic through self-report items. The science-specific response was defined as the response that appear due to student's recognition of nature or characteristics of science when his or her concepts or perceptions about science were attempted to measure. We have searched for cases where science-specific responses especially interfere with the measurement objective or accurate self-reports. The results of the error due to the science-specific factors were derived from the quantitative data of 649 students in the 1st and 2nd grade of high school and the qualitative data of 44 students interviewed. The perspective of science and the characteristics of science that students internalize from everyday life and science learning experiences interact with the items that form the test tool. As a result, it was found that there were obstacles to accurate self-report in three aspects: characteristics of science, personal science experience, and science in tool. In terms of the characteristic of science in relation to the essential aspect of science, students respond to items regardless of the measuring constructs, because of their views and perceived characteristics of science based on subjective recognition. The personal science experience factor representing the learner side consists of student's science motivation, interaction with science experience, and perception of science and life. Finally, from the instrumental point of view, science in tool leads to terminological confusion due to the uncertainty of science concepts and results in a distance from accurate self-report eventually. Implications from the results of the study are as follows: review of inclusion of science-specific factors, precaution to clarify the concept of measurement, check of science specificity factors at the development stage, and efforts to cross the boundaries between everyday science and school science.
Keywords
science education assessment; science-related affective characteristic; self-reporting test tool; response bias; error in test; science-specific error; improvement of test tool;
Citations & Related Records
Times Cited By KSCI : 5  (Citation Analysis)
연도 인용수 순위
1 Kluger, A. Reilly, R., & Russell, C. (1991). Faking biodata tests: Are option-keyed instruments more resistant? Journal of Applied Psychology, 76, 889-896.   DOI
2 Lederman, N., Abd-El-Khalick, F., Bell, R., & Schwartz, R. (2002). Views of NOS questionnaire toward valid and meaningful assessment of learners' conceptions of NOS. Journal of Research in Science Teaching, 39, 497-521.   DOI
3 Lee, J., & Moon, S. (2015). An analysis on the university entrance examination system's change process and main contents of occasional-regular admissions. Comtemporary Educational Research, 27, 97-130.
4 Lee, M., Sohn, W., & No, U. (2007). The Results from PISA 2006. Seoul: Korea Institute for Curriculum and Evaluation.
5 Linacre, J. (2005). A user's guide to Winsteps Rasch-model computer programs. Retrieved from www.winsteps.com.
6 London, M. (1997). London's career motivation theory: An update on measurement and research. Journal of Career Assessment, 5(1), 61-80.   DOI
7 Markus, H. (1977). Self-schemata and processing information about the self. Journal of Personality and Social Psychology, 35, 63-78.   DOI
8 McDaniel, M., & Timm, H. (1990). Lying takes time: Predicting deception in biodata using response latency. Paper presented at the American Psychological Association Annual Convention, Boston.
9 Messick, S. (1991). Psychology and methodology of response styles. In R. E. Snow, & D. E. Willey (Eds.), Improving inquires in social science(pp. 161-200). Hillsdale, NJ: Erlbaum.
10 Messick, S. (1995). Standards of validity and the validity of standards in performance assessment. Educational Measurement: Issues and Practice, 14(4), 5-8.   DOI
11 Kim, Y., Park, Y., Park, H., Shin, D., Jung, J., & Song, S. (2014). World of science education. Seoul: Book's hill.
12 Paulhus, D. L. (2002). Socially desirable responding: The evolution of a construct. In H. I. Braum, D. N. Jackson, & D. E. Wiley (Eds.), The role of constructs in psychological and educational measurement(pp. 49-69). Mahwah, New Jersey: Lawrence Erbaum Associates.
13 Munsterberg, H. (1908). On the witness stand: Essays on psychology and crime. New York: Doubleday, Page & Co.
14 National Science Teachers Association(NSTA) (2000). NSTA position statement of the nature of science. Retrieved July 12 2003, from http://www.nsta.org/159&psid=22
15 Noe, R., & Ford, J. (1992). Emerging issues and new directions for training research. In G. Ferris, & K. Rowland (Eds.), Research in personnel and human resources management(pp. 345-384). Greenwich. CT: JAI Press.
16 Osborne, J., Simon, S., & Tytler, R. (2009). Attitudes towards science: An update. Paper presented at the annual meeting of the American Educational Research Association, San Diego, California.
17 Paulhus, D. (1984). Two component models of socially desirable responding. Journal of Personality and Social Psychology, 46, 589-609.   DOI
18 Pickering, A. (1992). Science as practice and culture. Chicago: The University of Chicago.
19 Ratcliffe, M., & Grace, M. (2003). Science education for citizenship: teaching socio-scientific issues. Maidenhead, UK: McGraw-Hill International.
20 Reise, S. P., & Flannery, W. P. (1996). Assessing person-fit on measures of typical performance. Applied Measurement in Education, 9(1), 9-26.   DOI
21 Rust, J., & Golombok, S. (2014). Modern psychometrics: The science of psychological assessment(3rd). London: Routledge.
22 Scheuneman, J. D. (1984). A theoretical framework for the exploration of causes and effects of bias in testing. Educational Psychologist, 19(4), 219-225.   DOI
23 Bae, B., Lee, D., & Ham, K. (2015). Validation of the Korean short-version of social desirability scale(SDS-9) using the Rasch model. Korean Journal of Counseling, 16(6), 177-197.
24 Abd-El-Khalick, F., Summers, R., Said, Z., Wang, S., & Culbertson, M. (2015). Development and large-scale validation of an instrument to assess Arabic-speaking students' attitudes toward science. International Journal of Science Education, 37(16), 2637-2663.   DOI
25 Anastasi, A. (1988). Psychological testing(6th ed.). New York: Macmillan & Co.
26 Au, Y. (2007). A search on social desirability according to administered mode and demonstrable condition of a psychology testing. Journal of Educational Evaluation, 20(4), 235-258.
27 Brunetti, D., Schlottmann, R., Scott, A., & Hollrah, J. (1998). Instructed faking and MMPI-2 response latencies: The potential for assessing response validity. Journal of Clinical Psychology, 54, 143-153.   DOI
28 Choi, J., Hwang, S., Pai, D., Hwang, S. T., & Kim, Y. (2015). Diagnostic efficiency of personality disorder screening tool; The Korean version of self-report standardized assessment of personality-abbreviated scale: preliminary validation study. Journal of the Korean Neuropsychiatric Association, 54(4), 534-541.   DOI
29 Schwartz, R., Lederman, N., & Crawford, B. (2004). Developing views of nature of science in authentic context: An explicit approach of bridging the gap between nature of science and scientific inquiry. Science Education, 88, 610-645.   DOI
30 Schluf, Boaz, Hattie, J., & Dixon, R. (2008). Factors affecting responses to Likert type questionnaires: Introduction of the ImpExp, a new comprehensive model. Social Psychology of Education, 11(1), 59-78.   DOI
31 Seol, H., Kim, D., & Lee, S. (2006). Validation of the emotional empathy scale using Rasch rating scale model. Journal of Education Evaluation, 19(2), 179-201.
32 Shin, S. Ha, M., & Lee, J. (2014). Difference analysis between groups and the generalizability of the instrument for measuring high school students attitude toward convergence. Journal of Learner-Centered Curriculum and Instruction, 14(5), 107-124.
33 Shin, S., Ha, M., & Lee, J. (2016). The development and validation of instrument for measuring high school students' STEM career motivation. Journal of the Korean Association for Science Education, 36(1), 75-86.   DOI
34 Shin, Y., Kwak, Y., Kim, H., Lee, S., Lee, S. H., & Kang, H. (2017). Study on the development of test for indicators of positive experiences about science. Journal of the Korean Association for Science Education, 37(2), 335-346.   DOI
35 Son, E., Cha, J., & Kim, A. (2007). Test of construct equivalence of personality inventory in low and high socially desirable responding groups. The Korean Journal of Social and Personality Psychology, 21(2), 71-87.
36 Stöber, J. (2001). The Social Desirability Scale-17 (SDS-17): Convergent validity, discriminant validity, and relationship with age. European Journal of Psychological Assessment, 17(3), 222-232.   DOI
37 Chung, S., & Shin, D. (2017). Cases of discrepancy in high school students' achievement in science education assessment: Focusing on testing tool in affective area. Journal of the Korean Association for Science Education, 37(5), 891-909.   DOI
38 Chun, E., Na, J., Joung, Y., & Song, J. (2015). Development and application of the measuring instrument for the analysis of science classroom culture from the perspective of ‘community of practice'. Journal of the Korean Association for Science Education, 35(1), 131-142.   DOI
39 Stricker, L. J. (1963). Acquiescence and social desirability response styles: Item characteristics, and conformity. Psychological Reports, 12, 319-341.   DOI
40 Chung, S., & Shin, D. (2016). Trends of assessment research in science education, Journal of the Korean Association for Science Education, 36(4), 563-579.   DOI
41 Cronbach, L. (1946). Response sets and test validity. Educational and Psychological Measurement, 6(3), 475-494.   DOI
42 Crowne, D., & Marlowe, D. (1960). A new scale of social desirability independent of psychopathology. Journal of Consulting Psychology, 24, 349-354.   DOI
43 Ferrando, P., & Chico, E. (2001). Detecting dissimulation in personality test scores: A comparison between person-fit indices and detection scales. Educational and Psychological Measurement, 61, 997-1012.   DOI
44 Fives, H., Huebner, W., Birnbaum, A., & Nicolich, M. (2014). Developing a measure of scientific literacy for middle school students. Science Education, 98(4), 549-580.   DOI
45 Zerbe, W. J., & Paulhus, D. L. (1987). Socially desirable responding in organizational behavior: A reconception. Academy of Management Review, 12, 250-264.   DOI
46 Sweeney, P., & Moreland, R. (1980). Self-schemas and the perseverance of beliefs about the self. Paper presented at the meeting of American Psychological Association, Montreal.
47 Vasilopoulos, N., Reilly, R., & Leaman, J. (2000). The influence of job familiarity and impression management on self-report measure scale scores and response latencies. Journal of Applied Psychology, 85, 50-64.   DOI
48 Yoo, D., Lee, J., & Kim, H. (2012). A study on the comparative analysis of the historical transformation process on employment pattern and characteristics by the period of Korea major enterprise. The Korean Academy of Business History, 27(4), 33-58.
49 Zickar, M., & Drasgow, F. (1996). Detecting faking on a personality instrument using appropriateness measurement. Applied Psychological Measurement, 20, 71-87.   DOI
50 George, M., & Skinner, H. (1990). Using response latency to detecting accurate response in a computerized lifestyle assessment. Computers in Human Behavior, 6, 167-175.   DOI
51 Holden, R., & Hibbs, N. (1995). International validity of response latencies for detecting fakers on a personality test. Journal of Research in Personality, 29, 362-372.   DOI
52 Holden, R., & Kroner, D. (1992). Relative efficacy of differential response latencies for detecting faking on a self-report measure of psychopathology. Psychological Assessment, 4, 170-173.   DOI
53 Holden, R., Kroner, D., Fekken, G., & Popham, S. (1992). A model of personality test item response dissimulation. Journal of Personality and Social Psychology, 63, 272-279.   DOI
54 Hui, C. H., & Triandis, H. C. (1985). The instability of response sets. Public Opinion Quarterly, 49, 253-260.   DOI
55 Hong, S. (1994). Interaction between science and technology: Technology as knowledge and science as practice. The Quarterly Changbi, 22(4), 329-350.
56 Hough, L., Eaton, N., Dunnette, M., Kamp, J., & McCloy, R. (1990). Criterion-related validity of personality constructs and the effect of response distortion on those validities. Journal of Applied Psychology, 75, 581-595.   DOI
57 Hsu, L., Sanetelli, J., & Hsu, J. (1989). Faking detection validity and incremental validity of response latencies to MMPI subtle and obvious items. Journal of Personality Assessment, 53, 278-295.   DOI
58 Kim, H., & Lee, S. (1996). Secondary students' attitudes toward science-technology related issues in Korea. Journal of the Korean Association for Science Education, 16(4), 461-469.
59 Joo, Y., Kim, Y., Jeong, S., Shin, M., & Lee, C. (2001). Relationships between subjective symptoms and objective psychopathology in patients with schizophrenia. Journal of the Korean Neuropsychiatric Association, 40(4), 667-678.
60 Kim, S., & Kim, H. (2016). Development of a science ethicality test for elementary school students. Journal of the Korean Association for Science Education, 36(1), 1-13.   DOI
61 Kim, M., & Lee, H. (2006). A study of faking on normative and ipsative measures of personality for personnel selection. The Korean Journal of Industrial and Organizational Psychology, 19(3), 371-393.
62 Kim, J., Jeong, H., Kim, Y., & Cho, Y. (2015). A study on application of NCS recruiting systems in public organization - based on NCS recruiting performance during the first half year in 2015. Journal of Skills and Qualifications, 4(1), 65-84.