Weitere Artikel dieser Ausgabe durch Wischen aufrufen
This article describes the development, validation and application of a Rasch-based instrument, the Elementary School Science Classroom Environment Scale (ESSCES), for measuring students’ perceptions of constructivist practices within the elementary science classroom. The instrument, designed to complement the Reformed Teaching Observation Protocol (RTOP), is conceptualised using the RTOP’s three construct domains: Lesson Design and Implementation; Content; and Classroom Culture. Data from 895 elementary students was used to develop the Rasch scale, which was assessed for item fit, invariance and dimensionality. Overall, the data conformed to the assumptions of the Rasch model. In addition, the structural relationships among the retained items of the Rasch model supported and validated the instrument for measuring the reformed science classroom environment theoretical construct. The application of the ESSCES in a research study involving fourth grade students provides evidence that educators and researchers have a reliable instrument for understanding the elementary science classroom environment through the lens of the students.
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
Adamson, S. L., Banks, B., Burtch, M., Cox, F., I. I. I., Judson, E., Turley, J., et al. (2003). Reformed undergraduate instruction and its subsequent impact on secondary school teaching practice and student achievement. Journal of Research in Science Teaching, 40, 939–957. CrossRef
American Association for the Advancement of Science (AAAS). (1993). Science for all Americans. Washington, DC: AAAS.
Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Mahwah, NJ: Lawrence Erlbaum Associates.
Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education, 90, 253–269. CrossRef
Boone, W. J., Townsend, J. S., & Staver, J. (2011). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: An exemplar utilizing STEBI self-efficacy data. Science Education, 95, 258–280. CrossRef
Brooks, J. G., & Brooks, M. G. (1999). In search of understanding: The case for constructivist classrooms. Alexandria, VA: Association for Supervision and Curriculum Development.
Cavanagh, R. F., & Romanoski, J. T. (2006). Rating scale instruments and measurement. Learning Environments Research, 9, 273–289. CrossRef
Cochran-Smith, M., & Lytle, S. L. (2006). Troubling images of teaching in No Child Left Behind. Harvard Educational Review, 76, 668–697.
DeBoer, G. E. (2002). Student-centered teaching in a standards-based world: Finding a sensible balance. Science and Education, 11, 405–417. CrossRef
Dorans, N. J., & Kingston, N. M. (1985). The effects of violations of unidimensionality on the estimation of item and ability parameters and on item response theory equating of the GRE verbal scale. Journal of Educational Measurement, 22, 249–262. CrossRef
Falconer, K., Wyckoff, S., Mangala, J., & Sawada, D. (2001, April). Effect of reformed courses in physics and physical science on student conceptual understanding. Paper presented at the annual meeting of the American Educational Research Association, Seattle, WA.
Fisher, D. L., & Fraser, B. J. (1981). Validity and use of My Classroom Inventory. Science Education, 65(1), 145–156. CrossRef
Fisher, D. L., & Fraser, B. J. (1983a). Validity and use of Classroom Environment Scale. Educational Evaluation and Policy Analysis, 5, 261–271.
Fisher, D. L., & Fraser, B. J. (1983b). A comparison of actual and preferred classroom environments as perceived by science teachers and students. Journal of Research in Science Teaching, 20, 55–61. CrossRef
Fraser, B. J. (1990). Individualised Classroom Environment Questionnaire. Melbourne: Australian Council for Educational Research.
Fraser, B. J. (1998). Classroom environment instruments: Development, validity and applications. Learning Environments Research, 1(1), 7–33. CrossRef
Fraser, B. J. (2012). Classroom learning environments: Retrospect, context and prospect. In B. J. Fraser, K. G. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 1191–1239). New York: Springer. CrossRef
Fraser, B. J., Anderson, G. J., & Walberg, H. J. (1982). Assessment of learning environments: Manual for Learning Environment Inventory (LEI) and My Class Inventory (MCI) (3rd ed.). Perth: Western Australian Institute of Technology.
Gable, R. K., Ludlow, L. H., & Wolf, M. B. (1990). The use of classical and Rasch latent trait models to enhance the validity of affective measures. Educational and Psychological Measurement, 50, 869–878. CrossRef
Gijbels, D., Van De Watering, G., Dochy, F., & Van Den Bosscher, P. (2006). New learning environments and constructivism: The students’ perspective. Instructional Science, 34, 213–226. CrossRef
Hambleton, R. K. (1993). Comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues and Practice, 12(3), 38–47. CrossRef
Hambleton, R. K. (2006). Good practices for identifying differential item functioning. Commentary Medical Care, 44(11:3), S182–S188. CrossRef
Herrenkohl, L. R., & Guerra, M. R. (1998). Participant structures, scientific discourse, and student engagement in fourth grade. Cognition and Instruction, 16, 431–473. CrossRef
Herrenkohl, L. R., Palincsar, A. S., DeWater, L. S., & Kawasaki, K. (1999). Developing scientific communities in classrooms: A sociocognitive approach. The Journal of the Learning Sciences, 8, 451–493. CrossRef
Jong, C., Pedulla, J. J., Reagan, E. M., Salomon-Fernandez, Y., & Cochran-Smith, M. (2010). Exploring the link between reformed teaching practices and pupil learning in elementary school mathematics. School Science and Mathematics, 110, 309–326. CrossRef
Lance, C. E., & Vandenberg, R. J. (Eds.). (2009). Statistical and methodological myths and urban legends: Doctrine, verity and fable in the organizational and social sciences. New York: Routledge, Taylor and Francis Group.
Linacre, J. J. (2003). Rasch power analyses: Size vs. significance: Infit and outfit mean-square and standardized Chi square fit statistic. Rasch Measurement Transactions, 17(1), 918. [ http://www.rasch.org/rmt/rmt171n.htm].
Linacre, J. M. (2009a). WINSTEPS (Computer program 3.68). Chicago: MESA Press.
Linacre, J. M. (2009b). A user’s guide to Winsteps Rasch-model computer programs (Program manual 3.68.0). Chicago: Winsteps.
Loyens, S. M., & Gijbels, D. (2008). Understanding the effects of constructivist learning environments: Introducing a multi-directional approach. Instructional Science, 36, 351–357. CrossRef
Loyens, S. M., Rikers, R. M. J. P., & Schmidt, H. G. (2008). Relationships between students’ conceptions of constructivist learning and their regulation and processing strategies. Instructional Science, 36, 445–462. CrossRef
Ludlow, L. H., Enterline, S. E., & Cochran-Smith, M. (2008). Learning to teach for Social-Justice-Beliefs Scale: An application of Rasch measurement principles. Measurement and Evaluation in Counseling and Development, 40, 194–214.
Martin, A., & Hand, B. (2009). Factors affecting the implementation of argument in the elementary science classroom: A longitudinal case study. Research in Science Education, 39, 17–38. CrossRef
Marx, R. W., & Harris, C. J. (2006). No Child Left Behind and science education: Opportunities, challenges and risks. The Elementary School Journal, 106, 467–477. CrossRef
Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47, 149–174. CrossRef
Moos, R. H., & Trickett, E. J. (1987). Classroom Environment Scale manual (2nd ed.). Palo Alto, CA: Consulting Psychologists Press.
National Research Council. (1996). National science education standards. Washington, DC: National Academy Press.
No Child Left Behind Act of 2001, 20 U.S.C. § 6319 (2008).
Ogasawara, H. (2002). Exploratory second-order analyses for components and factors. Japanese Psychological Research, 44(1), 9–19. CrossRef
Olgun, O. S., & Adali, B. (2008). Teaching grade 5 life science with a case study approach. Journal of Elementary Science Education, 20(1), 29–44. CrossRef
Piburn, M., & Sawada, D. (2000a). Reformed Teaching Observation Protocol (RTOP) reference manual (Technical Report No. IN00-32). Tempe, AZ: Collaborative for Excellence in the Preparation of Teachers, Arizona State University.
Piburn, M., & Sawada, D. (2000b). Reformed Teaching Observation Protocol (RTOP) training guide, (Technical Report No. IN00-02). Tempe, AZ: Collaborative for Excellence in the Preparation of Teachers, Arizona State University.
Pringle, R. M., & Martin, S. C. (2005). The potential impacts of upcoming high-stakes testing on the teaching of science in the elementary classroom. Research in Science Education, 35, 347–361. CrossRef
Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests. Copenhagen: Danish Institute for Educational Research. (Expanded edition, 1980. Chicago: University of Chicago Press).
Roth, W. M., & Bowen, G. M. (1995). Knowing and interacting: A study of culture, practices and resources in a grade 8 open-inquiry science classroom guided by a cognitive apprenticeship metaphor. Cognition and Instruction, 13(1), 73–128. CrossRef
Sawada, D., Piburn, M., Judson, E., Turley, J., Falconer, K., Benford, R., et al. (2002). Measuring reform practices in science and mathematics classrooms: The Reformed Teaching Observational Protocol (RTOP). School Science and Mathematics, 102, 17–38. CrossRef
Smith, E. V., Jr. (2000). Metric development and score reporting in Rasch measurement. Journal of Applied Measurement, 1, 303–326.
Smith, E. V., Jr. (2002). Detecting and evaluating the impact of multidimensionality using item fit statistics and principal components analysis of residuals. Journal of Applied Measurement, 3, 205–231.
Smith, C. L., Maclin, D., Houghton, C., & Hennessey, M. G. (2000). Sixth-grade students’ epistemologies of science: The impact of school science experiences on epistemological development. Cognition and Instruction, 18, 349–422. CrossRef
Smith, A. B., Rush, R., Fallowfield, L. J., Velikova, G., & Sharpe, M. (2008). Rasch fit statistics and sample size considerations for polytomous data. BMC Medical Research Methodology, 8(33), 1–11.
Southerland, S., Kittleson, J., Settlage, J., & Lanier, K. (2005). Individual and group meaning-making in an urban third grade classroom: Red fog, cold cans, and seeping vapor. Journal of Research in Science Teaching, 42, 1032–1061. CrossRef
Southerland, S. A., Smith, L. K., Sowell, S. P., & Kittleson, J. M. (2007). Resisting unlearning: Understanding science education’s response to the United State’s national accountability movement. Review of Research in Education, 31, 45–78. CrossRef
Taylor, P. C., Fraser, B. J., & Fisher, D. L. (1997). Monitoring constructivist classroom learning environments. International Journal of Educational Research, 27, 293–302. CrossRef
Taylor, P. C., Fraser, B. J., & White, L. R. (1994, April). The revised CLES: A questionnaire for educators interested in the constructivist reform of school science and mathematics. Paper presented at the annual meeting of the American Educational Research Association, Atlanta, GA.
Tenenbaum, G., Naidu, S., Jegede, O., & Austin, J. (2001). Constructivist pedagogy in conventional on-campus and distance learning practice: An exploratory investigation. Learning and Instruction, 11, 87–111. CrossRef
Thomas, G. P. (2004). Dimensionality and construct validity of an instrument designed to measure the metacognitive orientation of science classroom learning environments. Journal of Applied Measurement, 5, 367–384.
Thurstone, L. L. (1940). Current issues in factor analysis. Psychological Bulletin, 37, 189–236. CrossRef
Tsai, C. (2000). Relationships between student science epistemological beliefs and perceptions of constructivist learning environments. Educational Research, 42, 193–205. CrossRef
Upadhyay, B., & DeFranco, C. (2008). Elementary students’ retention of environmental science knowledge: Connected science instruction versus direct instruction. Journal of Elementary Science Education, 20(2), 23–37. CrossRef
Wolfe, E. W., & Smith, E. V., Jr. (2007a). Instrument development tools and activities for measure validation using Rasch models: Part I—Instrument development tools. Journal of Applied Measurement, 8, 97–123.
Wolfe, E. W., & Smith, E. V., Jr. (2007b). Instrument development tools and activities for measure validation using Rasch models: Part II—Instrument development tools. Journal of Applied Measurement, 8, 204–234.
Wright, B. D., & Masters, G. N. (1982). Rating scale analysis. Chicago: MESA Press.
Wright, B. D., & Mok, M. (2000). Rasch models overview. Journal of Applied Measurement, 1, 83–106.
Wright, B. D., & Stone, M. H. (1979). Best test design: Rasch measurement. Chicago: MESA Press.
Wright B. D, & Masters, G. N. (2002). Number of person or item strata. Rasch Measurement Transactions, 16(3), 888. ( http://www.rasch.org/rmt/rmt163.htm).
- Development and application of the Elementary School Science Classroom Environment Scale (ESSCES): measuring student perceptions of constructivism within the science classroom
Shelagh M. Peoples
Laura M. O’Dwyer
Jessica J. Brown
Camelia V. Rosca
- Springer Netherlands
Neuer Inhalt/© Stellmach, Neuer Inhalt/© Maturus, Pluta Logo/© Pluta