Developing and Testing Multi-Component Computer-Based Assessment for the Next Generation Science Standards

Collaborating Institutions: 
Stanford University, SERP Institute, San Francisco Unified School District (SFUSD)

Funded By: 
Institute of Education Sciences

Project Participants: 

This research project is an IES-funded project designed to investigate students' understanding and ability to argue from evidence within two science domains - structure of matter and ecology.

The Next Generation Science Standards (NGSS) require a re-conceptualization of science teaching around (1) a set of disciplinary core ideas; (2) the need to engage in one or more of eight distinct scientific practices; and (3) developing an understanding of a set of cross cutting themes that frame the discipline of science and engineering. What is needed to meet the assessment challenges of this innovation are the following: (a) multi-component tasks that tap into these dimensions within a single assessment task, (b) efficient methods for delivering, scoring, and interpreting the results of the items in these tasks using a computer-based approach, and (c) an investigation of the consequential validity of this innovation for teacher’s instructional practices. Through a previous grant, we have developed an understanding of how students interpret science concepts, how they argue about them, and the relationship between students’ knowledge of content and their facility with argumentation. But these materials need to be adapted to an electronic format and their performance and scoring investigated thoroughly. We propose to develop science assessments in an on-line environment using the BEAR Assessment System Software (BASS) to investigate how well the complex practice of scientific argumentation can be assessed by computer-based methods. We plan to collect validity, reliability, and fairness evidence to assess the components of performance on content and practice. Finally, we propose to collect evidence on the consequential validity for instruction.

Population and Setting: Data will be gathered from middle and high school students (grades 8 -10) and science teachers from the San Francisco Unified School District (SFUSD), which is an ethnically, culturally, and linguistically diverse urban school district.

Assessments/measures: The researchers will work with teachers and others to transform assessment materials into electronic format. The assessments are using the BAS approach and are designed to measure student understanding and their ability to coordinate theory and evidence to construct domain-specific arguments about the structure of matter and of ecology.

Primary Research Method: We plan to use the BEAR Assessment System and its on-line companion (BASS) to develop and refine assessment materials. The BEAR Assessment System is an integrated approach to developing assessments that provides meaningful interpretations of student work relative to the cognitive and developmental goals of the domain. The system applies new theories and methodologies in the field of assessment to the practice of teacher-managed, classroom-based assessment of student performance. It is grounded in four building blocks (progress map, item design, outcome space, and measurement model), which guide assessment development. The learning progressions (Structure of Matter, Ecology, and Scientific Argumentation) are composed of several constructs. For each construct, we will iterate through the four building blocks to gather high quality empirical evidence to improve the assessments.

Selected References

Osborne, J., Henderson, J.B., MacPherson, A., Szu, E., Wild, A., & Yao, S.-Y. (2016). The Development and Validation of a Learning Progression for Argumentation in Science. Journal of Research in Science Teaching, 53:6, 821-846.

Yao, S.-Y., Wilson, M., Henderson, J.B., & Osborne J. (2015). Investigating the function of content and argumentation items in a science test: A multidimensional approach. Journal of Applied Measurement.

Chiu, T. (2014). Exploring language and gender differences on a scientific argumentation test. Unpublished prequalifying paper, University of California, Berkeley: Graduate School of Education.

Black, P., Wilson, M., & Yao, S.-Y. (2011): Road Maps for Learning: A Guide to the Navigation of Learning Progressions, Measurement: Interdisciplinary Research and Perspectives, 9:2-3, 71-123.

Morell, L., Chiu, T., Black, P., & Wilson, M. (submitted). A Construct-Modeling Approach to Develop a Learning Progression of how Students Understand the Structure of Matter.