IOMW 2020 Spotlight Talk: A Model-Data-Fit-Informed Approach to Score Resolution in Rater-Mediated Assessments (Session 1/4) [Online]

Abstract

Many large-scale performance assessments include score resolution procedures for resolving discrepancies in rater judgments. The goal of score resolution is conceptually similar to person fit analyses: To identify students for whom observed scores may not accurately reflect their achievement. Previously, researchers have observed that rater agreement methods and person fit analyses result in similar conclusions about which students’ achievement estimates warrant additional investigation, and that score resolution generally improves person fit. We consider the implications of using person fit analysis as an initial step to identify performances for score resolution, and of using fit indices to identify raters to provide additional ratings. We simulated student responses to multiple-choice items and a writing task. We simulated various types of person misfit in the writing task and identified the persons who needed resolution by a model-fit index. Results indicate larger improvements in person fit after resolution when the person fit approach was used compared to a rater agreement approach. With the fit-informed approach, person fit improved for ≥ 98% of the misfitting students; with the rater agreement approach, person fit improved for around one third of these students. We consider the implications of our findings for mixed-format assessments

Dr. Stefanie A. Wind is an Associate Professor of Educational Measurement at the University of Alabama. Her primary research interests include the exploration of methodological issues in the field of educational measurement, with emphases on methods related to rater-mediated assessments, rating scales, latent trait models (i.e., Rasch models and item response theory models), and nonparametric item response theory, as well as applications of these methods to substantive areas related to education.

Dr. Wind has published her research in a number of peer-reviewed journals, including methodological journals in the field of educational measurement (e.g., Journal of Educational Measurement, Educational Measurement: Issues and Practice, Educational and Psychological Measurement) as well as applied journals (e.g., Language Testing, Assessing Writing, Science Education, Studies in Educational Evaluation). In 2018, she co-authored the book Invariant Measurement with Raters and Rating Scales, which was published by Taylor and Francis. She has also received awards for her research, including the Alicia Cascallar early career scholar award from the National Council on Measurement in Education, the Exemplary Paper Award from the Classroom Observation SIG of AERA (2017), and the Georg William Rasch Early Career Scholar award from the Rasch SIG of AERA (2015).

Adrienne Walker is the program manager for the Assessment Research and Development unit at the Georgia Department of Education. Her team supports the technical quality and defensibility of Georgia’s assessment program by monitoring operational psychometric processes and collecting validity evidence. Adrienne received her Ph.D. in Educational Studies from Emory University in 2016. Her research and professional interests include Rasch measurement theory and exploring the utility of person fit for score interpretation in large-scale assessment.

Note1: This is a series of talks that were proposed for the IOMW 2020 Conference as a warm-up to the conference, which will be held virtually.

Note2: For any questions about the study, please contact Dr. Stefanie Wind (stefanie.wind@ua.edu)

Date: 
Tuesday, September 15, 2020 - 2:00pm
Building: 
Online session
Room: 
Zoom
AttachmentSize
PDF icon Presentation slides1.9 MB