Measuring Diagnostic Competence: An Exploration of Error Analysis Ability and the Roles of Procedural and Conceptual Knowledge [Online]


There are few measurement instruments that exist for the purposes of assessing teachers’ diagnostic competence, although it has been identified as a key component of successful teaching practice in mathematics. This study aims to confirm the hypothesized structure of a subdomain of diagnostic competence in mathematics - the construct of error analysis ability - through the development of a valid and reliable measurement instrument. This work also investigates the roles of procedural knowledge and conceptual knowledge within error analysis, which are hypothesized as qualitatively distinct but parallel levels within the construct. Using the BEAR Assessment System (Wilson, 2005), a measure of error analysis ability was developed and tested online with a sample of 198 respondents. Responses were scored according to item-specific outcome spaces and analyzed via calibration of a unidimensional partial-credit model (Masters, 1982).

Item and person results were examined, with a majority of items functioning as expected. Thurstonian thresholds for 20 items revealed that many were aligned with their respective construct levels. Items pertaining to procedural knowledge were located lower than most, but not all items pertaining to conceptual knowledge. Additional reliability and validity evidence is presented in the results and discussion sections. Overall, the results suggest that the construct of error analysis ability does indeed follow the hypothesized structure, however, the evidence for the roles of conceptual and procedural knowledge in error analysis ability indicates that the construct may be multidimensional, as most respondents found it easier to provide procedural remediation (as opposed to conceptual) when analyzing student errors. Future research in this context could provide further insight into the structure of the construct via model selection and the development of a greater number of items pertaining to procedural and conceptual error analysis.

Rebecca McNeil is an independent scholar. She studies quantitative and qualitative methods in educational and psychological research, with research interests including measurement, assessment development, program evaluation, and applications of data science in education. She completed her Master’s degree in Social Research Methodologies at the University of California, Berkeley in Fall 2020, and is currently a collaborator on research projects with the researchers at the BEAR Center.

Tuesday, February 23, 2021 - 2:00pm
Online session
PDF icon Presentation slides534.83 KB