Quantifying cognitive bias in educational researchers

TitleQuantifying cognitive bias in educational researchers
Publication TypeJournal Article
Year of Publication2020
AuthorsBierema, A, Hoskinson, A-M, Moscarella, RA, Lyford, A, Haudek, KC, Merrill, JE, Urban-Lurain, M
JournalInternational Journal of Research & Method in Education
Date Published08/2020
AbstractAs we take advantage of new technologies that allow us to streamline the coding process of large qualitative datasets, we must consider whether human cognitive bias may introduce statistical bias in the process. Our research group analyzes large sets of student responses by developing computer models that are trained using human-coded responses and a suite of machine-learning techniques. Once a model is initially trained, it may be insufficiently accurate. Increasing the number of human-coded responses typically enhances these models to an acceptable level of accuracy. Alternatively, instead of human coding responses, we can rapidly increase the number of coded responses by verifying computer-predicted codes for each response. However, having access to this information may bias human coders. We designed the present study to test for differences in level of agreement with computer-predicted codes in terms of magnitude and direction during computer model calibration if information about computer-predicted codes is available. Our results indicate human coding bias despite being disciplinary experts who were aware of the possibility of cognitive bias creating statistical bias and that magnitude and direction of that bias varies across experts.
DOI10.1080/1743727X.2020.1804541
Refereed DesignationRefereed

thumbnail of small NSF logo in color without shading

This material is based upon work supported by the National Science Foundation (DUE grants: 1438739, 1323162, 1347740, 0736952 and 1022653). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the NSF.