Exploring the Impact of Q-Matrix Specifications Through a DINA Model in a Large-Scale Mathematics Assessment

Wu H., Liang X., Yuerekli H., Becker B. J., Paek I., Binici S.

JOURNAL OF PSYCHOEDUCATIONAL ASSESSMENT, vol.38, no.5, pp.581-598, 2020 (SSCI) identifier identifier

  • Publication Type: Article / Article
  • Volume: 38 Issue: 5
  • Publication Date: 2020
  • Doi Number: 10.1177/0734282919867535
  • Journal Indexes: Social Sciences Citation Index (SSCI), Scopus, CINAHL, EBSCO Education Source, Educational research abstracts (ERA), ERIC (Education Resources Information Center), Psycinfo
  • Page Numbers: pp.581-598
  • Keywords: DINA, Q-matrix, large-scale mathematics assessment, attribute, IRT, CLASSIFICATION ACCURACY
  • Yıldız Technical University Affiliated: Yes


The demand for diagnostic feedback has triggered extensive research on cognitive diagnostic models (CDMs), such as the deterministic input, noisy output "and" gate (DINA) model. This study explored two Q-matrix specifications with the DINA model in a statewide large-scale mathematics assessment. The first Q-matrix was developed based on five predefined content reporting categories, and the second was based on the post hoc coding of 15 attributes by test-development experts. Total raw scores correlated strongly with the number of skills mastered, using both Q-matrices. Correlations between the DINA-model item statistics and those from the item response theory analyses were moderate to strong, but were always lower for the 15-skill model. Results highlighted the trade-off between finer-grained modeling and less precise model estimation.