
| Dr. Bolt joined the Department of Educational Psychology in the spring of 1999, coming from the Laboratory for Educational and Psychological Measurement at the University of Illinois. In addition to his own research, he collaborates on various projects related to the development and statistical analysis of educational and psychological tests. Dr. Bolt teaches courses in test theory, factor analysis, and hierarchical linear modeling.
Research Statement: "My interests are in the theory and application of psychometric methods in education and psychology. I am especially interested in the application of latent variable models for purposes of test validation, assessment of individual differences (such as response styles), and modeling student growth.
Most of my research is in item response theory (IRT), including its application to issues such as differential item functioning and test dimensionality assessment. I am also interested in the development of nonparametric IRT methods, which relax certain modeling assumptions and have the potential to increase the flexibility and efficiency of IRT in many testing applications."
Representative publications:
Bolt, D.M., Wollack, J.A., & Suh, Y. (2012). Application of a multidimensional nested logit model to multiple-choice test items. Psychometrika, 77(2), 263-287.
Bolt, D.M., & Newton, J.R. (2011). Multiscale measurement of extreme response style. Educational and Psychological Measurement, 71, 814-833.
Suh, Y., & Bolt, D.M. (2011). A nested logit approach to deteting differential distractor functioning. Journal of Educational Measurement, pp. 188-205.
Suh, Y., & Bolt, D.M. (2010). Nested logit models for multiple-choice item response data. Psychometrika, 75(3), 454-473.
Johnson, T.R., & Bolt, D.M. (2010). On the use of factor-analytic multinomial logit item response models to account for individual differences in response style. Journal of Educational and Behavioral Statistics, 35(1).
Bolt, D.M., & Johnson, T.R. (2009). Addressing score bias and DIF due to individual differences in response style. Applied Psychological Measurement, 33(5), 335-352.
Bolt, D.M., Piper, M.E., McCarthy, D.E., Japuntich, S.J., Fiore, M.C., Smith, S.S., & Baker, T.B. (2009). The Wisconsin Predicting Patients' Relapse questionnaire. Nicotine & Tobacco Research, 11(5), 481-492.
Park, C., & Bolt, D.M. (2008). Application of multilevel IRT to investigate cross-national skill profiles on TIMSS 2003 [Monograph]. IERI monograph Series: Issues and methodologies in large-scale assessments. 1, 71-96.
Kim, J., & Bolt, D.M. (2007). Estimating item response theory models using Markov Chain Monte Carlo. Educational Measurement: Issues and Practice, pp. 38-51.
Bolt, D.M., Cohen, A.S., & Wollack, J.A. (2001). A mixture item response model for multiple-choice data. Journal of Educational and Behavioral Statistics, 26, 381-409.
Contact Informationdmbolt@facstaff.wisc.edu Phone: (608) 262-4938 Office: 1082A Ed Sciences Website: http://www.education.wisc.edu/edpsych/default.aspx?content=bolt.html
Completed ProjectsCoordination, Consultation, and Evaluation Center for Implementing K-3 Behavior and Reading Intervention ModelsDevelopment of a Plan for a Study of Best Practices in After-School Programming Do After-School Programs Affect Student Experience? An Enhancement Study to the 21st Century CLC Evaluation An Integrated Qualitative and Quantitative Evaluation of the SAGE Program National Center for Improving Student Learning and Achievement in Mathematics and Science (NCISLA) National Center for Improving Student Learning and Achievement in Mathematics and Science--Design Collaborative--Elementary Reading Excellence and Demonstration of Success (READS) Program Study of Promising After-School Programs Systemic Initiatives: Student Achievement Analysis Study Using DIF Analyses to Examine the Effects of Testing Accommodations on Students' Responses to Test Items |