Christopher Rhoads
Associate Professor
Titles:
Associate Professor
Research Methods, Measurement and Evaluation
Academic Degrees:
Ph.D., Statistics, Northwestern University, 2008
M.S., Applied Mathematics and Statistics, Stony Brook University, 2003
B.A., Philosophy, Haverford College, 1996
Areas of Expertise:
Optimal Design of Quantitative Educational Research
Causal Inference with Data
Regression Discontinuity Designs
Power Analysis and Sample Size Determination
Randomized Field Trials in Education
Biography:
Dr. Christopher Rhoads received his Ph.D. in Statistics from Northwestern University (NU). He entered his current position following a three-year post-doctoral fellowship at the Institute for Policy Research at NU. Dr. Rhoads’ research focuses on methodological and statistical approaches to improving causal inference in policy-relevant research, particularly in the design and analysis of large field studies for the purposes of policy evaluation. He has published articles in outlets such as Journal of Educational and Behavioral Statistics, Journal of Research on Educational Effectiveness and British Journal of Mathematical and Statistical Psychology and is acknowledged as an outstanding peer reviewer for two scholarly journals.
Dr. Rhoads has been a member of research teams conducting evaluation and efficacy grants in the areas of educational technology and gifted education (Institute of Education Sciences), elementary mathematics teacher professional development (Department of Education), college persistence for non-traditional students (National Science Foundation) and housing and child welfare (Administration for Children and Families). He also serves on the advisory boards for several IES and NSF funded projects. He is a regular presenter at the IES funded Summer Research Training Institute for Cluster Randomized Trials, where he lectures on the topic of longitudinal models.
Rhoads’ current methodological work involves:
- Techniques for generalizing to policy-relevant populations when there are multi-level selection processes.
- using prior information about the correlation structure to improve power and precision in experiments with clustering
- determining optimal experimental designs for regression discontinuity studies
- exploring uses of implementation fidelity measures to improve learning from field experiments in education
Current and past funded research projects (selected):
2012-2018. Co-Investigator: Evaluation Methodology and Statistics. Grant to study, develop and disseminate a model of Intensive Supportive Housing for Families. Children’s Bureau, Administration for Children, Youth and Families (ACF), U.S. Department of Health and Human Services ($5 million to CT Dept of Children and Families, $1.1 million to UConn PI: Anne Farrell).
2017-2019. Co-Principal Investigator. Factors Affecting Comprehension by Teens During Online Reading in Science. Institute of Education Sciences ($599,852, PI: Don Leu).
2017-2022. Co-Investigator. Project LIFT: Learning Informs Focused Teaching. U.S. Dept. of Education, Jacob Javits Gifted and Talented Program ($2.41 million, PI: Catherine Little).
2019-2024. Co-Principal Investigator (UConn PI). What happens when students can’t get into an online STEM course section? Online STEM course scarcity, student time poverty, and academic momentum. NSF ($2.5 million, PI: Claire Wladis).
2020-2025. Co-Principal Investigator. National Center for Research on Gifted Education. Institute for Education Sciences. ($5 million, PI: Del Siegle).
2018-2022. Co-Principal Investigator. (UConn PI). Foundations for Success: Developing Effective Mathematics Educators through Cognitively Guided Instruction. U.S. DOE-SEED program. ($9.7 million, PI: Rob Schoen)
2021-2025, Co-Investigator. Fostering Computational Thinking through Neural Engineering Activities in High School Biology Classes. NSF ($1.5 million, PI: Ido Davidesco).
Selected Recent Publications:
Li, Y. and Rhoads, C. (2020). Applying Inverse Propensity of Participation Weights to Multilevel Models for the Purpose of Generalizing Large-Scale Cluster Randomized Trials in Education. Testing, Psychometrics, Methodology in Applied Psychology, 27(3), 453-476.
Upshur, C., Wenz-Gross, M., Rhoads, C., Heyman, M., Yoo, Y. and Sawosik, G. (2019). A randomized efficacy trial of the Second Step Early Learning (SSEL) curriculum. Journal of Applied Developmental Psychology, 62, 145-159.
Rhoads, C. (2017). Coherent power analysis in multi-level studies using parameters from surveys. Journal of Educational and Behavioral Statistics, 42(2), 166-194.
Rhoads, C. (2016). The Implications of Contamination for Educational Experiments with Two Levels of Nesting. Journal of Research on Educational Effectiveness, 9(4), 531-555.
Louie, J., Rhoads, C, and Mark, J. (2016). Challenges when using the Regression Discontinuity Design in educational evaluations: Lessons from the Transition to Algebra study. American Journal of Evaluation, 37(3), 381-407.
Rhoads, C. and Dye, C. (2016). Optimal Design for Two Level Random Assignment and Regression Discontinuity Studies. Journal of Experimental Education, 84(3), 421-448.
Rhoads, C. (2014) Under what circumstances does external knowledge about the correlation structure improve power in cluster randomized designs? Journal of Research on Educational Effectiveness, 7(2), 205-224.
Hedges, L.V. and Rhoads, C. (2011). Correcting an Analysis of Variance for Clustering. British Journal of Mathematical and Statistical Psychology, 64(1), 20-37.
Rhoads, C. (2011) The Implications of Contamination for Experimental Design in Education Research. Journal of Educational and Behavioral Statistics,36(1), 76-104.
Hedges, Larry and Rhoads, Christopher (2009). Statistical Power Analysis in Education Research (NCSER 2010-3006). Washington, DC: National Center for Special Education Research, Institute of Education Sciences, U.S. Department of Education.
christopher.rhoads@uconn.edu | |
Phone | 860 486 3321 |
File | Rhoads-Chris-CV-current-profile-Nov-2022 |
Mailing Address | Unit 3064 |
Office Location | Gentry 337 |
Campus | Storrs |