Methodological Tools for Research Design and Evaluation

Author: ORCID icon
Kush, Joseph, Education - School of Education and Human Development, University of Virginia
Konold, Timothy, CU-Leadshp, Fndns & Pol Studies, University of Virginia
Bradshaw, Catherine, CU-Human Svcs, University of Virginia
Soland, Jim, CU-Leadshp, Fndns & Pol Studies, University of Virginia
Hull, Michael, CU-Leadshp, Fndns & Pol Studies, University of Virginia

In social and educational science research, methodologically sound research designs are essential for studying complex and multifaceted research problems. Understanding conditions under which various research designs and statistical methods perform well or poorly is important for informing best practices. Monte Carlo simulation methods are a well-suited tool for investigating the performance of quantitative methodologies, and have long been used to explore questions related to sample size and power, model specification, and bias and variability in effect estimates, among other areas of interest.

The focus of this three-paper dissertation centers around methodological tools used in applied research design and evaluation. With an overall aim of informing research practices in the context of social and educational settings, these studies examine methodological concerns and challenges that often arise in these and other applied settings (e.g., schools or communities). The first paper addresses the use of propensity score matching methods for quasi-experimental designs, with a specific focus on intervention scale-up designs. The second paper examines statistical power in cluster randomized trials with clusters of varying size. The third paper investigates how the sampling ratio is related to bias and variability in aggregated latent level-2 construct measurement and estimation.

PHD (Doctor of Philosophy)
Propensity Score Matching, Statistical Power, Structural Equation Modeling, Quasi-experimental, Multilevel Modeling, Latent Variable, Sampling Ratio
Issued Date: