Improving Impact Evaluations with Low-Cost, Automated, and Scalable Tools for Implementation Research
Anglin, Kylie, Education - School of Education and Human Development, University of Virginia
Wong, Vivian, CU-Leadshp, Fndns & Pol Studies, University of Virginia
This dissertation develops low-cost, automated, and scalable solutions for a key challenge in educational evaluations: monitoring and assessing program implementation. In order for program evaluations to be of most use to policymakers and practitioners, they not only need to answer questions regarding effectiveness, but also implementation. How was the program implemented? Was it implemented as its designers intended? Or was it adapted to local needs and constraints? Unfortunately, evaluators often face intense logistical, budgetary, and methodological challenges in their efforts to monitor intervention implementation in the field. This dissertation develops and applies computational tools to assist researchers with these challenges. In the first chapter, I develop an efficient method of collecting policy implementation data from school district websites using web-scraping and natural language processing. The second chapter illustrates the insights that can be gained from this method of data collection within the context of a massive deregulation effort in Texas under the District of Innovation statute. Finally, in the third chapter, I develop a scalable and low-cost method of assessing treatment fidelity using semantic similarity. Together, these chapters demonstrate how computational methods can be applied to the field of education in order to improve our understanding of program implementation.
PHD (Doctor of Philosophy)
program implementation, treatment fidelity, natural language processing, impact evaluation, data science, education policy, deregulation, web-scraping
Institute of Education SciencesNational Academy of Education/Spencer Dissertation Fellowship Program
English
2021/04/29