Mathematica supports the U.S. Department of Education’s Supporting Effective Educator Development (SEED) grantees in disseminating the lessons learned from their grants to the broader education field. We partner with grantees to communicate their insights through webinars, briefs, communities of practice,...
- Rapid-cycle evaluation
- Evidence-based decision making
- Technical assistance
- Evaluation methodology
- Teacher evaluation
- Value-added methodology
- Analytical technical assistance and development
- Effective Data Use
- Strengthening and Disseminating Research
- Teacher and Principal Effectiveness
- Labor: Strengthening and Disseminating Research
- Human Services
Alexandra Resch is an expert in program evaluation methodology, teacher evaluation, and value-added methods. Resch’s current work is focused on making research more accessible to end users and helping states and localities use complex methods to improve programs and policies.
Resch leads several projects using rapid-cycle evaluation to help school districts make better decisions. She leads a team that is developing a toolkit that will allow school districts and other users conduct rigorous, quick-turnaround evaluations of education technologies they use in schools. She is also leading teams working with Race to the Top-District grantees to conduct rapid-cycle evaluations of personalized learning strategies implemented through their grants. In related work focused on reducing barriers to using rigorous evaluation methods, she recently co-authored two guides on opportunistic experiments, a way of embedding rigorous research in planned pilots or policy changes, for the U.S. Department of Education and is currently providing technical assistance to state Temporary Assistance for Needy Families agencies on conducting efficient, rigorous evaluations using existing administrative and program data.
Resch played key roles on several teacher evaluation projects. She serves as project director for the development of a new teacher and principal evaluation system to support Charleston (SC) County School District’s Teacher Incentive Fund (TIF) grant, an effort that includes developing value-added measures of teacher and principal effectiveness. Resch directed Mathematica’s value-added work for the Washington, DC, public schools for use with high-stakes teacher assessment systems. On the evaluation of the federal TIF, she provided technical assistance to grantees on the design and implementation of performance-based teacher and principal compensation systems.
Resch writes for a wide range of audiences, publishing documents ranging from peer-reviewed journal articles to user-friendly briefs directed at school leaders. She has been published in the BE Press Journal of Economic Analysis and Policy, Review of Economics and Statistics, and National Tax Journal. Resch holds a Ph.D. in public policy and economics from the University of Michigan.
Improving Educator Effectiveness Through Partnerships and Collaboration
KIPP: Preparing Youth for College
Mathematica built on its initial study of KIPP middle schools with this five-year project, designed to address the question of whether KIPP can maintain its effectiveness as the network grows. The study included an impact analysis, an implementation analysis, and a correlational analysis.
Multi-Dimensional Educator Evaluation Framework and Educator Effectiveness Rating System
Mathematica is designing evaluation models for both teachers and school administrators, designing and producing value-added models, and providing technical and analytic assistance.
Value-Added Assessment System for DC Schools and Teachers
We designed value-added models to measure teacher and school effectiveness and have produced annual estimates of educators’ contributions to achievement in grades 4-8 since the 2008-2009 school year. These estimates are combined with other measures of educator effectiveness in DCPS’s IMPACT system.
Analytic and Technical Support for Education Researchers and Practitioners
Mathematica provides analytic and technical support for the Regional Educational Laboratories as well as the general education community to systematically assess and create materials and training to advance education research and share best practices.
Rapid-Cycle Tech Evaluations Accelerate Decisionmaking
This project involves developing, field testing, and disseminating easy-to-use evaluation resources, via a web-based, interactive toolkit, to expedite low-cost, quick-turnaround evaluations using rapid-cycle evaluation approaches.
Providing Timely and Reliable Evidence for Schools at No Cost
At this week’s Future of Education Technology Conference, Mathematica Policy Research and the Office of Educational Technology at the U.S. Department of Education launched a new web-based, interactive research tool for education administrators: the Ed Tech RCE Coach.
Understanding Evidence: New Guide Explains Four Key Types and How to Evaluate Them
A new guide from CIRE describes four key types of evidence—anecdotal, descriptive, correlational, and causal and explains how to tell which type provides strong support for claims about effectiveness, ordering the from weakest to strongest. It also gives examples of common sources for each type.
New Guide for Researchers on Experiments in Education
New guide shows how “opportunistic experiments” can build evidence by incorporating rigorous research studies into the normal course of action. This approach to conducting randomized controlled trials takes advantage of planned interventions or policy actions, all with minimal cost and disruption.
Building the Knowledge Base on Teacher Preparation and Effectiveness
Mathematica designed and conducted three large-scale studies on the relationship between teacher preparation and effectiveness, using the most rigorous approach possible—random assignment of students to teachers from different kinds of programs—and compared student test scores to gauge teacher effectiveness.