New Approaches for Medicaid: The 1115 Demonstration Evaluation
Prepared for:
U.S. Department of Health and Human Services, Centers for Medicare & Medicaid Services, Center for Medicaid and CHIP Services
Prepared for:
U.S. Department of Health and Human Services, Centers for Medicare & Medicaid Services, Center for Medicaid and CHIP Services
Key Features of the Evaluation Design Plan:
Medicaid section 1115 demonstration waivers offer states wide flexibility to test new approaches to administering Medicaid programs that depart from existing federal rules yet are consistent with the overall goals of the program. Section 1115 demonstrations offer many design choices for states to test, ranging from provider payment reforms, expanded coverage or cost sharing, and implementation of behavioral incentives, to delivery system reforms such as managed care. Although state approaches to section 1115 demonstrations vary, many demonstrations share the common goals of controlling costs while improving access and quality. Evaluating the degree to which each of these demonstrations achieves these and other goals, such as system transformation, as well as the links between program characteristics and program results, is critical to the Centers for Medicare & Medicaid Services (CMS) because the experiences and results will provide the federal government and states with evidence to inform policy at all levels and improve future section 1115 demonstrations.
In September 2014, CMS contracted with Mathematica Policy Research and its partners, Truven Health Analytics and the Center for Health Care Strategies, to conduct a national, cross-state evaluation of four types of Medicaid section 1115 demonstration waivers: (1) delivery system reform incentive payments (DSRIP), (2) premium assistance for Medicaid expansions, (3) beneficiary engagement/premium payments, and (4) MLTSS. This contract, which is projected to be ongoing through federal fiscal year 2019, will track the general performance of the demonstrations and evaluate demonstration impacts and outcomes. Results of the evaluation will be presented in periodic rapid-cycle reports, as well as in interim and final evaluation reports. The work also will include detailed assessments of data sources and state-led monitoring, evaluation, and diffusion or replication activities. This report lays out the general design and approach of the evaluation of these demonstrations.
To solve their most pressing challenges, organizations turn to Mathematica for deeply integrated expertise. We bring together subject matter and policy experts, data scientists, methodologists, and technologists who work across topics and sectors to help our partners design, improve, and scale evidence-based solutions.
Work With Us