Section 1115 Medicaid demonstration waivers enable states to test potential innovations or alterations to Medicaid that would not otherwise be allowed under existing law. Demonstration waivers generally reflect evolving priorities at the federal and state level and can vary in scope – covering the entirety of a state’s Medicaid program, or more narrowly providing otherwise non-eligible adult’s access to specific benefits, such as family planning services.

In the last 10 years, there has been an increase in 1115 demonstration proposals and approvals. With this growth in activity, the Center for Medicare & Medicaid Services (CMS) has become increasingly interested in improving the rigor of evaluations to examine their impacts on beneficiaries, providers, health plans, and states, including impacts on access, quality of care, and costs as proposed in a state’s plan. However, until recently, there was limited guidance and examples on recommended evaluation methods, approaches, data resources and related funding to support state efforts in responding to this demonstration project requirement.

AcademyHealth has worked with experts to discuss evaluation priorities and identify key issues associated with 1115 evaluations. These experts urged practitioners to keep demonstration designs relatively simple to ensure it produces robust results and usable knowledge. In addition to design, evaluation of demonstrations should produce evidence regarding both the outcome as well as how the design contributed to that outcome. Experts also emphasized the need for multi-state evaluations as well as the importance of understanding both short and longer-term effects so policymakers have a fuller sense of policy change.

Building on these experts’ insights, AcademyHealth is engaging with leading Medicaid demonstration evaluation researchers in the Medicaid Demonstration Evaluation Learning Collaborative  to improve alignment of the policy questions being evaluated, the data sources being used, and the study designs and methodologies employed across state evaluation efforts. So far, the Learning Collaborative has identified Evaluation Design and Implementation as the key focus for their work noting the need to be clear in determining what you are testing, and identify relevant Medicaid populations the evaluation will measure and impact. Participants also discussed the importance of community outreach and input; identifying sources and scope of funding; the ability to link data across sources to provide a robust data set; and challenges in collecting primary data.

When designing evaluations, learning collaborative participants highlighted the following six considerations:

  1. Engage stakeholders at the community level and ensure evaluation designs capture relevant data necessary for policy impact;
  2. Understand how data is disseminated and perceived throughout the community;
  3. Use appropriate, community relevant evaluation language and terminology when presenting findings to the community (work reporting versus work requirement);
  4. Consider the limitation of enrollee data due to the need to measure beliefs, preferences, understanding, etc. and survey data only provides a narrow picture;
  5. Ensure access and capacity to link data sets in a specific state and/or across states to ensure complete beneficiary outcome data;
  6. Consider funding for evaluators and its potential impact on evaluation design. It could be helpful to identify a standardized approach for determining evaluation costs over time.

As important as these factors are, learning collaborative participants noted that it is paramount that the evaluation be implementable. As such, it is beneficial to involve an evaluator early in the process to provide valuable input in the evaluation design of the proposal and serve as a consistent voice in the demonstration’s process from submission, approval and implementation.

As researchers are developing and implementing their own state demonstration evaluations, the learning collaborative will work to address identified design challenges and questions across the demonstration types, ultimately creating a synthesis of these findings for future researchers.

Blog comments are restricted to AcademyHealth members only. To add comments, please sign-in.