This is the contemporary, applied text on evaluation that your students need.
Evaluation for Health Policy and Health Care: A Contemporary Data-Driven Approach explores the best practices and applications for producing, synthesizing, visualizing, using, and disseminating health care evaluation research and reports. This graduate-level text will appeal to those interested in cutting-edge health program and health policy evaluation in this era of health care innovation. Editors Steven Sheingold and Anupa Bir’s core text focuses on quantitative, qualitative, and meta-analytic approaches to analysis, providing a guide for both those executing evaluations and those using the data to make policy decisions. It is designed to provide real-world applications within health policy to make learning more accessible and relevant, and to highlight the remaining challenges for using evidence to develop policy.
Innehållsförteckning
List of Figures and Tables
Preface
Acknowledgments
About the Editors
PART I. SETTING UP FOR EVALUATION
Chapter 1. Introduction
Background: Challenges and Opportunities
Evaluation and Health Care Delivery System Transformation
The Global Context for Considering Evaluation Methods and Evidence-Based Decision Making
Book’s Intent
Chapter 2. Setting the Stage
Typology for Program Evaluation
Planning an Evaluation: How Are the Changes Expected to Occur?
Developing Evaluations: Some Preliminary Methodological Thoughts
Prospectively Planned and Integrated Program Evaluation
Summary
Chapter 3. Measurement and Data
Guiding Principles
Measure Types
Measures of Structure
Measures of Process
Measures of Outcomes
Selecting Appropriate Measures
Data Sources
Looking Ahead
Summary
PART II. EVALUATION METHODS
Chapter 4. Causality and Real-World Evaluation
Evaluating Program/Policy Effectiveness: The Basics of Inferring Causality
Defining Causality
Assignment Mechanisms
Three Key Treatment Effects
Statistical and Real-World Considerations for Estimating Treatment Effects
Summary
Chapter 5. Randomized Designs
Randomized Controlled Trials
Stratified Randomization
Group Randomized Trials
Randomized Designs for Health Care
Summary
Chapter 6. Quasi-experimental Methods: Propensity Score Techniques
Dealing With Selection Bias
Comparison Group Formation and Propensity Scores
Regression and Regression on the Propensity Score to Estimate Treatment Effects
Summary
Chapter 7. Quasi-experimental Methods: Regression Modeling and Analysis
Interrupted Time Series Designs
Comparative Interrupted Time Series
Difference-in-Difference Designs
Confounded Designs
Instrument Variables to Estimate Treatment Effects
Regression Discontinuity to Estimate Treatment Effects
Fuzzy Regression Discontinuity Design
Additional Considerations: Dealing With Nonindependent Data
Summary
Chapter 8. Treatment Effect Variations Among the Treatment Group
Context: Factors Internal to the Organization
Evaluation Approaches and Data Sources to Incorporate Contextual Factors
Context: External Factors That Affect the Delivery or Potential Effectiveness of the Treatment
Individual-Level Factors That May Cause Treatment Effect to Vary
Methods for Examining the Individual Level Heterogeneity of Treatment Effects
Multilevel Factors
Importance of Incorporating Contextual Factors Into an Evaluation
Summary
Chapter 9. The Impact of Organizational Context on Heterogeneity of Outcomes: Lessons for Implementation Science
Context for the Evaluation: Some Examples From Centers for Medicare and Medicaid Innovation
Evaluation for Complex Systems Change
Frameworks for Implementation Research
Organizational Assessment Tools
Analyzing Implementation Characteristics
Summary
PART III. MAKING EVALUATION MORE RELEVANT TO POLICY
Chapter 10. Evaluation Model Case Study: The Learning System at the Center for Medicare and Medicaid Innovation
Step 1: Establish Clear Aims
Step 2: Develop an Explicit Theory of Change
Step 3: Create the Context Necessary for a Test of the Model
Step 4: Develop the Change Strategy
Step 5: Test the Changes
Step 6: Measure Progress Toward Aim
Step 7: Plan for Spread
Summary
Chapter 11. Program Monitoring: Aligning Decision Making With Evaluation
Nature of Decisions
Cases: Examples of Decisions
Evidence Thresholds for Decision Making in Rapid-Cycle Evaluation
Summary
Chapter 12. Alternative Ways of Analyzing Data in Rapid-Cycle Evaluation
Statistical Process Control Methods
Regression Analysis for Rapid-Cycle Evaluation
A Bayesian Approach to Program Evaluation
Summary
Chapter 13. Synthesizing Evaluation Findings
Meta-analysis
Meta-evaluation Development for Health Care Demonstrations
Meta-regression Analysis
Bayesian Meta-analysis
Putting It Together
Summary
Chapter 14. Decision Making Using Evaluation Results
Research, Evaluation, and Policymaking
Program/Policy Decision Making Using Evidence: A Conceptual Model
Multiple Alternatives for Decisions
A Research Evidence/Policy Analysis Example: Socioeconomic Status and the Hospital Readmission Reduction Program
Other Policy Factors Considered
Advice for Researchers and Evaluators
Chapter 15. Communicating Research and Evaluation Results to Policymakers
Suggested Strategies for Addressing Communication Issues
Other Considerations for Tailoring and Presenting Results
Closing Thoughts on Communicating Research Results
Appendix A: The Primer Measure Set
Appendix B: Quasi-experimental Methods That Correct for Selection Bias: Further Comments and Mathematical Derivations
Propensity Score Methods
An Alternative to Propensity Score Methods
Assessing Unconfoundedness
Using Propensity Scores to Estimate Treatment Effects
Unconfounded Design When Assignment Is at the Group Level
Index
Om författaren
Anupa Bir, Sc D MPH is the Senior Director of the Center for Advanced Methods Development at RTI International. A health economist by training, much of her work has focused on the well-being of vulnerable populations and aligning incentives within various systems, including the welfare, child welfare, corrections, and health systems, to improve well-being. Dr. Bir currently leads several contracts to evaluate complex health and social policy interventions. These interventions include innovative workforce interventions to improve access to quality health care, interventions that offer financial incentives for asset development, and interventions that improve communication and family strength during stressful circumstanceslike incarceration and reentry. Within health care and health policy, she leads evaluations of State Innovation Models, efforts funded by the Centers for Medicare and Medicaid Innovation to accelerate the transition to value-based payment models in 11 states. She also leads meta-evaluation work to understand the lessons from state Medicaid demonstrations to improve service delivery for those with substance use disorders or serious mental illness. She holds an MPH from Yale University School of Public Health and a doctoral degree in international health economics from the Harvard TH Chan School of Public Health.