Program Evaluation and Performance Measurement offers a conceptual and practical introduction to program evaluation and performance measurement for public and non-profit organizations. The authors cover the performance management cycle in organizations, which includes: strategic planning and resource allocation; program and policy design; implementation and management; and the assessment and reporting of results.
The
Third Edition has been revised to highlight the current economic, political, and socio-demographic context within which evaluators are expected to work, and includes dynamic public policy exemplars such as the evaluation of body-worn police cameras.
'Finally, a text that successfully brings together quantitative and qualitative methods for program evaluation.’
–Kerry Freedman,
Northern Illinois University
Spis treści
Chapter 1: Key Concepts and Issues in Program Evaluation and Performance Management
Introduction
Policies and Programs
Key Concepts in Program Evaluation
Example: Evaluating a Police Body-Worn Camera Program in Rialto, California
Ten Key Evaluation Questions
The Steps in Conducting a Program Evaluation
Summary
Discussion Questions
References
Chapter 2: Understanding and Applying Program Logic Models
A Basic Logic Modeling Approach
Working with Uncertainty
Program Objectives, and Program Alignment with Government Goals
Program Theories and Program Logics
Logic Models that Categorize and Specify Intended Causal Linkages
Constructing a Logic Model for Program Evaluation
Logic Models for Performance Measurement
Strengths and Limitations of Logic Models
Summary
Discussion Questions
Appendices
References
Chapter 3: Research Designs for Program Evaluations
Introduction
Our Stance
What is Research Design?
Why Pay Attention to Experimental Design?
Using Experimental Designs to Evaluate Programs
Defining and Working with the Four Basic Kinds of Threats to Validity
Quasi-Experimental Designs: Navigating Threats to Internal Validity
Non-Experimental Deisgns
Testing the Causal Linkages in Program Logic Models
Research Designs and Performance Measurement
Summary
Discussion Questions
Appendices
References
Chapter 4: Measurement for Program Evaluation and Performance Monitoring
Introduction
Introducing Reliability and Validity of Measures
Units of Analysis and Levels of Measurement
Sources of Data in Program Evaluations and Performance Measurement Systems
Using Surveys to Estimate the Incremental Effects of Programs
Survey Designs Are Not Research Designs
Validity of Measures, and the Validity of Causes and Effects
Summary
Discussion Questions
References
Chapter 5: Applying Quantitative Evaluation Methods
Introduction
Comparing and Contrasting Different Approaches to Qualitative Evaluations
Qualitative Evaluations Designs: Some Basics
Designing and Conducting Qualitative Program Evaluations
Assessing the Credibility and Generalizability of Qualitative Findings
Connecting the Qualitative Evaluation Methods to Performance Measurement
The Power of Case Studies
Summary
Discussion Questions
References
Chapter 6: Needs Assessments for Program Development and Adjustment
Introduction
Steps in Conducting Needs Assessments
Needs Assessment Example: Community Health Needs Assessment in New Brunswick
Summary
Discussion Questions
Appendices
References
Chapter 7: Concepts and Issues in Economic Evaluation
Introduction
Three Types of Economic Evaluation
Economic Evaluation in the Performance Management Cycle
Historical Developments in Economic Evaluation
Cost-Benefit Analysis
Cost-Efective Analysis
Cost-Utility Analysis
Cost-Benefit Analysis Example: The High/Scope Perry Preschool Program
Strengths and Limitations of Economic Evaluation
Summary
Discussion Questions
References
Chapter 8: Performance Measurement as an Approach to Evaluation
Introduction
The Current Imperative to Measure Performance
Performance Measurement for Accountability and Performance Improvement
Growth and Evolution of Performance Measurement
Metaphors that Support and Sustain Performance Measurement
Comparing Program Evaluation and Performance Measurement Systems
Summary
Discussion Questions
References
Chapter 9: Design and Implementation of Performance Measurement Systems
Introduction
The Technical/Rational View and the Political/Cultural View
Key Steps in Designing and Implementing A Performance Measurement System
Performance Measurement for Public Accountability
Summary
Discussion Questions
Appendix A: Organizational Logic Models
References
Chapter 10: Using Performance Measurement for Accountability and Performance Improvement
Introduction
Using Performance Measures
Rebalancing Accountability-Focused Performance Measurement Systems to Increase Performance Improvement Uses
When Performance Measurement Systems De-Emphasize Outputs and Outcomes: Performance Management Under Conditions of Chronic Fiscal Restraint
Summary
Discussion Questions
References
Chapter 11: Program Evaluation and Program Management: Joining Theory and Practice
Introduction
Internal Evaluation: Views from the Field
Building an Evaluative Culture in Organizations: An Expanded Role for Evaluators
Striving for Objectivity in Program Evaluations
Criteria for High-Quality Evaluations
Summary
Discussion Questions
References
Chapter 12: The Nature and Practice of Professional Judgement in Program Evaluation
Introduction
The Nature of the Evaluation Enterprise
Ethical Foundations of Evaluation Practice
Ethical Guidelines for Evaluation Practice
Understanding Professional Judgement
Improving Professional Judgement in Evaluation
The Prospects for an Evaluation Profession
Summary
Discussion Questions
Appendix
References
O autorze
Irene Huse holds a Master of Public Administration from the University of Victoria and is a Ph D student in the School of Public Administration at the University of Victoria. She has worked as an evaluator and a researcher in universities and the private sector. Her research has appeared in the American Journal of Evaluation, the Canadian Journal of Program Evaluation and New Directions for Evaluation.