Sunday, September 19, 2010

The Road to Results: Designing and Conducting Effective Development Evaluations

A very practical handbook, which also has some practical exercises, is the 2009 World Bank publication by Linda G. Morra Imas and Ray C. Rist, titled: The Road to Results: Designing and Conduction Effective Development Evaluations. It is a 585 page book that covers a wide variety of topic on program evaluation. The nice thing is that you can read it online for FREE. The detailed table of contents is below. I have also added the link to the Documents section of this blog.

FOUNDATIONS
Chapter 1. Introducing Development Evaluation
Evaluation: What is it?
The Origins and History of the Evaluation Discipline
The Development Evaluation Context
Principles and Standards for Development Evaluation
Examples of Development Evaluations

Chapter 2
. Understanding the Issues Driving Development Evaluation
Overview of Evaluation in Developed and Developing Countries 
Implications of Emerging Development Issues

PREPARING AND CONDUCTING EFFECTIVE DEVELOPMENT EVALUATIONS
Chapter 3. Building a Results-Based Monitoring and Evaluation System
Importance of Results-Based Monitoring and Evaluation
What is Results-Based Monitoring and Evaluation?
Traditional Versus Results-Based Monitoring and Evaluation
Ten Steps to Building a Results-Based Monitoring and Evaluation System

Chapter 4. Understanding the Evaluation Context and the Program Theory of Change
Front-End Analysis
Identifying the Main Client and Key stakeholders
Understanding the Context
Tapping Existing Knowledge
Constructing, Using, and Assessing a Theory of Change

Chapter 5. Considering the Evaluation Approach
General Approaches to Evaluation

DESIGNING AND CONDUCTING
Chapter 6. Developing Evaluation Questions and Starting the Design Matrix
Sources of Questions
Types of Questions
Identifying and Selecting Questions
Developing Good Questions
Designing the Evaluation

Chapter 7. Selecting Designs for Cause-and-Effect, Descriptive, and Normative Evaluation Questions
Connecting Questions to Design
Designs for Cause-and-Effect Questions
Designs for Descriptive Questions
Designs for Normative Questions
The Need for More Rigorous Evaluation Designs

Chapter 8. Selecting and Constructing Data Collection Instruments
Data Collection Strategies
Characteristics of Good Measures
Quantitative and Qualitative Data
Tools for Collecting Data

Chapter 9. Choosing the Sampling Strategy
Introduction to Sampling
Types of Samples: Random and Nonrandom
Determining the Sample Size

Chapter 10. Planning for and Conducting Data Analysis
Data Analysis Strategy
Analyzing Qualitative Data
Analyzing Quantitative Data
Linking Qualitative Data and Quantitative Data

MEETING CHALLENGES
Chapter 11. Evaluating Complex Interventions
Big-Picture Views of Development Evaluation
Joint Evaluations
Country Program Evaluations
Sector Program Evaluations
Thematic Evaluations
Evaluation of Global and Regional Partnership Programs

LEADING
Chapter 12. Managing an Evaluation
Managing the Design Matrix
Contracting the Evaluation
Roles and Responsibilities of Different Players
Managing People, Tasks, and Budgets

Chapter 13. Presenting Results
Crafting a Communication Strategy
Writing an Evaluation Report
Displaying information Visually
Making an Oral Presentation 

Chapter 14. Guiding the Evaluator: Evaluation Ethics, Politics, Standards, and Guiding Principles
Ethical Behavior
Politics and Evaluation
Evaluation standards and Guiding Principles

Chapter 15. Looking to the Future
Past to Present
The Future
Enhanced by Zemanta

No comments:

Post a Comment