Thursday, April 8, 2010

Responding to Evaluation Findings

John Scott Bayley, an Evaluation Specialist at the Independent Evaluation Department in the Asian Development Bank published an article in the Evaluation Journal of Australasia about Handy Hints for Program Managers.

Though meant to be both lighthearted as well as serious, one of his handy hints for Program Managers is if they feel threatened by the results of an evaluation study they should consider responding with one of the following strategies:
  1. Attack the evaluation’s methodology; 
  2. Attack the data's interpretation and resulting conclusions;
  3. Attack the evaluation’s assumptions; 
  4. Attack the recommendations;
  5. Substitute previously unstated goals for the official program goals;
  6. Attack the evaluators personally, claim they are biased or unfamiliar with the program;
  7. Attack the evaluation’s key issues and research questions; 
  8. Do not participate in the evaluation, but argue that the findings lack an adequate contextual background;
  9. Rally together those who are threatened by the findings; 
  10. Indicate the findings are reasonable, but unable to be implemented due to a lack of resources, political opposition, the staff need training etc;
  11. Complain about a lack of consultation;
  12. Argue that the evaluators did not appreciate the subtleties of the program; 
  13. Simply pretend that the evaluation never occurred, ignore it; 
  14. State that the program’s environment has changed, and the findings are no longer relevant; 
  15. Stall for time until the evaluation is forgotten about; 
  16. Argue that the union and staff will not accept the recommendations; 
  17. Argue that while the program has not achieve it's goals, it does achieve other important things that are too subtle to be easily measured;
  18. Say that the evaluation leaves important questions unanswered, its' significance is questionable; 
  19. Argue that the data is open to alternative interpretations, the evaluation’s conclusions have been questions by others;
  20. Attack the steering committee;
  21. Claim that the results contradict commonsense experience, and testimonials from clients;
  22. Claim that the findings are contradicted by other research conducted by various experts in the field; 
  23. Agree with the findings and indicate that you have known about this for some time, and you started making changes months ago;
  24. Argue that the findings contradict the spirit and philosophy of the dept/program;
  25. Make up quotations that support your case and attribute them to knowledgeable sources; and
  26. Argue about definitions and interpretations.
Some others that I have heard that are not listed above are:
  1. Argue that the project did not have a sufficient budget to monitoring the results and thus cannot be held responsible for not achieving them.
  2. Argue that the results the project was trying to achieve are so unique that they are not measurable.
  3. Argue that the "real" results of the project will occur years after the evaluation.

1 comment:

  1. Very perceptive. A strong list, obviously based in long experience.

    I think word of this list has already spread, and the excuses are being applied. In reviewing one project final report recently, I saw least 18 of these excuses used in response to a single evaluation. In addition:

    The evaluators did not follow the TORs

    The funding agency should have higher standards for evaluators and monitors

    The evaluators and monitors should get out of hotels and interview people in the field - (they did)

    ReplyDelete