Thursday, October 27, 2011

Making Evaluations Matter - A Practical Guide for Evaluators

Recently, the Centre for Development Innovation, Wageningen University & Research Centre, Wageningen, The Netherlands, published Making Evaluations Matter: A Practical Guide for Evaluators written by Cecile Kusters with Simone van Vugt, Seerp Wigboldus, Bob Williams and Jim Woodhill.

This guide emphasizes participatory evaluation and draws heavily upon the work of Michael Quinn Patton, especially from Utilization-Focused Evaluation.

I think this is a very handy guide NOT ONLY for evaluators, but also a handy guide for country directors, project managers, and project directors to read PRIOR to implementing a project as well as toward the end of a project when planning an evaluation.


The contents are the following:

1. Core Principles for guiding evaluations that matter.
2. Suggested steps for designing and facilitating evaluations that matter.
3. Getting stakeholders to contribute successfully.
4. Turning evaluation into a learning process.
5. Thinking through the possible influences and consequences of evaluation on change processes.
6. Conclusion

Annex A: Examples of (Learning) Purposes, Assessment Questions, Users, and Uses of an Evaluation for a Food Security Initiative.

Annex B: Contrasts between traditional evaluation and complexity-sensitive developmental evaluation.

Wednesday, October 19, 2011

Self-administered questionnaires


In survey research, especially when questions related to sensitive topics are being asked, there are debates between which form of questionnaire administration is best: a) interviewer-administration or b) self-administration. More often that not, questionnaires are administered by a trained interviewer; however, there are times that some people feel its best that the respondent completes the questionnaire without the assistance of an interviewer (self-administered).

Currently, I'm dealing with survey data from a youth study that used a self-administered questionnaire and the data contain many "missing" cases, nonsensical responses, and numerous cases of Errors of Commission and Errors of Omission. An Error of Commission is one where the person responds where they should not and an Error of Omission is one where the person fails to respond when they should.

A questionnaire designed for an interviewer administered survey cannot be used for a self-administered survey! Interviewers are trained in understanding the questions and how to navigate through the questionnaire; however, a questionnaire that is designed for someone who has never seen it before and for them to understand the questions as well as navigate through the questionnaire requires special attention to many factors. Using pg.6 from the 2008 National Survey of College Graduates, conducted by the US Census Bureau to illustrate, some critical factors to consider for a self-administered questionnaire are:


  • Language - the instructions and questions need to be written in a vocabulary that is slightly lower than lowest education level of any respondent.
  • Section Heading - every section/topic needs a heading that is short, in bold font, slightly different color than the rest of the questionnaire, such as Part B - Past Employment.
  • Question Numbering - question numbers should carry the section lettering/numbering as well as the question number and should be in a slightly larger font than the question text and in bold font, such as B1.
  • Verbal navigation - instructions next to certain responses that tell the respondent where to clearly go next. In the example above, if the respondent answers "No" in question B1 there is a verbal instruction, in bold font, telling them both 1) the page and 2) the question # to go to.
  • Symbol navigation - these are generally arrows showing a respondent where to go next if they answer a certain response. Above, if a respondent answers "Yes" in question B1 the arrow shows them to go to question B2.
  • Adequate spacing - all to often to save printing costs, a questionnaire is too cluttered but generally this is ok for a trained interviewer but not for self-administration. A self-administered questionnaire should have adequate spacing between questions to reduce eye fatigue and confusion.
  • Coloring - if posible, use slightly different grays or colors to highlight different sections and responses, such as in the example above.

Saturday, October 15, 2011

Youth Conflict & Tolerance Survey

Conflict and tolerance are relevant issues for western (Republic of) Georgia, which has experienced civil war in the 1990s and the recent 2008 war with Russia. Save the Children, with funding from the EU, conducted formative research among youth in regions on both sides of the de facto  administrative border between Samegrelo and Gali regions.

In cooperation with several colleagues, we used formative research (Focus group discussion, in-depth interviews, and key informant interviews), to develop a Youth Conflict and Tolerance Survey (YCTS) tool for western Georgia. The YCTS tool focuses on conflict and tolerance issues youth confront in three context settings: the home, at school and in the community. It is a 185 item survey instrument divided into 6 sections: 1) Respondent characteristics, 2) Knowledge of conflict resolution skills and attitudes, 3) Types of conflict in the home, 4) Types of conflict at school/education facility, 5) Types of conflict in the community, and 6) General attitudes. The YCTS underwent a time-stability (reliability) test which showed that the majority of the questions in each section are reasonably stable.

The YCTS tool is designed to be completed either via self-administration (where the youth reads and answers the questions on their own) or via oral administration (where a youth worker or teacher reads each question and the youth answers each question on their own). The YCTS tool is designed for program development (general assessment) and program evaluation (change resulting from program interventions). Change over time is tracked through the administration of the YCTS tool on at least two separate occasions (Time One and Time Two) -- using the same survey methodology, the same instrument with the same learners.
In addition, a YCTS handbook was developed to provide a simple, step-by-step guide to effectively administer the Youth Conflict & Tolerance Survey (YCTS) Tool. Download the TCTS Handbook and please send me your comments, suggestions, and critiques.

Monday, October 10, 2011

NGO Network Analysis Handbook

Today, there is the UN NGO Network, NGO Global Network, International NGO Network, Voluntary Action Network and many other type of sectoral networks such as the Child Rights Network, Human Rights Network, and Environmental Protection Network.

However, more often than not, the term "network" is used as a metaphor and rarely is there much effort to actually measure and demonstrate if there is truly a network; that is, an interconnected group or system.

Since January 2011, I worked with Lilly Saganelidze and Tamuna Dagargulia to study, measure and map the network of youth-focused NGOs in western Georgia. The outcome of this study is: NGO Network Analysis Handbook: how to measure and map linkages between NGOs.

Please feel free to use download and use this handbook and please send us an email if you have suggestions in how to improve it.

Wednesday, June 29, 2011

GPS and Google Earth/Maps

Two new handbooks are available for download, one in English and one in Russian, titled:

Community Based Disaster Risk Reduction Mapping Using Google Earth & GPS: A Handbook

This handbook, produced by Save the Children, to aid humanitarian organizations, local governments and central governments to use mapping methodologies with Google Earth virtual globe, map and geographic
information programs in disaster risk reduction. Using affordable GPS (global positioning system) units, and free Google Earth/Maps, local communities can produce highly accurate maps that can be easily understood by wider communities, and provide a useful picture for targeting disaster preparedness, prevention, and mitigation resources.

Not only can GPS units and Google Earth/Maps facilitate community disaster risk reduction, but in Jordan these are being used to map child labor locations as well as child labor services.

These handbooks are available in the DME Documents section to the left.

Wednesday, June 15, 2011

Catalog of Survey Quesitions

How often have you had to develop a questionnaire for a needs assessment, baseline or end-line survey, and not only for one project, but several that address different issues? At times like these it would be nice to have a catalog of questions to review to see which ones may be appropriate and relevant or which ones with some adaptation could be appropriate and relevant.

The International Household Survey Network has now developed an easily accessible online Catalog of Survey Questions. Currently, it has now about 2500 questions that have been used in about 1300 different surveys. You can filter types of surveys and questions by country, year (1950-2011) and type. There are four types of surveys: 1) demographic and health survey, 2) income and expenditure, 3) multiple indicator cluster survey, and 4) population and housing census.

I think this is a good start, however, several of the links do not work. For example, since I'm involved in a child labor project in Amman, Jordan, I thought I would download the Child Labor Study questionnaire used in West Bank/Gaza. But, the link does not work. I have sent an email to IHSN and I'll let you know if they respond. Nonetheless, many of the links to the questionnaires do work.

Sunday, June 12, 2011

Online Data and Statistics

Data Market is a website that provides easy access to 13k data sets, 100 million time series, and 600 million facts! DataMarket is a type of "one stop shopping" to obtain data for countries ranging from Afghanistan to Zimbabwe and issue ranging from agriculture to youth.

Since I work in the Middle East, and with the recent "Arab Spring" I decided to graph internet users in the countries of Egypt, Jordan, Lebanon, Morocco, Tunisia, Palestine and Yemen. The graph below was produced and DataMarket provides various formats to export the data and graphs.

As the graph illustrates, since 2001 there was a dramatic increase in the number of internet users in Egypt, followed by Morocco and Tunisia with much lower increases in internet users were in Lebanon, Jordan and no real increase in Libya or Yemen.




So, give DataMarket a try and see if it can help you quickly get the data you need and illustrate it.

Thursday, June 9, 2011

Monitoring & Evaluating a Project Related Website

Increasingly, websites are being developed as one part of the projects/programs' outreach to their beneficiaries, especially urban-based youth projects. These project websites provide information about the project, allow beneficiaries such as youth to download information, such as information on reproductive health, as well as post comments and questions. However, I rarely projects monitoring these project websites and regularly reporting any website metrics, which means that these website are not evaluated at the end of the project.

Google Analytics has made monitoring and evaluating a project website very easy. Google Analytics provides very detailed information to help you monitor who is viewing the project website, how much they are using it, which pages of the website are viewed most, how long they stay at the website, as well as location of the viewers.

All that is required is that you have a Google Gmail account and Google Analytics is free. The process involves copying/pasting a script provided by Google into the code of your website that has a "tracking code." That's it. In usually 24hrs, you can start viewing a Google Analytic Report about your website.

To illustrate how to monitor and evaluate a project website, I will report my own Design, Monitoring & Evaluation website usage.

I began writing this blog regularly starting in March 2010. Google Analytics lets me put in any dates I want to analyze, so I will report usage since this time. So, since March 2010:

Visitors:
  • there have been 13,538 visits to this blog
  • there have been 10,740 unique visitors
  • there have been 21,954 page views
  • on average, visitors view one and one-half (1.6) pages
  • on average, visitors stay for 2.42 minutes
Location:
  • 22.6% of visitors are from the US
  • the majority of US visitors are from District of Columbia, California and New York state
  • the majority of visitors from California are from Los Angels and San Francisco
  • the majority of visitors from New York state are from the city of New York and Brooklyn
  • the next largest percentage of visitors are from the UK, of which most are from London and Manchester
  • since I live in Georgia (Republic of) I know that 98% of visitors are from Tbilisi (the capital) and the remaining 2% are from Batumi (on the Black Sea)
  • all the 8 countries I cover in my work, and for which I started this website to assist, represent only about 1% of all the visitors
Content:
  • other than the opening page, the most viewed pages are Using Excel to Create a Ghant Chart, Essential Program Evaluation, and Quotes Related to Evaluation
Traffic:
  • the vast majority (71%) of visitor find this blog using a Search Engine (60% of these visitors use Google), 15% using a Referring Site, and 14% coming directly (they have bookmarked this blog).
Mobile Devices:
  • the two mobile devices most used to read this blog are IPhone and IPad.

In summary, these data help me monitor my website (I know who is visiting, how often, what are the favorite pages, where visitors are from).

Also, I can evaluate this blog as not accomplishing its intend goal, which was to serve Save the Children offices in the 8 countries I cover since very few visitors are from these countries and from the locations where Save the Children offices are located. However, I can assess that this blog is serving a general interest of a small group of people around the world.

So, if  your project has a website, or is considering developing one, consider using Google Analytics to monitor and evaluate the website.

Sunday, June 5, 2011

Training Videos on Program Evaluation

For those people interested in random assignment program evaluations, the Abdul Latif Jameel Poverty Action Lab (located at the Massachusetts Institute of Technology) has videos of its training sessions on evaluating social programs online.

The videos are titled:
  1. What is Evaluation
  2. Why Randomize
  3. How to Randomize 1
  4. How to Randomize 2
  5. Measurement and Outcome
  6. Sample Size and Power Calculations
  7. Managing Threats to Evaluation and Data Analysis
  8. Analyzing Data
These videos are located here.

Friday, June 3, 2011

My M&E

My M&E is an interactive WEB 2.0 platform to share knowledge on country-led M&E systems worldwide. In addition to being a learning resource, My M&E facilitates the strengthening of a global community, while identifying good practices and lessons learned about program monitoring and evaluation in general, and on country-led M&E systems in particular.


http://www.mymande.org/?q=wikimehome


While My M&E was founded by IOCE, UNICEF and DevInfo, it is managed by a consortium of partner organizations including IDEAS, IPDET, WHO/PAHO, UNIFEM, ReLAC, Preval, Agencia brasileira de Avaliacao, SLEvA and IPEN. If your organization wishes to join the consortium as a partner, please send an email to Marco Segone, UNICEF Evaluation Office, at msegone@unicef.org.

My M&E is a collaborative website whose content can be modified continuously by users. To develop and strengthen a global community on country-led M&E systems, registered users have the facility to complete their own social profile and exchange experiences and knowledge through blogs, discussion forums, documents, webinars and videos.


This is a great resource! Enjoy.

Systemic Approaches in Evaluation

On January 25 and 26, 2011 the Evaluation and Audit Division of the Federal Ministry of Economic Cooperation and Development (BMZ) and the Evaluation Unit of GIZ offered a forum to discuss systemic approaches to evaluation at an international conference.
More than 200 participants from academia, consulting firms and NGOs discussed, amongst others, the following questions:
  • What are systemic approaches in evaluation?
  • For which kind of evaluations are systemic approaches (not) useful? Can they be used to enhance accountability, for example?
  • Are rigorous impact studies and systemic evaluations antipodes or can we combine elements of both approaches?
  • Which concrete methods and tools can be used in systemic evaluation?

You can find videos and download presentations from this website.

Friday, February 11, 2011

Online Training in Monitoring and Evaluation

The Global Health eLearning Center is provide online training in various topics, some of which include monitoring and evaluation (M&E). Some of the M&E course include:

  • Data Quality
  • Data Use for Program Managers
  • Economic Evaluation
  • M&E Frameworks for HIV/AIDS Programs
  • M&E Fundamentals
In order to take these online courses you must register, but registration is free. Also, certificates can be earned for these courses.

So, go to Global Health eLearning Center and take an online M&E course!