In the past few years, there has been an increased availability of, and easiness of access by evaluators around the world to potential sources of evaluative evidence (evaluation reports, research papers, surveys) on a myriad of topics and contexts.
Prospective users of such information, however, have become less capable to take full account of this development, in part due to lack of time and capacities to absorb all the information. Decision-makers in particular demand and expect from researchers and evaluators to provide them with products to facilitate the uptake of such knowledge. Evaluation synthesis can fill this gap by building on findings from different sources to get a better understanding of the effectiveness of a program or policy.
A synthesis is the integration of existing knowledge and findings relevant to a topic, and has as its main objective to increase the applicability of evaluation findings and develop new knowledge through the integration process. It is promoted as an approach that addresses the challenge of "information overload", delivering products that distil relevant evidence for decision-making.
In September, I raised this topic with the Community and asked which experiences members had in using evaluation synthesis to improve programme and project effectiveness. Several members shared examples as well as lessons and ideas for increasing the use of evaluation synthesis, such as the following:
- Synthesis work needs to be focused and pragmatic in order to attain its goal.
- Evaluation documents needs to be easily available, and the researchers need to interact with key stakeholders throughout the preparation process.
- Meta-analyses of evaluations allows the identification of structural themes that affect performance, which is of interest to donors and programme managers.
- A more holistic approach is required to enhance the engagement and outreach of the learning i.e., by organizing dedicated workshops to discuss the findings of the evaluation synthesis.
- Meta-analysis can provide very valuable insights on the performance of a programme or approach, and guide primary data collection. It can however be very time-consuming and thus not aways feasible to undertake within the timeframe of an evaluation.
- Meta-analysis can be an important tool for decision-makers but is not well known among evaluators. More training on its usage could help broadening awareness and application of the approach.
EvalForward will be holding a webinar on evaluation synthesis in late October for those interested in learning more about this tool and its application in development evaluation. Participation will be open to members of Community as well as to evaluators from across the UN system. The exact date and agenda of the learning event will be announced soon. Stay tuned!
Participants to the discussion: Emmanuel Kojo, Eltighani Mirghani Elamin, Nasser Samih Qadous, Chitra Achyut Deshpande, Lal Manavado, Md Moshfaqur Rahman, Malika Bounfour and Olivier Nkurunziza.