Ibtissem [user:field_middlename] Jouini

Ibtissem Jouini

Regional Evaluations Specialist
FAO
Egypt

My contributions

  • What type of evaluator are you?

    Discussion
  • A common dilemma is striking a balance between team members who understand the technical subject and those who have expertise in evaluation methodologies and processes. In most evaluations, resources are scarce to hire big teams to cover all fields. Finding people who are experts in at least one discipline, have the right language skills and are skilled evaluators is challenging.

    Here are some principles for creating a good working relationship between the roles of experts and evaluators in development evaluations based on my own experience as an evaluator, and on informal conversations with colleagues, notably in the last European Evaluation

    • Bringing together evidence from evaluations which covered –in part or exclusively– the Quality of Science (QoS) can be a strong way to shape changes towards the development agenda. Evidence synthesis is a trustworthy method but certainly not simple to implement given the variety of evidence we can find, the evaluation criteria, approaches, focus, contexts, etc.

      The QoS theme was one of the major topics covered by the analysis under the framework of the Synthesis of Learning from a Decade of CGIAR Research Programs (2021).[i] One challenge the synthesis team faced was to find the specific analytical framework that best reflected the variety of evidence from the two phases of CRP implementation: 2011-2016 and 2017-2019 and reach the synthesis objectives.

      To further explain this, we had to find out how information had to be categorized and serve as a reference to indicate the focus, the scales, the concepts, and related terms and definitions based on the original objectives of the synthesis and mapping of the analyses forming the core basis of the 43-document corpus of the synthesis. The QoS related levels of inquiry were converted into two main questions and four subthemes.

      The two main questions for the Quality of Science (QoS) and Quality of Research for Development (QoR4D) theme are:

      1.    How has QoS evolved between two CGIAR Research Programs (CRP) phases along three dimensions—inputs, outputs, and processes?

      2.    To what extent has QoS evolved along two of the four QoR4D elements—legitimacy and credibility?

      The results were structured around four subthemes: (1) QoS: Research inputs (2) QoS: Quality of research outputs (3) QoS: Research management/process and, (4) QoR4D elements: legitimacy and credibility. Along these topics, a set of Cross-cutting themes were covered: gender, climate change/environment, capacity building, external partnerships and youth.

      Four key issues were addressed in the analysis of findings: (1) patterns and trends between the two phases of CRPs related to the quality of science (QoS) and research for development, achievement of sustainable development outcomes, and management and governance; (2) systemwide issues affecting CRP achievements; (3) recommendations for the future orientation of CGIAR research and innovation; and (4) key evidence gaps and needs for future evaluations.

      A narrative synthesis approach was used to summarize and analyze the Learning from a Decade of CGIAR Research Programs, employing secondary source data from 47 existing evaluations and reviews. External evaluations were systematically coded and analyzed using a standardized analytical framework. A bibliometric trend analysis was carried out for the QoS theme, and findings were triangulated against earlier syntheses and validated by members of the Independent Science for Development Council (ISDC), CRP leaders, and expert peer reviewers.

       

      [i] Report: CAS Secretariat (CGIAR Advisory Services Shared Secretariat). (2021). Synthesis of Learning from a Decade of CGIAR Research Programs. Rome: CAS Secretariat Evaluation Function. https://cas.cgiar.org/evaluation/publications/2021-Synthesis