- Do you think the Guidelines respond to the challenges of evaluating quality of science and research in process and performance evaluations?
Having been involved in evaluating CGIAR program and project proposals as well as program performance over the past decade, I have used an evolving range of frameworks and guidelines. For the 2015 Phase I CRP evaluations, we used a modified version of the OECD-DAC framework including the criteria relevance/coherence, effectiveness and impact and sustainability. The lack of a quality of science criterion in the OECD-DAC framework was addressed but evaluated without designated elements or dimensions. Partnerships were evaluated as cross-cutting and evaluation of governance and management were not directly linked to the evaluation of quality of science. For the 2020 Phase II CRP evaluative reviews, we used the QoR4D Frame of Reference with the elements relevance, credibility, legitimacy and effectiveness together with three dimensions inputs, processes and outputs. Quality of science was firmly anchored in the elements credibility and legitimacy and all three dimensions had well-defined indicators. During the 2020 review process, the lack of a design dimension was highlighted in regard to its importance in evaluating coherence and methodological integrity and fitness as well as the comparative advantage of CGIAR to address both global and regional problems.
The beta version of the Evaluation Guidelines encapsulates all of these valuable lessons learnt from a decade of evaluations and, in this respect, it responds to the challenges of evaluating quality of science and research in process and performance evaluations. During its development, it has also consulted with other evaluation frameworks and guidelines to gain greater understanding of evaluation of both research and development activities. Due to this, it is flexible and adaptable and thus useful and usable by research for development organizations, research institutes and development agencies.
Recently, the Evaluation Guidelines were used retrospectively to revisit the evaluative reviews of 2020 Phase II CRPs with a greater understanding of qualitative indicators in four dimensions. Application of the Guidelines provided greater clarity of the findings and enhanced the ability to synthesize important issues across the entire CRP portfolio.
- Are four dimensions clear and useful to break down during evaluative inquiry (Research Design, Inputs, Processes, and Outputs)? (see section 3.1)
The four dimensions are clear and useful especially if accompanied by designated criteria with well-defined indicators. They are amenable to a mixed methods evaluation approach using both quantitative and qualitative indicators. In addition, they provide the flexibility to use the Guidelines at different stages of the research cycle from proposal stage where design, inputs and planned processes would be evaluated to mid-term and project completion stages where outputs would then become more important.
- Would a designated quality of science (QoS) evaluation criterion capture the essence of research and development (section 3.1)?
From my own use of the quality of science criterion with intrinsic elements of credibility (robust research findings and sound sources of knowledge) and legitimacy (fair and ethical research processes and recognition of partners), it captures the essence of research and research for development. Whether it will capture the essence of development alone will depend on the importance of science to the development context.