I would like to share the experience that Cirad, the French agricultural research and international cooperation organization working for the sustainable development of tropical and Mediterranean regions, has and the efforts that have been made in the latest years to strengthen an impact-oriented culture through the fostering of an evaluative thinking. I think this work resonates with the challenges and willingness of using evaluation to better steer science and technology development and contribute in innovation processes.
Cirad has already made a first effort of reconstructing what they call the "building a culture of impact" process and you may access more detailed information here: https://doi.org/10.1093/reseval/rvy033
I joined Cirad one year ago and, coming from the CGIAR, I was not surprised to see that a publicly accessible repository was in place and that a set of standardised bibliometric indicators was regularly monitored: https://indicateurs-publication.cirad.lodex.fr/
I learned that, being Cirad a public institution, it undergoes a regular (every four year) evaluation coordinated by a public independent authority, the Haut Conseil de l’évaluation de la recherche et de l’enseignement supérieur (Hcéres). There are three points that retained my attention when I first looked at the Hcéres evaluation framework and method. https://www.hceres.fr/fr/referentiels-devaluation
The first point is that before an external panel of high-level national and international experts performs the evaluation, there is an internal self-evaluation process that is performed by the Cirad itself.
The second point was that the evaluation framework clearly identifies the capacity of the organisation to orient and implement its strategy based on societal challenges and demands, the centrality of partnerships and the quality of science as key criteria.
The third point was the granularity of the evaluation that was performed both at the institutional level and for each of the research units.
Along with this externally committed evaluation, regular evaluations are also performed to assess the main research and development regional networks (https://www.cirad.fr/en/worldwide/platforms-in-partnership).
Here you may find the latest Hcéres evaluation report (2021) in French: https://www.cirad.fr/les-actualites-du-cirad/actualites/2021/evaluation-hceres-du-cirad-une-culture-de-l-impact-et-un-positionnement-strategique-salues
The institutional mechanism I briefly described, was enriched, starting about ten years ago, thanks to the scientific and methodological efforts led by a team of researchers and experts under the umbrella of the Impact of Research in the South (ImpresS) initiative. https://impress-impact-recherche.cirad.fr/
ImpresS has defined a set of methodological principles (i.e. reflexive learning, systemic perspective, case study analysis, contribution analysis, actor-centred and participatory approach, outcome-orientation, focus on capacity development and policy support processes) to develop and implement evaluative approaches and tools that are currently used to assess long term innovation trajectories (ImpresS ex post: https://www.cirad.fr/nos-activites-notre-impact/notre-impact) and to conceive new interventions (ImpresS ex ante: https://www.cirad.fr/les-actualites-du-cirad/actualites/2021/impress-contribution-de-la-recherche-aux-impacts-societaux). In the latest years, the portfolio of research for development projects coordinated by Cirad has steadily increased. A mechanism has been recently put in place to foster and improve the systematic use of outcome evaluations for adaptive management and learning, and build a consistent body of knowledge on research contribution to societal and environmental changes. This mechanism will provide funding and methodological support to teams that are implementing flagship and strategic interventions.
Here you may find the methodological documents and other publications related to the ImpresS work: https://impress-impact-recherche.cirad.fr/resources/impress-publications
Diversity of perspectives and approaches appears to me as a key factor for assessing important dimensions of the performances and contributions of a research for development organisation and disentangle, as best as possible, the complexity of interactions and feedback loops that characterize the majority of these interventions in development contexts. Nonetheless, the most comprehensive evaluation framework would not be enough if the governance and management bodies at different organizational levels and the organisation as a whole would not be willing and able to learn and adapt based on the use of the evaluation findings.