We interviewed Julia Betts, independent evaluation consultant, who is currently supporting WFP evaluation activities related to the response to COVID-19.
She shares her views on what to consider when designing an evaluation, the challenges of producing real-time information and innovative ways of feeding back findings to affected people.
Evaluations of Covid-19 responses are now happening in many countries and organisations. What factors need to be taken into account when embarking on a Covid evaluation?
There are so many things to consider. Firstly, agencies are overburdened already just trying to implement their response. You have to think carefully about how to reduce demands on staff. Secondly, the design needs to consider how you can contribute to learning as you go – a lot of large evaluations take several months or a year to conduct, which is too late when organizational responses are moving quickly. So building in useful evidence outputs as the study proceeds. Thirdly, identifying what can be done remotely and what really needs field study, even if that’s possible. And finally, being very conscious of the speed of the response – in effect, you are evaluating a moving picture. Methodologically, you need to build in a way to capture this dynamism.
The challenges to independent and credible evaluations arising from Covid-19 are well documented but are there also opportunities for change? How do you see the practice of evaluation evolving?
The field of evaluation is evolving very quickly. There’s a big emphasis at the moment, linked to UN reform, on coordinated evaluative work - so for example there is an international coalition on evaluation, housed by the OECD DAC. Also there is now a big demand for ‘real time’ information to help with accountability and learning. A great deal of work is being done on how to conduct evaluations remotely, using remote monitoring data or surveys.
I hope we can retain some of the good practices that have come out of the this enforced situation – such as the co-ordinated work and the learning on remote technologies – and build them into practice going forward.
In terms of “closing the learning loop”, what do you consider to be the best ways to involve and feed findings back to affected populations?
Well there are lots of different ways of doing this – videos, discussion groups, etc. It can be challenging to make e.g. policy evaluation findings directly relevant to a focus group of beneficiaries you interviewed back in Homs, when their assistance was delivered through a co-operating partner. Often they don’t directly know the international agency being evaluated.
I think it’s about considering which groups would best benefit from the feedback of which evaluation findings, and then developing tailored products. So perhaps local co-operating partners would like to hear about the findings on partnership. Maybe beneficiaries would like to know about how timely the assistance was, and whether it met their needs. So [key is] adapting to need/audience.
What is the biggest challenge you have faced as an evaluator and how did you manage it?
The one that comes to mind is the Paris Declaration Evaluation Phase 2 back in 2011. There were so many challenges – it was the largest aid evaluation ever conducted, with over 50 studies in more than 20 countries, 11 donor reviews, special studies and so on. The International Reference Group alone had 52 members.
I was new to international synthesis then, and it was quite a journey. I was very lucky in that the evaluation had an extraordinary team leader and evaluation manager, and this helped a lot throughout the process.
What do you consider to be the most practical skills an evaluator should possess in carrying out their work?
Honestly, technical skills aside, I think it’s the ability to get on with people. You have to remember that the response you are assessing is ultimately people’s work, which is often done under very difficult conditions. Treating both it and them respectfully does make for a much better, and ultimately more useful, evaluation process, in my experience.
Having been involved in numerous evaluations across the globe, what aspects of your work do you find most rewarding? What drives you on?
I suppose for me it’s the same commitment as for most humanitarian staff. At the end of the day, whether evaluators or agency personnel, we’re all trying to do the same thing – to make a contribution, to make what difference we can, in our own way, for those who need humanitarian support.
I think you know when you’ve found your niche. Each study is so different, and each has its own fascination. I feel I have the most interesting and rewarding professional life I could ever have imagined.