Nicola [user:field_middlename] Theunissen

Nicola Theunissen

Communications consultant
World Food Programme
South Africa

My contributions

    • Dear Harriet,

      What a great discussion, and so many interesting reflections! I like Kombate’s point on visual tools’ ability to increase impact. Coming from a media background, I can concur that the use of visual tools during the evaluation process enables a more compelling story to be told after the evaluation has been completed (and published) – or while it is being conducted, for that matter.

      On the point of learning, the WFP Office of Evaluation has conducted an evaluation stakeholder survey on communication products and evidence use. Some findings below:

      • On the question of how evidence can be better channelled, packaged and presented to be useful – 31% of audiences indicated shorter reports with simpler language, 28% highlighted scope, with more relevant questions and findings, 17% highlighted processes, 10% accessibility and channelling, 9% variety with more tailored products and 5% indicated timeliness
      • Overall shorter evaluation products (such as summary evaluation reports) had a much higher usefulness rating than full evaluation reports – 81% compared to 58%
      • Products like briefs and infographics had a higher usefulness rating among directors and senior management
      • Although it’s clear from the findings that there is a need for shorter text, when asked about the preferred format for receiving evaluation information the majority of the audience still indicated reading (74%) – followed by oral (65%), video (56%) and then audio (39%)

      Translating evaluation info into a visual language requires niche skills that could be context-specific and culturally sensitive. I’m a big advocate of using visualization in evaluation processes and products; however it’s important to consider that specific audiences may still require text-based info (using visualization as an aid to the message); while others may prefer to receive the entire message in a visual format. Understanding audiences’ specific information needs are therefore critical to achieve the evaluation’s purpose and use.

  • The old refrain that there are not enough skilled evaluators in Africa has passed its sell-by date. Realizing the need to offer solutions, the Centres for Learning on Evaluation and Results – Anglophone Africa (CLEAR-AA), the South African Monitoring and Evaluation Association (SAMEA) and the World Food Programme (WFP) recently joined forces to develop a tailored Emerging Evaluator Programme.

    The work immersion programme was launched in June 2021 during gLOCAL Evaluation Week, bringing six emerging evaluators on board for a year. The programme is taking them on a “deep dive” into evaluation work from different perspectives. For example, with WFP’s

  • Peru’s flagship social protection programme, Qali Warma, which means ‘vigorous child’ from the indigenous languages Quechua, provides nutritious food to more than 4-million children each year.

    Building on a need for better evidence, the M&E unit of Peru’s Ministry of Development and Social Inclusion led an impact evaluation of the programme to assess the extent to which the programme improved the cognitive processes, nutritional status, calorie intake, and school attendance of primary school students.

    UNDP commissioned the evaluation to Pacific University in 2018/2019, while WFP provided technical advice, playing the role of an enabler, knowledge broker and facilitator of