REAGAN RONALD OJOK

REAGAN RONALD OJOK

SENIOR MONITORING AND EVALUATION OFFICER
UGANDA DEVELOPMENT BANK
Ouganda

Ojok Reagan Ronald is the Senior Monitoring and Evaluation Officer at Uganda Development Bank, a national Development Finance Institution. Reagan is experienced in setting up monitoring and evaluation systems which track performance of complex projects in social sectors, networks & coalitions, humanitarian emergencies, policy influence and advocacy.  He has over 9 years of experience in economic research and analysis, conducting and commissioning evaluations, application of economic principles in development programmes. He has a diversity of experience and profession, having worked in the private and public sectors in Uganda and Kenya.

Ojok is a graduate of Development Economics and MA Economic Policy and Management of Makerere University. He holds postgraduate training in Project Monitoring and Evaluation. He is trained in Outcome Mapping & Harvesting from University of Bologna-Italy, Gender Equality for Development Effectiveness from International Training Center of ILO in Turin-Italy.  He is a member of European Evaluation Society, African Evaluation Association (AfrEA) and Uganda Evaluation Association (UEA).

My contributions

    • Dear Natalia,

      I would like to thank you for raising this issue. No doubt it’s of very high importance.

      The Devil’s Advocate:

      I have not seen anyone responding as a commissioner and allow me to attempt to fit in their shoes in this scenario. I begin by making this simple assumption: That most evaluation handbooks are developed by competent consultants who are proud to call themselves international with very voluminous CVs.

      Secondly, let us face the reality here. Everyone is criticizing the institutional evaluation handbooks as most time "poorly developed" with a lot of gaps. Who develops those handbooks? Is it not us the consultants? Let us own our mistakes as evaluation consultants that sometimes we end up setting traps for our colleagues in the future by producing work of low quality.

      Similar challenges due to differences in terminologies

      This is a very common challenge. In my opinion, definition is not cast in stone and I subscribe to the school of thought that is flexible enough to modify it here and there due to complexities in other sector. My approach has always been bringing the matter to the attention of the evaluation management team. Based on what is emerging, I would suggest possible working definitions so that the depth and breadth of the evaluation is appropriate to accommodate whatever is emerging. I think in your case, you stood a better chance of providing findings that might influence the review of the handbook if it has become very limiting in its definition and approach.

      I hope my one cent contribution gets a soft landing in the ears and hearts of fellow evaluators.

      Thanks

    • Dear Abubakar,

      I think Dr. Emile and Zahid have nicely driven the point home and I would like to agree with the both of them. Other factors constant, the M&E effort coupled with good project managers and a conducive environment is capable of impacting significantly on the project, target community and the policy framework in a particular country.

      The contribution of a well-designed M&E system that has been tracking progress and proposing possible realignment (where need be) is reflected in policy change, practice change and behavioral change respectively (depending on the level of change).

      In your problem statement (If I may be allowed to call it that), I hear a bit of complexity and uncertainty on the different level of impact (with your example of agriculture) and tracking how they occur (time & place).

      I would think that if you adopt Outcome harvesting (mixed with another approach like contribution tracing) you would be able to trace the impact of your M&E system. I chose Outcome harvesting because it does not measure progress towards a predetermined objective, but rather collects evidence of what has changed and then work backward to trace a plausible relationship between the change and an intervention contribution to this change. So if government proposes to set up an agricultural bank, this can be traced back to recommendations from some M&E effort from a particular project, district, ministry or agency.

      Secondly, if government  proposes agricultural insurance policy, this policy change can be traced to (i) The project Intervention;  (ii) Project M&E efforts (iii) Environment and what it means for agriculture etc.

      Cheers.

    • Greetings Mr. Abubakar,

      If I hear you well, the challenge of measuring Monitoring and Evaluation work is just a symptom of a bigger problem with the national M&E system. Your ministry (office of the prime minister) has the mandate to oversee M&E functions in the country. The office of the president also carries out some parallel M&E work, including other government ministries (MoFPED-BMAU) charged with responsibility of carrying out budget M&E.

      In such setting how do we make all efforts complement each other? How do we measure the M&E work as government? The answer to these questions might not come at the end of my submission but am glad we are discussing it now. It is very important!

      While all the above is happening, there are also some pockets of project based M&E work in other government agencies and ministries (small scale). You rightly stated a possibility of coming up with some indicators of sort, that is a good proposal. However, locating the national M&E system within the broader planning strategy for the country (NDP 2) and linking the system to the different efforts would help in aggregating all M&E efforts. This way, I think you might be able to measure the impact of M&E work, maybe by producing an annual series of Reports on Impact of M&E for government of Uganda "The Annual M&E Outlook".

      All the best and feel free to get in touch if need be.

      Thanks,

      Reagan Ronald Ojok
      Uganda Development Bank