Joseph [user:field_middlename] Toindepi

Joseph Toindepi

International Development Consultant
JT Development Consulting
United Kingdom

Joseph Toindepi is a specialist in international development programmes with a BSc Degree, Master Degree and PhD in development studies. He has over 22 years experience as International DMEL Professional with years of experience leading complex evaluations as evaluation team leader for reginal and national program evaluations across Africa and managing teams, providing thought leadership. I have eight years in Global DMEL leadership developing MEL Strategy and delivery Systems, leading on DMEL capability development, providing Global Team Leadership, Management, Coaching and mentoring country teams as well as leading on designing Global research frameworks, Implementation and managing partnerships with academic institutions.

My contributions

    • Dear Evaluators and Colleagues

      I followed the conversations and contributions on this topic with much interest. I have been a mixed methods practitioner for several years both as a consultant and as part of my day job in the international development sector.

      My experience is that applying mixed methods in evaluations is easier said than done. The main challenge is misaligned expectations and or understanding of mixed methods between commissioners of evaluations and those assigned to conduct evaluations. As a consultant  regularly develop technical proposals or expression of interests for evaluation tenders. This process requires me to review several evaluation ToRs per day and I find that at least four in five ToRs specifically suggests or require mixed methods approach to be used. However, in the majority of the cases, there is often not sufficient time and or budget allocation to align with minimum necessities for carrying out  a decent logical mixed methods approach.

      Good evaluations require a good balance of complementary and, or supplementary quantitative and qualitative evidence. This therefore means regardless of whether the evaluation commissioner or the project has specifically asked for mixed methods, it is difficult in most cases to sufficiently address the standard evaluation questions without considering both quantitative and qualitative data. This means as an evaluator, you have only two choices; (1) either to operate within the time and budget limitations which might compromise the quality of evaluation results and impact on your professional integrity or (2) pay the price difference and go over and above the allocated time and budget in order to deliver quality.

      Finally, I would say that the demand for mixed methods or expectations in program evaluations is here to stay but the development sector is still a long way in realising the need to aligning those expectations with necessary enablers; particularly, pricing and timeframes.

      JT

    • In my experience Theories of Change remains a mythical concept in the minds of several development practitioners at field level, first because it is perceived as a compliance driven tool forced by donors and funders and second it is usually developed by consultants or technocrats with little involvement of implementing staff.

      Implementing staff usually takes on the role of understanding what is required to follow the ToC assumptions and rarely will you see the ToC in operation beyond siting in the Project Proposal document. The implementing teams rarely refers to ToC perhaps because there may be some difficulties in incorporating ToC concepts in day to day operations or it may be too complex for the field staff to engage with.

      I therefore see a disconnect between the intended purpose of the ToC in guiding programming and impact and the realisation of the same in practice. Also, there is the One Size fit all approach to ToC presentation which I believe could be another challenge. On one hand we want the ToC to fit on one page, highly simplified and easy to conceptualise, almost too simplistic for the real world realities. But that is what makes it easier to digest and make sense of, which is great for policy makers and high level audiences. However, for implementers, detail matters so much but we often simply leave the ToC at that high level nice and glossy presentation and expect the implementers to work some form of magic to translate that into logical delivery of interventions according to the conceptual assumptions without the necessary detailed exploration and unpacking of the ToC. Donors do not request that, it is needed by implementors so it is often left undone and implementation of project goes on with little to no reference to the ToC.

      The only time the ToC question will be revisited perhaps is when the project evaluation looks to test those assumptions assuming also that implementation was guided by those assumptions which we know is not always the case.