By Rachel Eager

CDI started the new year with a bang, launching our inaugural ‘Designing effective ways to evaluate impact’ short course at IDS, led by CDI director Chris Barnett and IDS visiting fellow Rob D. van den Berg. Aiming to explore a range of approaches to understanding causality and the associated designs and methods, the course brought together participants interested in pushing their thinking and knowledge of impact evaluation beyond a reliance on counterfactual logic to explore a wider range of evaluation designs. Three highlights stood out:

  • There are a multitude of different definitions and approaches for what we mean by impact in the development sector. We discussed the limitations of experimental and quasi-experimental approaches and reflected on the different ways of thinking about causality. Configurational, generative and participatory approaches all suggest a causal link in fundamentally different ways. Each approach requires different evaluation design approaches, tools and methods and each has its strengths and weaknesses. Learning about these different approaches and their associated designs provided the framework for the rest of the week. 
  • There are a wide range of methodologies we can use to evaluate impact. Itad’s Kathi Welle introduced our participants to qualitative comparative analysis. IDS' Dolf te Lintelo gave an interesting overview of process tracing using his work on understanding the impact of HANCI and challenging the group to develop rival hypotheses to those he used! We also heard from Inka Barnett on realist synthesis; Peter O’Flynn on his recent social network analysis work and Rob van den Berg on systems approaches. It was also great to welcome Laura Camfield, from the University of East Anglia, who helped the group to consider the level of participation in their evaluations and provided some useful examples of participatory methodologies.
  • Evaluators worldwide, from varying organisations and sectors, are all grappling with the same issues in designing credible impact evaluations. As we learnt about the broad range of impact evaluation designs and methods it became clear that to develop a credible, relevant impact evaluation we need to consider the implications of the evaluation questions, the operating context and potential utility. We then need a flexible and adaptive approach to select the most appropriate design approach, or indeed, mix of designs.

We had an interesting week discussing impact evaluation design challenges and considering the usefulness of a broad range of design approaches and methods with a fantastic group of participants. It was a great start to our CDI training programme and we look forward to running future courses!

Here’s what some of the course participants had to say:

“It has been one of the most useful training courses I’ve been on in my professional life”, Mr Abdulrahman Alzuebi (Country Programme Officer, Dubai Cares)

“Many methods explained here were new to me and so it was very useful to me to have a big and comprehensive picture of the methods and how to apply then in what conditions”, Dr Soleiman Pakseresht (Associate Professor, Bu-Ali Sina University)

“An excellent introductory course” and “I learnt a lot from the combined experience of everyone I met”, Ms Sonali Oshin Chopra (Programme Analyst, Dubai Cares)

[the most useful aspect of the course was] “opening up the toolbox of evaluation approaches and focusing on appropriateness...rather than picking out one particular approach for alleged superiority”, Nico Herforth (Evaluator, German Institute for Development Evaluation (DEval)

Partner(s): Institute of Development Studies