TWEET THIS

Evaluation must identify if timely and responsive course-corrections were made if needed.
Emergency situations need timely responses and challenge responders to be agile.

The pantheon of evaluation criteria – relevance, effectiveness, efficiency, impact, and sustainability – does not address the question whether timely and responsive course-corrections were made when needed. In today’s world – with a “new normal” of rapidly changing contexts, be it due to political economy, instability and involuntary migration, or climate change – this might seem surprising. But, 15 years ago development contexts seemed more stable, and the pace at which they changed was (or appeared to be) much slower than today.

Regular readers will recognize this piece as part of a series of blogs that discuss the challenges and changes that evaluation needs to live up to in the near future if it wants to avoid becoming redundant. For those who are joining the series now, please have a look back at our first two Rethinking Evaluation posts - Have we had enough of R/E/E/I/S?, and Is Relevance Still Relevant?  - and join the debate by commenting below. We are looking for your ideas, feedback, and debate.

Development practitioners have, for some time, argued that they are held accountable to objectives set several years earlier in a context that might have changed dramatically since. We evaluators, in turn, suggest at least two arguments in return. The problem might arise from poorly defined objectives at the outset that did not allow the flexibility to adjust tactics while continuing to pursue a higher (and still valid) objective. Or, in the absence of redefined objectives, it is not clear when or what kind of course-corrections were actually introduced that would provide the new basis for evaluation. Rigid bureaucratic systems often create disincentives to revising objectives, or misunderstandings exist about how changes to objectives are reflected in evaluations.

But, even if we resolved these problems, the pantheon of evaluation criteria – relevance, effectiveness, efficiency, impact, and sustainability – does not address the question of whether timely and responsive course-corrections were made when needed. In today’s world – with a “new normal” of rapidly changing contexts, be it due to political economy, instability and involuntary migration, or climate change – this might seem surprising. But, 15 years ago development contexts seemed more stable, and the pace at which they changed was (or appeared to be) much slower than today. Hence, the leaders in evaluation did not think, at the time, about the need for assessing agility and responsiveness.

This gap has been a larger issue in the humanitarian world. Rapidly evolving emergency situations need timely responses and challenge responders to be agile and responsive to constantly changing situations. In these situations, stakeholders – from managers who must make quick decisions to donors who need to prioritize scarce resources – would benefit greatly from evaluative evidence that answers questions about the timeliness and appropriateness of course corrections.

This area, however, is a poorly recognized and hence hardly satisfied demand. Evaluators could address this need by adapting questions and tools of the craft. Questions that could enter the evaluator’s repertoire could include:

  1. Was the need for change anticipated at project design? Clearly, this is not the case for sudden onset disasters like earthquakes. But in other cases, an evaluation should be able to determine whether the potential need for changes in the future were recognized and built into adaptive management and corresponding monitoring systems.
  2. What drove the adaptation process? Here, an evaluation should seek to understand whether development partners proactively monitored relevant indicators and situational information and how that information was used in deciding on course-corrections.
  3. Was adaptation timely?  Establishing timelines of events and tracing when course-corrections were undertaken will be essential to determine whether solutions were sought pro-actively or rather forced by circumstances. 
  4. And what would have happened if….? This is a classic question of establishing counterfactuals, but in this case one that needs to determine whether outcomes were better or worse because course-corrections were made or failed to be made.

These are tough challenges to grapple with in evaluation, in particular as many of the details, processes, and conversations that lead to course-corrections are not documented.

Nonetheless, as agility and responsiveness are important determinants of success or failure, evaluation needs to adopt a specific focus on agility and responsiveness to provide feedback, be it by giving credit for responsiveness and agility when it is due, and identifying opportunities to improve when needed. This alone, I believe will incentivize debates and actions within institutions to anticipate the need for timely and responsive adaptation.

Will that be enough to overcome inertia where it exists? Maybe not, but it is a contribution that evaluation can make.

Comments

Submitted by Max Merit on Tue, 02/07/2017 - 19:11

Permalink

Incorporating the new criterion into evaluations will add complexity to the art and science of evaluation. For example, while responsiveness and agility may be seen by a subset of the team as more than less of an imperative to achieving success, it may be that key stakeholders disagree on the what, how and when. Also, it would seem natural that a leader be empowered to change course on policies and programs in a timely manner, though there are different incentives and numerous approaches to leadership that will need to also be considered. This implies a need for additional training for evaluators, as well as new requirements for self-evaluation to capture some of the context/thinking while still in the moment.

Entirely right, it would take new skills and tools to do this, and to do it in ways to identify where and when systems limit the capacity of leaders to make necessary course corrections, as well as understand when wrong choices were exercised to avoid making the same mistakes over.

Submitted by Anna Guerraggio on Wed, 02/08/2017 - 09:25

Permalink

Dear Caroline, agility and responsiveness are indeed very important, especially in humanitarian and peacekeeping contexts. I find that the argument that development practitioners often advance about ‘objectives no longer valid’ is a bit prosaic, as it very much depends at what level objectives are defined. Expected accomplishments and longer-term goals might remain the same, even in case of a change in environment, while immediate outcomes could significantly vary. That being said, I think it is very important, as you very well put it, that evaluators maintain an equal level of flexibility in assessing the relevance, efficiency, and effectiveness of programs, and acknowledge managers' capacity to adapt to change when it occurred. Equally important, I think that there is room for evaluators to further assess any resistance or 'immunity to change', not only at systemic and institutional level but also at team and individual level. This is where evaluators could join efforts with organizational psychologists to understand better the internal dynamics and inform more sustainable patterns of change.

Submitted by Ting Yang on Wed, 02/08/2017 - 10:56

Permalink

Thanks Caroline for raising this very important but challenging aspect. This resonates with the previous topic about relevance (in the evaluation criteria chain): being responsive and agile in order to remain relevant? This needs an enabling evaluative culture and environment, both at institutional and individual level, as well as methodological diversity and flexibility. However, current donor environment shows apparent preference and requirement for rigid tools such as logframes. A balance needed on manageable degree of responsiveness and agility, which perhaps is most challenging and needs some underlying transformation. Even though challenging as it is, it still needs us to face, discuss and better our responses.

Thank you Ting for your comments. I agree that the discussion is needed beyond the evaluation community, but am focused on the professional group where I have the greatest responsibility and maybe some influence. Wouldn't it be great if the discussion among evaluators would influence practitioners, including donors, to adopt new ways of working?

Thanks Caroline for your prompt reply. Yes, true. People tend to stay in the familiar and comfort zone, it's great to have such thought-provoking discussions initiated and led by IEG, which certainly benefits not only the evaluation community but also wider range of groups of development practitioners.

Submitted by Zubair Faisal Abbasi on Wed, 02/08/2017 - 13:22

Permalink

I have recently worked on a number of assignments on evaluation of humanitarian programmes. After reading this blog post, I feel nowhere the indicators of agility and responsiveness are relevant that humanitarian programmes. Although we try to cover things under the rubric of relevance but the programmes are actually either responsive to the needs or they waste resources. However, in development interventions too. Responsiveness should be measured possibly while understanding "the most significant changes" and while unpacking the logic model. Thanks for the blog and the ideas. I shall try to experiment with it in the next assignment.

Submitted by Ehtisham ul Ha… on Tue, 02/14/2017 - 10:41

Permalink

Dear Caroline, Many thanks for sharing another perspective of looking at evidence building agenda with another perspective. I will test these additional questions to include in the upcoming evaluation studies. These are very useful points to learn/understand about the programme management and how often programme teams utilised the evidence to inform decision making. on similar lines , we have developed process indicators to ensure the utilisation of evaluation studies. we will gather more evidence on the utility of this system. lets see how these things will work in future.
I must say that you have great contribution to bring in a fresh and interesting perspective in the development sector. I really find your blogs excellent.

Thanks for your support and contribution.

Add new comment

By submitting this form, you accept the Mollom privacy policy.