Results of IEG's client survey – an exercise in evaluating the evaluator.

As evaluators we have to "walk the walk" and hold ourselves to the same standards we expect of others. In my blog When Rating Performance Start with Yourself, I spoke about the importance of being heard and knowing it, that is, the importance of feedback. One part of the evidence trail that tells us what difference we make is feedback from the users of our evaluations.

Late in 2014, we asked our clients at the World Bank Group (including the Board, management, and staff) and outside stakeholders (including academics, representatives of governments, international organizations and NGOs) to tell us what they think of our work.

Thanks to the many of you who responded! We had much greater engagement and higher response rates than in any previous survey of this kind. And here'€™s what you had to say:

How relevant is IEG to the mission of the World Bank Group mission?   

  • Board members, who showed a high level of appreciation in the previous survey, increased their assessment of our relevance in the top response category from 26% to 40%. Taking all ratings "above the line", we are hitting 100%.
     
  • Colleagues at the World Bank Group are not rating us as highly, but their assessment of our relevance has stayed stable at around 88%, in the range of highly to somewhat relevant. 
     
  • Clients outside the World Bank Group give a high rating (95%) for relevance. Respondents based in Africa, who showed a strong level of participation in the survey, were also the group of external clients who thought our work was most relevant.

Are we getting the balance right between accountability and learning?

  • Board members and external stakeholders are more satisfied than staff with the balance we achieve between accountability and learning in our work. Staff, particularly senior staff and those most familiar with IEG's work, believe we are overly focused on the accountability remit at the expense of learning.

How good are our evaluations?

This is an area of high importance where we have already, and will continue to invest a lot of effort:

  • Feedback on the quality of our work shows that on a scale of 1-6, with 6 being the highest, we have come out with an average of 4.2. Yes, well above the mid-point of 3, but not good enough. The client survey provided us with feedback on the overall quality, clarity, coherence, feasibility, and cost-effectiveness of recommendations. We are looking into these details to understand how we can do better.
  • The Survey also tells us that we need to be paying more attention to the timeliness of our work. Satisfaction levels among Board and WBG staff respondents have decreased with reference to this important criterion; and it is the one we rate lowest on when it comes to the evaluation process.

Aiming for the highest quality has been the main reason for falling behind on timeliness, so we are looking to address both of these points together to find ways to improve quality and efficiency.

To what extent are our products being used?

  • We have some way to go in this area too. Among World Bank Group staff responding to our survey, 65% said they are using our evaluations for various purposes. While the usage has gone up across all categories - whether in the form of advising clients, to designing results frameworks, projects or policies - we want that percentage to be much higher.
  • Among respondents from the Board, the percentage reaches above 90% for the use of our work in assessing country strategies, and 75% and above for reviewing projects, policies and procedures, sector strategies, giving inputs to the work of others or proposing a course of action.
  • A question that remains for us: if our authorizing environment, the Board, is such an avid user of insights from independent evaluation, how do we get others to take those lessons on board earlier and more readily in the process?

Are we making a difference?

Here the news is really positive. All categories of respondent have rated our impact and influence on the World Bank Group and the broader development community much higher than in the previous survey.

  • 94% of respondents from the Board find that IEG's work influences the World Bank Group's development effectiveness at least to some extent. For external audiences, this rate is 90%, and 75% of colleagues from the World Bank Group think so too.
     
  • We see a similar pattern, though at a slightly lower level, for IEG'€™s influence on the broader development community: 88% of Board members, 83% of external clients, and 63% of World Bank Group staff see IEG having at least some influence, often more than that.

How well do we communicate with you?

  • Email announcements, our website, as well as events and presentations provide the main access route to our product, particularly for external stakeholders who also appreciate our growing social media outreach efforts.

And where do we go from here?

The client survey is an important part of a range of structured feedback that we are about to receive regarding what works at and for IEG. We are currently discussing the best ways to respond to your feedback so that, next year, we'€™ll be even better.

You can access the full set of results here.

 

Comments

Submitted by Steven Mayer on Tue, 03/10/2015 - 02:23

Permalink
Nicely done!

Submitted by Vinod Nambiar on Wed, 03/11/2015 - 00:03

Permalink
Very nice and inspiring to read the involvement of third sector of stakeholders ( external).

Submitted by Kathy Steinmeier on Fri, 03/13/2015 - 06:35

Permalink
Too much complacency and self-congratulation for my taste. The real decision-makers in the World Bank, i.e. the VPs and directors will tell you that evaluations do not really influence decision-making. Pushing money out of the door is still the prime objective of operations staff.

Submitted by Ms Shashika Wi… on Sat, 03/14/2015 - 23:00

Permalink
Thank You. SWijesinha Sri Lanka

Submitted by Begnadehi Clau… on Mon, 03/16/2015 - 04:07

Permalink
IEG is doing a great job in terms of programs evaluation. Thank you for sharing this report that came up with relevant findings. I do believe that tis report should be disseminated all over the World using different channels. Best regards. Claude

Submitted by Josefine D on Mon, 03/16/2015 - 07:24

Permalink
If you have a scale of 1-6, 3 is not the mid-point. The mid-point is 3.5, wihch means your score of 4.2 is arguably not as "well above" the mid-point as indciated.

Submitted by Caroline Heider on Mon, 03/16/2015 - 23:29

In reply to by Josefine D

Permalink
Point taken. Your observation reinforces my message that we need to reach higher ratings. We also recognize that client feedback for evaluation is a tricky business: not all clients are dispassionate. To corroborate feedback we look also at other data sources like External Meta Evaluations.

Submitted by Caroline Heider on Tue, 03/17/2015 - 05:24

Permalink
Many thanks for the positive feedback from many of you. Really appreciated. And, Kathy, you are right in that evaluation is not -- and actually should not be -- the sole driver of decision-making: it is just one aspect, one source of information that needs to be taken into account when designing and implementing policies and projects.

Submitted by John O Brien on Tue, 03/24/2015 - 21:29

Permalink
Sorry, to strike a potentially negative note, but, "Globescan"? Really ? I suspect that Kathy Steinmeier and I may have a few things in common ? I know how vital and ignored evaluation really is. This flag-flying exercise, no matter how colourful, is not the way to go, if the objective is to prove and enhance the value and potential contribution of our field. This exercise was certainly not an "evaluation" in the sense that some might assume from the blurb. It was a highly controlled and very limited exercise, primarily designed to ellicit quantitative data, for activity management, PR and marketing activities. This data is certainly useful for some, but, for our ultimate end-users? I think we know who the expected readers are and that this says a lot about the real state of play in our field, if there really is one? This simplistic mechanical reliance on response-limiting tools will always excell on producing graphs, based on pre-set "closed " data responses. However, they are incapable of eliciting adequately qualitative and certainly more relevant, if inconveniently presented information, available from the direct experience of practitioners. However, I am not naive enough to think that getting to the real facts, the available "learning" or even questioning, much less exposing the waste and contradictions of the development "machines" was ever really on the agenda. More grey hairs on heads occupying eval seats, more real field experience for eval. managers and more practitioner-expert analysis of personal field and office case-studies instead of graph-friendly whitewash from friendly, herlpful, but ultimately super-manipulable computer/statistician types??? ( BTW, I am retired, so I am not touting for work for myself. I am just so sad at the way dubious eval is used for whitewashes and covering up cracks, when we could actually, sometimes, be a small but vital part of the solution

Submitted by Caroline Heider on Thu, 04/02/2015 - 08:38

In reply to by John O Brien

Permalink
John, many thanks for your comments. Having worked in evaluation for the last 25+ years, I agree with you that a client survey is not enough, and it is not an evaluation. It is just one of the many feedback mechanisms that we use. And yes, it is limited to a simplistic online survey instrument, as we reach out to the many readers of our evaluations. But, in addition (as mentioned in my blog of May 7th, 2014): IEG is being reviewed by an independent panel appointed by the World Bank’s Board and we eagerly await the results of that evaluation; and, we have also commissioned an external meta-evaluation to review the quality of a couple of our major evaluations. This latter exercise will be used to sharpen our quality assurance processes and as a model to help set up something like this on a regular basis. There are many other things we have done to improve our practice, as regular readers of this blog can tell. I hope you keep in touch and continue to share your experience and views.

Add new comment