As we all know, Columbus crossed the Atlantic expecting to reach the East Indies but America got in the way. So, what would happen if a World Bank team were to travel back in time to evaluate his voyage, how would they rate the results and what lessons would they learn?

Applying our conventional approach to project evaluation:  the intended result was the discovery of the East Indies; the lesson learned was that when you venture into the unknown you should prepare for the unexpected; and one of the (many) unexpected results of Columbus’s venture was the colonization of America by Europe.

If you apply the World Bank’s objectives-based approach to evaluation, Columbus failed because he didn’t meet the target he had set himself. From a European perspective, however, the outcome of the voyage could be viewed as highly satisfactory; it made Europe richer and may have jumpstarted its industrial revolution.

Does this mean we should abandon objectives-based evaluation? No, but it does mean that the Bank should be vigilant about, and constantly monitor, both intended and unintended benefits and costs, and be willing to change its approach – to learn – in the light of experience. A static approach in a dynamic world would more than likely fail.

In April, IEG’s Director General, Caroline Heider, wrote in one of her regular blogs that what we all want is “a simple measure of success in a complex world.” Drawing on the recent work of Ben Ramalingam, author of Aid on the Edge of Chaos, the post suggested that the push to simplify may lead to misuse of tools like the logical framework and ignore rather than recognize complexity.

IEG’s program of evaluations on Learning and Results in World Bank Operations faces exactly the challenge that Caroline described: there is no linear relationship between learning and results. Learning is always a good thing - because it helps us adapt to complexity -  but it doesn’t always lead to results—or, at least, not the results we expected.

It is equally important to recognize that the perception of benefits and costs can vary across people and groups: we shouldn’t expect the Inca who has just contracted measles to rate the outcome of Pizarro’s venture the same way as the Spanish conquistador who has infected him. One of the duties of the evaluator is to listen to all the parties to a given outcome and accurately report the differences in their perspectives.

This is how IEG is approaching its second evaluation of how the Bank Group learns through investment lending.

First, we are examining the results chain of a series of linked operations (for example, a series of social protection projects in different countries linked by a common task team leader). We identify parties to the intervention who are knowledgeable about the Bank-supported projects—Bank staff, government officials, donors, nongovernment organizations and others—and we present them with evidence of results produced by the M&E of the project series. We ask them if the evidence is credible and points to the achievement of the expected results. Then we ask them if there were other results that were not anticipated.

Second, we present the parties to the intervention with a hypothesized “from-learning-to-results chain”. We identify a particular event that may have led to learning (for example, a design change following peer reviewer comments on a project concept document) and we trace plausible steps linking this event to the observed outcome. Then we ask them to comment on the plausibility of the hypothesis. Did learning result from the identified event and did it influence the outcome? Were there other events that were more important sources of learning with bigger results?

In this way, our evaluation takes an adaptive approach to the logical framework, allowing for expected outcomes to be altered in the light of learning—over a series of operations. It recognizes that the definition of “learning events that lead to results” will vary from one observer to the next.

The Bank needs to promote learning that enhances the intended results and minimizes any costs – how would you ensure that you are learning in a manner that accomplishes this?

Comments

Submitted by Allen F. Shapi… on Thu, 07/10/2014 - 05:42

Permalink
Good marks for this pragmatic approach to WBG project evaluations. Indeed, we must "expect the unexpected" and not be fearful of sharing and explaining the good and bad results of such events in order to expedite the learning curve. My two suggestions to ensure that WBG staff are benefiting from this approach are defined simply as follows: (i) Keep it all transparent, and (ii) Synthesize and share the feedback in a very timely manner.

Submitted by John Heath on Thu, 08/07/2014 - 07:24

In reply to by Allen F. Shapi…

Permalink
Dear Allen, Apologies for the belated response (I was on leave). I couldn't agree more with you about the need for the WBG to be transparent about its evaluation methods. IEG has taken steps to explain methods applied to ICR Reviews (through regional workshops). We still need to engage with Operations on the scope for refining the WBG's approach to objectives-based evaluation. Best wishes, John

Add new comment