The 1992 Wapenhans report – formally known as “Effective Implementation: Key to Development Impact” – was arguably the most influential evaluation ever at the World Bank. Many of the institution’s core quality assurance instruments and processes were created in response to the report’s findings. Although many newer staff today have not heard of it, the report was part of the induction of new Bank staff for many years.

The context of Wapenhans

The report was written by a team formed for just this purpose called the Portfolio Management Task Force, led by Willi A. Wapenhans, Vice President of the World Bank. The Operations Evaluation Department (OED) – precursor to the Independent Evaluation Group (IEG) – contributed but did not get to write the report.

The report’s inside history, documented in the archives, makes for fascinating reading 30 years later as it captures how the report came to be so influential.

Lewis Preston, who was new as President of the World Bank wanted an unvarnished view of the quality of the portfolio of projects under implementation. The Bank did not have a good measure of its development impact at the time. In the 1970s, World Bank President Robert McNamara had led the transformation of the Bank into a development institution; he created the Operations Evaluation function to begin assessing the institution’s development effectiveness, but the Bank did not yet have a full view of how well its projects led to sustainable development in client countries.  

President Lewis T. Preston, Annual Meetings, Bangkok, Thailand.
Former World Bank President Lewis Preston speaks at the 1993 World Bank/IMF Annual Meetings. Photo Credit: World Bank

Wapenhans identifies a steady decline in portfolio performance

The share of projects with major problems nearly doubled from 11% in Fiscal Year (FY)81 to 20% in FY91. Some 30% of projects in their fourth and fifth year of implementation were reported as having major problems.

The Wapenhans report placed much of the responsibility for weak portfolio quality on overly optimistic project approval by the Bank, driven by an “approval culture” and inadequate attention to risk and implementation planning. Wapenhans also identified a bias for complex projects with many components and co-financiers as contributing to weak quality at entry.

Further, Wapenhans advised that the Bank figure out how it should support project implementation. The division of labor with borrowers was unclear at the time, and there were major issues with procurement. Country assistance strategies did not factor in portfolio performance. Bank projects did not systematically address country and sector specific obstacles to implementation.

Lastly, the report concluded that the Bank’s evaluation system did too little to assess project outcomes and their sustainability once projects had closed, weakening the Bank’s ability to learn about what works and to ensure accountability for outcomes.

The findings resonate across the institution

Although some of the responses from Bank management were defensive, the extensive consultations that Wapenhans had caried out across the Bank and with its borrowers had created a strong head of steam for the report’s findings. Many operational staff and managers agreed that there were deep problems in project quality and implementation.

Reading through the responses to the report, there is a palpable sense that the process of producing and consulting on the report opened the floodgates for staff to share concerns that had been known for a long time but not openly acknowledged. Wapenhans used OED project ratings and other data on operational quality to make a compelling case that the portfolio was not in good shape.

The takeaways are still relevant for the World Bank Group today. The Wapenhans report was a watershed moment in how the Bank manages its portfolio. In response to the report, the Bank and OED created metrics of quality at entry, quality of supervision, and quality of M&E that are still in use today. It created a Board Committee on Development Effectiveness (CODE), Development Effectiveness units in the Regions, and eventually also the Quality Assurance Group, since disbanded. The Bank also strengthened its country assistance strategy process and linked it to assessment of portfolio quality.

Evaluation as a catalyst for change

All organizations face internal challenges from time to time. An organization like the World Bank with its smart, reflective, mission-driven people has the capacity to identify and correct organizational challenges from within. All the while, improvement processes need a catalyst. In this case, the catalyst was Wapenhans’ comprehensive evaluation backed up by Preston’s firm support.

Today, it is the role of IEG to produce evaluations of the World Bank Group’s development effectiveness. We see a reflection of our own experience in the story of the Wapenhans report: that evaluation based on solid data often helps organizations, by providing a firmer diagnostic underpinning to internal discussions about results and effectiveness that would otherwise be more anecdotal. Evaluations provide a solid ground of evidence to support organizational change and strategy development.

In conclusion, the Wapenhans report led to sustained changes in how the World Bank Group manages for quality and results. This was possible because Wapenhans got many things right: the report was timely, full of credible data and solid analysis, informed by extensive engagement, and followed up by changes in management practices. These have all become essential elements of evaluation good practices that IEG adheres to.

 

First photo: Interview with Mr. Willi A. Wapenhans, left, and Mr. P. Karieti, right, in 1983. Photo credit: World Bank.

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.