The Three Pillars of a Working Evaluation Function: IEG's Experience
The second in a 3-part retrospective series by Director General Caroline Heider.
As promised in my last blog in this series, here are more details on what’s important under each of our three strategic pillars.
1. Making Strategic Choices.
Every year, IEG undertakes a select few thematic, sector, and corporate evaluations. They span larger subsets of lending portfolios, and include strategy, policy, and convening role of the Bank Group. These major evaluations are selected through a process of analyses and consultation that results in topics that matter to institution’s strategic directions and development impact. Our selection process involves 1) consultations with stakeholders in the World Bank Group—the Board, senior leadership, mid-level management—and in academia and civil society, 2) gap analyses to understand whether IEG’s work has left material gaps that should be covered or over-evaluated certain areas, and 3) review of other sources that discuss important future trends or directions, like strategic directions of the World Bank Group and from external sources like the World Economic Forum.
During the 2018 informal discussion of IEG’s work program with the Committee on Development Effectiveness (CODE), which is the Board committee that discusses our evaluations, the CODE chair and other Executive Directors made a point to congratulate IEG on its efforts to strategically align its work program with World Bank Group strategic priorities.
2. Enhancing Quality and Credibility.
Influence depends on actual and perceived quality of evidence and evaluation process. If stakeholders do not trust the findings, why should they act on them? And, methodology is key to quality and credibility.
In this regard, the World Bank Group is a demanding customer because staff are highly qualified and engaged in analytical and research work. The first major evaluation that was submitted under my leadership received 64 pages of comments, many of them questioning the methodology together with the findings. At that time, CODE had demanded that they review and discuss our evaluation approach papers. Too many evaluations were being questioned at the end of the evaluation when discussing the report in the committee on grounds of faulty methods. The committee’s deliberations on our evaluations could not focus on substantive findings because they were drawn to arbitration between IEG and WBG management.
Within my first three months as IEG’s director general, we adopted our first standards for approach papers and updated the review process for them. The new requirements shifted our mindset from shorter approach papers with cursory mention of methodologies to a more thorough analysis of the evaluand, elaboration of theories of change, and selection of methods that correspond to evaluation questions. With the creation of the Methods Advisory Function in 2016 we saw a big shift in the design of our evaluations. Today, approach papers elaborate choices of a mix of methods that have enhanced the credibility of our work, improved the basis on which our evaluations are conducted, and the quality of our reports. And: comments on our reports and discussions at the Board rightly focus on findings and implications rather than the approach we took.
3. Reaching Out Strategically and Systematically
The job of evaluation is not done with the completion of a report. Reaching the right people with the right information at the right time in the right format is essential to ensure lessons can be absorbed and applied.
While IEG has been fortunate to have had a department on knowledge, learning, and communication for a long time, I have seen an incredible transformation of our work in this area. The team has developed a nested strategy just for their work. Nested in that it takes IEG’s goal and strategy, and applies them systematically to all that they do. This vision has led to clear priorities and a cohesive set of activities that are designed for our agenda to Influence change.
The key to our ability to influence change is to engage with audiences that could (and should) learn from our evaluations. In IEG, with have employed multiple avenues to reach our audiences and share evaluation insights in constructive and useful ways. In doing so, we have been mindful of different needs of diverse audiences. For instance, the G20s interest is in high-level strategic issues that are drawn from a number of evaluations and packaged in a short report. Other audiences, like operational teams in the World Bank Group seek opportunities to review and unpack data, and debate what it means for them and how to apply lessons. And, IEG’s “WhatWorks” blog speaks largely to the global evaluation community that are interested in evaluation-focused topics rather than sector-specific findings.
Our stakeholder outreach work is complemented by our knowledge and learning agenda. IEG has long offered training courses as an integral part of World Bank Group offering and through our support to evaluation capacity development in client countries. We have expanded this kind of training, and added new formats, such as the learning engagements, which are demand-driven tailored engagements that use existing IEG data and employ user-centered techniques to help with learning, absorption, and application of evaluation findings. The underlying philosophy: by training staff we can have multiplier effects far beyond what we could achieve if we tried to fix project designs at the beginning through hands-on advice.
Read Part 1 of this series, Looking back at 7 Years at the Helm of IEG.
Read Part 3 of this series, In Conclusion...