Much as for the rest of the World Bank Group, the past year has required an unprecedented degree of adaptation and agility from all staff at the Independent Evaluation Group (IEG). For many, fiscal year (FY)21 may feel like a bridge between the old life and the new. At the beginning of FY21, we were just growing accustomed to the full-time remote work required by a worsening global coronavirus (COVID-19) pandemic and realizing that the changes were semipermanent. At IEG, we settled in for the long haul, quickly adjusting our ways of sharing information and methods of evaluation to overcome these new challenges. By the end of the fiscal year, we had built confidence in our abilities to collect data, interview distant stakeholders, and devise new remote mission strategies.
Evaluation too acts like a bridge, connecting hindsight and foresight through the objective analysis of past programs to find evidence that supports and informs positive change. Our job as evaluators is to share the insights and lessons derived from this evidence. In FY21, IEG focused on responding agilely to changing circumstances and innovating how we collected data and delivered our findings to those who needed them, when they needed them. We adapted our work program to align with the Bank Group’s COVID-19 pandemic response while continuing to build a pipeline of relevant, timely, and robust evaluations focused on long-term development challenges.
Where relevant, ongoing evaluations were adjusted to include an analysis of the impact of the pandemic on the specific theme and to identify evidence to inform the recovery. Early evaluations of the effectiveness of the Bank Group response to the pandemic were also launched. We used a variety of instruments—and in some cases innovated with new ones—to respond to the demand for evaluative insights. An early evaluation of the International Development Association (IDA) Private Sector Window, conducted as part of an ongoing evaluation of the International Finance Corporation (IFC) and the Multilateral Investment Guarantee Agency engagement in situations of fragility and conflict, was published in time to inform early discussions about the 20th Replenishment of IDA. To make findings from past evaluations more accessible, we stocked a digital library with lessons from Bank Group interventions in past global crises.
Throughout FY21, there was also an increased focus on outcome orientation: the process of gathering feedback, learning, and adapting that is central to improving development outcomes, which took on added significance during the pandemic. IEG supported the Bank Group’s efforts to increase its outcome orientation with the publication of an evaluation of the Bank Group’s outcome orientation at the country level and with ongoing input to the related reforms launched in the previous fiscal year. At IEG, we also made an effort to increase our own outcome orientation. IEG updated its results framework to follow a Monitoring, Evaluation, and Learning plan. The monitoring aspect focuses on outcomes achieved in FY21, while the evaluation and learning parts provide explicit instructions on how IEG evaluates itself to prioritize learning. The plan should support IEG’s drive to continuously improve itself.
Improving IEG means increasing our value to the institution and seeing our findings leading to concrete change. Two stories of the influence of IEG’s evaluative evidence in FY21 stand out. The first is how the new World Bank Group Strategic Framework for Knowledge Management draws heavily on IEG evaluation findings on knowledge, learning, knowledge flow, data for development, and knowledge management. The second is the impact of IEG’s focus on gender: data and lessons from IEG’s evaluations informed the Social Protection and Jobs Global Practice core curriculum and a World Bank session on empowering women in community-driven development projects.
We published 11 evaluations, a new annual validation report on the Management Action Record, and the Results and Performance of the World Bank Group 2020. Along with these larger evaluations, IEG produced hundreds of project-level evaluations and validations, which form a core of microproducts that buttress learning from very specific cases. These microproducts underwent a review in FY21 that revealed that, although the current formation of the microproducts promotes accountability, more can be done to enhance their learning value and uptake throughout the institution. Based on these findings, we created an action plan to maximize the value of these products by making them a source of information that stakeholders actively seek out to inform project design and policy dialogue. The new approach to microproducts will see IEG producing more synthesis reports, reducing the number of project-focused reports, increasing engagement during the rating validation process, and better targeting areas of interest for IFC. Our hope is that these improvements will build stronger relationships between evaluators and stakeholders and increase the learning value of IEG output.
In FY21, IEG also launched an initiative to increase knowledge and awareness of innovative data collection methods and tools for evaluators. We hosted three seminars with IEG evaluators, members of other multilateral development banks, Bank Group data teams, and external data specialists on topics such as using Twitter data and data-sensing tools and conducting remote interviews. Along with building IEG’s COVID-19 Lesson Library, we produced more bite-size products, such as blogs, videos, and infographics, to increase our ability to share knowledge and let stakeholders know about our activities in real time. These shorter pieces increased traffic to IEG’s website and engagement with social media, bringing more attention to IEG’s findings.
IEG’s mandate also includes increasing evaluation capacity. In FY21, we partnered with the Independent Evaluation Office of the United Nations Development Programme, among others, to launch the Global Evaluation Initiative (GEI). GEI is an exciting new step in global cooperation to build the practice of monitoring and evaluation (M&E). Governments, citizens, and experts come together to share knowledge and build programs that educate individuals and strengthen capacity for M&E. I encourage you to read more about the initiative in chapter 4 of this report.
We will carry the many lessons learned in FY21, about IEG’s functions, the topics and timing of evaluations, and how we communicate our knowledge, into FY22. For example, a gap analysis on whether significant portions of the Bank Group lending and investments have been missed in previous evaluations indicated no major holes in coverage. However, areas with less emphasis in the portfolio—public sector management and public administration, education, social development, and information and communication technology—will be partially addressed in the FY22 work program. The FY22 agenda also includes refining our approach to online content by introducing new formats such as visual summaries and a podcast.
The experiences of FY21 show that IEG can be agile and adaptable. FY22 provides us with an opportunity to build on the unique opportunities that have arisen from recent innovations in evaluation. I look forward to seeing the microproduct enhancements in action and continuing to present meaningful lessons through evaluation to inform the multiple aspects of green, resilient, and inclusive development.
Alison Evans Director-General, Evaluation