Early-Stage Evaluation of the Multiphase Programmatic Approach
Chapter 2 | Scope, Evaluation Questions, and Methodology
Scope
The evaluation portfolio is limited to the 40 approved nonemergency MPAs as of December 31, 2023, as assessing emergency MPAs would require a distinct evaluation framework. It therefore excludes the COVID-19 Strategic Preparedness and Response Plan MPA and the Emergency Locust Response Program, the first of which has been covered by a separate evaluation (World Bank 2022b).
The evaluation assesses the performance of the MPA against the expectations outlined in the 2017 Board paper. The scope of this evaluation is largely determined by the youth of the MPA portfolio. All 40 nonemergency MPAs are under implementation, and 17 of them were approved in 2023. No ex post assessment of outcomes is therefore possible. The evaluation instead asks if MPA design is fulfilling the expectations outlined in the 2017 Board paper, if the specific features of MPAs are functioning as expected, and if there is an enabling environment for MPAs within the World Bank and on the client side. The evaluation also assesses the claims made in the OPCS technical briefing to the Board on the processing times for MPAs relative to IPF (see appendix D).1
The evaluation does not look at the achievement of long-term outcomes, assess the uptake of MPAs or whether there have been “missed opportunities” to apply an MPA, ask if the policy scope of MPAs or the delegation of authority for MPAs should be expanded, or assess the improved efficiency of management of the Bank Group’s financial resources. These issues are outside the scope of this evaluation and, given the youthful nature of the portfolio, not evaluable.
Evaluation Framework
The overarching objective of the evaluation is to assess whether the MPA is meeting Board expectations on design and functioning so far. This evaluation assumes that Board expectations were set by the 2017 Board paper. The evaluative framework for addressing this objective is underpinned by the theory of action for the evaluation (figure 2.1). In figure 2.1, column 1 contains the design features of the MPA—namely, the long-term horizon, the flexibility in the content and timing of phases, the learning requirements, and the processing efficiency—that are expected to support the MPA objectives listed in column 2: coherence, continuity, learning, and adaptation.
Source: Independent Evaluation Group.
Note: The Independent Evaluation Group developed this theory of action based on discussions with the Operations Policy and Country Services unit and a review of relevant World Bank documents. MPA = multiphase programmatic approach.* All MPA features feed to some extent into all key MPA objectives.
.
These design objectives are not unique to the MPA, but the approach was expected to enhance their delivery. All projects are expected to aim for these design objectives. However, the MPA’s design features were meant to strengthen its ability to support a learning-based, adaptive, stable, and coherent program and thereby better support development effectiveness in the face of recurring and complex development challenges. These design objectives are briefly described as follows:
- Coherence. A coherent program fits within the broader program at the level of the country, sector, and institution. The MPA is expected to be more coherent than its alternatives because it was intended to leverage external partnerships and internal collaboration more effectively.
- Continuity. This refers to the MPA’s ability to provide stable, long-term support, mainly for vertical MPAs. The vertical MPA supports continuity better than its alternatives because of its programmatic structure and the provision for overlapping phases.
- Learning. Although all operations should embed knowledge, the MPA requires an explicit learning plan, with specificity on implementation arrangements and how the knowledge is to be used.
- Adaptation. This refers to the ability of the MPA to adjust the content and timing of the phases in response to new information, evolving priorities, and changing context to better support the PrDO. An MPA would also have a larger number of preset points at which a stocktake could be done—for example, the Mid-Term Review of each phase—and would be better positioned to use learning to inform the design and implementation of subsequent phases. Adaptation therefore takes place both through restructuring and through learning-informed program design.
Three questions are addressed in this evaluation:
- To what extent has the design of MPAs followed Board expectations and management guidance?
- a. To what extent have the objectives of MPA operations been oriented toward high-level impacts, including climate-related objectives and private capital mobilization?
- b. To what extent have MPAs been designed to support institutional development and learning?
- c. To what extent do MPAs conform to either the horizontal or vertical models outlined in the 2017 Board paper?
- To what extent have the design features embedded in the MPA worked as expected to achieve design objectives?
- a. To what extent have the design features improved the coherence of interventions?
- b. To what extent have the design features supported programmatic continuity?
- c. To what extent have the design features facilitated and supported monitoring of learning within or across phases?
- d. To what extent have the design features supported adaptation to changing circumstances and priorities?
- Under what circumstances or enabling conditions has the MPA worked as intended?
- a. To what extent have client-side conditions enabled or prevented the MPA from working as intended?
- b. To what extent have conditions within the Bank Group enabled or prevented the MPA from working as intended?
Since the analysis is largely ex ante, the evaluation relies on hypothesized mechanisms associated with the development effectiveness of the MPA. As the portfolio is still under implementation and nearly half the programs were approved in fiscal year 2023, it is not possible to evaluate outcomes or impact. The evaluation therefore assesses the extent to which the design objectives anchoring the theory of action are being achieved through specific hypothesized mechanisms, as described in table 2.1. The observable implications vary by type of MPA and are proposed based on technical discussions with OPCS and a review of project documents. More details on how achievement of the design objectives was evaluated are given in appendix A.
Table 2.1. Hypothesized Multiphase Programmatic Approach Mechanism Associated with Development Effectiveness
Design Objective |
Mechanism |
Observable Implications |
|
Vertical |
Horizontal |
||
Coherence (World Bank 2017) |
Agreement on long-term objectives and constraints across Global Practices and with development partners; management of risks to program’s ability to stay on track toward meeting program development objectives |
Articulation of long-term objectives in country strategies that strengthens consensus around them within the World Bank team; greater cross-sector collaboration on the World Bank side and the client side; greater collaboration with external partners; more private capital mobilization |
Same as for vertical; evidence of additionality from regional approach |
Continuity |
Greater likelihood of long-term financing without interruption in engagement |
Overlapping phases; management of risks to continuity |
Management of risks to continuity |
Learning |
Requirement of learning plan in PAD backed by monitoring, implementation arrangements, and capture of lessons learned |
World Bank supervision more oriented toward learning than compliance; more self-evaluation by vertical MPA clients than in a single large operation |
More parallel learning across World Bank teams and clients than in a set of independent operations |
Adaptation |
Multiple points for reflection (Mid-Term Review and the end of each phase) that enable restructuring or cancellation of activities |
Earlier cancellation or restructuring in response to changed circumstances and lessons learned than in a single large operation; more evidence of restructuring anchored in learning; more frequent restructuring |
Same as for vertical |
Source: Independent Evaluation Group.
Note: The mechanisms will be refined and expanded during the evaluation. MPA = multiphase programmatic approach; PAD = Project Appraisal Document.
Methods
The evaluation relies on a two-pronged analytical strategy, using (i) data analysis and (ii) key informant interviews across all evaluation questions. It uses a portfolio review of the 40 approved nonemergency MPAs and a set of comparators or selected non-MPA operations; a desk-based document review; and structured and semistructured interviews with key informants.
First, the evaluation relies on analysis of data from all nonemergency MPAs to assess the MPA design characteristics and mechanisms through in-depth content analysis of project documents. Then, it tests the extent to which MPAs follow a business-as-usual model by comparing the set of MPAs with a matched non-MPA comparator group comprising approximately 60 non-MPA operations. We extracted and coded several outcomes for both the MPA and comparator groups (see appendix B for a list of extracted outcomes and coding criteria). We focused on testing the observable implications of the MPA expected to materialize earlier in the project life cycle.
To construct the comparator group, two groups of comparator projects (henceforth referred to as “comparators”) were selected to maximize intervention similarity while minimizing the influence of key confounders. The groups comprise (i) the most similar operations in the same country, for vertical MPAs, or in the same Region, for horizontal MPAs, and (ii) the most similar operations in a similar context, as measured by the public administration Country Policy and Institutional Assessments for fiscal years 2018–22 (as a proxy for institutional capacity). Project similarity is calculated as the cosine distance of mean text embeddings from the project description section of Project Appraisal Documents from that of reference MPA projects.
Second, the evaluation leverages structured and semistructured interviews with key informants, covering approximately 75 percent of MPAs, to validate findings, bridge gaps in evidence, understand how MPAs operate in the field, and triangulate the perspectives of various stakeholders. We conducted interviews with respondents within the World Bank (task team leaders [TTLs] of horizontal and vertical MPAs, practice managers, country directors, regional directors, directors of strategy and operations, and vice presidents) and from client countries, using a combination of purposive sampling strategies and stratification. We followed structured and semistructured interview protocols—asking all respondents within the same respondent category a set of identical questions—and then extracted and coded several items mapped to the evaluation subquestions using manual processing and NVivo (see appendix C). We mitigated potential biases inherent in this type of data (for example, selection, social desirability, confirmation) via proper selection of interviewees, projects, and interview questions (see appendix A, table A.1, and appendix C for detailed discussions of how we ensured the robustness of our analyses).
In the next chapter, we triangulate evidence from these sources to determine if MPAs align with expectations and add value through improved programmatic coherence, continuity, learning, and adaptability relative to comparators.
- The Operations Policy and Country Services unit also noted that the multiphase programmatic approach would enable clients to achieve higher-level results faster than a set of stand-alone operations. We view this claim as unverifiable given the youth of the multiphase programmatic approach portfolio.