Using results frameworks to improve planning and to better identify potential dynamic synergies at country level.

Over the course of two days last week, Results Measurement and Evidence Stream (RMES) specialists from across the Bank Group met to share and showcase experiences and "trade secrets", and to discuss a wide range of topics, including some on which we have previously blogged, such as our recent piece on beneficiary participation in evaluation. Next week's blog will reflect on the event as a whole. This week, we want to concentrate on a central component of the Bank Group's planning process at country level that was the subject of debate at a number of RMES sessions - the Results Framework.

Results Frameworks were first introduced as a formal component of Bank Group Country Assistance Strategies (CAS) in 2005. They were intended to be a key tool that would contribute to improving the quality of country strategies as planning documents, maximizing development effectiveness, and to demonstrating measurable results in fostering growth and reducing poverty as a result of WBG advice, investment, and lending.

Today, every CAS and every Country Partnership Framework (CPF) -the new country strategy document that replaced the CAS from January, 2015 - has a results framework in which outcome indicators and milestones for tracking progress are defined, fulfilling important accountability as well as learning objectives. Overall, the results-based approach has helped bring about improvements, for example, an improved focus on results in country strategies, as well as better alignment between WBG country engagement and national priorities.

On the other hand, a recent learning review of our validation of CAS completion reports (replaced by Completion and Learning Reviews) identified weaknesses in many of the results frameworks reviewed over time, including: a focus on inputs and outputs rather than outcomes; poor M&E systems, including indicators that do not "fit" associated objectives; and, perhaps most importantly, weak results chains.

Designing an effective operational results chain is possibly the most critical and challenging task in developing the results framework for a country strategy. The results chain is intended to present a logical statement of how planned WBG interventions will lead to the realization of objectives, beginning with inputs, moving through activities and outputs, culminating in outcomes, impacts, and feedback. A well designed results chain identifies risks and makes explicit any underlying assumptions about client (government) or other stakeholder (e.g., firms, CSOs, communities) actions. A clearly constructed, logical results chain is critical to accountability, mid-course correction, and learning, and is also integral to exercising selectivity.

In our experience, the operational results chain is often the weakest element in the results frameworks. A common weakness is in the reporting of outcomes where, for example, there can often be very little association or linkage between project level and higher level outcomes. In the absence of a clear results chain to explain why the Bank Group is pursuing any given objective, the results matrix (metrics) is less meaningful, making it more difficult to evaluate success or to draw lessons for the future.

It is also important for the results chain to recognize synergies that can be generated by the combination of instruments and interventions at play - their joint impact may be greater than the sum of their parts. This approach may be critical in identifying country-level or transformational impacts as, ideally, the results chain should specify potential catalytic effects that will result from planned interventions designed to help realize higher level goals. Our recent learning review of results frameworks  identified possible effects that may be reported and that may contribute to articulating the dynamic catalytic or synergistic dimension of the results chain. For example, reporting on potential synergies associated with: Bank Group collaboration; scaling up of pilot projects; or potential synergies associated with enhancing the capacity of public institutions.

A strong, robust results framework that encompasses a well-designed logical results chain and a well-articulated results matrix is an evaluator's dream! However, given increased complexity and heterogeneity in the Bank Group's working environment as well as its ambitious goals, there is increasing awareness of the need for strong results frameworks not only to keep the evaluators happy, but to better manage and target scarce resources and, critically, better identify and articulate achievement associated with Bank Group efforts in association with clients.

This much was confirmed by participants from across the Bank Group who attended the RMES session that specifically addressed results frameworks and other factors relevant to the assessment of country programs. They were asked to comment on a case study based on an actual country completion report, and very quickly identified many of the issues referenced above including gaps in results chains and a lack of hard evidence in the documentation to support stated results. We welcome the opportunity for this type of open dialogue about the evaluation process and believe it will result in overall improvement in the planning process that is ultimately subject to evaluation. We look forward to continued discussion with colleagues through RMES and other fora about what works - stay tuned for more about RMES next week.


Submitted by YK on Tue, 03/10/2015 - 01:38

Thanks for the really insightful blog. I agree that absence of clear results chain is problematic as it won't be able to show how the intervention has contributed to higher goals. But at the country level another big challenge is lack of M&E capacity of the government and lack of or poorly managed data which is fundamental to answer the results chain. Practial challenge facing staff the country office often comes this gap between the high level results chain which might not be doable to track in the reality. I'd like to know how the WB works to improve client country's M&E capacity and engages client countries in developing a results framework.

Submitted by Shoghik Hovhannisyan on Mon, 03/16/2015 - 08:56

Dear YK, thanks for your comment. Our evaluation work confirms the importance of M&E systems and the challenges faced at country and project level in ensuring sound systems in that regard. In recognition of that fact as well as taking broader capacity development concerns into account, IEG supports and hosts the secretariat for Centers for Learning on Evaluation and Results (CLEAR). Established in 2010, CLEAR is also supported by other development partners, and operates as a collaborative, global partnership that works to strengthen partner countries’ capacities and systems for M&E and performance management (PM), to guide evidence-based development decisions. IEG is also a founder and current sponsor of the International Program for Development Evaluation Training (IPDET). Established in 2001, IPDET is an executive training program that aims to provide managers and practitioners the generic tools required to evaluate development policies, programs, and projects at the local, national, regional, and global levels. Within the World Bank Group we are, of course, deeply committed to the Results Measurement and Evidence Stream and, more generally, to good practices in M&E.

Submitted by Susan Stout on Wed, 05/06/2015 - 08:02

Completely agree with YK that working on "M&E Capacity" -- is a key issue -- though it is perhaps the key strategic missing element of the Bank's approach to building country systems -- where we are focus too narrowly and in silo-like fashion on procurement, FM and safeguards systems. I hope the Bank can significantly scale up in this area (and its got to be willingness as well as capacity to use results and evidence in decision making). So overdue to recognize and address this constraint -- maybe we can do something in the context of so-called data revolution to leapfrog the long absence of work on these systems at the country level.

Submitted by Daniel Ticehurst on Fri, 07/14/2017 - 12:48


Fantastic point: confounding and confusing the type and nature of support (eg,Outputs) development programmes make available with the assumed responses (e.g. changes in the behaviours, relationship and decisions) among those intended to benefit from such support (i.e., Outcomes) is all too common. But, as noted by the authors, this seemingly simple definition is flung down and danced upon by far too many. And these are often the same people/institutions who make fatuous claims about how there has been "a focus on inputs and outputs rather than outcomes". A study by the then OED in 1994 on M&E in the bank argues to the contrary. And so we turn full circle.

Add new comment