Measuring Up: The Importance of Quality Results Frameworks in Country Strategies
Using results frameworks to improve planning and to better identify potential dynamic synergies at country level.
Using results frameworks to improve planning and to better identify potential dynamic synergies at country level.
By:
Using results frameworks to improve planning and to better identify potential dynamic synergies at country level.
Over the course of two days last week, Results Measurement and Evidence Stream (RMES) specialists from across the Bank Group met to share and showcase experiences and "trade secrets", and to discuss a wide range of topics, including some on which we have previously blogged, such as our recent piece on beneficiary participation in evaluation. Next week's blog will reflect on the event as a whole. This week, we want to concentrate on a central component of the Bank Group's planning process at country level that was the subject of debate at a number of RMES sessions - the Results Framework.
Results Frameworks were first introduced as a formal component of Bank Group Country Assistance Strategies (CAS) in 2005. They were intended to be a key tool that would contribute to improving the quality of country strategies as planning documents, maximizing development effectiveness, and to demonstrating measurable results in fostering growth and reducing poverty as a result of WBG advice, investment, and lending.
Today, every CAS and every Country Partnership Framework (CPF) -the new country strategy document that replaced the CAS from January, 2015 - has a results framework in which outcome indicators and milestones for tracking progress are defined, fulfilling important accountability as well as learning objectives. Overall, the results-based approach has helped bring about improvements, for example, an improved focus on results in country strategies, as well as better alignment between WBG country engagement and national priorities.
On the other hand, a recent learning review of our validation of CAS completion reports (replaced by Completion and Learning Reviews) identified weaknesses in many of the results frameworks reviewed over time, including: a focus on inputs and outputs rather than outcomes; poor M&E systems, including indicators that do not "fit" associated objectives; and, perhaps most importantly, weak results chains.
Designing an effective operational results chain is possibly the most critical and challenging task in developing the results framework for a country strategy. The results chain is intended to present a logical statement of how planned WBG interventions will lead to the realization of objectives, beginning with inputs, moving through activities and outputs, culminating in outcomes, impacts, and feedback. A well designed results chain identifies risks and makes explicit any underlying assumptions about client (government) or other stakeholder (e.g., firms, CSOs, communities) actions. A clearly constructed, logical results chain is critical to accountability, mid-course correction, and learning, and is also integral to exercising selectivity.
In our experience, the operational results chain is often the weakest element in the results frameworks. A common weakness is in the reporting of outcomes where, for example, there can often be very little association or linkage between project level and higher level outcomes. In the absence of a clear results chain to explain why the Bank Group is pursuing any given objective, the results matrix (metrics) is less meaningful, making it more difficult to evaluate success or to draw lessons for the future.
It is also important for the results chain to recognize synergies that can be generated by the combination of instruments and interventions at play - their joint impact may be greater than the sum of their parts. This approach may be critical in identifying country-level or transformational impacts as, ideally, the results chain should specify potential catalytic effects that will result from planned interventions designed to help realize higher level goals. Our recent learning review of results frameworks identified possible effects that may be reported and that may contribute to articulating the dynamic catalytic or synergistic dimension of the results chain. For example, reporting on potential synergies associated with: Bank Group collaboration; scaling up of pilot projects; or potential synergies associated with enhancing the capacity of public institutions.
A strong, robust results framework that encompasses a well-designed logical results chain and a well-articulated results matrix is an evaluator's dream! However, given increased complexity and heterogeneity in the Bank Group's working environment as well as its ambitious goals, there is increasing awareness of the need for strong results frameworks not only to keep the evaluators happy, but to better manage and target scarce resources and, critically, better identify and articulate achievement associated with Bank Group efforts in association with clients.
This much was confirmed by participants from across the Bank Group who attended the RMES session that specifically addressed results frameworks and other factors relevant to the assessment of country programs. They were asked to comment on a case study based on an actual country completion report, and very quickly identified many of the issues referenced above including gaps in results chains and a lack of hard evidence in the documentation to support stated results. We welcome the opportunity for this type of open dialogue about the evaluation process and believe it will result in overall improvement in the planning process that is ultimately subject to evaluation. We look forward to continued discussion with colleagues through RMES and other fora about what works - stay tuned for more about RMES next week.
Comments
Fantastic point: confounding…
Fantastic point: confounding and confusing the type and nature of support (eg,Outputs) development programmes make available with the assumed responses (e.g. changes in the behaviours, relationship and decisions) among those intended to benefit from such support (i.e., Outcomes) is all too common. But, as noted by the authors, this seemingly simple definition is flung down and danced upon by far too many. And these are often the same people/institutions who make fatuous claims about how there has been "a focus on inputs and outputs rather than outcomes". A study by the then OED in 1994 on M&E in the bank argues to the contrary. And so we turn full circle.
Add new comment