Increasing Value-for-Money of Evaluation from Start to Finish.

Value can be generated or lost and costs can be incurred or saved throughout the lifecycle of an evaluation, starting with what to evaluate and when; how to evaluate; and with whom and how to share evaluation results, insights, and knowledge.

A few weeks ago, as part of this series on getting value for money from evaluation,  I wrote about the importance of making strategic choices on what to evaluate and when. Today, I want to focus on opportunities to increase value-for-money during the evaluation process.

A number of readers might ask: why focus on this?  isn'€™t this part of evaluation practice well established by now? Yes it is, but choices made in designing and conducting evaluations can and often do influence the overall cost and value of evaluation.

Ultimately, the value of an evaluation lies in it being used. For that to happen evaluation and its underlying design and process have to be credible, timely, and useful.

Credibility is necessary! Why else would anyone take note and act on findings or recommendations? A large part of credibility is derived from evaluators'€™ professional standing. They are known for their expertise. Their word counts. But, evaluation is more than an expert opinion.

Making the right choices about scope and methods also has an important part to play. For instance, if sampling methods introduce biases, or methods rely on too few sources of information for meaningful triangulation or assessment of results, validity of findings and credibility of the evaluation suffer. Now, does that mean sampling 100% of a program, or asking every conceivable question in a survey are the right answers to credibility? Certainly not, as wastefulness in an evaluation undermines credibility just as much - especially when the evaluation critiques the efficiency of the program it is evaluating.

Credibility is also gained when stakeholders understand how the evaluation is conducted. Transparency around the processes, methods, and yardsticks used to form an evaluative judgment can reassure stakeholders. It gives them greater opportunity to share information or question analyses. Engagement during the evaluation process, if constructive, might also lead to early learning.  

Timeliness is quintessential for an evaluation to be used. When a decision needs to be taken, for example, to renew a program, make changes to it, or stop it altogether, evaluation results need to come in well ahead of that decision. Nothing is more disappointing, and costly than to miss such an opportunity to influence decisions. This is the reason for my previous blog: making strategic choices about what to evaluate when should take into account key milestones in decision-processes. That way, an evaluation can start early enough. Another difficulty is when evaluators face trade-offs between scope and timeliness, especially when looking at larger complex programs. There are no simple recipes for that situation, but choices have to be weighed at the outset of an evaluation and managed towards throughout.

Usefulness. Michael Quinn Patton is the grand master of utilization-focused evaluation. He has provided insightful discussions, guides, and checklists to ensure evaluations are focused on use. I couldn'€™t agree more with them on the need to identify specific primary users. But, I would add here that at least in my experience in multilateral organizations, very often we do not have a single prime user. We need to satisfy different demands with the same evaluation. And, even if we did evaluations that are primarily focused on the needs of the Executive Board, to whom IEG reports, we generate important insights for management and operational staff as well. And, they need to take up lessons and recommendations in their actions.

So What? The Importance of Recommendations. A critical aspect of the value-for-money of evaluation derives from its recommendations. Recommendations define the issues that the evaluation prioritizes and suggests will make the biggest difference in the program -€“ its performance and results - if addressed. Recommendations derive from evidence, findings, and conclusions, and often focus on observed shortcomings to correct them. But, especially when assessing large complex programs, many avenues for improvement or scaling up of success might be possible. Recommendations are sometimes exposed to tests whether they are actionable, but less to the question what difference will they make? Once implemented, what value-added will they generate? To foster ownership of recommendations it is important for evaluators to engage with program managers and seek the answer to the "€œso what?" question. The larger value added from implementing the recommendation, the higher the likelihood that ownership and follow-up action will be high. 


More in the #WhatWorks Value-for-Money Series:
The first step to a great evaluation? Make the right choices
Institutionalizing Evaluation: What is the Theory of Change?
Influencing Change through Evaluation: What is the Theory of Change?

Comments

Submitted by Jane Davidson on Tue, 05/24/2016 - 22:55

Permalink
Great topic! Another great source to add to the conversation Want to really cut to the chase on the Value for Investment question? Check out Julian King's ebook on this topic (free, but he's updating it! I have the first version and it's great): http://www.julianking.co.nz/downloads/

Submitted by Caroline Heider on Wed, 05/25/2016 - 00:28

In reply to by Jane Davidson

Permalink
Thanks, Jane, for the great contribution!

Submitted by Julian King on Thu, 05/26/2016 - 07:44

Permalink
Thanks, Jane, for the mention! My e-book was on the back burner for a while but you motivated me to finish it. Voila! http://www.julianking.co.nz/downloads/

Submitted by Koenraad Van Brabant on Wed, 06/01/2016 - 22:18

Permalink
'Fostering ownership of recommendations' is a key issue. What I've come across repeatedly is that those who commission the evaluation 'outsource' the thinking far too much. They are too disengaged from the whole evaluation process, and want to look at the recommendations without taking the time to actually go through the findings and what they suggest. That may be justifiable when what you evaluate is 'complicated', but becomes problematic when its nature is more one of 'complexity' (to refer to D Snowden's Cynefin framework). My preference is for a different process in which the evaluators present the findings, and then consider together with the commissioning entity what they seem to 'tell us', and develop recommendations together. There are also instances, particularly when the exercise is more of a strategic review, where it reveals that the action being evaluated/reviewed has come to a strategic crossroads and the agency may have to make some important choices. In such cases it is often not appropriate for the evaluators to recommend what the agency should do: ultimately the evaluators have only limited information, cannot weigh various considerations against each other the way the agency must do, and will not have to live with the consequences of the choices made. I find it better then to describe the situation, what the possible choices seem to be, but only invite the 'client' to reflect on them where they want to go. And limit my 'recommendations' to those elements where changes or improvements make fairly straightforward common sense.

Submitted by Caroline Heider on Tue, 06/14/2016 - 21:42

In reply to by Koenraad Van Brabant

Permalink
Koenraad, thanks for your thoughtful contribution. Your points seem to be made from the perspective of an external evaluator, is that correct? Here at the World Bank Group, I see the Board, through its Committee on Development Effectiveness, as the commissioner of evaluations. They do not get involved in writing recommendations. Instead, we are piloting engaging management and operational staff of the World Bank Group in shaping recommendations. Your second example is more akin to what we aim to do with our learning engagements.

Submitted by Shakeel Mahmood on Sun, 06/05/2016 - 08:14

Permalink
Enjoyed reading your peice and gathered a good grasp of knowledge.Thanks. shakeel

Submitted by Harsley Wesisi on Thu, 06/09/2016 - 00:28

Permalink
Awesome and interesting read, lots of knowledge gathered. Thank you.

Add new comment

Image CAPTCHA