The Mystery of Mixing Methods
Despite significant progress on mixed methods approaches, their application continues to be (partly) shrouded in mystery, and the concept itself can be subject to misuse.
Despite significant progress on mixed methods approaches, their application continues to be (partly) shrouded in mystery, and the concept itself can be subject to misuse.
By: Jos VaessenThe lack of an explicit (and comprehensive) understanding of the principles underlying mixed methods inquiry has led to some confusion and even misuses of the concept in the international evaluation community.
For some time now mixed methods approaches have been part and parcel of mainstream debates and practices in evaluation (and other branches of the applied social sciences). Going back to at least the 1950s, the mixed methods tradition picked up momentum in the 2000s. A number of text books (e.g. Tashakkori and Teddlie (2003, 2010)) and a dedicated journal (Journal of Mixed Methods Research) are clear signs of mixed methods research becoming a research paradigm in itself.
Despite significant progress on mixed methods approaches, for many evaluation stakeholders their application continues to be (partly) shrouded in mystery. To use a simple analogy, the concept of flying is intuitively understood and appreciated by most, yet when asked to explain the underlying principles, many would fall short. This is not to say that evaluators (and researchers in general for that matter) are lost in the dark. Far from this, evaluators tend to have very sensible ideas about combining qualitative and quantitative methods.
Yet, the lack of an explicit (and comprehensive) understanding of the principles underlying mixed methods inquiry has led to some confusion and even misuses of the concept in the international evaluation community. Let me briefly highlight three manifestations of this:
Notwithstanding a continued lack of understanding as well as misuses of the term by some, the imperative for using mixed methods designs to strengthen the validity of evaluation findings is strong. IEG is continuously looking for ways to strengthen its mixed methods evaluation designs. The overall purpose of a mixed methods approach is to offset the biases and limitations of one method with the strengths of another. Greene et al. (1989) in their seminal article on the principles of mixed methods in evaluation present five more precise purposes:
The art of developing mixed methods designs goes beyond an understanding of these principles. It has become widely understood in evaluation in the field of international development that different methods have particular comparative advantages (NONIE, 2009; Bamberger et al., 2010). The ongoing challenge of bringing to bear such comparative advantages in the design and sequencing of methods in multi-level and multi-site evaluations constrained by time, budget and data considerations, will continue to require context-specific and creative solutions. In that sense, some of the mystery will always remain to challenge our thinking.
What is (good) program theory in international development? (by Jos Vaessen)
Using ‘Theories of Change’ in international development (by Jos Vaessen)
Institutionalizing Evaluation: What is the Theory of Change? (by Caroline Heider)
Influencing Change through Evaluation: What is the Theory of Change? (by Caroline Heider)
Comments
Thanks for sharing this post…
Thanks for sharing this post. I do not agree with the characterization of mixed methods research as "mysterious". It could be that researchers trained in a single tradition may find it difficult to fully understand mixed methods research, and therefore, tend to be quite suspicious of it. However, this has less to do with mixed methods being mysterious, and more to do with the researcher in question being trained in only a single school of thought. One should remember that good/bad practices exist in every strand of research -- be it qualitative, quantitative or mixed methods. One of the best parts of being a mixed methods researcher is exposing yourself to criticism by qualitative folks and quantitative folks alike. When it's done well, the benefits of mixed methods research outweigh the costs, by a large margin.
Dear Emcet, thanks for your…
Dear Emcet, thanks for your comment and your interest in the blog post. I agree with your main point. My interpretation and use of the word ‘mystery’ with regard to mixed methods approaches does not come so much from a perceived lack of clarity in the literature or lack of guidance on the topic as such, but from the fact that many evaluators (and policy-oriented researchers) have been trained in a particular discipline and methodological tradition which tends to frame their thinking and practice. Methodological specialization is an important skill which is needed in practice. At the same time, we need sufficient evaluators (and policy-oriented researchers) who have a broad understanding of different designs and methods and the capacity to use and integrate them in logical and efficient ways.
An excellent, clarifying…
An excellent, clarifying article Jos, thank you. I sometimes observe that all the evaluations we do are 'mixed methods' to some degree, in that they capture a number of the purposes you outline. However, evaluations often simply mention that mixed methods were used and stop there, without explicitly mentioning the inter-linkages between the different methods adopted and how they are used in combination to resolve a particular evaluation question.
Dear Garrett. I fully agree…
Dear Garrett. I fully agree. Thanks for your interest in the topic.
Great piece! Many points…
Great piece! Many points raised ring true. But the use of sweeping, no-evidence claims and false equivalence does not depart from the reality that all charges apply as well in cases of either the quanti or quali -making all three in equal footing when it comes to "flying high shrouded in mystery". Just a 'pari passu' thing.
Dear Romeo, that is a valid…
Dear Romeo, that is a valid point. A key challenge in mixed methods designs (one could argue that most methodological designs are to some extent mixed methods designs) is about how we select, integrate and sequence methods. This challenge is further compounded by methodological challenges that relate to specific (qualitative or quantitative) methods. Thanks for your comment.
Mixed methods paradigm is in…
Mixed methods paradigm is in cul-de-sac because of one tiny problem: it is ussually assumed that MMP needs to produce scalar results with uniform message for complex questions, such as 42 in The Hitchhiker's Guide; I think that the challenge is different: not only methods are mixed but also results are mixed, and so methodological chalenge is upgraded - how to link mixed results in consistent interpretation. I think that the bridge between production of mixed results with mixed approach in science, and their understanding, is evaluation - that makes sense of contradictory meanings.
Interesting point, Boran…
Interesting point, Boran. Thanks. If I understand you correctly, you highlight the role of evaluation as a sense-making exercise of different types of evidence of different quality (and the relation with the different types of mixed methods approaches that generate the evidence). I would certainly agree with that.
hi Jos, more on this is…
hi Jos, more on this is available in two papers (on evaluation synthesis and on territorial cohesion) which are available for comments on request sent to bradej@gmail.com
I'm not sure that I have…
I'm not sure that I have seen wide spread misuse of methodologies purporting to mix methods, but I do see a limited use of the possibilities open to researchers by doing so. I like the term 'creative mix' of methods by Gerald Midgley alluding to mixed method going beyond a simple qual/quant distinction. In work with my colleagues Ellen Lewis (Uni of Hull, UK) and Shravanti Reddy (UN Women) we encourage the use of transdiciplinary mixed method as a way to bring to the methodological design a genuine and meaningful engagement with participants and stakeholders. Such efforts can enhance the rigor and credibility of evaluation and research. It is risky for those schooled in positivist method. It also may be more expensive to conduct but this is traded against obtaining outcomes, even impacts, that are longerlasting and inspire greater confidence in the study. (Forthcoming: Guidance Inclusive Systemic Evaluation for Gender, Environments and voices from the Margins #ISE4GEMs)
Thank you Anne for sharing…
Thank you Anne for sharing an interesting example of a mixed methods approach in evaluation. There are as you probably know numerous interesting examples out there, often involving academics. Many of those in a way could be characterized as ‘boutique’ evaluations rather than being representative of large strands of institutionalized evaluation work. My points largely relate to the latter.
I came across this blog…
I came across this blog while googling for mixed methods with a focus on gender. The blog as well as participants comments were informative and kind of made me wonder if we should have an additional advisory expert in our panel of co-authors. We (authors consortia) are writing a review, 'The effects of transport infrastructure and logistics services interventions for increasing women participation in formal labor markets in low and middle-income countries'. In our review methodology, we plan to use mixed method/triangulation approach. I am wondering if you or anyone from your team would be interested in assisting us in an advisory capacity.
Thanks Manisha for your…
Thanks Manisha for your interest in this blog post. Unfortunately, we do not have the time to take on the role you suggested, but I am happy to have a conversation with you on using mixed methods in a (systematic) review study.
Thank you Jos Vaessen for…
Thank you Jos Vaessen for responding to my post. I understand the time constraint and appreciate your offer to have a conversation on using the mixed methods in our systematic review (SR). This is my third SR, however I am using the mixed methods in SR for the first time. I have knowledge of mixed methods in qualitative research and two co-authors have expertise in qualitative methods - thematic synthesis. Nevertheless, as a lead co-author I am feeling the need to have at least one senior expert who could guide/critique our approach in the protocol. With that said, I am not sure if you would be open to discuss our approach in the conversation. If so, please let me know.
Excellent clarifying article…
Excellent clarifying article. Even for those who are trained in Mixed Methods, it is good to be reminded of the core principles and purposes. Thanks for sharing.
Thank you for your interest…
Thank you for your interest in this blog post, Courtney.
If you would like to read…
If you would like to read more about MIxed-Methods, Jennifer Greene has written a wondeful book on the topic.
https://www.amazon.com/Methods-Social-Inquiry-Jennifer-Greene/dp/078798…
Dear Denise, I agree that…
Dear Denise, I agree that this is a helpful and insightful text book. Thanks for your interest in this blog post.
Thank you very much, Jos,…
Thank you very much, Jos, for your thoughtful piece. It's quite true that "mixed methods" can all too easily become an empty mantra, at any stage of an evaluation but particularly in design. I have found that many colleagues, including clients as well as fellow evaluators, are increasingly skeptical of such recitations, and do look beyond the first-round language in search of just how complementarity or expansion (for example) might work in a particular evaluation design. I've found that once I dig into the options and the steps for getting to "something more" it is not necessarily so simple or straightforward (the data don't always behave as we would like them to). It can be a real challenge but doing "deep mixed methods" can reap real payoffs if the effort is invested wisely. Thanks again for your contribution.
Dear Jim, I couldn’t agree…
Dear Jim, I couldn’t agree more. Thanks for sharing.
Your first list of three…
Your first list of three reasons scrapes the top of the lack-of-capacity iceberg. You, me, and those who read this are in something of an ivory tower: outside is the vast mass of development sector workers. They typically administer small projects, often for INGOs, with an impact evaluation budget of maybe $20,000. They draft the TORs from previous templates. They copy out the OECD DAC criteria, and then add 40+ more questions. Colleagues might add more questions. They say do "mixed methods", specifying KIIs, FGDs, and a quantitative survey. They do not know or understand any other methods. They do not understand control groups, counterfactuals, etc. My company, Mekong Economics Ltd., does 10+ of these meaningless evaluations every year. Help. We have to do proposals that respond to TORs, or we do not win jobs. Can the Bank pop out of the academic Ivory Tower and think through this dull vocational problem? Training is an obvious solution, but also why no encourage such organisations to put out a TOR that just says: "We have $20,000 and these five key evaluation questions - please propose your approach".
Dear Adam, thanks for your…
Dear Adam, thanks for your remarks. As advocates for evaluation (and quality evaluation) I think we should always strive to engage in conversations with colleagues across the globe on how to design meaningful evaluations. Training is important and we are doing this through multiple channels. At the same time we need to engage in conversations with governmental, non-governmental and private sector organizations on sensible evaluation strategies. Not everything needs to be evaluated all the time. Strategic selectivity and sensible scoping in line with the needs for evaluative evidence and available resources are very important in this regard.
Add new comment