Volume 16, No. 3, Art. 18 – September 2015



Review:

Felix Anderl

Andrew Bennett & Jeffrey T. Checkel (Eds.) (2015). Process Tracing: From Metaphor to Analytic Tool. Cambridge: Cambridge University Press; 329 pages; ISBN: 978-1107044524; €80.89

Abstract: In this review, I argue that this textbook edited by BENNETT and CHECKEL is exceptionally valuable in at least four aspects. First, with regards to form, the editors provide a paragon of how an edited volume should look: well-connected articles "speak to" and build on each other. The contributors refer to and grapple with the theoretical framework of the editors who, in turn, give heed to the conclusions of the contributors. Second, the book is packed with examples from research practice. These are not only named but thoroughly discussed and evaluated for their methodological potential in all chapters. Third, the book aims at improving and popularizing process tracing, but does not shy away from systematically considering the potential weaknesses of the approach. Fourth, the book combines and bridges various approaches to (mostly) qualitative methods and still manages to provide abstract and easily accessible standards for making "good" process tracing. As such, it is a must-read for scholars working with qualitative methods. However, BENNETT and CHECKEL struggle with fulfilling their promise of bridging positivist and interpretive approaches, for while they do indeed take the latter into account, their general research framework remains largely unchanged by these considerations. On these grounds, I argue that, especially for scholars in the positivist camp, the book can function as a "how-to" guide for designing and implementing research. Although this may not apply equally to interpretive researchers, the book is still a treasure chest for them, providing countless conceptual clarifications and potential pitfalls of process tracing practice.

Key words: process tracing; research design; causal mechanisms; textbook; international relations; positivist approaches; interpretive approaches

Table of Contents

1. From Correlation to Mechanism in Social Inquiry

1.1 A Mechanism-based research program for a discipline addicted to causation

1.2 What is "good" process tracing? The editors' offer

2. The Book's Contributions

3. Takeaways, Limits, and Remaining Questions

References

Author

Citation

 

1. From Correlation to Mechanism in Social Inquiry

Empirical political science has in the last few decades been dominated by a hunt for correlation and causation. Despite all the critique, mono-causal explanations of empirical "facts out there" have remained the gold standard of good inquiry. The limitation to this form of research is crumbling, however. This is not due to a general insight that would challenge the concept of causality. Rather, scholars from various camps have been noting that in many constellations, the causation itself (A causes B) is not enough to understand an empirical puzzle. Often, it is much more interesting to know how A causes B—that is, on the basis of which mechanisms. This insight is the basis for a number of studies that have shifted their attention to this way of analysis, among which the most prominent are probably DEITELHOFF (2006), GUZZINI (2012), MORAVCSIK (1998), SCHIMMELFENNIG (2003), and TANNENWALD (2007). Furthermore, a growing number of social scientists have been putting time into conceptualizing causal mechanisms in social inquiry (see for instance ELSTER, 1998; FRIEDRICHS & NONNENMACHER, 2014; MAYNTZ, 2004). Finally, methodologists have recently been trying to come to terms with this "new" way of approaching the social by formalizing approaches to process tracing (BEACH & PEDERSEN, 2013; ROHLFING, 2013). BENNETT and CHECKEL provide an outstanding textbook that delivers on the task of unifying these attempts by providing yardsticks with which the scholarly community will now be in a better position to evaluate a process tracing study. As such, it can be read as a prosecution of KING, KEOHANE and VERBA's (1994) "Designing Social Inquiry" and VAN EVERA's (1997) "Guide to Methods for Students of Political Science." [1]

1.1 A Mechanism-based research program for a discipline addicted to causation

Naturally, historians have been focusing on causal processes for a long time (see ELMAN & ELMAN, 2001; TILLY, 2001). The "new" focus on mechanisms is a progressive development within a largely deductive and scientific realist environment. It is mainly directed at students of International Relations (IR) and comparative politics, where process-oriented explanations have only recently been fully established. The book thus mainstreams a mechanism-based research program within a discipline addicted to causation. [2]

Even in this field, however, mixing methods and bridging research schools is no longer revolutionary. As BURZAN (2015, §2) has argued in this journal, combining seemingly opposed research practices is possible and has indeed longstanding historical roots, such as in "mixed methods" or in the concept of triangulation. The approach of process tracing is probably the most promising for such an undertaking, for it provides a research framework open enough for various methodologies. However, as BENNETT and CHECKEL rightly argue, this framework has for long remained a "metaphor" rather than becoming an analytic tool. The next section elucidates their take on "good" process tracing. [3]

1.2 What is "good" process tracing? The editors' offer

While BEACH and PEDERSEN (2013) in their competing textbook work with three forms of process tracing (theory testing, theory generating, and outcome explaining), BENNETT and CHECKEL offer two approaches to process tracing: inductive and deductive. However, in contrast to BEACH and PEDERSEN, they do not illustrate differing research routes and logics. Instead, one definition of process tracing is offered under which both approaches should be categorized and to whose standards both should live up. They define process tracing as "the analysis of evidence on processes, sequences, and conjunctures of events within a case or the purposes of either developing or testing hypotheses about causal mechanisms that might causally explain the case" (BENNETT & CHECKEL, p.7). [4]

They then go on to define their understanding of "cases" and "within-case" evidence and connect it with a discussion of the philosophy of science. Generalizability is set up as the aim: "The development of cumulable social science theory and the theoretical explanation of individual cases are—or, rather, should be—the central goals of process tracing" (p.260). In a footnote, they "urge recognition that traditional positivism is inadequate for dealing with concepts such as mechanisms and techniques like process tracing" (p.21). Efforts are successfully made to accommodate various research schools, subscribing to an agenda that one could term reflective positivism. It is therefore not surprising that their process tracing is especially relevant to conventional constructivists. Although the authors make definite attempts to build bridges, it only serves to highlight that it is more difficult to integrate interpretive and post-structuralist approaches. This becomes even more apparent when the discussion of interpretivism is followed by a summary of VAN EVERA's (1997) four tests, which can clearly be attributed to the positivist camp. That being said, the quality standards that BENNETT and CHECKEL propose are applicable to many streams of research, and, as I argue in the following, every scholar could benefit from implementing, or at the least considering, the "best practices" they propose. [5]

The book's magic word is "equifinality." It is discussed in numerous contexts. And rightly so, as failing to address this properly has probably been the biggest weakness of prior process tracing studies. Reconstructing a process may quickly lead to a "storytelling" mode that does not consider alternative paths, and this problem may have led to the particular outcome that is unquestionably haunting small-n researchers. BENNETT and CHECKEL, as well as all the contributors, make sure to advocate a sensible approach in this regard. Moreover, they highlight the importance of context, in order not to lose track on what made the hypothesized causal processes possible in the first place.

    • Cast the net widely for alternative explanations

    • Be equally tough on the alternative explanations

    • Consider the potential biases of evidentiary sources

    • Take into account whether the case is most or least likely for alternative explanations

    • Make a justifiable decision on when to start

    • Be relentless in gathering diverse and relevant evidence, but make a justifiable decision on when to stop

    • Combine process tracing with case comparisons when useful for the research goal and feasible

    • Be open to inductive insights

    • Use deduction to ask: "if my explanation is true, what will be the specific process leading to the outcome?"

    • Remember that conclusive process tracing is good, but not all good process tracing is conclusive.

    Table 1: Process tracing best practices (BENNETT & CHECKEL, p.21) [6]

    Their ten good practices (see Table 1) are organized around these thoughts. This becomes apparent in their first and second suggestions to "cast the net widely for alternative explanations" (p.23) and to be equally tough on them. The third suggestion, to consider source bias, is a commonplace for every historian. However, it is a major risk for students of political sciences to take the opinion of a source as equivalent to an explanation. The fourth suggestion, taking into account the most or least likelihood of a case for alternative explanations, is a direct follow-up of classical comparative study designs. In combination with suggestion seven, the combination with case comparisons, it highlights that process tracing is combinable with other methods and should be used accordingly. That is to say, the editors propose a case logic here, suggesting that the reconstruction of a particular process should be seen as one in a class of cases, rather than a process standing by itself. It is only by applying these standards that researchers can live up to the cumulable social science that the editors long for. Suggestions five and six concern time. What moment can be set as a starting point for a certain process and how much data do we have to gather in order to legitimately derive inferences from it? This should not be underestimated and it is of critical importance to consider these questions. The editors are, however, not in the position to offer a general framework of how to choose a timeframe. When a case starts and stops will always be a matter of debate and the best advice for researchers is to properly reflect on these issues. [7]

    Suggestions eight and nine are related to inductive and deductive approaches. The standard procedure that BENNETT and CHECKEL have in mind is of a deductive kind, testing a specific explanation with process tracing. The advancement in comparison to traditional testing designs comes with a specification of the process ("possible facts and sequences within a case that should be true," p.30). Process tracing thus allows for much more fine-grained testing of observable implications that should be specified in advance and tested successively. This step entails the biggest promise of process tracing: not only to determine whether an explanation is true/untrue, but to test how it is true/became true. The authors make it clear, however, that one should not wear blinkers within this testing logic. On the contrary, they advocate being "open to inductive insights" (p.29) and reporting them transparently. The last suggestion holds that "conclusive process tracing is good, but not all good process tracing is conclusive" (p.30). Again, this should be read as a call for transparency rather than as a methodological step in its own right. If confidence in a hypothesis is low, we as scholars should point that out. [8]

    BENNETT and CHECKEL finally advocate four goals: first, they ask for a systematization of process tracing practices which, as they say, "only asks researchers to do systematically and explicitly what they had to be doing implicitly if their process tracing could legitimately claim to have reached justifiable inferences" (p.267). Second, they aspire to collectively agreed community standards. Third, they hope that inductive "soaking and poking" approaches will not be denigrated in the course of this process. However (and this may be the core message of the book), induction does not mean that "anything goes." "Storytelling" is at various instances held up as the negative example of process tracing. Forth, they ask researchers not to end up with "dry, hard-to-read empirics" (p.268), and instead, guide the reader through well-written and systematic case studies. [9]

    It is strength of the editors' good practices to be open to many streams of research while at the same time excluding some practices specifically. The latter applies mainly to studies that do not take the problem of equifinality into account, do not discuss the external validity of their study, have no good selection criteria, do not specify observable implications, and refuse to take up alternative explanations or non-hypothesized insights. Although some reflectivist scholars may quarrel with the case logic of these good practices, almost every "good" piece of research should be held accountable to these standards. This is also what the volume is able to show throughout the following contributions. [10]

    2. The Book's Contributions

    In this section, I briefly discuss various chapters of the book that illustrate the breadth of the volume. It is a special strength of the book that all of the chapters take up the best practices outlined above, while the editors discuss in their conclusion how successful this was. All contributors make extensive use of prior research, either their own or others', and discuss the applicability of the process tracing good practices. This provides the reader with an excellent overview of the diverse process tracing approaches. Due to the transparent way in which the contributions are organized, it also becomes apparent where the good practices can simply be adopted and for which fields of study one would need more specialized standards. [11]

    The first contribution of the book is the article "Process-tracing the effects of ideas" by Alan M. JACOBS. He focuses on situations in which ideas have effects both on a micro-scale by influencing individual cognition and decisions, as well as macro-level patterns of behavior and interaction. He defines an ideational theory as a "causal theory in which the content of a cognitive structure influences actors' responses to a choice situation" (p.43), while this cognitive structure must not be fully endogenous to material features. He argues that ideational process tracings will necessitate an expansive empirical scope, for a "narrow focus on critical choice points will rarely be sufficient for distinguishing ideational from alternative explanations" (p.41). In this regard, he strongly differs from the contribution by Frank SCHIMMELFENNIG, whose take on "efficient process tracing" "maximizes analytical leverage in relation to the invested resources" (p.100). With this procedure, he wants to "avoid three major problems of the method: the potential waste of resources, the temptation of storytelling, and the lack of generalizability" (p.101). The "infinite regress" about which he warns process tracers is a result of scholars never quite knowing "whether we have soaked and poked enough" (p.102). While JACOBS largely endorses the ten best practices, SCHIMMELFENNIG (p.108) offers extensions for the first eight. The eighth point—"be open to inductive insights"—is supplemented by a "Yes, if theoretically specified causal mechanisms fail to explain the case" (ibid.). SCHIMMELFENNIG hence distances himself from the editors' choice to put inductive and deductive approaches on a level playing field, or even to understand them as two necessary parts of a single research undertaking. For SCHIMMELFENNIG, induction is in order if deduction has failed. [12]

    David WALDNER also subscribes to a deductive framework. His article mainly deals with a "completeness standard" that he seeks to develop by suggesting four conditions for proving causality with process tracing. He aims at solving the exogeneity problem, which he correctly sees as a major obstacle for process tracing. Among other things, WALDNER holds that a complete process tracing needs "a causal graph whose individual nodes are connected in such a way that they are jointly sufficient for the outcome" (p.128). Every step in this causal graph must be predicted. With extensive discussions of prior process tracings, he outlines his conception and also gives a convincing example of what a causal argument should not look like, namely if it does not "establish reasons to order the nodes as relations of causal dependence" (p.149). This may also apply if the descriptive evidence appears convincing. [13]

    With this approach, he breaks with the editors' best practices. So does Vincent POULIOT, although for completely different reasons. Of all the approaches in the volume, his call for "practice tracing" has the strongest interpretive character, combining his well-known practice approach with process tracing standards developed in the volume. He endorses the editors' proposal to bridge or cross-fertilize process tracing and interpretivist literature due to their "many substantive commonalities" (p.237). However, as he is quick to point out, for the latter the "singularity of causal accounts" (ibid.) is important. His attempt to build common ground is based on the conviction that "no social relationships and practices are so unique as to foreclose the possibility of theorization and categorization" (ibid.). Therefore his concept simultaneously upholds singular causality as well as analytical generality. Despite these worthwhile attempts to build bridges between the paradigms, this contributor cannot accept all of the ten "good practices." As opposed to BENNETT and CHECKEL, POULIOT argues for local causality instead of generalizations that are inferred from deductive models. For this reason, his contribution is probably the most interesting, as it is strongly productive in grappling with the editors' framework. In effect, POULIOT fully accepts criteria three, five, six and eight, while "reinterpreting" the remaining six. As with all the other contributions, he discusses a range of existing literature, and develops his criteria while assessing the quality of these works. For readers, this is a pleasure, because they can learn by reconstructing meaningful empirical studies and have the opportunity to make up their own mind about the authors' conclusions, instead of simply being told rules to follow, as in many other methodological books. [14]

    The editors' conclusion structures the individual contributions and considers their caveats and evaluations of the original framework. Contentious points are deliberately taken up, as well as ones that may show the weaknesses of process tracing. The individual contributions' value could only be adumbrated in this review. It is their rich content which makes the volume so resourceful, because they shed light on research practices from numerous fields of study, from civil war (Jason LYALL) to international institutions (Jeffrey T. CHECKEL), as well as explain historical cases such as the Cold War (Matthew EVANGELISTA). Methodological problems that otherwise seem very dry are vividly explained and illustrated, such as the one of over-determination in CHECKEL's article and the ideological roots of decisions which, according to EVANGELISTA, should also be traced in order to really understand a process. [15]

    3. Takeaways, Limits, and Remaining Questions

    BENNETT and CHECKEL and the contributors provide an outstanding textbook with this volume. It is suggested reading for all students of political science, especially those working with reconstructive or other small-n methods. The systematization of process tracing studies is paramount and this book is a step in that direction. Looking at earlier studies such as the famous one by MORAVCSIK (1998) and comparing it with the current state of the art, progress is indeed observable, for a scholar nowadays has to deliver much more distinct justifications for why one causal node led to the other. Such a justification must always consider alternative explanations and address the problem of equifinality. Furthermore, it is suggested by this book that techniques of generalization should be applied. This is also true for the area of mid-range theory, which is probably the level of theorizing where process tracing is most frequently drawn upon. The danger of storytelling can be circumvented with various methods. Depending on the issue area as well as the stance on the philosophy of science, the contributions of the book offer a wealth of possible approaches. That is to say, the plurality of the discipline does not necessarily suffer from a conversation on common quality indicators. [16]

    The editors make pronounced efforts to be open towards constructivist studies in their book and go to great pains to integrate these approaches into a broad framework that allows for materialist and ideational studies, as well as deductive and inductive methodologies. This works well with conventional constructivism (see RISSE, POPP & SIKKINK, 1999 for such an approach). More radical interpretive approaches are, as they admit themselves, "more challenging to reconcile" with their guidelines (p.15). JACOBS's contribution stands for the first pillar, and it is by now beyond question that ideas and norms are important and can be integrated into a research framework that sanctions causality. However, although POULIOT makes an effort in his chapter in reconciling the approach with a stronger reflectivist stance, he has to make significant departures from the editors' "best practices" in order to achieve his goal of "practice tracing." For scholars in a post-structuralist tradition, the best practices may prove rather unhelpful, especially their underlying praise of deduction and the "case"-logic which they apply to empirical constellations. Nevertheless, even for scholars in this tradition, the book may work as a guide for systematization of the research. The danger of being accused of "storytelling" is especially high when reconstructing a single case that stands for itself rather than as a representation of something bigger. It is with some justification that the editors warn of such a procedure. The tools being offered here can thus profit every researcher, irrespective of whether her question is "How does A cause B" or "Why did constellation A receive traits B and C when facing discourse D?" [17]

    Beyond the actual information that the individual contributions convey, the format of the edited volume is also more readable and practical than the one by BEACH and PEDERSEN (2013). The latter is a rather technical reference book, and the main distinction of BENNETT and CHECKEL's volume is that it is very close to research practice and offers insights from the contributors' experiences of applying process tracing as an analytic tool. Some of the advice given in the book is, in fact, much more general than expected from a book about process tracing. It seems that, regardless of the method one applies, "quality" comes with the upholding of very basic scholarly values, as illustrated in the insistence on transparent procedures in Thad DUNNING's contribution. Sometimes, this is lengthy to read, but it also has the benefit that the volume can be read as a universal introduction into reconstructive social science practice. To summarize, this is simultaneously the virtue and problem of process tracing as an approach: it encompasses so many different paradigms and methods that it can spur a conversation on what good research practice means in general, but it is also so broad a category that one struggles to formulate rules for adequate procedure. Therefore, the editors' attempt to explicate ten "good practices" followed by powerful discussions of these is an important intervention into a field in which process tracing gets ever more popular. [18]

    References

    Beach, Derek & Pedersen, Rasmus B. (2013). Process-tracing methods: Foundations and guidelines. Ann Arbour, MI: University of Michigan Press.

    Burzan, Nicole (2015). Rezension: Udo Kuckartz (2014). Mixed Methods. Methodologie, Forschungsdesigns und Analyseverfahren. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 16(1), Art. 16. http://nbn-resolving.de/urn:nbn:de:0114-fqs1501160 [Date of access: April 5, 2015].

    Deitelhoff, Nicole (2006). Überzeugung in der Politik. Grundzüge einer Diskurstheorie internationalen Regierens. Frankfurt/M.: Suhrkamp.

    Elman, Colin & Elman, Miriam F. (2001). Bridges and boundaries. Historians, political scientists and the study of international relations. Cambridge, MA: MIT Press.

    Elster, Jon (1998). A plea for mechanisms. In Peter Hedstroem & Richard Swedberg (Eds), Social mechanisms. An analytical approach to social theory (pp.45-73). Cambridge: Cambridge University Press.

    Friedrichs, Jürgen & Nonnenmacher, Alexandra (Eds.) (2014). Soziale Kontexte und Soziale Mechanismen. Wiesbaden: Springer.

    Guzzini, Stefano (Ed.) (2012). The return of geopolitics in Europe? Social mechanisms and foreign policy identity crises. Cambridge: Cambridge University Press.

    King, Gary; Keohane, Robert O. & Verba, Sidney (1994). Designing social inquiry: Scientific inference in qualitative research. Princeton, NJ: Princeton University Press.

    Mayntz, Renate (2004). Mechanisms in the analysis of macro-social phenomena. Philosophy of the Social Sciences, 34(2), 237-295.

    Moravcsik, Andrew (1998). The voice for Europe. Social purpose and state power from Messina to Maastricht. Ithaca, NY: Cornell University Press.

    Risse, Thomas; Popp, Stephen & Sikkink, Kathryn (1999). The power of human rights. International norms and domestic change. Cambridge: Cambridge University Press.

    Rohlfing, Ingo (2013). Comparative hypothesis testing via process tracing. Sociological Methods and Research, 4(43), 606-642.

    Schimmelfennig, Frank (2003). The EU, NATO and the integration of Europe. Rules and rhetoric. Cambridge: Cambridge University Press.

    Tannenwald, Nina (2007). The nuclear taboo: The United States and the non-use of nuclear weapons since 1945. Cambridge: Cambridge University Press.

    Tilly, Charles (2001). Mechanisms in political processes. Annual Review of Political Science, 4, 21-41.

    Van Evera, Stephen (1997). Guide to methods for students of political science. Ithaca, NJ: Cornell University Press.

    Author

    Felix ANDERL is a research associate at the chair of International Relations and Theories of Global Orders, Goethe University Frankfurt. He works in a DFG-funded project on the global justice movement and currently writes his PhD on the World Bank's reactions to protest.

    Contact:

    Felix Anderl

    Goethe Universität Frankfurt
    FB 03: Institut für Politikwissenschaft
    Exzellenzcluster "Die Herausbildung Normativer Ordnungen"
    Max-Horkheimer-Straße 2
    60629 Frankfurt am Main, Germany

    Tel.: +49 (0)69 798-36527

    E-mail: Anderl@soz.uni-frankfurt.de
    URL: http://www.fb03.uni-frankfurt.de/51220314/Anderl

    Citation

    Anderl, Felix (2015). Review: Andrew Bennett & Jeffrey T. Checkel (Eds.) (2015). Process Tracing: From Metaphor to Analytic Tool [18 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 16(3), Art.18,
    http://nbn-resolving.de/urn:nbn:de:0114-fqs1503187.

    Forum Qualitative Sozialforschung / Forum: Qualitative Social Research (FQS)

    ISSN 1438-5627

    Creative Common License

    Creative Commons Attribution 4.0 International License