Volume 7, No. 2, Art. 30 – March 2006

Review:

Matthias Catón

Henry E. Brady & David Collier (Eds.) (2004). Rethinking Social Inquiry: Diverse Tools, Shared Standards. Lanham, Md.: Rowman and Littlefield, 362 pages, ISBN 0-7425-1126-X, USD 27,95

Abstract: The book Rethinking Social Inquiry, edited by Henry E. BRADY and David COLLIER, is a response to a book by KING, KEOHANE and VERBA (1994) that aimed to introduce quantitative standards to qualitative research. The authors of the book reviewed here criticize many of the suggestions made there because they argue that qualitative research requires other tools. Nevertheless, they agree that the foundations of research design are similar. The book comprises a comprehensive critique of mainstream quantitative techniques, describes a set of qualitative tools for research, and addresses issues of how to combine qualitative and quantitative approaches to maximize analytical leverage. It is an excellent contribution to the methodological debate in the social sciences.

Key words: methods, methodology, quantitative versus qualitative, inference, research design

Table of Contents

1. The Quest for Methodology in Political Science

2. Rethinking Social Inquiry—Structure and Scope of the Book

3. Contributions to the Methodological Debate

References

Author

Citation

 

1. The Quest for Methodology in Political Science

Political Science has often been accused of lacking its own methodology and borrowing only from neighboring disciplines. In a way this may be true if we look at the wide variety of approaches used: at one extreme there are quantitative researchers applying cutting-edge statistical tools such as those used in biostatistics or econometric models. At the other end of the scale are qualitative researchers using methods from history or anthropology. Sociology has always had a major impact on political science, and while some scholars see political science as a peripheral area of law, others are inspired by psychology. [1]

Methodological debates between quantitative and qualitative researchers are common in other disciplines, too (see for example ROST 2003 or ONWUEGBUZIE & LEECH 2005 on this topic), but maybe it is because they perceive themselves as lacking their own methodology that scholars from different methodological currents are particularly intransigent in political science. The book reviewed here is part of this debate, but what makes it special is that it tries to bridge this divide. In fact, the book is not an independent work but an answer to another book, namely Designing Social Inquiry, written by Gary KING, Robert KEOHANE and Sidney VERBA, published in 1994 (this book will be subsequently referred to as DSI). In their book, which received considerable attention, KING et al. try to adapt principles of quantitative research for qualitative approaches. Written from a quantitative perspective, the authors suggest how qualitative researchers could improve their findings by accepting certain principles derived from statistical analysis. [2]

Alongside approval for its concise and clear language and its important insights DSI also aroused considerable criticism because many qualitative researchers felt that the peculiarities of their approaches were not entirely understood by the authors and that the advantages qualitative works can have over quantitative ones were not sufficiently taken into account. [3]

2. Rethinking Social Inquiry—Structure and Scope of the Book

BRADY and COLLIER’s book was explicitly written as an answer to KING et al. and makes frequent references to DSI throughout its chapters. The editors depart from the premises that qualitative and quantitative research methods have their own values and their own problems and that therefore neither is simply better. Each school, they argue, could learn from the other. [4]

The book is organized as follows: An introductory part with two chapters is followed by a part with several chapters addressing specific points of critique of DSI's proposals. Part III describes and assesses qualitative tools and approaches. Parts IV and V deal with bridging qualitative and quantitative research methods, in accordance with the book's subtitle Diverse Tools, Shared Standards. Finally, the book contains a chapter on the 2000 US presidential elections, which serves as an example of how to apply the different approaches described in the book. A very comprehensive glossary of key methodological terms concludes the book. [5]

After an overview in chapter 1, COLLIER, SEAWRIGHT and MUNCK provide a detailed description of DSI's main assertions on how to conduct research. They describe the aspects of theory formulation, case selection, descriptive and causal inference and the drawing of valid conclusions. At the end of the chapter, they provide 35 rules or guidelines which attempt to summarize DSI's recommendations. For example, researchers should limit the number of variables but increase the number of cases; they should avoid biases by selecting on the dependent variable; they should not choose sets of cases where either the dependent or the independent variable is constant; and they should estimate measurement errors. This chapter is valuable for those readers who did not read DSI, to give them an idea of the book's standpoints, but it is also an excellent synopsis of the fundamentals of quantitative research. [6]

The next four chapters address specific points of DSI. BRADY—a quantitative researcher himself—criticizes deficiencies in the aspects of concept formation and measurement, two areas he rightly considers crucial in any research endeavor and which often pose the biggest problems in political science, especially when dealing with abstract concepts such as democracy. How do we define it and, second, how do we measure it? BARTELS detects "unfulfilled promises of quantitative imperialism," meaning that much advice given by quantitative researchers is inappropriate for qualitative research and may also be problematic in quantitative research itself. ROGOWSKI deals with the problem of biased case selection, frequently seen as paramount by quantitative researchers. He claims that the analysis of abnormal or deviant cases can shed light on underlying causations and both help to establish new theories and to refute existing ones. COLLIER, MAHONEY and SEAWRIGHT also deal with selection bias. They argue that while selecting cases with extreme values on the dependent variable does in fact seriously distort regression analysis, this is much less the case in qualitative research, especially when dealing with within-case analysis. [7]

The next four chapters describe how qualitative researchers normally proceed in their work. MUNCK addresses how to define the universe of cases and how to select appropriate cases from this universe, how to measure data, and how to draw causal conclusions from them. Finally he discusses a distinctive feature of qualitative research which is often falsely criticized by quantitative researchers: the refinement of initial hypotheses during the analysis through iteration, which means that the theory is adapted after looking at the data and then tested against the same data. [8]

In his chapter, Charles RAGIN "turns the tables" as he says: instead of dealing with perceived deficiencies of qualitative methods from the quantitative point of view, he starts from the perspective of a qualitative researcher and points to those situations that quantitative analysis cannot handle properly: qualitative research can analyze complex causalities where multiple factors coincide rather than being additive, and it can deal with critical junctures—an aspect TARROW also deals with (see below). Also, qualitative analysis can address nonconforming cases more easily and explain why they are outside the general causality. RAGIN stresses the point made by MUNCK on going back and forth between theory and data: "The reciprocal clarification of empirical categories is one of the central concerns of qualitative research" (p.126). According to RAGIN it is not always possible to follow DSI's suggestion to test refined theories against new data: when using countries as cases, as is often done in political science, their number is limited so the advice might simply not be feasible. Also, qualitative research is capable of dealing with a fact that is frequently ignored by quantitative researchers: there is often more than one causal pathway for the same outcome; RAGIN calls them multiple conjunctural causations. When this happens, the condition of causal homogeneity—central to statistical analysis—is not met. [9]

The value of case studies and the logic behind them are treated by McKEOWN in his chapter. He starts with a discussion of the philosophy of logic behind inference, mainly of the positivist POPPERian (1969) approach used by most quantitative researchers, and contrasts it with the reality of many research endeavors: trying to extend existing knowledge of a certain phenomenon rather than moving from total ignorance to perfect knowledge. McKEOWN calls this "folk Bayesian," that is, the judgment of cases based on our prior knowledge. [10]

In the fourth part of the book, two chapters deal with the question of how to bridge qualitative and quantitative methods to increase the analytical outcome. TARROW describes two situations where qualitative analysis is evidently superior to quantitative analysis, namely process tracing and the analysis of tipping points. Process tracing looks at chains of events that lead to certain outcomes; tipping points are moments in history which turn out to be crucial for the further development. TARROW then describes three ways to combine qualitative and quantitative designs: qualitative analysis can serve a heuristic function of establishing knowledge of the phenomenon at stake which is subsequently tested statistically against more cases to check its representativeness. Quantitative analysis can serve as a starting point to give a first overview of possible causalities which are than scrutinized in qualitative analyses. Finally, triangulation is useful to increase inferential leverage when the available information is incomplete: Defects can be overcome by looking at the same data through different methodological eyes. [11]

KING, KEOHANE and VERBA focus on the importance of research design. They comment on some of the objections and remarks made in the other chapters of the book, such as the allegation that they concentrate too much on how to evaluate a theory and not enough on how to formulate it. Concerning the critique that they do not value case studies they respond that they do so because case studies can be seen as one single observation in a set of studies from different scholars. In essence, they are stating that a single case study might not suffice, but many case studies in a joint endeavor of science do. Also, they clarify their standpoint on how to select cases and how to avoid biased selection. Finally, they assess some of the examples of scholarly work given by the other authors of the book. [12]

Two final chapters, both written by COLLIER, BRADY and SEAWRIGHT, wrap up the debate. In addition to dealing with aspects already mentioned, they discuss the difference between probabilistic and deterministic causations and the different meanings of these concepts in qualitative and quantitative work. They address the fundamental question of trade-offs between different research goals and deal with some key aspects that differentiate qualitative and quantitative research. [13]

3. Contributions to the Methodological Debate

The authors deal with many aspects of methodology. The main areas in which quantitative and qualitative researchers still disagree seem to be: (a) how many cases are necessary and how they should be selected; (b) how theories are established and how they are tested and refined; and (c) whether researchers should use a predefined set of variables or rely on comprehensive knowledge of cases and contexts. [14]

The book serves a threefold purpose: it gives an excellent brief overview of the fundamentals of quantitative research, including a critique; it describes in detail tools for qualitative research; it gives a perspective on how to use both to maximize research results. It would be very desirable to continue the methodological debate initiated by KING et al. and continued in the book reviewed here. It should eventually be possible to agree on a set of standards while at the same time maintaining the advantages of the different approaches. [15]

The aim should be getting the most out of analysis, whether through methodological triangulation or nested analysis as described by LIEBERMAN (2005). This could be a unique methodological contribution of political science which might transcend the realms of the discipline. Maybe having its own little piece of methodology would render much of the fierce fighting between the different currents unnecessary. It would certainly help the ultimate goal of research in our discipline: explaining social phenomena and processes. After all, methods are only a mean and not an end in itself. [16]

References

King, Gary; Keohane, Robert O. & Verba, Sidney (1994). Designing Social Inquiry. Scientific Inference in Qualitative Research. Princeton: Princeton University Press.

Lieberman, Evan S. (2005). Nested Analysis as a Mixed-Method Strategy for Comparative Research. American Political Science Review, 99(13), 435-452.

Onwuegbuzie, Anthony J. & Leech, Nancy L. (2005). Taking the "Q" Out of Research. Teaching Research Methodology Courses Without the Divide Between Quantitative and Qualitative Paradigms. Quality and Quantity, 39(3), 267-295.

Popper, Karl R. (1969). Die Logik der Sozialwissenschaften. In Theodor W. Adorno; Hans Albert & Ralf Dahrendorf (Eds.), Der Positivismusstreit in der deutschen Soziologie (pp.103-121). Neuwied; Berlin: Luchterhand.

Rost, Jürgen (2003). Zeitgeist und Moden empirischer Analysemethoden [45 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research [On-line Journal], 4(2), Art. 5. Available at: http://www.qualitative-research.net/fqs-texte/2-03/2-03rost-d.htm [Date of access: February, 1, 2006]

Author

Matthias CATÓN is a research associate and lecturer in political science at the University of Heidelberg. His main areas of research are electoral systems, party systems and democracy. He is also particularly interested in research methodology.

Contact:

Matthias Catón

University of Heidelberg
Institute of Political Science
Marstallstr. 6
69117 Heidelberg
Germany

E-mail: matthias@caton.de
URL: http://www.caton.de

Citation

Catón, Matthias (2006). Review: Henry E. Brady & David Collier (Eds.) (2004). Rethinking Social Inquiry: Diverse Tools, Shared Standards [16 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 7(2), Art. 30, http://nbn-resolving.de/urn:nbn:de:0114-fqs0602309.

Forum Qualitative Sozialforschung / Forum: Qualitative Social Research (FQS)

ISSN 1438-5627

Creative Common License

Creative Commons Attribution 4.0 International License