Volume 2, No. 1, Art. 4 – February 2001

Introduction: On the Compatibility between Qualitative and Quantitative Research Methods

Nigel Fielding & Margrit Schreier

Table of Contents

1. Overview

2. The Contributions in this Volume

3. Approaches to Method Combination: The Triangulation Paradigm

Note

References

Authors

Citation

 

1. Overview

If you happen to be an avid reader of FQS, you may have noticed that the title of this volume has changed since it was first announced as "Qualitative and quantitative methods: How the two research traditions see each other". This first title reflects the original orientation behind the volume: to take a look at qualitative researchers' views of quantitative methods and at quantitative researchers' views of qualitative methods. Is there anything that they value about the "other" tradition, and in what way do they believe that their own methodological orientation might profit from integrating such elements? [1]

As it has turned out, this was obviously an overly optimistic idea, presupposing the existence of researchers on both sides of the methodological divide willing to take an unbiased look both at what they themselves do and what the "others" do in their research practice. Presumably it will not be much of a surprise, either, that qualitative researchers were often quite willing to go along with this idea—whereas it proved to be much more difficult (although not completely impossible: cf. ALISCH) to find quantitative researchers willing to consider that there might be a point to doing things the qualitative way at least some of the time. This situation quite accurately reflects the methodological situation in the social sciences. In many disciplines, the quantitative paradigm is still the dominant one (although there is some within-discipline variation from one country's social and behavioural science community to another). As a consequence, qualitative researchers usually cannot get by without some (sometimes even quite substantial) knowledge of quantitative methods and methodological standards, whereas in several disciplines there is no immediate need for quantitative researchers to "bother" much with qualitative methods. Thus the orientation (and with it the title) of the volume have changed, reflecting the concern of qualitative researchers in particular with the combination of qualitative and quantitative methods. [2]

We have grouped the resulting contributions into three sections. The first and in a sense most "abstract" section comprises papers that are concerned with the logic underlying qualitative and quantitative approaches and the consequences for the inter-relation of method ("The logic of relating qualitative and quantitative method"). It is in this section that topics such as the conceptualisation of triangulation, abductive logic, or questions concerning the reconciliation of positivist and constructivist epistemologies are dealt with. In Section two, papers presenting methodological approaches for inter-relating qualitative and quantitative methods have been assembled ("Different approaches for inter-relating qualitative and quantitative method"). In some cases the proposed methodologies extend over the entire research process; other suggestions for method integration concentrate on one phase of the research process in particular, such as the "initial telephone contact" in survey studies. In Section three, the focus is on the application of the most prominent among such integrative methodological approaches—i.e. triangulation—to actual research practice in different disciplines such as economics, media studies, and sociology ("Innovative applications of methodological inter-relation"). In the following, we will first give an overview of the papers (2.) and then go on to outline the major types of the inter-relation of qualitative and quantitative method exemplified by the contributions (3.). This is most notably triangulation which will therefore be dealt with in some detail. [3]

2. The Contributions in this Volume

Section one on the logic of relating qualitative and quantitative method begins with a contribution that takes us into the very methodological center of the volume. Combining qualitative and quantitative methods is almost by definition an issue of across-method triangulation. Since it was introduced into the social sciences by DENZIN (1970), the term triangulation has become something of a catchphrase. "Triangulation" is now ubiquitous in the methodological literature of the social sciences—and as it is often the case with such ubiquitous terms, its precise meaning has become lost over time. In his contribution, Udo KELLE proceeds to identify the various meanings in which "triangulation"—which he regards as a metaphor rather than a precise concept—has come to be used and to determine which of these meanings is most appropriate for conceptualising the combination of qualitative and quantitative methods. [4]

He distinguishes three meanings or models of triangulation: (1) triangulation as the mutual validation of results obtained on the basis of different methods (the validity model), (2) triangulation as a means toward obtaining a larger, more complete picture of the phenomenon under study (the complementarity model), and (3) triangulation in its original trigonometrical sense, indicating that a combination of methods is necessary in order to gain any (not necessarily a fuller) picture of the relevant phenomenon at all (the trigonometry model). These three models are in turn brought to bear upon the potential relationships between the results yielded by qualitative and quantitative methods employed in the same study. [5]

In order to determine the applicability of these models to the combination of qualitative and quantitative methods, KELLE goes on to examine the results of three mixed-method studies from life course research, thus tying his methodological considerations back to the actual research process. Judging the applicability of different understandings of triangulation, however, is something which from KELLE's point of view should involve not only methodological and epistemological considerations, but also include theoretical considerations. KELLE's conclusion that it is the trigonometric model of triangulation which holds the greatest promise for conceptualising the combination of qualitative and quantitative methods is thus a qualified conclusion, holding especially for sociological studies with its distinction between micro- and macrolevel descriptions. KELLE thus clarifies the discourse surrounding triangulation by presenting us with a number of models of triangulation to choose from and he adds to the grounds on which to make such a choice by drawing our attention to the relevance of theoretical issues—implicitly raising, of course, the question of what such a choice would look like in other disciplines. [6]

Philipp MAYRING starts out from the observation that the call for the combination of qualitative and quantitative methods has become almost a commonplace in methodology textbooks in the social sciences. This call, reasonable as it may be, MAYRING argues, is nevertheless a long way from actual research practice and does little to tell the researcher how exactly such a combination is to be achieved. By suggesting five levels at which qualitative and quantitative methods can be related—ranging from data to the entire research process—MAYRING alerts us to the details which lie behind the global call for the combination of the two paradigms. [7]

In closing, MAYRING turns to a premise of the inter-relation of the two paradigms which is more often than not left implicit: what are the advantages of such an inter-relation? It is especially in the context of his outline for an integrative documentation of the (qualitative or quantitative) research process that MAYRING shows what the two paradigms stand to gain by no longer ignoring each other. In the case of the quantitative paradigm, this is in particular the greater proximity to the research subject, while the qualitative paradigm will profit most by making the various stages of the research process more transparent and systematic, thus increasing the generalisability of the results. [8]

The ontological position of constructivist realism which is at the heart of Gerald CUPCHIK's contribution may strike one—at first sight—as something of a paradox. "Realism" with its implications of a world out there which can be apprehended and known by scientists, is a position which has gone out of fashion in our postmodern times. "Constructivism", on the other hand, carries associations of precisely such a postmodern discourse, suggesting that "the world" is real only to the extent that we make it so, that there are as many worlds, as many "realities" as there are minds to construe them. [9]

In his explication of constructivist realism, CUPCHIK cuts across such dichotomies. His starting point is the assumption that in everyday life, we usually have very little doubt about the reality of events that befall us, our actions and our interactions with others. To the extent that it is precisely these personally and socially relevant realities which constitute the subject matter of the social sciences, the social sciences deal with phenomena which are real—hence "constructivist realism". Yet their reality is not a given, but it is constructed by imbuing the phenomenon in question with meaning—hence "constructivist realism". If this meaning is socially shared, the process of meaning construction will hardly be noticeable; the more discrepant the social realities of two persons, however, the less they will be able to agree upon the reality of a phenomenon. In stressing the importance of the social constitution of meaning, CUPCHIK's position is thus akin to that of social constructionism (cf. for instance GERGEN 1985). [10]

If one starts out from this ontological position, CUPCHIK argues, the competition between qualitative and quantitative research is resolved into complementarity. While researchers from the two paradigms tend to stress either the realist (quantitative) or the constructivist (qualitative) end point, they are in the same position: they both deal with real phenomena in the above sense, with social processes, and they both have to ascribe meaning to their data. Rather than sequencing qualitative and quantitative research in some way, CUPCHIK sees both approaches as essentially inter-related, with quantitative research contributing towards the precise identification of relevant processes, and qualitative research providing the basis for their "thick description". [11]

While most contributors to this volume unanimously advocate the inter-relation of qualitative and quantitative methods, Harald WITT cautions us against the indiscriminate combination of methods from the two paradigms. He points out that a major difference between quantitative and qualitative research is to be seen in their research strategies which he describes as linear and circular respectively. Both research strategies, he argues, are cut out for different research goals, they accomodate different kinds of data and different sample types. WITT goes on to show how combining qualitative and quantitative methods does not necessarily result in getting the best from both worlds. Rather, certain types of method inter-relation may be cumbersome at best; at worst, the results achieved by such an "unhappy" combination will fall far short of what could have been achieved by remaining exclusively within one of the two paradigms. This applies in particular to the use of a qualitative method for data collection in the context of a circular research strategy. WITT thus draws attention to what is easily forgotten in the enthusiasm over transcending the boundaries between the qualitative and the quantitative paradigm: Combining qualitative and quantitative methods is not a good thing at all times, but only provided that such a combination is in line with the overall research goals. WITT also shows that the willingness to combine methods is not enough to make such an informed choice of method or method combination. The researcher who wants to combine methods had better know them all, qualitative and quantitative—no mean feat considering the proliferation of methods both in the quantitative and the qualitative area. [12]

For Gary SHANK, qualitative research is the systematic empirical inquiry into meaning. If, at the broadest level, triangulation is about adopting a sceptical attitude that represents "genuine doubt", and maintaining a receptiveness to any method of inquiry which offers to assuage that doubt, then SHANK is opening up a perspective which prompts researchers to inspect and always keep in mind an awareness of the logical foundations of their inquiries. Logic informs all reasoning, whether it is the hypothetico-deduction we associate with much quantitative inquiry or the inductive efforts often associated with qualitative work. For many researchers, awareness of logic and how it informs the epistemology which supports their empirical work does not get far beyond this distinction. But SHANK alerts us to the several kinds of induction, and to the critical (but widely unremarked) role of abduction in inquiries motivated by an analytic interest in meaning. [13]

SHANK's illustration of the six forms of abduction, or "reasoning to the best explanation", has an affinity with the other, relatively rare, meta-commentaries on the formal properties of qualitative analysis (such as DIESING's 1971 "pattern model of understanding" and the school of critical realism associated with BHASKAR 1975, HARRE 1970 and BOURDIEU 1996). Its central ground is the logic of inference. When SHANK confronts the relation of quantitative and qualitative methods he first asserts that it is not the method but the question which is important. But, beyond this, he emphasises the effect on the questions we can ask, and answer, if quantitative work were to define research methods. In particular, the valuable fruits of abductive reasoning would be lost. [14]

Drawing on a background in the analysis of political discourse and in the study of Artificial Intelligence, Francisco GUTIERREZ takes our discussion into the issue of how we know what we think we know. Using the case of the famous Turing Test (which challenges experimenters to see whether they can establish the difference between statements generated by a computer and those generated by a human), GUTIERREZ explores the logic by which we establish identity (or any kind of categorical knowledge) more generally. His empirical application is the possibility of discriminating between political actors holding different ideological orientations using only criteria internal to their discourse. Here we are in a central realm of qualitative work, the application and validation of classificatory systems such as typologies (GUTIERREZ focuses primarily on the dichotomous classification). The Artificial Intelligence community, as a result of its efforts to model human reasoning, has developed a close interest in classificatory practices, and the more formal methods of quantitative work can contribute to a better understanding of how people make fundamental distinctions in the course of everyday practical reasoning. Closing his paper, GUTIERREZ offers an overview of the several points of connection at meta-level between supposedly competing schools of thought, using the notion of paradigm shift to suggest the artificiality of a bipolar contrast between hermeneutic and formal reasoning, "hard" and "soft" methods, and "subjectivity" and "objectivity". [15]

The concluding part of GUTIERREZ's contribution considered inter alia the implications of paradigm shifts in intellectual disciplines, and this serves as the starting point for Dietmar JANETZKO's contribution. He notes that the debate over the status of qualitative and quantitative methods has been subject to its own changes of perspective over time. This fact serves as the basis on which to address a method for analysing conceptual changes (or changes of representation). These changes are amenable to qualitative or quantitative analysis. JANETZKO unfolds an approach to analysing changes of representation on the basis of symbolic, sequential data, which he calls "knowledge tracking", and which allows researchers to investigate both qualitative and quantitative aspects of changes. The approach is embedded in the network representation of cognition and requires the re-casting of the sociologist's or psychologist's theory of the data into the formal terms of a relational structure. JANETZKO argues that, while methods usually are either qualitative or quantitative, knowledge tracking is either or both: used quantitatively, knowledge tracking conducts a data-driven selection between competing theories, while used qualitatively it carries out a data-driven reduction of one theory. In this approach, then, the relationship of quantitative and qualitative can be calibrated to the requirements of the analytic work in hand. [16]

We mentioned that JANETZKO's approach involves a transformation of data and its associated theory into formal terms. In discussing triangulation, the benefits of iterative research designs which develop a programme of research through, for example, a sequence of quantitative inquiry followed by qualitative inquiry followed by further quantitative inquiry have been highlighted. This can be offered as a means of more rigorously and systematically pursuing the object of inquiry (cf. Below).1) It seems to us that the formalising project represents an alternative to the iterative research design where there is a series of studies alternating quantitative and qualitative approaches. In that the benefit of the latter is a more rigorous understanding of the relationships that have emerged from qualitative and quantitative work separately, JANETZKO's formal manipulation offers the prospect of similar benefit but with an intrinsic gain in efficiency, in that it may reduce the need for further empirical work. [17]

As we outlined above, in our plan for this issue of FQS a central interest was that we should not confine our attention to the established constructions of the relationship between quantitative and qualitative methods, such as the position which sees them as competing and irreconcilable modes of inquiry, or the position which seeks a systematic relationship based on the concept of triangulation. We wanted to stand back a little from these concerns and attend to the contemporary qualitative researcher's perspective on quantitative research, and the contemporary quantitative researcher's perspective on qualitative research. Another point we wanted to pick up was based on the perception that the diffusion of methodological approaches, understandings and practices is never uniform, even within a national community of scholars. When we take a global perspective we see fascinating local developments which feed a distinctive approach into the global social science community (for example, the "participatory research" variant of action research which is strong in Latin America) and we also see backwaters where the penetration of contemporary approaches has been impeded, sometimes reflecting the obstructive influence of political structures or the dominance of approaches associated with perspectives which have in the meantime become outmoded in their culture of origin. [18]

This was one reason for our particular interest in gaining contributions to this issue of FQS from scholars outside the dominant North American/western European social and behavioural science circuit. Jean SALUDADEZ and Primo GARCIA offer us an illuminating glimpse of the perspectives dividing—and relating—the researchers in an applied research institute in the Philippines which speaks directly to our interest in the contemporary relationship of qualitative and quantitative researchers. They profile the quantitative researcher's construction of qualitative research, and the quantitative researcher's understanding of the qualitative researcher's construction of quantitative research. While the study testifies to the continuing reign of quantitative work in such a setting, it also reveals an awareness of the merits and demerits of these modes of inquiry which is a good deal more subtle than a bipolar distinction would permit, thus setting up a framework for viewing the different approaches for interrelating qualitative and quantitative method presented in Section Two. [19]

For these authors, the relationship shows scope to evolve to a more complementary and less conflictual form than has prevailed in the past. Lest it be thought our view of this is as a pretty example of the "maturing" of social science in a post-colonial setting, we might observe that significant developments, such as the withdrawal of a requirement that all doctoral research proposals must be assessed and accepted by a quantitative social scientist or a statistician, are not entirely widespread in the universities of countries such as the United States of America. [20]

Nicole WESTMARLAND addresses the relationship between quantitative and qualitative method from a feminist perspective. She profiles the debate within feminist research over the value of qualitative and quantitative work, where central issues have been the conduct of the research process, the extent to which the two approaches to research adequately capture the reality of women's experiences, and thus the validity of the data upon which quantitative or qualitative researchers base their analyses. This last point means that, of course, the critiques raised by feminists interested in methodology are significant outside the confines of feminist research itself, and are of interest to the wider methodological community. It remains broadly the case that feminist research is drawn largely to qualitative methods, and WESTMARLAND explores the affinities which make this so. However, there is a substantial and important stream in feminist thought on methodology which sees a place for quantitative work in feminist inquiry. WESTMARLAND helps us to see the several ways in which quantitative work is valuable, even necessary, in those approaches to feminist work which prioritise the transformation of the place of women in society. She does so methodologically by contrasting and comparing the role of the survey questionnaire and the semi-structured interview, and empirically by tracing the role of qualitative and quantitative methods in her own researches into the situation of female taxi drivers. For WESTMARLAND, different feminist concerns speak to different research methods, and a dichotomy of qualitative and quantitative based on the claimed superiority of one over the other is a diversion from identifying the best tools for the job. [21]

Annette SCHMITT, Ulrich MEES and Uwe LAUCKEN present and illustrate an approach designed for analysing the structure and the rules underlying everyday social knowledge as it becomes manifest in texts: logographic analysis. Logographic analysis was developed precisely for this purpose; it is informed by the research goal, not by the affinity to either qualitative or quantitative methods (even though the authors are somewhat more at home in the qualitative paradigm), nor was it designed with a view towards combining the two paradigms. The approach thus constitutes a perfect example of the dictum of the priority of the research question over the method (as stated, for instance, by WESTMARLAND). Considering the research goal and the textual as well as the social character of the data, SCHMITT et al. localise logographic analysis predominantly within the qualitative paradigm. At the same time, however, the analysis also comprises quantitative aspects. In some cases, these are steps which are interleaved with the qualitative ones, such as hypothesis testing and comparative frequency analyses. In other cases, as in assessing the reliability and validity of the initial coding of the texts, qualitative and quantitative aspects of the procedure are so closely linked that it makes little sense to separate the two. Logographic analysis thus evades description in terms of qualitative and quantitative "parts". It is not a combination of qualitative and quantitative methods, but integrates the two and thus transcends the traditional dichotomy. [22]

Numerically aided phenomenology, the approach presented by Don KUIKEN and David MIALL, constitutes another method which, in integrating qualitative and quantitative aspects, succeeds in going beyond the dichotomy between "qualitative" and "quantitative" research. The criterion by which to evaluate research, KUIKEN and MIALL argue, is above all precision in the sense of distinctiveness, coherence, and richness. Where the description of lived experience is concerned, quantitative research is frequently lacking in precision to the extent that it underestimates the complexity of categories of experience, resulting in the reduction of the phenomenon to a few conventional meanings. Qualitative research, on the other hand, lacks precision in that it fails to distinguish between different extents to which a category of experience may be present. [23]

Numerically aided phenomenology, a method for the description of categories of lived experience, is aimed at increasing the precision of qualitative phenomenological research by instituting a quantitative algorithm at the very centre of qualitative data collection. In this, KUIKEN and MIALL regard categories of experience as "polythetic classes" (BAILEY 1994, pp.4ff.), i.e. as arrays of attributes where some attributes will characterise some instances of the category, but presumably no attribute will characterise all instances. Following the identification of relevant attributes by means of an in-depth analysis of the phenomenon, the specification of categories is achieved by drawing upon HUSSERL's concept of "imaginative variation", a kind of thought experiment in which the presence of individual attributes is varied systematically. Within numerically aided phenomenology, this imaginative variation is achieved by means of the quantitative step of cluster analysis. This is again followed by a qualitative procedure, the close inspection of the data on the basis of those clusters which may in turn result in an expansion and further differentiation of the classes or types. Numerical phenomenology may thus be regarded as an approach toward the construction of a typology which combines qualitative and quantitative procedures (while most approaches existing to date are restricted to either of the two paradigms; cf. the overview in KLUGE 1999). [24]

Most contributors to this volume who are in favour of relating qualitative and quantitative methods suggest ways in which such an inter-relation may be actively realised, as in the above numerologically aided phenomenology, in triangulation, etc. Gerhard KLEINING and Harald WITT take a somewhat different approach. The apparent incompatibility of the qualitative and the quantitative paradigm, they argue, is basically just a byproduct of the almost exclusively interpretive orientation of qualitative research in the social sciences. This orientation, however, is thought to imply a number of drawbacks culminating in a "crisis of qualitative research". KLEINING's and WITT's major concern is thus not with the combination of qualitative and quantitative research as such, but with overcoming the interpretive bias of qualitative research. In order to do this, they suggest the reinstatement of heuristic, exploratory methods which are aimed at discovery rather than interpretation. They present their own approach developed along these lines, the Hamburg qualitative heuristic methodology, which combines classic heuristic elements with systematic rules for their application. This methodological orientation toward discovery, the authors argue, can act as a kind of common roof for both qualitative and quantitative designs, thus overcoming the divide between the two paradigms; they go on to demonstrate this by presenting one qualitative and one quantitative example from their own research. [25]

Obviously, an orientation towards discovery as it is advocated by KLEINING and WITT inevitably carries certain implications of a realist ontology and epistemology—which is very much at odds with the ontological stance found, for instance, even in modified versions of postmodernism (one representative in this volume would be BOWKER). By making their ontological assumptions explicit, these authors thus draw attention to the way in which methodological issues relate to the philosophy of science in general, raising the question of the compatibility between our way of combining qualitative and quantitative methods on the one hand and our ontological premises on the other. [26]

Giampietro GOBO argues in accord with the famous dictum that "the devil is in the detail". It is a commonplace that quantitative and qualitative approaches should be integrated, but the prevailing examples of their integration very often revolve around the macro-level comparison of findings from independently-conducted, discrete applications of each type of method (as do the preceding contributions in this section). Instead, GOBO argues that our notion of integration needs to become more specific, and he illustrates this by a discussion of how a qualitative research practice can be applied as an integral part of a quantitative inquiry. Survey non-response is a widely-remarked and increasing problem, as GOBO's useful summary of recent methodological research on the matter indicates. This methodological research shows that an important element in non-response to survey interviews is the tactics and persuasive techniques that are used in the first moments of contact between researcher and potential respondent. In the context of telephone interviewing, GOBO demonstrates how an understanding of the communicative process, drawing on discourse analysis, conversation analysis and narrative analysis, can enable researchers to identify analytically the effects of different interviewer tactics, indicate best practice, and thus improve survey techniques. [27]

Section three on innovative applications of methodological inter-relation begins with an example from economics, a discipline we normally do not associate either with an acknowledgement of subjectivity or with the use of qualitative or fieldwork-based methods. There are illuminating exceptions (see the special issue of Administration Science Quarterly edited by John VAN MAANEN in 1979 for several examples), but the fact remains that the exceptions are occasional and isolated from the mainstream of the discipline. It follows that triangulation or more broadly, work which integrates quantitative and qualitative method, is a rarity in the economics discipline. At the same time, though, economists are well-aware of the critiques the other social sciences bring to its central convention, the homo economicus or "rational" economic actor, all of whose actions can be modelled, understood and made the basis for prediction precisely because these actors' decisions are wholly and reliably captured by the calculation of costs versus benefits. Since economic models have been known to fail (!), it is plain that there are deficiencies in the cost/benefit heuristic as an exclusive way of understanding those (preponderant) aspects of the social world which relate in one way or another to the allocation of resources (be they material, intellectual or spiritual). This spurs some economists, at least, to pursue additional means of capturing human decision-making. Stefan MANN provides us with an example, drawing on his research on factors influencing the decision whether to invest in a new agri-business development in a rural region of Germany. It is an example of the classic form of triangulation, with the following twist: MANN argues that the qualitative element of the research exposed only the factors that participants were willing to explicitly articulate (while not being obvious to the researcher), whereas the quantitative element enabled the identification of factors that were effective but not consciously articulated during the research process. [28]

There are different postures towards the integration of quantitative and qualitative methods, only one of which is triangulation. But the established literature might be characterised as being dominated either by approaches which argue that triangulation is possible because, if methods are systematically understood and rigorously used, points of connection can be identified such that both types of methods are addressing the same phenomenon, or by approaches that argue that because the types of method are founded on contrasting epistemologies, their differences are irreconcilable and so triangulation is impossible. During its emergent phase, postmodernist schools of thought have overwhelmingly fallen into the latter category. But as postmodernism has begun to establish its own approach to understanding empirical phenomena (rather than being preoccupied with the need to carve out space for its approach by a backwards-looking critique of what has gone before, in similar fashion to other emergent schools of thought, as can be seen in the stages of the relationship between ethnomethodology and sociology), a more interesting and sophisticated position has emerged. This position naturally acknowledges and even valorises relativism, as a sign of the inevitably multi-perspectival nature of knowledge of the social, but also sees no reason to refrain from quantitative work simply because notions of objectivity have been discarded. Such work must be done alongside, and in articulation with, qualitative work, so as to increase awareness of multiple perspectives and the contingency associated with situated knowledge (rather than to draw quantitative and qualitative findings together into a monolithic framework as in conventional triangulation). Natilene BOWKER offers us a significant example of this approach in a report of her programme of research into online behaviour in Internet Chat Rooms, concluding that multiple methods enable researchers to integrate their, and the research participants', situated knowledge. [29]

In Alexander JAKOB's contribution, the concept of triangulation is applied to the sociological reconstruction of employment biographies of officers in the German army who are about to become civilians. The focus of the study is on the extent to which and the way in which these officers experience uncertainty in facing this substantial change in their life course. JAKOB approaches his topic by means of an across-method triangulation: in a first quantitative phase, a large representative sample is drawn; data are collected by questionnaire and subjected to probabilistic cluster analysis. In a subsequent qualitative phase, a smaller subsample is selected on theoretical grounds and interviewed in depth. The study thus constitutes an example of realising different research phases where the quantitative is followed by a more detailed qualitative step of data collection and analysis (which corresponds to one of the designs for combining qualitative and quantitative research as suggested by MAYRING). Yet JAKOB's study differs from the standard design of this type in that the two phases do not consist of separate studies, but are in fact interdependent and thus complementary. Quantitative data analysis, for instance, is informed by the results of the qualitative phase and in turn allows to draw conclusions concerning the frequency of each of the types in the population. In describing the characteristics of the types, JAKOB also draws upon his analysis of the interviews, aiming for a "thick description" (GEERTZ 1973). In applying the concept of triangulation, JAKOB is thus not concerned with a mutual validation of "qualitative" and "quantitative" results, but with their complementarity which he employs towards realising a description of the phenomenon under research which is at once more precise and has greater depth than any description he could have obtained by restricting himself to one method only. [30]

3. Approaches to Method Combination: The Triangulation Paradigm

As these applications in Section three demonstrate, triangulation is clearly a core issue in any approach to methodological combination. The contributions in this volume also show, however, that triangulation is not the only way in which qualitative and quantitative methods can be combined. Besides triangulation, two other approaches to method combination can be distinguished: sequencing and what we will term "hybrids" (cf. SCHREIER in press). [31]

In the case of sequencing, qualitative and quantitative methods are employed within one and the same study, although in different phases of the research process. The most common example would be a qualitative phase of data collection which is followed by a quantitative phase of data analysis, as in the case of interviews which are coded and for which coding frequencies are determined; alternatively, data analysis might involve the construction of types by means of cluster analysis, the reduction of categories to a smaller number of dimensions by means of multiple correspondance analysis, etc. (for additional examples of sequencing cf. MAYRING in this volume). Sequencing in this sense can be employed within otherwise "quantitative" studies which aim at hypothesis testing (cf., however, WITT in this volume who warns against indiscriminately using qualitative data as part of a linear-quantitative research strategy). To the extent that qualitative research wishes to go beyond individual cases and to say something about the sample at large, maybe even the population, sequencing can also be part of a qualitative research strategy, taking place whenever a generalisation of qualitative findings occurs on an aggregate level. Looked at from this perspective, sequencing may even be said to constitute an inherent characteristic of many typically "qualitative" approaches, such as grounded theory, objective hermeneutics, comparative casuistics, and so on. [32]

By "hybrids" we mean approaches which do in themselves constitute a combination of qualitative and quantitative elements. These elements may be so closely "packed" as to be practically indistinguishable—systematic content analysis which combines the (qualitative) coding of texts with the (quantitative) calculation of coefficients of interrater agreement would be a case in point (RUSTEMEYER 1992; GROEBEN & RUSTEMEYER 1994). More often, hybrid approaches comprise a number of phases, some of which are qualitative, others quantitative; all, however, are equally necessary for achieving the objective of the approach. There are some examples in this volume, such as logographic analysis (SCHMITT, MEES & LAUCKEN), numerically aided phenomenology (KUIKEN & MIALL), or the qualitative experiment (KLEINING & WITT); others, such as the research program subjective theories (GROEBEN & SCHEELE 2000) or KUCKARTZ' approach toward case-oriented quantification (e.g., KUCKARTZ 1995) are not covered here. To the extent that these latter approaches combine qualitative and quantitative research phases, these "hybrids" are very similar to the strategy involved in sequencing. Hybrids and sequencing differ, however, in the sense that hybrids require precisely one and only one specific combination of qualitative and quantitative phases, whereas in sequencing any kind of combination is possible. [33]

There do, of course, exist other issues concerning the relation between qualitative and quantitative methods which might have been raised by the contributions to the volume—such as strategies for the analysis of qualitative data on an aggregate level or questions concerning the methodological standards for evaluating the results of qualitative research (cf. the discussion in FQS: REICHERTZ 2000; BREUER 2000). That this was not the case is probably to some extent due to the original orientation of the volume which did not really invite such strictly methodological papers. Instead, as we said above, it is triangulation which drew the greatest amount of discussion. [34]

The usual emphasis in triangulation is on combining methods, e.g., survey questionnaires with non-standardised interviews, although examples are also common of studies where triangulation is claimed on the basis of using a number of data sources (self, informants, other commentators), a number of accounts of events, or a number of different researchers (see FIELDING & FIELDING 1986). The broad idea in the conventional approach to triangulation is that if diverse kinds of data support the same conclusion, confidence in the conclusions is increased. It is implicit here that this is only to the extent that different methods or different kinds of data have different types of error. Further implied is that these sources of error can be anticipated in advance and that their effects and magnitude can be traced when analysis is carried out. It is in this sense that LEVINS' (1966, p.423) declaration that "our truth is the intersection of independent lies" is so apt. [35]

The classical approach represented by CAMPBELL's work, and still widely encountered in psychology, is one seeking convergence or confirmation of results across different methods. In effect, this amounts to conducting two studies with the hope of arriving at the same conclusions, thus demonstrating that the conclusions are not artifacts of method and, in particular, associated with sources of invalidity characteristic of a given method. A key example is DENZIN (1970), whose original conceptualization of triangulation is explicitly related to the work of WEBB, CAMPBELL, SCHWARTZ and SECHREST (1966) on "unobtrusive measures". However, the term "triangulation" has acquired so many meanings and usages that it is now safer to use the terms "convergence" or "confirmation" when seeking cross-validation between methods. [36]

In fact this classic goal of seeking convergence across methods has always been relatively rare and is increasingly so as a motive for combining quantitative and qualitative methods. This is so particularly in social science research and even more so in applied social research. One reason for this is the obstacle one encounters when results fail to converge. But the rarity of classical triangulation as a reason for combining methods is also a response to the amount of effort that it takes to pursue the goal of producing convergent findings. As MORGAN (1998) notes, researchers in applied fields often cannot afford to put so much effort into finding the same thing twice. On the other hand, applied problems such as the factors influencing health are so various and complex that applied researchers are readily driven to appreciate the different strengths that different methods offer. This makes for a more flexible approach to methodological combination than is found in classic triangulation, but nevertheless represents an important motivation for combining methods. [37]

It must be apparent from the different constructions of triangulation mentioned above that there are degrees of rigour and/or formality in the operationalisation of the broad idea of triangulation. We might, for example, regard the idea that validity will be enhanced simply by drawing on data collected by different researchers using the same method as a relatively weak form of triangulation, while an approach based on the combination of different methods might be regarded as somewhat more rigorous. Even so, we have already begged a significant question—what is to count as "valid"? As virtually all readers of this journal will be aware, validity (or the idea of a "conclusion" about which we can be "confident") is a highly contested idea. [38]

While epistemological debate continues, with the virtual certainty that it will never conclude, we can nevertheless safely proceed with our concept of triangulation provided in each case where it is claimed the researchers make clear what criteria of adequacy and/or validity they intend to apply. But this is really only an extension of the idea that, for triangulation to be credibly claimed and demonstrated, it is necessary to identify in advance the characteristic weaknesses or types of error associated with given methods so that, when data from the different methods is combined, the possibility can be discounted that the methods might be susceptible to the same kinds of validity-threat. Where they are susceptible to the same weaknesses, combining them will, of course, do no more than multiply error. [39]

Thus, a great deal depends on the logic by which researchers draw on and mesh together data from the different methods.

What is involved in triangulation is not the combination of different kinds of data per se, but rather an attempt to relate different sorts of data in such a way as to counteract various possible threats to the validity of (their) analysis (HAMMERSLEY & ATKINSON 1983, p.199). [40]

While the social science application of triangulation is widely regarded as having originated in psychology, there is an established argument to the effect that qualitative research, and especially ethnography, is particularly well-suited to triangulation. Many have followed DENZIN's (1970) argument that triangulation should not only involve multiple methods ("data triangulation") but multiple investigators ("investigator triangulation") and multiple methodological and theoretical frameworks ("theoretical and methodological triangulation"). Each of the main types has a set of sub-types in DENZIN's formulation. Data triangulation may include time triangulation, exploring temporal influences by longitudinal and cross-sectional designs; space triangulation, taking the form of comparative research; and person triangulation, variously at the individual level, the interactive level among groups, and the collective level. In investigator triangulation, more than one person examines the same situation. In theory triangulation, situations are examined from the perspective of different theories. Methodological triangulation has two variants, "within-method", where the same method is used on different occasions (without which, one might suggest, one could hardly refer to "method" at all), and "between-method", where different methods are applied to the same subject in explicit relation to each other. [41]

Ethnography nearly always involves collecting different kinds of data (fieldnotes, interview transcripts, documents) from different sources (members, the researchers—e.g., through fieldwork diaries, and independent commentators on the setting, e.g., those from another discipline or journalists). BURGESS (1984, p.5) adds to this an important elaboration, that the distinctive thing about ethnography in the context of triangulation is that it involves developing "relationships between the researcher and those researched". Such relationships make available a range of techniques for checking interpretations which arise from the more intimate and sustained nature of this form of fieldwork. [42]

It may be thought that all of this is to disregard the powerful critique of social and behavioural science epistemology brought to bear by postmodernism in recent years. However, one need not subscribe to the notion of absolute standards, objectivity and "truth" to see that triangulation has an important place in the research process. As BREWER (2000, p.76) puts it, "even in this type of (postmodern) ethnography, practitioners recognise that all methods impose perspectives on reality by the type of data that they collect, and each tends to reveal something slightly different about the same 'symbolic' reality". This means that data triangulation is necessary even in the type of ethnography where the applicable criterion is not the achievement of the objective knowledge of the social world, "not as a form of validity ... but as an alternative to validation" (l.c.). [43]

Even for those not in accord with postmodern perspectives, and who are oriented to notions of validity and reliability, triangulation in itself is no guarantee of internal and external validity. For example, let us consider KELLE's (this volume) third empirical case, where a qualitative enquiry took place into the operation of the job placement scheme in former-socialist East Germany, which had been endorsed as effective by (official) statistical analyses. The qualitative study suggested-to some, revealed-that what was in fact happening was that the job placement system was being manipulated by potential employees, who were merely finding their own work using informal channels, then colluding with employers to report a "vacancy" to the job placement scheme, which was then quickly "filled" by the collusive employee, yielding an apparent success for the job placement system. Let us assume that there is no doubt at all of an internal methodological kind about the rigour with which both the statistical analysis of the job placement system and the qualitative study of employees apparently placed by it were conducted. Does this example represent a successful case of triangulation or does it actually mean that we always need qualitative methods, since the quantitative findings do not seem to have contributed anything? [44]

Our answer would be that it is a case justifying the value of triangulation—because, without the quantitative data providing one version of social reality we would not know how to value or assess those reports from the qualitative study about the workers manipulating the system. In order to identify in our analytic work with the qualitative data that data about the manipulation of the system raised a point worthy of enquiry we had to have the quantitative data suggesting that the official system was operating rather well. Even so, doubt remains. We might, for example, worry that, due to the almost-intrinsically limited scope of qualitative work, our research had simply managed to uncover those few renegade workers who had manipulated the system. One way we could address that—within the confines of qualitative method—would be to inspect the data for talk in which workers reported satisfaction with the state job placement system. Perhaps this balanced the accounts where manipulation was reported? But another way we could address such doubt (and these procedures have their mirror image in studies where the quantitative data repudiate the qualitative data) would be to extend our programme of research to a third step, where, in light of the findings of the qualitative work, we constitute a further quantitative enquiry, but this time instead of using official employment data, we carry out an independent survey which specifically asks questions about the respondents' experience of the job placement process, for example, precisely how they learned of the vacancy which they then filled (i.e., did they hear about it from a friend or see it posted on a job card in the state job placement bureau). In this approach, initial quantitative data gives an official version of reality, this is called into question by qualitative work, and we seek a resolution of the conflicting versions by highlighting the process suggested by the qualitative work and seeking to establish whether it is more widely applicable. [45]

Thus, we might take the more modest view that the real value of triangulation is not that it guarantees conclusions about which we can be confident but rather that it provokes in researchers a more critical, even sceptical, stance towards their data. All too often in qualitative research (and examples exist in quantitative work, too), researchers are drawn to facile conclusions, of the sort which frequently lead outsiders to complain that the main product of social and behavioural research is the confirmation of what everyone knew by commonsense in the first place. Further, when analyses are challenged, qualitative researchers are prone to claim "ethnographic authority" (HAMMERSLEY & ATKINSON 1983), that is, they defend their interpretation not by adherence to systematic, established, externally-validated analytic procedures but by the (usually unassailable) fact that "they were there". They did the fieldwork, they collected the data, therefore they have the "best sense" of what the data may mean. [46]

Such a criterion for warranting inferences is deeply unsatisfactory. Among its several defects is the way it contrasts with the grounds on which the warrant for inferences from quantitative data can be established. Here use is made of statistical procedures whose steps are standardised, so that adherence to each stage can be checked by critics, and whose criteria for drawing a particular conclusion are not only explicit but precisely define the conditions under which a given conclusion can be assumed to hold or to break down. Triangulation offers a means for qualitative researchers to be more discriminating and discerning about their data, to take on the stance so often characteristic of the quantitative researcher, for whom conclusions are always "on test", hold only under specified conditions, and whose relationship to the data is not uncritical "immersion" but measured detachment. [47]

We are not arguing that qualitative researchers need to transform their approach to resemble that of the statistician, but we do argue that when we look at triangulation its value lies more in its effects on "quality control" than in its guarantee of "validity". A further benefit is that this approach promotes more complex research designs and that these oblige researchers to be more clear about what it is they are setting out to study. There will always be value in the relatively diffusely-focussed exploratory study, but as qualitative research gains legitimacy (and there is little doubt that it has done so in recent years in western Europe and in North America; FIELDING & LEE 2000), it increasingly tackles more precisely-specified topics and becomes more prominent in applied spheres such as policy-related research in fields like health behaviour and crime, where relevant research audiences (including research subjects and researchers themselves) want to feel "confidence" in the "conclusions". Indeed, it seems perverse even in purely exploratory work for researchers to be indifferent to the accuracy of their analyses. One might even argue that it is incumbent on researchers exploring hitherto obscure corners of the social world to employ research designs which accurately depict what has previously been unknown and which has thus far proved resistant to study by more conventional means. [48]

In that triangulation is much about the comparison and integration of data from different methods it is worth reminding ourselves of SIEBER's (1979) seminal argument on what qualitative work can do for quantitative work and what quantitative work can do for qualitative work. Bearing in mind that SIEBER's approach is grounded in a firmly positivist perspective, and beginning with data collection issues, qualitative work can assist quantitative work in providing a theoretical framework, validating survey data, interpreting statistical relationships and deciphering puzzling responses, selecting survey items to construct indices, and offering case study illustrations. In some cases the theoretical structure itself is a product of field experience. For SIEBER, quantitative data can be used to identify individuals for qualitative study and to delineate representative and unrepresentative cases. Turning to data analysis, SIEBER maintains that quantitative data can correct the "holistic fallacy" that all aspects of a situation are congruent, and can demonstrate the generality of single observations. Field methods sometimes suffer "elite bias", an over-concentration on certain respondents due to their articulacy, strategic placing in terms of access, and because researchers like to share their high status. Quantitative data can deal with this fault by indicating the full range that should be sampled. Among the things that SIEBER suggests qualitative data can contribute to quantitative research are depth, an idea of the range of core concepts, and the ability to solve puzzles that the more superficial quantitative data cannot address. [49]

It is worth making explicit that accepting the case for interrelating data from different sources is to accept a relativistic epistemology, one that justifies the value of knowledge from many sources, rather than to elevate one source of knowledge (or more accurately, perhaps, to regard one knowledge source as less imperfect than the rest). Those taking an approach favourable to triangulation in conventional terms are more likely to work from a perception of the continuity of all data-gathering and data-analysing efforts (e.g., as several of our contributors hold, to perceive that all data analysis involves "interpretation"). They are more likely to regard all methods as both privileged and constrained: the qualities that allow one kind of information to be collected and understood close off other kinds of information. [50]

It is important, then, not to be led by an enthusiasm for multiplying sources of information into forgetting to monitor the biases to which each method is susceptible. The conventional logic of triangulation's multiple sources of information is that by using several we can diversify biases in order to transcend them. We use a variety of independent methods with predictable and different characteristic kinds of error so we can look for things which are invariant or identical in the data which have been produced using different knowledge sources. But it is not just the search for points of co-incidence or agreement. In this conventional approach to triangulation, we have further to identify the scope of the processes across which they are invariant, the conditions under which the invariance occurs. We also need to explain failures of invariance, why given limits or conditions apply. It follows that the differences between findings from different knowledge sources can be as analytically illuminating as their points of coherence (as in, for example, the third empirical study in KELLE's contribution to this volume). [51]

Two main sources of bias are apparent in qualitative work: the tendency to select field data to fit a preconception of the phenomenon and how it should be analysed, and a tendency to select field data for analysis which are conspicuous because they are exotic at the expense of less dramatic, but possibly more indicative, data. While the rigidity of positivist methods helps researchers resist these faults, such work is not free of such problems either. But what makes it easier for quantitative researchers to trace such faults is that the character of the data, and the necessity to state hypotheses, make the researcher's assumptions more explicit and available for inspection by third parties. However, systematic observation can have some of the advantages of the survey, as in HUMPHREYS' (1970) study of impersonal sex in public toilets. He completed "fact-sheet" descriptions for each observation, later augmenting these with conventional fieldnotes, and claimed that this strategy gave "objective validity" to his data. It would be more accurate to say that a quality control mechanism was built into the data by incorporating into the data physical descriptors that could be checked. The point is that the introduction of a systematic element to the field observation facilitated attention to replication and comparison in a similar way to that normally associated with survey work. [52]

The advantages of combining methods should not lead researchers to subordinate their awareness that different approaches are supported by different epistemologies and logical assumptions, which require their handling by different terminologies. Results from different methods founded on different methods may, then, be combined but for a different purpose than that associated with the established approach to triangulation. Theoretical triangulation does not necessarily reduce bias, nor does methodological triangulation necessarily increase validity. Competing theories are generally the product of different traditions, so when combined they may offer a fuller picture but not a more "objective" one. Likewise, different methods draw on different (and often competing) epistemologies and while combining them can add range and depth it does not necessarily add accuracy. In this approach, when we combine theories and methods we do so to add breadth or depth to our analysis, not to pursue an "objective" truth. [53]

Rejecting absolute versions of truth, and the feasibility of absolute objectivity, is not the same as rejecting the standard of truth or the attempt to be objective. In things social and behavioural, our knowledge is always partial and intrinsically incomplete. We accept the abstraction or conclusion-with-identifiable-and-defined limits as invitational, suggesting implicitly the "constant and unevadable necessity for interpretation and change of aspect" (NEEDHAM 1983, p.32). This is, ultimately, the warrant for the triangulation paradigm. [54]

Note

1) Formalisation is also at the core of the contribution by Lutz-Michael Alisch which will be added to this volume at a later time. <back>

References

Bailey, Kenneth D. (1994). Typologies and taxonomies. An introduction to classification techniques. Thousand Oaks etc.: Sage.

Bhaskar, Ronald (1975). A realist theory of science. Leeds: Leeds Books.

Bourdieu, Pierre (1996). Toward a reflexive sociology: a workshop with Pierre Bourdieu. In Stephen P. Turner (Ed.), Social Theory and Sociology: the classics and beyond (pp. 213-228). Oxford: Blackwell.

Breuer, Franz (2000, September). Über das In-die-Knie-Gehen vor der Logik der Einwerbung ökonomischen Kapitals-wider bessere wissenssoziologische Einsicht. Eine Erregung. Zu Jo Reichertz: Zur Gültigkeit von qualitativer Sozialforschung [18 paragraphs]. Forum Qualitative Sozialforschung / Forum Qualitative Social Research [On line Journal], 1(3). Accessible via: http://www.qualitative-research.net/fqs-texte/3-00/3-00breuer-d.htm [Access: 14.12.2000].

Burgess, Robert (1984). In the field. London: Routledge.

Denzin, Norman (1970). The research act. Chicago: Aldine.

Diesing, Philip (1971). Patterns of discovery in the social sciences. Chicago: Aldine.

Fielding, Nigel & Fielding, Jane (1986). Linking data: the articulation of qualitative and quantitative methods in social research. Beverly Hills, London: Sage.

Fielding, Nigel & Lee, Ray (2000). Patterns and potentials in the adoption of qualitative software: the implications of user experiences and software training (Social science methodology in the new millennium: Proceedings of the Fifth International Conference on Logic and Methodology, CD ROM. Köln: Zentralarchiv fur Empirische Sozialforschung, 2000).

Geertz, Clifford (1973). The interpretation of cultures: selected essays. New York: Basic Books.

Gergen, Kenneth (1985). The social constructionist movement in modern psychology. American Psychologist, 40(3), 266-275.

Groeben, Norbert & Rustemeyer, Ruth (1994). On the integration of quantitative and qualitative methodological paradigms (based on the example of content analysis). In Ingwer Borg & Peter Mohler (Eds.), Trends and perspectives in empirical social research (pp.308-325). Berlin/New York: de Gruyter.

Groeben, Norbert & Scheele, Brigitte (2000). Dialogue-hermeneutic method and the "research program subjective theories" [18 paragraphs]. Forum Qualitative Sozialforschung / Forum Qualitative Social Research [On line Journal], 1(2). Accessible via: http://www.qualitative-research.net/fqs-texte/2-00/2-00groebenscheele-e.htm [Access: 30.01.2001].

Hammersley, Martin & Atkinson, Paul (1983). Ethnography: principles in practice. London: Tavistock.

Harré, Rom (1970). The principles of scientific thinking. London: Macmillan.

Humphreys, Laud (1970). Tearoom Trade. London: Duckworth.

Kluge, Susann (1999). Empirisch begründete Typenbildung. Zur Konstruktion von Typen und Typologien in der empirischen Sozialforschung. Opladen: Leske & Budrich.

Kuckartz, Udo (1995). Case-oriented quantification. In Udo Kelle (Ed.), Computer aided qualitative data analysis. Theory, method, and practice (pp.158-166). London etc.: Sage.

Levins, Robert (1966). The strategy of model building in population biology. American Scientist, 54, 420-440.

Morgan, David (1998). Practical strategies for combining qualitative and quantitative methods: applications to health research. Qualitative Health Research, 8(3), 362-76.

Needham, Rodney (1983). The tranquillity of axiom. Los Angeles: University of California Press.

Reichertz, Jo (2000). Zur Gültigkeit von qualitativer Sozialforschung [76 paragraphs]. Forum Qualitative Sozialforschung / Forum Qualitative Social Research [On line Journal], 1(2). Accessible via: http://www.qualitative-research.net/fqs-texte/2-00/2-00reichertz-d.htm [Access: 14.12.2000].

Rustemeyer, Ruth (1992). Praktisch-methodische Schritte der Inhaltsanalyse. Münster: Aschendorff.

Schreier, Margrit (in press). Qualitative methods in studying text reception. In Dick Schram & Gerard Steen (Eds.), The psychology and sociology of literature. In honor of Elrud Ibsch (pp.35-56). Amsterdam: Benjamins.

Sieber, Sam (1979). The integration of fieldwork and survey methods. American Journal of Sociology, 78(6), 1335-1359.

Van Maanen, John (1979). Special issue on non-quantitative methods in management research. Administration Science Quarterly.

Webb, Elizabeth; Campbell, Donald; Schwartz, Robert & Sechrest, Lee (1966). Unobtrusive measures: non-reactive research in the social sciences. Chicago: Rand McNally.

Authors

Nigel FIELDING is Professor of Sociology and co-Director of the Institute of Social Research at the University of Surrey. He has taught field methods and criminology at Surrey since 1978. His research interests are in qualitative methods, new research technologies, and criminal justice. He was Editor of the Howard Journal of Criminal Justice from 1985 to 1998, and is co-editor of the series "New technologies for social research" (Sage). He has published twelve books, four of them on aspects of methodology, and is currently working on the second edition of Computer programs for qualitative data analysis (with E. WEITZMAN and R. LEE) and a four volume set on Interviewing, both for Sage.

Contact:

Nigel Fielding

Institute of Social Research
Department of Sociology
University of Surrey
Guildford GU2 7XH, England

E-mail: n.fielding@surrey.ac.uk

 

Margrit SCHREIER, Dr. phil., is research assistant at the Department of General and Cultural Psychology, University of Cologne. Her current research interests include: methodology of the social sciences, combination of qualitative and quantitative methods, research program subjective theories, media psychology and media studies, psycholinguistics, attribution of responsibility, and the empirical study of literature.

Contact:

Dr. phil. Margit Schreier

Universität zu Köln
Psychologisches Institut
Lehrstuhl f. Allgemeine Psychologie und Kulturpsychologie
Herbert-Lewin-Str. 2
D - 50931 Köln, Germany

E-mail: m.schreier@uni-koeln.de

Citation

Fielding, Nigel & Schreier, Margrit (2001). Introduction: On the Compatibility between Qualitative and Quantitative Research Methods [54 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 2(1), Art. 4, http://nbn-resolving.de/urn:nbn:de:0114-fqs010146.

Revised: 7/2011

Forum Qualitative Sozialforschung / Forum: Qualitative Social Research (FQS)

ISSN 1438-5627

Creative Common License

Creative Commons Attribution 4.0 International License