header image

Volume 24, No. 1, Art. 13 – January 2023

Mixed Methods and Their Pragmatic Approach: Is There a Risk of Being Entangled in a Positivist Epistemology and Methodology? Limits, Pitfalls and Consequences of a Bricolage Methodology

Giampietro Gobo

Abstract: Since the early 2000s, the pragmatic approach has been proposed as a philosophical program for social research, regardless of whether qualitative, quantitative or mixed methods are used. In addition, current mixed methods have been presented as a third way between positivism and constructivism. However, can mixed methods be fully considered a third way? For instance, in their inquiries, will scholars oriented to pragmatism actually employ the traditional and standardized questionnaire, with forced choices and closed questions, which strongly limits any interpretative and interactional perspective? Hence, several theoretical and methodological difficulties of the pragmatist proposal emerge precisely (and paradoxically) at the level of research practice. The pragmatic approach is presented by its proponents as a model designed to dissolve differences and neutralize epistemological barriers; however, without problematizing and removing the positivist features of their methods, researchers oriented to pragmatism actually risk ending up reproducing positivism in disguise. Hence, despite their claims to innovation, proponents of pragmatism are often overly traditionalist in their use of methods.

Key words: mixed methods; measurement; bricolage methodology; pragmatic approach; third paradigm; merged methods

Table of Contents

1. Introduction

2. Can Mixed Methods be Considered a Third Approach?

3. The Quantitative Imprinting in Mixed Methods

4. Measurement: An Example of the Permanence of a Positivistic Component in Mixed Methods Research

4.1 Objects

4.2 Conditions

4.3 Measuring versus counting

4.4 Further restriction to measurability: Cooperation

4.5 The lack of a measurement unit in social sciences

4.6 Social measurement and its scales

4.7 Scaling is not measurement

5. Sampling and Generalization

6. The Multiple Meanings of Qualitative and Their Uptake in Mixed Methods Research

6.1 The great misapprehension

6.2 A muddle

7. The Pragmatic Approach

8. Concluding: Abandoning a Bricolage Methodology

Notes

References

Author

Citation

 

1. Introduction1)

The self-labeled mixed methods approach dates back to the late 1980s as being a counteraction to claims of incompatibility between quantitative and qualitative methods (BRANNEN, 1992; HOWE, 1988). Later, it was presented as a third way, an alternative to quantitative and qualitative methods, with the potential to usher in a new era in the field of behavioral and social sciences. KARASZ and SINGELIS (2009), MORGAN (2007) and others gave mixed methods research the status of a third paradigm,2) between positivism and constructivism (BIESTA, 2010; HALL, 2013; JOHNSON & ONWUEGBUZIE, 2004; MAXCY, 2003; MORGAN, 2007, 2014; PEARCE, 2012; TASHAKKORI & TEDDLIE, 2010). They followed the position put forward by TASHAKKORI and TEDDLIE (1998), who considered mixed methods an independent methodology—a methodological orientation that they reinforced later, stating: "mixed methods research has evolved to the point where it is a separate methodological orientation with its own worldview, vocabulary, and techniques" (TEDDLIE & TASHAKKORI, 2003, p.10). Also, CRESWELL and PLANO CLARK (2017) asserted that mixed methods are an approach with specific philosophical assumptions as well as methods of inquiry. [1]

I will start by asking the question whether mixed methods can actually be considered a third approach in methodology, as an alternative to qualitative and quantitative approaches (Section 2), and briefly explore its historical and epistemological roots (Section 3). I will then look at how proponents of this approach deal with some essential methodological problems, specifically those of measurement in the social sciences (Section 4), sampling, and generalization (Section 5). I will conclude that they have failed to provide a real alternative (Section 6) and remain tied to quantitative methodological thinking (Section 7). Only when investigators undertake a thorough review of their research practices is there a potential for mixed methods to become a true alternative (Section 8). [2]

2. Can Mixed Methods be Considered a Third Approach?

After several decades have passed since those challenging statements were made, we might doubt that mixed methods researchers have really fulfilled the promise of a third way, because a tacit imbalance towards a quantitative methods mindset still prevails. In other words, several mixed methods investigators are still influenced by a quantitative imprinting and the corresponding post-positivist epistemology is still subtly dominant in the field. This position is shared by several authors, among them FLICK (2017), GIDDINGS (2006), HESSE-BIBER (2015), and HOWE (2004). [3]

In addition, mixed methods researchers' language sometimes seems much less innovative than they would like it to be. In fact, their vocabulary contains (few) new terms and, when such terms are used, they are typically applied to already existing concepts. For instance, the apparently new research designs termed "convergent (parallel) design," "explanatory sequential design," "exploratory (sequential) design," "embedded design," etc. (CRESWELL & PLANO CLARK, 2017, p.69), were already used (without naming them) by the first generation of mixed methods scholars (GOBO & MAUCERI, 2014): In fact, Charles BOOTH, William E.B. Du BOIS, Paul F. LAZARSFELD, Marie JAHODA and Hans ZEISEL, Frédéric LE PLAY, Robert STAUGHTON LYND and Helen MERRELL LYND, Seebohm B. ROWNTREE, Eilert SUNDT, Beatrice WEBB, Max WEBER, and many others "blended qualitative and quantitative data as they studied their communities" (JOHNSON, ONWUEGBUZIE & TURNER, 2007, p.113). Finally, apparently, no new techniques have been invented by mixed methods methodologists. As a matter of fact, data integration techniques (presented in a sophisticated way in BAZELEY, 2018), for instance making use of CAQDAS (computer-assisted or aided qualitative data analysis software) or network analysis were already identified and developed outside the mixed methods community. [4]

In order to make it a truly third way, mixed methods investigators should increase their efforts to integrate the quantitative and qualitative methodologies into a new, distinctive and really mixed approach. They should retain and combine the valuable contributions developed by researchers from both orientations and avoid a simple juxtaposition of different methods,3) as happens too often today. The essential aspects of my critique are:

3. The Quantitative Imprinting in Mixed Methods

After BLUMER's (1956) disruptive criticism of variable analysis, the roots of contemporary mixed methods can be traced back to the initial openness of some quantitative researchers towards the rise of qualitative methods (which came to be irresistible): Open-minded sociologists, psychologists and methodologists (CAMPBELL & FISKE, 1959; CRONBACH, 1975; DIESING, 1971; SIEBER, 1973; SMITH, 1975; VIDICH & SHAPIRO, 1955), their assistants (WEBB, CAMPBELL, SCHWARTZ & SECHREST, 1966) and young graduates (JICK, 1979), all with a quantitative background and training, proposed the concept of triangulation. At that time, qualitative researchers, being unable to offer the formal procedures and validity claimed by quantitative academics (GLASER & STRAUSS, 1967, developed their grounded theory methodology as a response to this threat), were in danger of fast becoming extinct; however, the move by these open-minded quantitative scientists was somewhat premonitory. Several years later, in the 1980s, the realization of several turns (cognitive, linguistic, pragmatic, interpretative, interactional, narrative, post-modern and so on) would have soon resulted in undermining the traditional survey. Consequently, researchers advocating quantitative methods would have begun to feel epistemologically outdated and methodologically inadequate.4) For this reason, triangulation (and now multimethod research and mixed methods) was a providential lifesaver for quantitative methodologists—a means to recognizing the limits of the quantitative approach and navigating the risk of becoming marginal and losing intellectual power.5) [6]

It is no coincidence that a major impetus for the use of mixed methods came from scientists of the universities of Michigan (Ann Arbor) and Nebraska-Lincoln, centers of excellence in quantitative methods, and from scholars with a quantitative training and background (in psychology, education, and so on).6) As a matter of fact, mixed methods were not initially proposed by qualitative researchers, who felt neither the need nor the urgency for them. On the contrary, some of them expressed worries about how proponents of mixed methods treated qualitative methods (DENZIN & LINCOLN, 2005; GIDDINGS, 2006; HOWE, 2004; MORSE, 2005; SCHREIER, 2017). At most, qualitative methodologists proposed an interplay between qualitative techniques only. Consequently, GIDDINGS's severe judgment was no surprise: "the thinking in mixed-methods research rarely reflects a constructionist or subjectivist view of the world. The majority of studies use the analytic and prescriptive style of positivism, albeit with a postpositivist flavor" (2006, p.200). According to her criticism "the 'thinking' of positivism continues in the 'thinking' of mixed methods, its postpositivist pragmatic underpinnings assumed" (p.202). A similar position was held by SMYTHE (2005).7) [7]

4. Measurement: An Example of the Permanence of a Positivistic Component in Mixed Methods Research

An example of the current existence of a positivist imprinting in mixed methods research can be detected in the widespread diffusion of the terms measures and measurement in the mixed methods literature, i.e., in research papers and theoretical essays (e.g., HOWE, 1988), especially those related to research designs (CRESWELL & PLANO CLARK, 2017; DAIGNEAULT & JACOB, 2014; HOWELL SMITH et al., 2020; LUYT, 2012; SEDOGLAVICH, AKOORIE & PAVLOVICH, 2015; URBAN, BURGERMASTER, ARCHIBALD & BYRNE, 2015; WHEELDON, 2010). This is also a consequence of the bricolage methodology,8) which is embedded in the pragmatist approach (see below) so dominant in mixed methods, and the claim of researchers affiliating with this approach to measure anything. It is present to such an extent that many authors speak without reservation of ordinal measurement, and even of nominal measurement. [8]

However, as HAMMERSLEY pointed out, "adopting a pragmatic approach does not mean treating whatever we find we can do as good enough, as if what is possible determines what is necessary" (2010, p.425). Consequently, if we take this (subtly positivistic) approach, we confuse measurement with counting, using numbers with measurements, and the term itself is used to indicate extremely different processes (MARRADI, 1981). Hence, bricolage methodologists neglect other important epistemological, methodological and technical problems related to measurement. [9]

In social sciences, the term measurement has undergone a progressive "semantic stretching of the original meaning coming from physical sciences, where we speak of classifying, counting, ordering, and not only, obsessively, of measuring" (p.597)9). In fact, it is only in the social sciences that we speak of ordinal and nominal measurement. (Positivist) social scientists' aspirations to acquire a more scientific status certainly lie at the root of the semantic dilution of the term. They pursued this aim by emulating physical and natural sciences10) without questioning the effective applicability of this concept in the social sciences (KAPLAN, 1964). In the physical sciences, the concept of measurement is quite clear: Measuring means establishing, by comparison, how many times the measurement unit is contained in the quantity to be measured (MARRADI, 1981). However, adapting this to the social sciences is highly problematic. [10]

4.1 Objects

From a measurement perspective, two kinds of objects (although referring to concepts would be more appropriate epistemologically) exist in the world. These objects have two different properties (or attributes): continuous vs. discrete. Being continuous is a property which has an infinite number of statuses (of the property itself) that are subtly different from each other; hence, there is no outright jump between one status and another, and differences between them are difficult to discern. For instance, the object income belongs to this category: If one person earns £1,264 and another £1,265, there is no obvious jump between the two statuses. These two different numbers can be positioned along a continuum, as a straight line. Time, space, height, age, income, authoritarianism, etc. are objects which belong to this species. In addition, these numbers are real because they are further divisible into smaller quantities (e.g., £1,264.13). [11]

In contrast, a discrete property has a finite number of distinct statuses (of the property itself): Education, votes, religion, gender, publications, children, etc. fall into this category. In fact, they cannot be subdivided into quantities smaller than one. Hence, a key problem here concerns whether the object properties to be measured are of the character that is demanded by metric measurement—in other words, whether they have a quantitative structure. It can be noted that mathematical measurements and social sciences differ irreconcilably on this point. In fact, in mathematics, if the discrete properties are cardinal (e.g., the number of pears in a box) you can determine their average, which can be a rational number (i.e., with decimals). This makes sense, because pears can be divided, a half pear or a quarter eaten, and exist in reality. From a mathematical perspective, you can also fractionate objects that do not exist as fractions in reality, e.g., traffic accidents, smartphones and computers per capita, beds in a hospital, deaths, children, etc. However, we often read in scientific reports nonsense such as that in most OECD countries, the total fertility rate is somewhere between 1.4 and 1.9 children per woman. This is because, from a mathematical perspective, the fertility rate is a metric cardinal number, since it is the ratio between two discrete cardinals (number of children and number of women of fertile age). It is a real number, with infinite possible values between 0 and infinity. All this is flawless from a mathematical point of view, but problematic from a sociological one. In other words, in mathematics we compute means whereas in sociology we should calculate (stricto sensu) medians and modes only. Hence, the pure mathematical measurement procedure applied to social sciences is somewhat misleading. [12]

4.2 Conditions

According to MARRADI (1981, 1985), there are two mandatory requirements to be able to measure something: 1. the presence of an object with a continuous property (therefore, objects with discrete properties are not measurable) and 2. the existence of a unit of measurement (a convention) that can be recorded with an instrument. For example, time has a continuous property and there is a standard unit for measuring it (e.g., the second). Therefore, time meets the two mandatory requirements and an instrument has been developed to measure it (the chronometer). The same is true for space: Length is its simplest measurement, whose unit is the meter, and the instrument to measure space is the tape measure. [13]

4.3 Measuring versus counting

An important distinction, which social scientists often forget, exists between measuring and counting. These two socio-cognitive processes use two different units for calculation: a measurement unit and a count unit, respectively (MARRADI, 1985). However, while the latter is somehow natural—a researcher does not need special training or a particular instrument to enumerate children—measuring is conventional—a researcher needs an operational convention and a special instrument to measure with. The practical implication is that time, space, height, age, income, etc. can be measured; but number of children, educational qualifications, hospital beds, religion, gender etc. cannot be measured, only counted (MARRADI, 1981, 1985). [14]

This distinction is only quirky in appearance, because treating objects with discrete properties as if they had continuous properties leads to neglecting the social dimensions of phenomena. In other words, owning £4 is like having two times £2. However, having four children is not like having two times two children, because here social conditions are essential: Did the family have the children close together or far apart? These are two different situations which have an impact on family management, financial resources, job reconciliation policies, and so on. Hence, from a socio-psychological perspective, the number of children can be ascribed its cardinal property only apparently, and when society (the context) is taken into account in the math, statistics will always be too late. [15]

4.4 Further restriction to measurability: Cooperation

Having an object with a continuous property is a necessary-but-not-sufficient condition for measuring it. Another (final) feature is necessary: An object (with continuous properties) is measurable if its statuses can be recorded without its (active) cooperation. For example, to measure the waiting time in a hospital emergency room, the researcher does not need the cooperation of any of the participants. Only in this way can we guarantee intersubjectivity (of the measurement outcome) among different observers. This is what happens in physics and natural sciences, where the researchers do not need the cooperation of subatomic particles or biological cells. In contrast, to register the status of other objects with continuous properties (such as opinions, attitudes, beliefs) the active collaboration of interviewees is needed: Social scientists have to ask questions, talk with the participants and obtain their consent. Hence, opinions, attitudes and beliefs are not measurable. [16]

4.5 The lack of a measurement unit in social sciences

Most of the properties of interest to social scientists, such as psychological ones expressed in opinions, attitudes or values (authoritarianism, social cohesion), have often been conceived as continuous, like measurable continuous properties. However, unlike the latter, there is no measurement unit for the former. For this reason, CICOUREL (1964) stressed that it is very difficult to talk of measurements in the social sciences because their concepts, unlike those of physical sciences, do not have corresponding definitions (lexical and operational) that are subject to a general consensus in the scientific community. Whilst a tape measure, a set of scales or a chronometer are (consensually considered) necessary for the operational definition of quantity of length, weight, or time, there are no equally (consensually accepted) instruments with which to obtain operational definitions of concepts like democracy, rationalization, authority or political participation. [17]

4.6 Social measurement and its scales

To measure opinions, attitudes, and behaviors, scholars cannot employ the equivalent of a chronometer (for time) or a tape measure (for length). However, they still need an instrument, which in this case is the scale. It was invented by LIKERT (1932), and subsequently systematized by STEVENS (1946) into four scales: nominal, ordinal, interval and ratio. In doing this, STEVENS crafted a new definition of measurement, introducing a nominal level of measurement that is distinguished from ranking and from the two forms of metric measurement (interval and ratio scales). He thus compensated for the absence of a measurement unit (and therefore the epistemological impossibility of measuring) through scaling techniques. [18]

However, STEVENS's error was precisely to collapse significantly different operations into the one term measurement, thus semantically stretching the term. In particular, STEVENS did not distinguish between continuous and discrete properties (therefore the difference between the count unit and the measurement unit) on the one hand, and, on the other, he invented two monsters or Frankenstein measures: the nominal and ordinal scales. In fact, by using the concept of the nominal scale, he also expanded the term measurement to include the activity of assigning objects to classes; that is, he confused measurement with classification. But a classification is the opposite of a measurement (SARTORI, 1970). As TORGERSON pointed out, "numeric labels can be used to name classes [...] and this commonly happens. However, the fact that the number 8105 is assigned to a book in a library does not mean that the librarian measured the book" (1958, p.9) and further: "Otherwise, the classification, and even the naming of individual cases, becomes a form of measurement" (p.14). Just by using the expression ordinal scale STEVENS (1946) accomplished another terminological stretch, thereby rendering his typology of nominal, ordinal, interval and ratio typologies even more misleading, as MARRADI pointed out:

"if the property consists of a series of ordered categories, not only is there not any measurement, but there is also no comparison between objects […] Therefore, the label 'ordinal measurement' seems completely inappropriate, and should be replaced by the label 'allocation to ordered categories'" (1981, p.601; see also VELLEMAN & WILKINSON, 1993). [19]

4.7 Scaling is not measurement

In scaling, the measurement unit is established by the researcher, who divides (arbitrarily) the hypothetical continuum into discrete statuses, e.g., the response alternatives (such as strongly agree, fairly agree, fairly disagree and strongly disagree), before administering them (as a scale) to the research participants. Hence, on the one hand, the property of the object is considered continuous; on the other hand, the statuses of the property are conceived as discrete—an evident contradiction. In fact, the distance between categories cannot be quantified in this way. The only possible operation is to establish whether one status is greater or smaller than another. So, the variables produced are only ordered categorically, i.e., on an ordinal rather than a metric scale (necessary in the case of measurement). In fact, researchers engaged in decades of methodological research on social, pragmatic and cognitive aspects of surveys have documented that this approach has led to several biases (overview in GOBO & MAUCERI, 2014). Hence, it turns out that this (supposed) measurement unit is just arbitrary, not intersubjective and not replicable due to the polysemy of meanings attributed by interviewees to the response alternatives (the discrete statuses on the scale crafted by the researchers). [20]

In conclusion, in the social sciences we have metric scales (ratio scales and interval scales, with a measurement unit), absolute scales (the product of a count), scaling and classifications (MARRADI, 1981). We rarely measure (measurement unit or how much), we sometimes count (count unit or how many), we often scale (scale unit or what is the most important ..., the degree of ...) and we more often classify (what and how). Therefore, measurement should be pushed back into its original riverbed, confined in its use only to certain and specific cognitive operations, processes and practices. The same should apply to the other three terms. This would remove some romanticism from qualitative research and some scientism from quantitative research. [21]

5. Sampling and Generalization

Furthermore, in regard to the two key issues of sampling and generalization, prominent mixed methods scholars (COLLINS, ONWUEGBUZIE & JIAO, 2007; ONWUEGBUZIE & COLLINS, 2007) and those referring to the wave of pragmatism (JOHNSON & ONWUEGBUZIE, 2004) have accepted the (positivistic) received view, without problematizing its (highly questionable) basic concepts. For example, the "basic MM sampling strategies, sequential MM sampling, concurrent MM sampling, and multilevel MM sampling" proposed by TEDDLIE and YU (2007) are just a refinement and juxtaposition of both qualitative and quantitative received views, without any reflection of the epistemological, methodological and technical failings embodied in these two views. In the same way, COLLINS et al., discussing the appropriate minimum sample size for each research method, offered a conventional view:

"statistical generalizability refers to representativeness, whereas analytic generalizability and case-to-case transfer relate to conceptual power (Miles & Huberman, 1994). Sampling designs play a pivotal role in determining the type of generalization that is justifiable. In particular, whereas large and random samples tend to allow statistical generalizations, small and purposive samples tend to facilitate analytical generalizations and case-to-case transfers. As such, quantitative researchers tend to make statistical generalizations, whereas qualitative researchers tend to make either analytic generalizations or case-to-case transfers" (2007, p.267). [22]

No mention is made of at least three issues (GOBO, 2008):

The main benefit to be gained from adopting these criteria in mixed methods research is that researchers thereby also take on a sociologically-oriented sampling theory (instead of a statistically-oriented one, which has its roots in natural sciences). This means that (among other things) they acknowledge the following implications: 1. abandoning the (statistical) principle of probability, 2. recovering the (statistical) principle of variance and 3. paying renewed attention to the units of sampling and analysis (beyond the conventional habit of sampling persons) such as beliefs, attitudes, stereotypes, opinions, emotions, motivations, behaviors, social relations, meetings, interactions, ceremonies, rituals, networks, rules and social conventions, situations and so on. Hence, it is possible to achieve representativeness without probability and to pursue a new path towards legitimizing generalizations in research where probability samples are not used (details in GOBO, 2008). [24]

However, in mixed methods research, few have followed DENZIN and LINCOLN's (2005) request for a serious rethinking of the concepts of validity, generalizability and reliability. Although the reconceptualization of these terms—re-theorized in postpositivist, constructivist-naturalistic, feminist, interpretative, post-structural, and critical approaches (LINCOLN & GUBA, 1985)—has resulted in significant replacements for positivist concepts (e.g., credibility for internal validity; transferability for external validity; dependability for reliability; confirmability for objectivity), such conceptual changes still have too little effect on mixed methods research practice and methodology. [25]

Yet, for some time now, theoretical and practical proposals have existed (GOMM, HAMMERSLEY & FOSTER 2000; PAYNE & WILLIAMS, 2005), whose proponents suggested that we need to sample attitudes and behaviors, types of actions or events, instead of just individuals: "not, then, men and their moments. Rather moments and their men" (GOFFMAN, 1967, p.3); "not only people but moments of lived life" (CONVERSE & SCHUMAN, 1974, p.1); "incidents and not persons per se" (STRAUSS & CORBIN, 1990, p.177); sequences instead of instances (SILVERMAN, 2005). Social scientists taking these positions provide an alternative to the common practices of sampling persons and seeking information from them about behaviors and events that are never observed directly by the researcher (CICOUREL, 1996), and could thereby open the way to a possible, real third approach. [26]

6. The Multiple Meanings of Qualitative and Their Uptake in Mixed Methods Research

The quantitative imprinting in mixed methods is traceable in the qualitative-quantitative debate, particularly in the meanings assigned to the term qualitative. It is not easy to determine when this term became part of the methodological literature.11) This simultaneously misleading, fortunate, and ruinous term probably emerged in the 1950s. It is misleading, because through its ambiguity it became an abused buzz-word, "a 'catch-all' for non-positivist inquiry" (GIDDINGS, 2006, p.199); fortunate, because it became (and still is) a fashionable term. Ruinous, because through frequently using it, methodologists contributed to the rise of the paradigm wars and prevented a more subtle and sharper exploration of the diverse epistemological and methodological problems of doing research. For this reason, many scholars have pointed out the questionable use of qualitative and quantitative descriptors (and their false dichotomy), stating that this binary distinction does not hold in practice and that by perpetuating it researchers encourage an unacceptable polarization and thus ultimately minimize diversity in methods (GIDDINGS, 2006; HAMMERSLEY, 2018; SANDELOWSKI, 2014; SANDELOWSKI, VOILS & KNAFL, 2009; VOGT, 2008). [27]

6.1 The great misapprehension

Instead of highlighting differences and diversity in methods (useful to epistemologically understand which methods can be combined and which not), a consistent part of scholars active in the 1990s mixed methods movement promoted a vision in which differences and diversity were leveled, stating that the quantitative/qualitative distinction was overdrawn, or even meaningless. The mantra was: There is quality in quantity, and quantity in quality; they are the two folds of the same phenomenon; they are complementary, as yin and yang. MORGAN (2018) called this the indistinguishability thesis and rightly challenged this position. After all, it is also true that an accepted list of necessary and sufficient conditions constituting the quantitative and the qualitative does not exist, which makes their boundaries fuzzy. [28]

However, contributors to this debate have not wholly captured the deeper underlying nature of the problem, as can be explicated with the following ethnographic note: There are six candidates in the waiting room of the recruiting office. Some are middle-aged, others are young. The duration of the job interviews varies from twenty to thirty minutes. Interviewed after the job interview, some are satisfied with the conversation with the recruiter; others confide that they are disappointed with how they were treated. [29]

In this description, the researcher counts (the candidates), measures (the time of the job interview), classifies (the candidates by their age), and scales (their satisfaction/disappointment). Hence, measuring, counting, scaling and classifying social situations and sequences are just different-but-complementary ways of collecting, assembling and analyzing data, even in qualitative research (BECKER, 1970). The misapprehension was to use this shared truth about the lack of rigid distinction between what is qualitative and what is quantitative as a way to radically level the (still existing) multiple and ineradicable differences between quantitative and qualitative methods. This misapprehension was especially appealing to quantitative or post-positivistic researchers. As a matter of fact, since the 1990s the term qualitative has been stretched and expanded, with a consequent dilution of its meaning since it first emerged in the late 1950s (depicted by DENZIN and LINCOLN as an interpretive and naturalistic approach, where "qualitative researchers study things in their natural settings, attempting to make sense of, or to interpret, phenomena in terms of the meanings people bring to them" [2000, p.3]), into a multitude of concepts to such an extent that it encompasses almost everything, even research with questionnaires. In mixed methods literature, qualitative is now synonymous with words (GREENE, CARACELLI & GRAHAM, 1989), interpretation, classification, theory and judgements (GORARD, 2010), texts (FAKIS, HILLIAM, STONELEY & TOWNEND, 2014), understanding of the world, exploration, comparison, bibliographic research, review of the literature, maps and visual formats, definition of the research topic and levels of the analysis, archival research to identify variables (AMATURO & PUNZIANO, 2016). [30]

This misapprehension continues. For instance, statisticians invented the qualitative variable (dichotomous, dummy or categorical variables applied in regression and factor analysis). Qualitative activity for them is also interpreting statistical results, or interpreting rotated factors, or identifying and naming multivariate clusters and dimensions. Even new mathematics (or non-linear mathematics), which is about relationships and models, and which is the basis of dynamic systems (e.g., chaos theory, complexity theory, fractal geometry) has been called qualitative. What in the past was named quantitative now has been turned into qualitative. [31]

Even ordinary features of commonsense reasoning (making inferences, deduction and abduction) and everyday life activities (observing, watching, describing, listening, talking, asking questions, and so on) have been framed as qualitative. However, classifying (for instance) observation as qualitative is misleading because, following this argument, laboratory experiments (also based on observations) become mixed methods research. Consequently, everything seems to be mixed methods research, lending it a featureless identity. However, there must be a difference between fieldwork and a vacation, an ethnographer and a tourist, even though they both observe, describe and ask questions to locals. In the same way, labeling induction and subjectivity as qualitative (and deduction and objectivity as quantitative), as MORGAN (2007) did, is a traditional and unhelpful view. In the end, stretching the term qualitative has left most of the traditional epistemological and methodological problems unsolved. In addition, many mixed methods researchers have effectively frozen quantitative methodology rather than encouraged a more flexible exploration of its methods. As GIDDINGS (2006) and SMYTHE (2005) have argued, the traditional positivist presumptions are still reproduced and are kept active in most mixed methods research, along with the ambiguity (and extended semantics) of the term qualitative.12) [32]

6.2 A muddle

If the above discussion seems too abstract, let us now consider a couple of examples to provide us with an idea of the consequences of the various uses and definitions of qualitative circulating in the mixed methods community. First, a conversation analyst has very little in common with someone who conducts thematic analyses of discursive interviews (SILVERMAN, 2017). Their research practices are very different and irreconcilable. Yet, in mixed methods literature, both would be classified as qualitative researchers. Second, qualitative comparative analysis (QCA) is a data analysis technique, originally developed by Charles RAGIN (1987), to determine which logical conclusions can be supported by a data set. The first step in the analysis consists of listing and counting all the combinations of variables observed in the data set, followed by applying the rules of logical inference (or Boolean algebra) and truth tables to determine which descriptive inferences or implications can be supported. In this technique, social actors' intentions, meanings, motives, accounts, beliefs (the basics of any―so called―qualitative research) are treated narrowly. An ethnographer would be uncomfortable with this technique. Yet, in mixed methods literature, both would be classified as qualitative. These two examples (among many others) show that it is necessary to re-think the proposal that the qualitative and the quantitative are compatible. When engaging in this re-thinking, methodologists would have to start at the level of each specific method, looking at its intrinsic nature, limits, and potentials and would have to abandon the simple and naïve combination of methods and techniques with which this unhelpful dichotomy is reproduced, resulting in a sort of epistemological bricolage. [33]

7. The Pragmatic Approach

Another tendency among scholars that can lead them to both underestimate and then replicate (rather than overcome) an indirect subtle positivism in existing mixed methods research practice is the reliance on pragmatism as a methodological foundation. Pragmatism has, therefore, been proposed as "the primary philosophy of mixed research" (JOHNSON et al., 2007, p.113); the "philosophical program for social research, regardless of whether that research uses qualitative, quantitative or mixed methods" (MORGAN, 2014, p.1045). Hence, the mixed methods researcher should be pragmatic (ONWUEGBUZIE & LEECH, 2005). This vision is inspired by the tradition of American pragmatic philosophy and is based "on concepts such as 'lines of action' (from William James and George Herbert Mead) and 'warranted assertions' (from John Dewey), along with a general emphasis on 'workability' (from both James and Dewey)" (MORGAN, 2007, p.66). Proponents of this approach intended to do away with all the "metaphysical concerns [...] related to the nature of reality and truth" (p.49), because these are too abstract and "tell us little about more substantive decisions such as what to study and how to do so" (p.53); in addition, they claimed that, when working with non-pragmatic approaches, researchers gained too few insights for translating such metaphysical issues into practical guidance on how to make decisions about actual research. Hence, they produced a "belief system [which] remains disconnected from practical decisions about the actual conduct of research" (p.64). [34]

In contrast, it was assumed that when adopting the three pragmatic concepts cited by MORGAN above, researchers following different approaches are able to come together (on a practical level) by using mixed methods, in order to build shared meanings and joint actions based on common communication and reciprocal persuasion, focusing on methodology and "research questions rather than metaphysical assumptions" (p.67); because "methods are not automatically 'appropriate.' Instead, it is we ourselves who make the choice about what is important and what is appropriate, and those choices inevitably involve aspects of our personal history, social background, and cultural assumptions" (p.69). [35]

Most of the assertions profiled above are wise and fully acceptable. However, when we move to the (really) practical side and act (really) as pragmatists, we realize that the metaphysical concerns (mocked by MORGAN, 2007) are not actually very metaphysical. For instance, in their inquiries, will pragmatic researchers actually use the traditional, standardized and positivistic questionnaire, with forced choices and closed questions that strongly limit any interpretative and interactional perspective? Would they do so if they recalled that these conventional fixed formats are considered to be the basis of several well-known response errors and biases such as social desirability effects, the yea-saying and response set phenomena, the influence of the response alternatives on the formation of the answer, the misunderstanding of the response alternatives by the interviewees, the multiple semantic meanings of response alternatives and the invented opinions (or lies) phenomenon (GOBO & MAUCERI, 2014)? Even though it is fair to consider the conflict between the two traditional approaches outdated, nonetheless, mixing the behaviorist standardized survey interview with the discursive interview and using them as a simple tool in line with a post-positivistic attitude (i.e., without taking into account the nature, potential and intrinsic limits of these two methods) is quite problematic and is also at the basis of some inconsistencies in mixed methods research findings (BRYMAN, 2007; HESSE-BIBER, 2015; O'CATHAIN, MURPHY & NICHOLL, 2007). In other words, most mixed methods investigators have just hidden important epistemological and methodological issues that were contentious at the start of the qualitative-quantitative paradigm wars under the carpet. [36]

Obviously, it is not a question of rejecting the survey (as qualitative researchers do), but only its standard version or approach (GOBO & MAUCERI, 2014). In fact, some methodological proposals for overcoming it already exist: flexible interviewing (SCHOBER & CONRAD, 1997), the event history calendar (BELLI, STAFFORD & ALWIN, 2009) or the intervey (GOBO, 2011). If adopted, these techniques can be used as concrete alternatives to the positivistic survey in mixed methods. However, they are still only marginally established in the mixed methods community. Hence, the methodological and epistemological weaknesses of the pragmatist approach become visible precisely at the level of research practice. These shortcomings are traceable in statements such as:

"one paradigm (like pragmatism) serves as an adequate foundation for concurrent or parallel types of designs, while paradigms may shift during a sequential design in which one starts from a postpositivist perspective [...] and then moves to a constructivist (qualitative) worldview" (CRESWELL, 2009, p.102). [37]

This vague reference to the broad field of epistemological and methodological pragmatism may not be overly helpful when a researcher is faced with actual practical decisions. Moreover, the pitfalls of a pragmatic approach are also revealed when MORGAN (2007) affirmed that mixed methods research is characterized by abduction (while qualitative researchers would proceed by induction and quantitative researchers by deduction), intersubjectivity (while the former would be focused on the study of subjective processes and the latter on objective ones) and transferability (while qualitative researchers remain anchored to the context and quantitative researchers to generality). However, MORGAN forgot that abduction is a human cognitive process, therefore applied (certainly unconsciously) by both qualitative and quantitative researchers; that intersubjectivity is continuously studied by both qualitative (symbolic interactionism and ethnomethodology) and quantitative researchers and that many qualitative studies have been generalized, becoming pillars of sociological theory.13) [38]

Although the pragmatic approach is presented as suitable for dissolving differences and neutralizing epistemological barriers, in practice, without problematizing and removing the positivist features (the traditional concepts and uses of measurement, the questionnaire, sampling, generalization, and so on) of current research methods when they are used alone, researchers end up reproducing positivism in disguise. Mixed methods research would benefit from researchers reverting to the original tenets of qualitative methods (the interpretive and naturalistic approach, as I mentioned above quoting DENZIN & LINCOLN, 2000, p.3) along with a non-positivistic vision and application of quantitative methods (as proposed by several scholars, who cannot all be named due to lack of space: survey methodologists such as, in historical order, Paul F. LAZARSFELD, Johan GALTUNG, William J. GOODE and Paul K. HATT, Robert L. KAHN and Charles F. CANNELL, Robert J. MOORE, Jean MORTON-WILLIAMS, Howard SCHUMAN, Stanley PRESSER, Alberto MARRADI, Ray PAWSON, Elliot G. MISHLER, Johannes van der ZOUWEN; cognitivists such as, in alphabetical order, Paul BEATTY, Norman M. BRADBURN, Frederick CONRAD, Robert M. GROVES, Hans-J. HIPPLER, Nora Cate SCHAEFFER, Michael F. SCHOBER, Norbert SCHWARZ, Seymour SUDMAN, Judith M. TANUR; ethnomethodologists, sociolinguists and conversation analysts such as Charles L. BRIGGS, Aaron CICOUREL, Douglas MAYNARD, Hugh MEHAN, Hanneke HOUTKOOP-STEENSTRA). [39]

8. Concluding: Abandoning a Bricolage Methodology

According to what I have set out above, mixed methods methodologists have been unable to establish a third alternative because their methodology has never been completely emancipated (epistemologically, methodologically and technically) from the two approaches that were supposed to be mixed. In fact, to truly represent a third way, three features should be present: an epistemology, a methodology and a bag of methods that are alternatives to the previous qualitative and quantitative approaches.14) In contrast, mixed methods researchers from time to time draw on the toolbox of one or the other (which is positive anyway, of course) without proposing a (really) new independent view. Hence, most mixed methods investigators are still entangled in a positivist epistemology and methodology, where qualitative methods are used as tools only, without importing the associated and crucial constructivist epistemology and interpretative and naturalistic stances. This weakness seems to be obscured by the adoption of pragmatism, indicated as the way to resolve issues that proponents of the other two approaches have treated in a metaphysical, ideological, and impractical way. To this end, a bricolage methodology was invoked (KRONTOFT, FUGLSANG & KRONBORG, 2018; SHARP, 2019; YARDLEY, 2008, 2014). Being well-aware that though bricolage is essential to creatively solve practical research problems rarely present in textbooks, the methodology itself does not always seem adequate for solving other practical problems we encounter in the field: e.g., which kind of questionnaire shall we use? A standardized one (FOWLER & MANGIONE, 1990) or conversational/flexible interviewing (GOBO, 2006, 2011; HOUTKOOP-STEENSTRA, 2000; RIESMAN, 1958; SCHOBER & CONRAD, 1997; SCHOBER, CONRAD & FRICKER, 2004)? Should we limit ourselves to sampling persons or include attitudes, actions, and behaviors? Shall we first establish, with methodological awareness, which objects can be measured, which counted and which simply classified? Shall we be on the side of STEVENS (1946) and the positivists, who stretch the concept of measurement to include many cognitive operations that have nothing to do with measurement, or on the side of interpretative approaches? We could continue our questions, retracing many of the still unresolved issues raised by interpretative methodologists (and hidden under the carpet by mixed methods researchers). [40]

In order to bridge the two different philosophical approaches (represented by the shorthand of quantitative or qualitative) and construct a real third one (an independent methodology), it is certainly important (as already stated by several authors) to overcome the conventional binary classification of qualitative and quantitative methods. However, this overcoming must take place in a complete, organic and systematic way; it should not be simply limited to the joint use of those methods within the same research study. In other words, it is also necessary to overcome the languages, concepts and mental models of the two approaches. To use an example from gender studies, the complexity and articulation of gender identity cannot be understood if the dichotomy between male and female remains the point of reference. In the same way, a third approach can be constituted if the methodological (and not only epistemological) foundations of the quantitative and qualitative approaches are also questioned and replaced by a different methodological language, constructed by melting and fusing the previous two, retaining what is good in them and eliminating the rest, i.e., developing a really new framework. [41]

A new epistemic culture (KNORR CETINA, 1999) would be one in which issues such as how to consider evidence, how to test a hypothesis, to build a sample, to construct and administer a questionnaire, generalizability and so on, are re-framed in an interpretative perspective. Some proposals in this direction already exist (e.g., GOBO, FIELDING, LA ROCCA & VAN DER VAART, 2022) and they represent a different approach to empirical research in the social sciences. They are based on a reinterpretation of some of the classics of methodological literature, along with an integration of recent relevant attempts at reshaping old issues in a new frame (e.g., HAMMERSLEY, 1987 on validity and reliability; HAMMERSLEY, 2008 and KELLE, 2001 on triangulation; HAMMERSLEY, 2010 on measurement; MAXWELL, 2012 on causality), and the use of truly hybrid and fully integrated research techniques (a list can be found in GOBO et al., 2022). Mixed methods researchers have done too little so far to achieve this epistemological, methodological, and technical full integration, not least because many mixed methods scholars have accepted the received view, without questioning its constellation of terms and (positivistic) concepts. [42]

Notes

1) The preparation of this paper was supported by the Department of Philosophy "Piero Martinetti" of the University of Milan within the project "Departments of Excellence 2018-2022," awarded by the Ministry of Education, University and Research (MIUR). <back>

2) It might be useful to note that, in this case, the term paradigm appears quite improper. First, its proponent (Thomas KUHN) "never used it to mean a set of philosophical beliefs that served as a foundation for research" (MAXWELL, 2018, p.323). In addition, he believed that social sciences (which in his opinion were not strictly scientific disciplines, since a multitude of different and sometimes irreconcilable methods, concepts, theories, and research questions have always coexisted), there could never be a period of normal science. The latter exists when most of the scientists in a discipline adhere to the same paradigm: they agree on the same basic principles and share the same interests, practices, tools, etc. Precisely this (paradigmatic) consensus is the criterion that elevates a discipline to the rank of mature science.

The use of the term paradigm was "basically initiated by Lincoln and Guba (1985), in their justification of qualitative research as based on a naturalistic or constructivist paradigm that was incommensurable with the positivist paradigm assumed by quantitative research" (MAXWELL, 2018, p.323). This use of paradigm was largely confined to the qualitative researchers they influenced, until it was picked up by some mixed methods researchers. <back>

3) However, it is imperative to acknowledge that there are other third way approaches in mixed methods, specifically those where rigorous method integration is emphasized (e.g., POTH, 2018). In addition, there is a view amongst some mixed methods researchers that investigators who adopt MMR as a metaparadigm are less concerned with proposing new than with integrating existing methods. <back>

4) Qualitative methods were sometimes introduced to compensate for the limits of quantitative methods in natural sciences as well (astronomy, geology, medicine, epidemiology, ethology with the exception of the new archeology), although without the paradigm wars (MAXWELL, 2016). <back>

5) However, it was not only a sort of desperate attempt. In fact, some quantitative researchers who engaged with mixed methods (e.g., Jennifer GREENE, 2007 or, later, Cheryl POTH, 2018) considered them as a way to broaden their perspectives, an open-minded move towards something new. <back>

6) I absolutely do not intend to maintain that having such a background is a fault. Indeed, it is an important cultural resource and skill. The point is that these authors accepted and applied the received view (e.g., regarding STEVENS's [1946] scales, fixed response alternative formats, the concept of measurement, etc.), without criticizing it internally as other authors did (who are mentioned in the current text and share the same background). <back>

7) Obviously, there are different shades and applications of constructivism, as well as diversity and (sometimes even constructivist) reflexivity (GOBO, 1993) of the approaches they categorize as positivist. <back>

8) According to YEE and BREMNER, "the term 'bricolage' originated in French and is a modern equivalent to the English phrase of 'making-do'. In a general sense, a bricoleur (someone who employs the bricolage method) is described as a resourceful and creative 'fiddler or tinkerer', and one who out of necessity uses available materials to create new objects from existing ones" (2011, p.183). Observing their PhD students, the authors noted that while selecting and applying the most appropriate methods they preferred a pick-and-mix approach to an established method or methodology. The French anthropologist and ethnologist Claude LÉVI-STRAUSS (1962) defined the term as a spontaneous creative act in which a person used whatever was available to reach a desired outcome. In the context of cultural studies, NELSON, TREICHLER and GROSSBERG (1992) outlined bricolage as a pragmatic, strategic and self-reflexive method. In contrast, KINCHELOE (2001) employed the term to describe multi-perspectival research methods, in the sense that a researcher could compare and contrast multiple points of view by just using methods from different disciplines. Thus, a suitable indeterminate state derives from the relationship between inquiry and method, a state where not-knowing represents a constructive loop exploitable by the bricoleur. <back>

9) All translations from non-English texts are mine. <back>

10) This attitude has been highly prevalent in social, anthropological and psychological sciences since their origins, especially in some (not all, of course) of their founders or influential scholars: The most important work of Adolphe QUÉTELET, known for introducing statistical methods to social sciences and the concept of l'homme moyen [the average man], is "Essai de la physique sociale" [Essay on Social Physics] (1835); Herbert SPENCER, in his "First Principles of a New System of Philosophy" (1862), proposed a synthesis between human and natural sciences, basing it on physical principles; at the same time the psychologists Robert HAMILTON (under the pseudonym of Leland A. WEBSTER) and James MCKEEN CATTELL made similar proposals, the latter affirmed that psychology cannot achieve the certainty and accuracy of physical sciences if it is not based on experiments and measurement (1890); and then Émile DURKHEIM who, in "Les règles de la méthode sociologique" [The Rules of Sociological Method] (1895), suggested that a sociologist should adopt the same attitude as physicists, chemists and physiologists towards the phenomena of their scientific fields; later, George A. LUNDBERG, recognized leader of the operationist movement in sociology and (in 1943) President of the American Sociological Association, stated that if we measure social phenomena, researchers following the path of social sciences are led to the same difficult terrain in which researchers in physics and other sciences have progressed to their current sensational triumphs (1938). A decade later he would declare that sociologists must strive to attain the same status as physicists (1947); thus, also the philosopher of science Hans REICHENBACH (1930), an illustrious exponent of the neo-positivist circle of Berlin, and the psychologist Burrhus Frederick SKINNER declared that "the methods of physical sciences have obtained astonishing success wherever they have been used. Let's apply them to human affairs" (1953, p.5). The anthropologist Siegfried Frederick NADEL (1951) asserted that an anthropologist takes natural sciences as a model, trying to trace particular facts or events to general laws and that there is only one scientific method: the one with which physics and chemistry have achieved their successes; the anthropologist George Peter MURDOCK (1949) assured that cultural and social data can be treated like physical and biological facts, since they conform to natural laws with a little less accuracy than that characterizing the combination of atoms in chemistry and germs in biology. This phenomenon of psychologists, sociologists and anthropologists emulating physical sciences has been denounced by BOURDIEU, CHAMBOREDON and PASSERON (1968), CINI (1994), CLINARD (1966), COATS (1989), DUNCAN (1984), GARDNER (1983), LEWIN (1935), LIEBERSON (1985), MACKENZIE and MACKENZIE (1974), McKINNEY (1966), MEEHAN (1968), MOKRZYCKI (1983), PARISI and CASTELFRANCHI (1978), PETERS (1958), RADNITZKY (1968), RAVINDRA (1975-1976), RUNCIMAN (1963), SHEPARD (1966), SOROKIN (1956), TAYLOR (1964), TORGERSON (1958), and many others. <back>

11) When the Chicago School was in fashion (one century ago), this distinction was not on the lips of its affiliates. They referred to field methods (even though survey researchers also referred to fieldwork and fielding a survey). The qualitative-quantitative divide was not present in the first 80 years of sociology as a named discipline. The belief that direct exposure to the field was essential to understanding social phenomena was also absent—most of the Chicago School scholars had their fieldwork done by graduate students or even people without any academic training at all, and some important qualitative studies were based completely on case notes written by third parties (e.g., social workers) for purposes other than sociological research. <back>

12) I suspect that the term quantitative has also undergone a similar though much less extensive dilution. <back>

13) MORENO (1951) and then LAZARSFELD (1958), for example, were the inventors of the sociometric inquiry that links the individuals being studied to their friends, relatives, neighbors, co-workers and social network generally. Their rationale was that individuals' behavior and attitudes must be related to the social contexts in which they live and work, reflecting their theoretical awareness of the socially situated nature of human actions. <back>

14) For an alternative proposal, see MAXWELL, CHMIEL and ROGERS (2015). <back>

References

Amaturo, Enrica & Punziano, Gabriella (2016). I mixed methods nella ricerca sociale [Mixed methods in the social sciences]. Rome: Carocci.

Bazeley, Patricia (2018). Integrating analyses in mixed methods research. London: Sage.

Becker, Howard S. (1970). Sociological work: Method and substance. New Brunswick, NJ: Transaction Books.

Belli, Robert F.; Stafford, Frank & Alwin, Duane F. (2009). Calendar and time diary methods in life course research. Thousand Oaks, CA: Sage.

Biesta, Gert (2010). Pragmatism and the philosophical foundations of mixed methods research. In Abbas Tashakkori & Charles Teddlie (Eds.), Sage Handbook of mixed methods research social & behavioral research (pp.95-118). Thousand Oaks, CA: Sage.

Blumer, Herbert (1956). Sociological analysis and the "variable". American Sociological Review, 21(6), 683-690.

Bourdieu, Pierre; Chamboredon, Jean-Claude & Passeron, Jean-Claude (1968). Le métier de sociologue. Préalables épistémologiques [The craft of sociology. Epistemological preliminaries]. Paris: Mouton.

Brannen, Julia (1992). Mixing methods: Qualitative and quantitative research. London: Gower.

Bryman, Alan (2007). Barriers to integrating quantitative and qualitative research. Journal of Mixed Methods Research, 1(1), 8-22.

Campbell, Donald T. & Fiske, Donald W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81-105.

Cicourel, Aaron V. (1964). Method and measurement in sociology. New York, NY: Free Press.

Cicourel, Aaron V. (1996). Ecological validity and White room effects. Pragmatics and Cognition, 4(2), 221-264.

Cini, Marcello (1994). Un paradiso perduto. Dall'universo delle leggi naturali al mondo dei processi evolutivi [Paradise lost. From the universe of natural laws to the world of evolutionary processes]. Milan: Feltrinelli.

Clinard, Marshall B. (1966). The sociologist's quest for respectability. The Sociological Quarterly, VII, 399-412.

Coats, Alfred W. (1989). Explanations in history and economics. Social Research, LVI(2), 331-360.

Collins, Kathleen M.T.; Onwuegbuzie, Anthony J. & Jiao, Qun G. (2007). A mixed methods investigation of mixed methods sampling designs in social and health science research. Journal of Mixed Methods Research, 1(3), 267-294.

Collins, Randall (1988). Theoretical sociology. San Diego, CA: Harcourt Brace Jovanovich.

Converse, Jean M. & Schuman, Howard (1974). Conversations at random: Survey research as interviewers see it. New York, NY: Wiley.

Creswell, John W. (2009). Research design. Qualitative, quantitative and mixed methods approaches. Thousand Oaks, CA: Sage.

Creswell, John W. & Plano Clark, Vicki L. (2017). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage.

Cronbach, Lee J. (1975). Beyond the two disciplines of scientific psychology. American Psychologist, 30(2), 116-127.

Daigneault, Pier-Marc & Jacob, Steve (2014). Unexpected but most welcome: Mixed methods for the validation and revision of the participatory evaluation measurement instrument. Journal of Mixed Methods Research, 8(1), 6-24.

Denzin, Norman K. & Lincoln, Yvonna S. (2000). The discipline and practice of qualitative research. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), The Sage handbook of qualitative research (pp.1-32). Thousand Oaks, CA: Sage.

Denzin, Norman K. & Lincoln, Yvonna S. (2005). Introduction: The discipline and practice of qualitative research. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), The Sage handbook of qualitative research (pp.1-32). Thousand Oaks, CA: Sage.

Diesing, Paul R. (1971). Patterns of discovery in the social sciences. Chicago, IL: Aldine-Atherton.

Duncan, Otis D. (1984). Notes on social measurement. Historical and critical. New York, NY: Russell Sage Foundation.

Durkheim, Émile (1895). Les règles de la méthode sociologique [The rules of sociological method]. Paris: Payot.

Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen & Townend, Michael (2014). Quantitative analysis of qualitative information from interviews: A systematic literature review. Journal of Mixed Methods Research, 8(2), 139-161.

Flick, Uwe (2017). Mantras and myths: The disenchantment of mixed-methods research and revisiting triangulation as a perspective. Qualitative Inquiry, 23(1), 46-57.

Fowler, Floyd J. & Mangione, Thomas W. (1990). Standardized survey interviewing. Minimizing interviewer-related error. London: Sage.

Gardner, Howard (1983). Frames of mind. The theory of multiple intelligences. London: Paladin Books.

Giddings, Lynne S. (2006). Mixed-methods research: positivism dressed in drag. Journal of Research in Nursing, 11(3), 195-203.

Glaser, Barney G. & Strauss, Anselm L. (1967). The discovery of grounded theory. Chicago, IL: Aldine.

Gobo, Giampietro (1993). Le forme della riflessività. Da costrutto epistemologico a practical issue [The forms of reflexivity. From epistemological construct a practical issue]. Studi di Sociologia, 31(3), 299-317.

Gobo, Giampietro (2006). Set them free. Improving data quality by broadening interviewer's task. International Journal of Social Research Methodology: Theory & Practice, 9(4), 279-301, https://www.tandfonline.com/doi/full/10.1080/13645570600916064 [Accessed: January 16, 2023].

Gobo, Giampietro (2008). Re-conceptualizing generalization. Old issues in a new frame. In Alasuutari Pertti, Julia Brannen & Leonard Bickman (Eds.), The Sage handbook of social research methods (pp.193-213). London: Sage.

Gobo, Giampietro (2011). Back to Likert. Towards a conversational survey. In Malcolm Williams & Paul Vogt (Eds.), The Sage handbook of innovation in social research methods (pp.228-248). London: Sage.

Gobo, Giampietro & Mauceri, Sergio (2014). Constructing survey data. An interactional approach. London: Sage.

Gobo, Giampietro; Fielding, Nigel G.; La Rocca, Gevisa & van der Vaart, Wander (2022). Merged methods: A rationale for full integration. London: Sage.

Goffman, Erving (1967). Interaction ritual: Essays in face-to-face behavior. Chicago, IL: Aldine.

Gomm, Roger; Hammersley, Martin & Foster, Peter (Eds.) (2000). Case study method. London: Sage.

Gorard, Stephen (2010). Research design, as independent of methods. In Abbas Tashakkori & Charles Teddlie (Eds.), Sage Handbook of mixed methods in social & behavioral research (pp.237-251). Thousand Oaks, CA: Sage.

Greene, Jennifer C. (2007). Mixed methods in social inquiry. San Francisco, CA: Jossey-Bass.

Greene, Jennifer C.; Caracelli, Valerie J. & Graham, Wendy F. (1989). Toward a conceptual framework for mixed method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274.

Hall, Jori N. (2013). Pragmatism, evidence, and mixed methods evaluation (Special Issue: Mixed methods and credibility of evidence in evaluation). New Directions for Evaluation, 138(2), 15-26.

Hammersley, Martyn (1987). Some notes on the terms "validity" and "reliability". British Educational Research Journal, 13(1), 73-81.

Hammersley, Martyn (2008). Troubles with triangulation. In Manfred Max Bergman (Ed.), Advances in mixed methods research (pp.22-36). London: Sage.

Hammersley, Martyn (2010). Is social measurement possible, and is it necessary?. In Geoffrey Walford, Eric Tucker & Madhu Viswanathan (Eds.), The Sage handbook of measurement (pp.409-426). London: Sage.

Hammersley, Martyn (2018). Commentary—on the "indistinguishability thesis": A response to Morgan. Journal of Mixed Methods Research, 12(3), 256-261.

Hesse-Biber, Sharlene (2015). Mixed methods research: The "thing-ness" problem. Qualitative Health Research, 25(6), 775-788.

Houtkoop-Steenstra, Hanneke (2000). Interaction and the standardized survey interview. The living questionnaire. Cambridge: Cambridge University Press.

Howe, Kenneth R. (1988). Against the quantitative-qualitative incompatibility thesis or dogmas die hard. Educational Researcher, 17, 10-16.

Howe, Kenneth R. (2004). A critique of experimentalism. Qualitative Inquiry, 10(4), 42-61.

Howell Smith, Michelle C.; Babchuk, Wayne A.; Stevens, Jared; Garrett, Amanda L.; Wang, Sherry C. & Guetterman, Timothy C. (2020). Modeling the use of mixed methods-grounded theory: Developing scales for a new measurement model. Journal of Mixed Methods Research, 14(2), 184-206, https://doi.org/10.1177/1558689819872599 [Accessed: January 16, 2023].

Jick, Todd D. (1979). Mixing qualitative and quantitative methods: Triangulation in action. Administrative Science Quarterly, 24(4), 602-611.

Johnson, Burke R. & Onwuegbuzie, Anthony J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26.

Johnson, Burke R.; Onwuegbuzie, Anthony J. & Turner, Lisa A. (2007). Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1(2),112-133.

Kaplan, Abraham (1964). The conduct of inquiry: Methodology for behavioral science. San Francisco, CA: Chandler.

Karasz, Alison & Singelis, Theodore M. (2009). Qualitative and mixed methods research in cross-cultural psychology. Journal of Cross-Cultural Psychology, 40(6), 909-916.

Kelle, Udo (2001). Sociological explanations between micro and macro and the integration of qualitative and quantitative methods. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 2(1), Art. 5, https://doi.org/10.17169/fqs-2.1.966 [Accessed: August 3, 2022].

Kincheloe, Joe L. (2001). Describing the bricolage: Conceptualizing a new rigor in qualitative research. Qualitative Inquiry, 11(3), 679-692, https://link.springer.com/chapter/10.1007/978-94-6091-397-6_15 [Accessed: January 16, 2023].

Knorr Cetina, Karin (1999). Epistemic cultures: How the sciences make knowledge. Cambridge, MA: Harvard University Press.

Krontoft, Anna; Fuglsang, Lars & Kronborg, Hanne (2018). Innovation activity among nurses: The translation and preliminary validation of the Bricolage Measure—a mixed-method study. Nordic Journal of Nursing Research, 38(3), 151-159, https://journals.sagepub.com/doi/full/10.1177/2057158517733931 [Accessed: January 16, 2023].

Lazarsfeld, Paul Felix (1958). Evidence and inference in social research. Daedalus, 87(4), 99-130.

Lévi-Strauss, Claude (1962). La Pensée sauvage [The savage mind]. Paris: Plon.

Lewin, Kurt (1935). A dynamic theory of personality. New York, NY: McGraw-Hill.

Lieberson, Stanley (1985). Making it count. Berkeley, CA: University of California Press.

Likert, Rensis (1932). A technique for the measurement of attitudes. Archives of Psychology, 22(140), 1-55.

Lincoln, Yvonna S. & Guba, Egon G. (1985). Naturalistic inquiry. Beverly Hills, CA: Sage

Lundberg, George A. (1938). The concept of law in the social sciences. Philosophy of Science, V(2), 189-203.

Lundberg, George A. (1947). Can science save us?. New York, NY: Longmans, Green and Co.

Luyt, Russell (2012). A framework for mixing methods in quantitative measurement development, validation, and revision: A case study. Journal of Mixed Methods Research, 6(4), 294-316.

Mackenzie, Brian D. & Mackenzie, S. Lynne (1974). The case for a revised systematic approach to the history of psychology. Journal for the History of the Behavioural Sciences, X(3), 324-347.

Marradi, Alberto (1981). Misurazione e scale: Qualche riflessione e una proposta [Measurement and scales: Some reflections and a proposal]. Quaderni di Sociologia, 29(4), 595-639.

Marradi, Alberto (1985). Unità di misura e unità di conto [Units of measurement and units of counting]. Rassegna Italiana di Sociologia, 24(2), 229-238.

Maxcy, Spenser (2003). Pragmatic threads in mixed methods research in the social sciences: The search for multiple modes of inquiry and the end of the philosophy of formalism. In Abbas Tashakkori & Charles Teddlie (Eds.), Handbook of mixed methods in social & behavioral research (pp.51-89). Thousand Oaks, CA: Sage.

Maxwell, Joseph A. (2012). The importance of qualitative research for causal explanation in education. Qualitative Inquiry, 18(8), 655-661.

Maxwell, Joseph A. (2016). Expanding the history and range of mixed methods research. Journal of Mixed Methods Research, 10(1), 12-27.

Maxwell, Joseph A. (2018). The "silo problem" in mixed methods research. International Journal of Multiple Research Approaches, 10(1), 317-327.

Maxwell, Joseph A.; Chmiel, Margaret & Rogers, Sylvia (2015). Designing integration in mixed method and multi-method research. In Sharlene Nagi Hesse-Biber & Burke R. Johnson (Eds.), Oxford handbook of multimethod and mixed methods research inquiry (pp.223-239). New York, NY: Oxford University Press.

McKinney, John C. (1966). Constructive typology and social theory. New York, NY: Appleton-Century-Crofts.

Meehan, Eugene J. (1968). Explanation in social science. A system paradigm. Homewood: Dorsey Press.

Mokrzycki, Edmund (1983). Philosophy of science and sociology. From the methodological doctrine to research practice. London: Routledge & Kegan Paul.

Moreno, Jacob L. (1951). Sociometry, experimental method and the science of society. Ambler: Beacon House.

Morgan, David L. (2007). Paradigms lost and pragmatism regained. Methodological implications of combining qualitative and quantitative methods. Journal of Mixed Methods Research, 1(1), 48-76.

Morgan, David L. (2014). Pragmatism as a paradigm for social research. Qualitative Inquiry, 20(8), 1045-1053.

Morgan, David L. (2018). Living within blurry boundaries: The value of distinguishing between qualitative and quantitative research. Journal of Mixed Methods Research, 12(3), 268-279.

Morse, Janice M. (2005). Evolving trends in qualitative research: Advances in mixed methods designs. Qualitative Health Research, 15(5), 583-585, https://doi.org/10.1177/1049732305275169 [Accessed: January 16, 2023].

Murdock, George P. (1949). Social structure. New York, NY: Macmillan.

Nadel, Siegfried Frederick (1951). The foundations of social anthropology. London: Cohen & West.

Nelson, Cary; Treichler, Paula & Grossberg, Lawrence (1992). Cultural studies: An introduction. In Lawrence Grossberg, Cary Nelson & Paula Treichler (Eds.), Cultural studies (pp.1-16). New York, NY: Routledge.

O'Cathain, Alicia; Murphy, Elizabeth & Nicholl, Jon (2007). Integration and publications as indicators of "yield" from mixed methods studies. Journal of Mixed Methods Research, 1(2), 147-153.

Onwuegbuzie, Anthony J. & Leech, Nancy L. (2005). On becoming a pragmatic researcher: The importance of combining quantitative and qualitative research methodologies. International Journal of Social Research Methodology, 8(5), 375-387.

Onwuegbuzie, Anthony J. & Collins, Kathleen M.T. (2007). A typology of mixed methods sampling designs in social science research. The Qualitative Report, 12(2), 281-316, https://doi.org/10.46743/2160-3715/2007.1638 [Accessed: January 16, 2023].

Parisi, Domenico & Castelfranchi, Cristiano (1978). Una definizione della psicologia cognitivista [A definition of cognitive psychology]. In Gaetano Kanizsa & Paolo Legrenzi (Eds.), Psicologia della gestalt e psicologia cognitivista [Gestalt psychology and cognitive psychology] (pp.63-84). Bologna: Il Mulino.

Payne, Geoff & Williams, Malcolm (2005). Generalization in qualitative research. Sociology, 39(2), 295-314.

Pearce, Lisa D. (2012). Mixed methods inquiry in sociology. American Behavioral Scientist, 56(6), 829-848.

Peters, Richard S. (1958). The concept of motivation. London: Routledge & Kegan Paul.

Quételet, Adolphe (1835). Sur l'homme et le développement de ses facultés, ou Essai de la physique sociale [A treatise on man and the development of his faculties, or Essay on social physics]. Paris: Bachelier.

Poth, Cheryl N. (2018). Innovation in mixed methods research. London: Sage.

Radnitzky, Gerard (1968). Contemporary schools of metascience. Goeteborg: Akademiforlaget.

Ragin, Charles (1987). The comparative method: Moving beyond qualitative and quantitative strategies. Berkeley, CA: University of California Press.

Ravindra, Raj (1975-1976). Experiment and experience: A critique of modern scientific knowing. Dalhousie Review, LV(4), 655-674.

Reichenbach, Hans (1930). Die philosophische Bedeutung der modernen Physik. Erkenntnis, 1(1), 49-71.

Riesman, David (1958). Some observations on the interviewing in the teacher apprehension study. In Paul F. Lazarsfeld & Thielens Wagner (Eds.), The academic mind (pp.266-370). Glencoe, IL: Free Press.

Runciman, Walter G. (1963). Social science and political theory. Cambridge: Cambridge University Press.

Sandelowski, Margarete (2014). Unmixing mixed methods. Research in Nursing and Health, 37(1),3-8, https://onlinelibrary.wiley.com/doi/full/10.1002/nur.21570 [Accessed: January 16, 2023].

Sandelowski, Margarete; Voils, Corrine I. & Knafl, George (2009). On quantitizing. Journal of Mixed Methods Research, 3(3), 208-222.

Sartori, Giovanni (1970). Concept misformation in comparative politics. American Political Science Review, 64(4), 1033-1053.

Schober, Michael F. & Conrad, Frederick G. (1997). Does conversational interviewing reduce survey measurement errors? Public Opinion Quarterly, 49, 576-602.

Schober, Michael F.; Conrad, Frederick G. & Fricker, Scott S. (2004). Misunderstanding standardized language in research interviews. Applied Cognitive Psychology, 18, 169-188, https://onlinelibrary.wiley.com/doi/abs/10.1002/acp.955 [Accessed: January 16, 2023].

Schreier, Margrit (2017). Contexts of qualitative research: Arts-based research, mixed methods, and emergent methods. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 18(2), Art. 6, https://doi.org/10.17169/fqs-18.2.2815 [Accessed: August 3, 2022].

Sedoglavich, Vesna; Akoorie, Michèle E. M. & Pavlovich, Katheyn (2015). Measuring absorptive capacity in high-tech companies: Mixing qualitative and quantitative methods. Journal of Mixed Methods Research, 9(3), 252-272.

Sharp, Heather (2019). Bricolage research in history education as a scholarly mixed-methods design. History Education Research Journal, 16(1), 50-62.

Shepard, Roger N. (1966). Metric structures in ordinal data. Journal of Mathematical Psychology, III(2), 287-315.

Sieber, Sam D. (1973). The integration of fieldwork and survey methods. American Journal of Sociology, 78(6), 1335-1359.

Silverman, David (2005). Instances or sequences? Improving the state of the art of qualitative research. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 6(3), Art. 30, https://doi.org/10.17169/fqs-6.3.6 [Accessed: August 3, 2022].

Silverman, David (2017). How was it for you? The interview society and the irresistible rise of the (poorly analysed) interview. Qualitative Research, 17(2), 144-158.

Skinner, Burrhus F. (1953). Science and human behaviour. New York, NY: Free Press.

Smith, Herman W. (1975). Strategies of social research: The methodological imagination. Englewood Cliffs, NJ: Prentice Hall.

Smythe, Elizabeth (2005). The thinking of research. In Pamela Ironside (Ed.), Beyond method: Philosophical conversations in healthcare research and scholarship (pp.223-258). Madison, WI: The University of Wisconsin Press.

Sorokin, Pitirim A. (1956). Fads and foibles in modern sociology and related sciences. Chicago, IL: Henry Regnery.

Spencer, Herbert (1862). First principles of a new system of philosophy. New York, NY: Appleton.

Stevens, Stanley S. (1946). On the theory of scales of measurement. Science, 103(2684), 677-680.

Strauss, Anselm & Corbin, Juliet (1990). Basics of qualitative research. London: Sage.

Tashakkori, Abbas & Teddlie, Charles (1998). Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks, CA: Sage.

Tashakkori, Abbas & Teddlie, Charles (2010). Overview of contemporary issues in mixed methods research. In Abbas Tashakkori & Charles Teddlie (Eds.), Sage Handbook of mixed methods research in social & behavioral research (2nd ed., pp.1-44). Thousand Oaks, CA: Sage.

Taylor, Charles (1964). The explanation of behaviour. London: Routledge & Kegan Paul.

Teddlie, Charles &Tashakkori, Abbas (2003). Major issues and controversies in the use of mixed methods in social and behavioral sciences. In Abbas Tashakkori & Charles Teddlie (Eds.), Sage Handbook of mixed methods research in social & behavioral research (pp.3-50). Thousand Oaks, CA: Sage.

Teddlie, Charles & Yu, Fen (2007). Mixed methods sampling. A typology with examples. Journal of Mixed Methods Research, 1(1), 77-100.

Torgenson, Warren S. (1958). Theory and methods of scaling. New York, NY: Wiley.

Urban, Jennifer B.; Burgermaster, Marissa; Archibald, Thomas & Byrne, Alyssa (2015). Relationships between quantitative measures of evaluation plan and program model quality and a qualitative measure of participant perceptions of an evaluation capacity building approach. Journal of Mixed Methods Research, 9(2), 154-177.

Velleman, Paul F. & Wilkinson, Leland (1993). Nominal, ordinal, interval, and ratio typologies are misleading. The American Statistician, 47(1), 65-72.

Vidich, Arthur J. & Shapiro, Gilbert (1955). A comparison of participant observation and survey data. American Sociological Review, 20(1), 28-33.

Vogt, Paul W. (2008). Quantitative versus qualitative is a distraction: Variations on a theme by Brewer & Hunter (2006). Methodological Innovations Online, 3(1), 1-10, https://journals.sagepub.com/doi/pdf/10.4256/mio.2008.0007 [Accessed: January 16, 2023].

Webb, Eugene J.; Campbell, Donald Thomas; Schwartz, Richard D. & Sechrest, Lee (1966). Unobtrusive measures. Nonreactive research in the social sciences. Chicago, IL: Rand McNally.

Wheeldon, Johannes (2010). Mapping mixed methods research: Methods, measures, and meaning. Journal of Mixed Methods Research, 4(2), 87-102.

Yardley, Ainslie (2008). Piecing together—a methodological bricolage. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 9(2), Art. 31, https://doi.org/10.17169/fqs-9.2.416 [Accessed: August 3, 2022].

Yardley, Ainslie (2014). Children describing the world: Mixed-method research by child practitioners developing an intergenerational dialogue. Educational & Child Psychology, 31(1), 48-62.

Yee, Joyce S.R. & Bremner, Craig (2011). Methodological bricolage: What does it tell us about design?. Conference paper, Doctoral Education in Design Conference, May 23-25, Hong Kong, http://nrl.northumbria.ac.uk/8822/ [Accessed: August 3, 2022].

Author

Giampietro GOBO, professor of methodology of social research and sociology of science at the University of Milan (Italy), was one of the founders of the Qualitative Methods Research Network of the European Sociological Association. He is interested in scientific controversies on health issues and coordination studies. He is currently conducting projects on immunization and Covid-19 policies, and ethnographic experiments in the area of cooperation in small teams. His books include "Doing Ethnography" (Sage, 2008), "Qualitative Research Practice" (co-edited with C. SEALE, J.F. GUBRIUM & D. SILVERMAN; Sage, 2004), "Constructing Survey Data: An Interactional Approach" (with S. MAUCERI, Sage, 2014), "Merged Methods: A Rationale for Full Integration" (with N. FIELDING, G. La ROCCA & W. van der VAART; Sage, 2022) and "Science, Technology and Society: An Introduction" (with V. MARCHESELLI; Palgrave Macmillan, 2022).

Contact:

Giampietro Gobo

Dept. of Philosophy, University of Milan
Via Festa del Perdono 7
20122 Milan, Italy

E-mail: Giampietro.Gobo@unimi.it
URL: https://www.unimi.it/en/ugov/person/giampietro-gobo

Citation

Gobo, Giampietro (2023). Mixed methods and their pragmatic approach: Is there a risk of being entangled in a positivist epistemology and methodology? Limits, pitfalls and consequences of a bricolage methodology [42 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 24(1), Art. 13, http://dx.doi.org/10.17169/fqs-24.1.4005.

Forum Qualitative Sozialforschung / Forum: Qualitative Social Research (FQS)

ISSN 1438-5627

Creative Common License

Creative Commons Attribution 4.0 International License