Volume 9, No. 2, Art. 12 – May 2008

Making Thinking Visible with Atlas.ti: Computer Assisted Qualitative Analysis as Textual Practices1)

Zdeněk Konopásek

Abstract: How is a new quality of reading, which we call "sociological understanding", created during the process of qualitative analysis? A methodological (conventional) answer to this question usually speaks of mental processes and conceptual work. This paper suggests a different view—sociological rather than methodological; or more precisely a view inspired by a contemporary sociology of science. It describes qualitative analysis as a set of material practices. Taking grounded theory methodology and the work with the computer programme Atlas.ti as an example, it is argued that thinking is inseparable from doing even in this domain. It is argued that by adopting the suggested perspective we might be better able to speak of otherwise hardly graspable processes of qualitative analysis in more accountable and instructable ways. Further, software packages would be better understood not only as "mere tools" for coding and retrieving, but also as complex virtual environments for embodied and practice-based knowledge making. Finally, grounded theory methodology might appear in a somewhat different light: when described not in terms of methodological or theoretical concepts but rather in terms of what we practically do with the analysed data, it becomes perfectly compatible with the radical constructivist, textualist, or even post-structuralist paradigms of interpretation (from which it has allegedly departed by a long way).

Key words: CAQDAS, analytic practices, grounded theory methodology, thinking, visualisation, textuality, reading and writing, humans and machines

Table of Contents

1. Introduction

2. The Inspiration from Science Studies

3. Why Choose GTM as an Example?

4. Reality, Virtuality and Practices

5. The Creation and Basic Operation of Textual Laboratory

5.1 Assigning primary documents

5.2 Defining quotations: Cutting PDs into pieces

5.3 Codes and coding: Reintegrating the pieces

6. Making the Textual Laboratory Really Useful: Beyond Code-and-retrieve

6.1 Writing comments

6.2 How to see relevance?

6.3 Reading data in a new way

7. Conclusions

Notes

References

Author

Citation

 

1. Introduction

Some contemporaries of the previous version of Atlas.ti (version 4), a software tool for qualitative data analysis, may remember the quirk in it. As a newcomer, while playing with options and menus of the programme, you could have become tempted to try a very promising option offered in the menu for work with textual documents: Relevant text search. Here it is, you thought, definitely the key function in computerised qualitative analysis, let us click on it! After choosing it, however, a small info window popped up with an ironic reply to your command: "Do you believe in magic?" And, if you were happy enough to have your PC equipped with a sound card you could also hear a significative hawking, indicating that you had just done something really foolish. [1]

Software packages such as Atlas.ti simply cannot do mental work for you. It is always you, as the analyst, who has to do the real analysis. Because only human researchers can think. The software only provides more or less useful assistance and support to the thinking subject.2) It extends the researcher's mental capabilities to organise, to remember, and to be systematic. But while doing so it essentially remains a stupid instrument, which cannot do things such as determining the relevance of a text passage. Humans, not machines, do the crucial work of coding and retrieving—i.e., decide what passages of data should be marked by what terms to be searched and browsed later on. The hope that the programme would do more and be able to replace the analytic mind is foolish. Only human researchers can make sense and analytic use of otherwise meaningless operations of the computer—such was the unforgettable lesson given by this little nasty quirk, incorporated into the design of the programme. [2]

This was an important and much needed lesson, of course, which was designed to prevent a typical misunderstanding about CAQDAS. Yet, I am convinced that the argument was (and still is) somewhat misleading. Indeed, in this paper I would like to suggest that making CAQDAS a substantively irrelevant and purely instrumental technical extension and support of mental processes was a disservice, something of a poke in the eye of all qualitative research. I argue that the entire idea that software essentially represents what occurs in the analyst's head strengthened a classical "methodological" view of qualitative analysis, emphasising the role of a researcher who is superior to his or her research subjects by virtue of special qualities of his or her thinking. Accordingly, this way of thinking suppressed a non-exclusive, say "ethnomethodological" position, which highlights taken-for-granted material practices and instructability of knowledge production. [3]

Such a mentalistic approach, either implicit or explicit, has had two unhappy consequences. First, CAQDAS has developed problematic relationships with those theoretical-methodological positions in qualitative research, increasingly influential among members of the community, that departed from objectivist methodology. Computer assisted qualitative data analysis is seen as not easily compatible with radical constructivism or post-structuralist understandings of language. Some scholars have even argued that under the disguise of the innovation called qualitative computing, a conservative ("modernist") approach reaffirmed its position (COFFEY, HOLBROOK & ATKINSON, 1996). [4]

Second, a unique opportunity for better understanding qualitative analysis as a set of mediations and embodied practices has been missed. This is really unfortunate, since such an understanding is priceless for our ability to defend, explain and teach qualitative research. In texts on qualitative research, there is usually an abundance of descriptions of various paradigms, approaches and theoretical frameworks; or of data collection procedures, fieldwork practices or research ethics. But when it comes to practices by means of which a new quality of reading (which we call sociological understanding) emerges, descriptions often become somewhat vague and poor.3) Analysis and interpretation of qualitative data are often seen as performances of "pure reason" to such an extent that it is very difficult to provide a clear and practice-oriented account of it. There seem to be no intermediaries here, just the lucid mind of the researcher contemplating the data. And it is the mind that is responsible for deduction, induction, generalisation, conceptualisation, comparison—as basically mental operations … [5]

Such accounts do reveal important things about qualitative analysis. But they are of a limited help. It is especially true when one has to explain to an outsider or to a student in what terms qualitative analysis consists of anything more than a careful reading of data, spiced by providential insights and observations (if there ever are any). As a consequence, it is claimed that qualitative research is in fact an art, hardly graspable and transferable (DENZIN, 1994, p.512 and others). It is emphasised, in response to inquiring questions about "how it is done", that there is no single qualitative method and that analysis of data can hardly be separated from other research-related activities (which can subsequently be described at length). Qualitative research is presented as a complex and context-dependent activity that resists a cook-book style of instructions. [6]

Similar responses are surely not wrong. Not per se. But they avoid the main point. Even worse: by avoiding the point they make it even more urgent—how qualitative analysis actually generates a new knowledge, in a distinctive and recognisable way? Conventionally, as we have seen, people are told that it is not by pressing a button in the interface of a computer programme. This would not help, it is believed, because everything important happens in our minds, in a way that is difficult to explain. My paper takes a different road though. I will try to talk about material practices and inter-actions, rather than of mental operations of an individual. The ambition here cannot be to explain the logic of (grounded theory) qualitative analysis better and deeper than, e.g., Anselm STRAUSS in his marvellous Qualitative Analysis for Social Scientists (1987). Rather, I would only like to take STRAUSS more seriously in the moment when he notes that research work consists of "sets of tasks, both physical and conceptual" (STRAUSS, 1987, p.1; italics added by ZK). And because the conceptual usually seems to be overrepresented in qualitative methodology writings, including the STRAUSS's book, I will focus here on the physical. [7]

"Thinking" will be bracketed out—not because it is unimportant, of course; but because its presence cannot account for differences between ordinary knowledge practices (e.g., of research subjects) and qualitative analysis worth of the name (on the side of the researcher). Of course, we analysts do think. No question about that. But so do all the others, including our research subjects. Therefore it does not make much sense to ground the superiority of sociological knowledge almost exclusively in our mental qualities and in the very act of … thinking. Rather we should focus, as science and technology studies do, on practical manipulations with visible, hearable and palpable pieces of reality that have the power of making the final sentence stronger and more durable than any other competing statement (LATOUR 1987, one for all). [8]

In the next section I am going to briefly explain this particular inspiration taken from science and technology studies. Then I will discuss the place of grounded theory methodology (GTM) and Atlas.ti in my overall argument. Also, I will clarify in what sense it is possible to keep the focus on material practices in the virtual environment of a computer programme. The main part then follows: an attempt to describe the analytical work with Atlas.ti in terms of creation and operation of a "textual laboratory". The most ordinary analytic procedures such as data segmenting and coding, linking or memoing will be presented as practical manipulations with objects visible on the screen. Precisely these manipulations endow the knowledge arising from qualitative analysis with qualities that make it distinct from ordinary members' knowledge. Furthermore, they enable us to speak of qualitative analysis in an instructable, practical way. The conclusion will discuss some broader theoretical consequences of such reframing of our thinking about qualitative analysis. [9]

2. The Inspiration from Science Studies

Bruno LATOUR (1995) in his article on a research expedition to the Amazon forests gives an illustrative example of how contemporary science studies understand the operation of scientific work. The question is how it is possible that scientific texts speak of reality; what constitutes their reference to the things under study. This question of the relation between the word and the world is an old one. But the perspective of science studies comes out with a novel answer to it. As a sociologist of science, LATOUR avoids theoretical concepts of epistemology and offers an ethnographic account (accompanied by a set of photographs) of various practices by which members of the research expedition "translate" the border between savannah and forest somewhere in Amazonia (i.e., the phenomenon under study) into the text of a scientific report. He emphasises that the empirical evidence he presents contains no traces of a mysterious jump from the world to a word; rather, we can follow numerous small practical operations by means of which reality is more and more loaded with meaning and progressively de-materialised so that it becomes increasingly "textual". There is no direct bridge between the world and the word, only chains of translations—i.e., practical manipulations and interventions by which a piece of natural landscape is turned into a field laboratory with exact parameters and coordinates; by which lumps of soil become sufficiently representative samples; and by which qualities of these lumps can be substituted by written codes and comments so that the studied boundary between savannah and forest can successively be inscribed into something else, and therefore inhabit/constitute the paper realm of texts.4) [10]

Like LATOUR, or many his colleagues, we could follow the series of translations made by qualitative researchers on the move from the field to the realm of textual data. For instance, something (which has happened) is narrated by an interviewee; the narration is recorded; the recording is transcribed; the transcript is incorporated to a set of data … each such step meaning that something is lost and something is gained. In general, it is materiality what is lost—e.g., material specificities of the interview act, such as the totality of voice modulation, smells, gestures and surrounding environment. What is gained? Meaning is gained, simply put. This is possible because the gradual loss of materiality brings about new possibilities. Once reality is narrated, recorded, and transcribed we can better manipulate it—store, transport, compress, mark, juxtapose to another realities, cut into pieces, recompose, reorder, etc. Only thanks to these manipulations we can see (and show) differences and similarities, emerging patterns, new contexts. [11]

Since we proceed in such a way that it is always possible to go back, along the chain of transformations (i.e., from a quotation in our paper to the transcription, to the recording, and—with the help of fieldnotes and labels on tapes—to the situation of the interview or even, to some extent, to the "original" event) we can speak of reference. Hence the LATOUR's argument, which he so nicely illustrated by the case of the research expedition to the Amazon forest: scientific texts speak of reality not because of a mysterious bond between things and words (something philosophers are so busy with), but rather thanks to well-tied chains of small transformations, during which something is preserved while other qualities are lost.5) [12]

However, I am not so much interested here in reference as a bond between the world and the word which we strive to maintain during the move from the field to analytic work on data. Rather, my task is to apply science studies' imagination to a "next step", i.e., to the qualitative analysis itself, more precisely, to the work with the programme Atlas.ti. I would like to show that what is often seen as an achievement of mind can be perhaps better described in terms of practical manipulations with bodies of texts. [13]

3. Why Choose GTM as an Example?

But what kind of qualitative analysis am I going to discuss? There exist different traditions and approaches to qualitative analysis6) and my account will in no case be "methodologically neutral". In general, I am going to take as an example the kind of qualitative research that is close to what is known as grounded theory methodology (GTM, see GLASER & STRAUSS, 1967). [14]

I should stress right from the beginning that it is not "grounded theory methodology" as a label for a self-contained epistemology that really matters. Rather, by GTM I refer to a loosely defined set of analytic practices, the use of which is very common among sociologists, ethnographers, psychologists or even historians. Howard BECKER (1993, p.228) says that "… general statements of what must be done to be scientifically adequate rely, usually without acknowledgement, on practical matters and, in this, they follow rather than lead everyday practice." This is very close to the position of science studies in which the perspective of methodology is suppressed in favour of a sociological study of "science in action".7) Therefore, to put it in a rather non-methodological way (i.e., without reference to the established notions of theoretical saturation, axial coding or constant comparison), I am going to talk about the kind of qualitative research projects which make use of large amounts of data, that are analysed in systematic and rigorous ways, and which aspire to provide knowledge different from (and in a way superior to) what is usually known by studied members or participants. [15]

There are several good reasons for choosing GTM as an example for my argument. First, the choice is not surprising given the credit the authors of Atlas.ti themselves make to this particular approach (MUHR & FRIESE, 2004). Further, whether one likes it or not, GTM enjoys persisting popularity, especially among students and teachers, and aspires to be taken as an overall strategy for non-deductive research projects. Occasionally, if taken as a generic approach for generating theory out of qualitative data, it is even perceived as a synonym for qualitative research.8) The current CAQDAS epidemic even strengthens this hegemony. Last and perhaps most important (and in close relation to the above) GTM is nowadays a challenged and often misunderstood qualitative paradigm. Some regard it as somewhat obsolete and associated with modernist adherence to scientific rigour and objectivity, improper for interpretive social research (LINCOLN & DENZIN, 1994). Further, software packages organised around the procedures of coding and retrieving contributed to a more or less implicit conviction that grounded theory methodology is nothing but an application of the code-and-retrieve principle. This is an unfortunate misapprehension (STRAUSS & CORBIN, 1994), which is difficult to combat. [16]

On the other hand, however, the "ecumenical" focus on something-like-grounded-theory-methodology is relatively arbitrary. In fact, we could try to follow other analytic practices—e.g., in conversation analysis or narrative analysis—equally well (perhaps with the risk of being less widely understood, since these practices are familiar to fewer social science people). [17]

It should be also stressed that I am not going to come out with some new and specific analytical procedures. No new analytical techniques and no new features of Atlas.ti will be proposed. Instead, I suggest just an alternative "theory" and practice-oriented account of very ordinary and basic procedures we all usually do as analysts. [18]

4. Reality, Virtuality and Practices

A question might appear: if we are to understand material practices of qualitative analysis, why not to look at a pre-CAQDAS researcher working with real things such as sheets and pieces of paper, printers, colour pencils, scissors, glue and card files? Such a focus would definitely be possible. And at some moments it could be pretty illuminating. [19]

In comparison to that when an analyst works with a specialised computer programme, the only thing he or she can manipulate seems to be pure information—bits and bytes that are thought to represent ideas in researcher's mind. Indeed, if we consider a computer to be a direct extension of human thinking, we could hardly talk about material practices at all.9) But computers can be viewed differently. They have keyboards, mouses, speakers and monitors. And on screens of monitors we can create, see and manipulate various objects. These objects can be of different sizes and shapes; they can be hidden, moved, split, colourised, grouped and regrouped, forgotten and rediscovered on unexpected occasions. In short, computers provide a virtualised environment in which we can not only do all the operations available to the pre-CAQDAS researcher equipped with paper, scissors and pencils, but much more. Virtual objects on the screen are even more shapeable by and embedded in practices than real ones. [20]

5. The Creation and Basic Operation of Textual Laboratory

What do researchers practically do with Atlas.ti when analysing their data? Let me pick up just a few key moments of the process. I will proceed from what is typical for the beginning of the project to what usually takes place at later stages. [21]

5.1 Assigning primary documents

In Atlas.ti a research project is defined by a set of "primary documents" (PD). These are our data. And data, so is the common belief, are what we gather in the field. True, but this is only a half of the story. Because data are also everything that we strive to put on one place, on one table. Or, more exactly, into a single textual laboratory—which has the power to shrink time and space distances between observable phenomena so that everything important is present and under control.10) [22]

We can better understand the point when we imagine what happens when primary documents are assigned to a project (to a "hermeneutic unit", as it is called in Atlas.ti). Adding new documents has important practical consequences: once we open the hermeneutical unit next time we immediately have all the materials at hand. These materials can have various formats—they can be texts, photographs, scanned documents, audio or video recordings. They can even be physically located on various media—on hard disks, optical discs, local network or the Internet. But most importantly, these documents can have their origin in a variety of times and places. They refer to different sites and moments. [23]

Interviews, recorded and transcribed, could have been made, for instance, during last two years in dozens of households and offices in several middle-sized cities. But the interview transcripts, or other data "from the field", are not the only documents that may belong to the primary documents of our project. Other primary documents, depending on circumstances, could be: excerpts from literature on the topic, written down actually during our entire professional career either at home or during study trips abroad; scholarly articles downloaded from online databases and covering several decades of relevant research; selected newspaper articles on the problem, published in the last decades; related official documents obtained from the Internet or coaxed from a range of involved authorities; a project proposal of our research written almost three years ago; e-mail exchanges with colleagues home and abroad that took place when the project proposal was prepared. And so on. [24]

So now we have all this in sight and within arm's reach. Or rather, we have all this available for scrutiny with the help of a few clicks of the computer mouse. While browsing primary documents of the project, we travel in time and space. It is unbelievably easy and fast: click, click. An interview with Mr. Miller from the city of Plzen, May 2005: we talked about how new civil organisations in the Czech Republic had been established in the beginning of 1990s. Click, click. A resolution of the governmental council for NGOs approved one month ago: it suggests a redefinition of the legal status of some non-profit organisations. Click, click. My own excerpts from a book on environmental movements, published in 1984: I made the excerpts roughly five years ago in Paris when I was writing a short note on new social movements. Now, in the context of these excerpts, what exactly did Mr. Miller say? Click, click, and here we are.11) [25]

When I was reading the book on environmental movements, I did not know about Mr. Miller's civil association. I was not even interested in it. I had no idea that I would engage, several years later, in a research project on expertise and democracy, for which I would also need interviews with local activists. And at the moment when I was doing my interview with Mr. Miller I only vaguely recollected what the authors of the book had said. The two events were too distant from each other. Both temporally and geographically. And also in terms of their nature, since the former concerns "theory" (and broader sociological contexts) while the latter is about the production of "empirical evidence" (and my own data). But at the present moment they are juxtaposed, next to each other, right at hand: Paris and Plzen, 1993 and 2005 (referring to the early 1980s and early 1990s)—here and now. The distance between the two pieces of reality is very small at the present moment, measurable by a few clicks of the computer mouse. They can be carefully compared and confronted. As primary documents, they have standardised headers (e.g., in comments attached to each PD) that enable us to keep the reference to the original distant times and places. [26]

5.2 Defining quotations: Cutting PDs into pieces

But it is difficult to juxtapose entire PDs. They are too large. There usually is no practical way to squeeze two full transcripts into a single unifying view. We can see more than one PD at once only as a list of items or a set of icons in a window, arranged in various ways. Even more, it is hard to see—at one moment—a single entire PD. Both our visual field and the size of the screen are limited. We can always see but a couple of paragraphs.12) [27]

We need a different kind of object to be able to closely study our data. Something smaller. That is why we mark some paragraphs or sentences of particular interest as "quotations". In the first view it looks like marking relevant passages on the margin of a book. But the virtual environment allows more: in fact, by marking a piece of data, we not only modify and extend the original PD, but also create a new analytical object—a piece of data separated and freed from its original context. The separation is never complete though. We can always trace back the quotations to their original location. [28]

What is the advantage of having the marked quotations at hand as self-contained objects? We now have our data in a form that better fits the screen and, in its variety and multiplicity, our field of view.13) The references to the original PDs are preserved: this is what Mr. Miller said, that is what the governmental resolution stated, and here we have a sociological observation from literature. But now we can work with all these textual pieces together, since the data are transformed in a double way. First, they are reduced in number so that we can focus only on what we have found relevant so far.14) Second, they are reduced in size so that they become graspable pieces of data. Only now can we arrange, on the screen of a monitor, unprecedented rendezvous that occur under our direct visual control: a piece of a legal document (a particular paragraph) meets a piece of an interview or a passage from an older research report.15) Do they support each other? Do they contradict? In what sense? Now we are in a good position to start arguing about all that. [29]

Indeed, quotations are elementary units of analysis not only because their meanings are reasonably contained and therefore accessible to our minds and mental processing; they are also of reasonable physical size to be grasped and processed in a material way—by eyes, hands, lists and boxes, computer screens. Hence the general point of this paper, i.e., to show that analytical work is in an important sense a material praxis (and vice versa). [30]

There is a big "but" in this though. The more quotations we have, the more distant from each other they again become. They are so numerous that one easily gets drowned in data. It takes a long and painful journey to find a way, or even the way, the connection, from one piece to another. Two relevant passages are often separated from each other by hours of careful reading and browsing. [31]

5.3 Codes and coding: Reintegrating the pieces

Pieces of data, quotations, need to be somehow ordered to become manageable even in large quantities. This is where the procedure known as coding comes in as a useful strategy.16) By coding we link certain quotations together and form thematic groups of data-pieces. Codes are names for such groups, indicating what kind of quotations can be found in each particular bundle. Here the gathered documents, interviews, excerpts, scholarly papers, project proposals, and media articles speak, for instance, about "money", here about "legislation" and here about "negotiation". With the help of codes (and the virtual Atlas.ti environment), we can see the thematic contours of each group of quotations17) as well as the size of the groups. [32]

But codes are not just names, conceptual labels. They are also useful handles with which we can grasp and manipulate respective groups of data-pieces.18) Codes can be selected, commented, ordered, filtered, moved, renamed, split, and linked to each other. They can be viewed in lists, hierarchies, network views or as particular occurrences (instances) when browsing through our data. Anytime we are doing an operation with a code (e.g., when we are linking it to another code or just selecting it) we do some indirect work on all associated quotations as well. [33]

Now, instead of having to freely dig through and through an unsorted heap of quotations, we can proceed more effectively. By means of coding, quotations gain relevance and meaningfulness. Some groups of quotations become closer than others. Coded data selectively shrink analytic distances between some pieces of data, making these elementary units more manageable. In short, they allow for a kind of more efficient, thematically or semantically organised reading.19) [34]

6. Making the Textual Laboratory Really Useful: Beyond Code-and-retrieve

Four principles, on which Atlas.ti operates, are introduced in the user's manual (MUHR & FRIESE, 2004, pp.3-4): visualisation, integration, serendipity, and exploration. Exploration is a very general term and can be applied to almost anything we do in qualitative analysis.20) The remaining three principles are more specific, and thus more interesting. [35]

It can be said that the previous section of this paper, focused on the basic logic of coding-and-retrieving, dealt primarily with what MUHR and FRIESE called integration—i.e., with how it is practically accomplished by the analyst that heterogeneous pieces of data are held within reach, kept under control and become manipulable. Indeed, coding-and-retrieving refers to not much more than the mere possibility of organised and efficient reading. No matter that so many people cannot imagine that qualitative analysis would consist of anything more than precisely this procedure, a rigorous analytic knowledge originates in something else other than only the coding-and-retrieving activity. And this something else has something to do with the other two principles, visualisation and serendipity.21) [36]

6.1 Writing comments

Each of the analytic objects we create in Atlas.ti—PDs, quotations, links, and network views—can be accompanied by a comment. There also are "free" comments, called memos, that can be attached either to more or less than one (kind of) object. The ways in which comments are used may be different, depending on the kind of commented object and chosen strategy. For instance, comments to individual PDs may contain detailed information about the source of data. Code comments would typically, but not necessarily, be descriptions or explanations of names given to less obvious or less descriptive codes. In the case of quotations or links, comments might provide explanations of why we have created these objects—i.e., what was so interesting about them. [37]

Memos are a special case. Their importance and analytical use is typically growing together with the progress of our analysis. In memos we integrate partial observations. The integration is not just an abstract mental operation. It corresponds with the ability of memos to be attached to several codes, quotations and other memos at once. We can therefore imagine memos as embryo-paragraphs or -pages of a future research report, already well-founded in empirical data and embedded in a broader argument (in the structure of other memos). Ideally, the report should be at least half-written within Atlas.ti: much of writing the report in a text processor (outside of Atlas.ti) would then consist of editing, associating and completing pieces of texts contained in memos and associated analytical objects, especially quotations and various other comments. However, such a dense and empirically grounded network of Atlas.ti analytical objects does not appear out of nothing. It is the result of a long-term work which goes through and beyond the above-described code-and-retrieve operations. What kind of work? [38]

It is generally thought that the main purpose of commenting on analytical objects is to help one's memory. The best way to prevent the ideas emerging from our reading the data becoming lost to our minds is to write these ideas down. Again, this is a conventional view, in which the use of software promotes and extends our mental capabilities. But there are other benefits of commenting. [39]

First of all, it is important to note that commenting is one of the key moves that constitute interpretation of data. By means of writing comments the researcher inscribes him- or herself into the studied material so that it becomes more and more under control. In the beginning, almost everything we have "on the table" is what others say; as time goes, the others' accounts are extended by our own textual interventions and additions. Brackets that mark quotations emerge on the margin of the main text; code names are attached to some of the quotations; and, above all, we add our comments here and there. After some time, we are studying not exactly the same original data, but a much richer mixture of voices, our own voice being increasingly pervasive among them. This is how sociological text is produced out of the text of data. No sudden switch from the empirical to the sociological is possible, only slow growing of the latter into the former. [40]

Comments should therefore not be seen only as tools for preservation of ideas, but also (and perhaps rather more importantly, since the aim of analysis is not to just preserve ideas!) as a space in which sociological text is gradually born. As such they should be made whenever possible. [41]

Our ability to add a comment to a possible new free quotation or a link could even be well taken as a test of whether the creation of certain new objects is legitimate. It is typical that beginners produce new analytical objects of Atlas.ti in a rather free-and-easy way. Seduced by the effortlessness and speed with which new quotations or links can be made, they soon have thousands of coded or free quotations and hardly any item unlinked to anything else, without having an idea what to do with these huge quantities of connected objects. Careful consideration is in place, especially when non-trivial, "strong links" are at stake.22) But what could be a feasible criterion for decisions about whether to link two quotations or codes or not? Some would suggest various kinds of rational criteria, but I recommend a pragmatic (and almost mechanical) one: is there anything worth of putting down about this particular text passage or connection? If yes, then let us create the link with confidence and make the respective comment. But if we are unable to write a comment on the considered link at the time, and only have an uncertain "feeling" or "sensation", then we should hesitate. If theory is to be grounded in empirical data then practical details, such as links grounded in arguments (not mentally, but virtually, in the form of written link comments), are observable procedural elements of it. [42]

Creation of quotations is somewhat different in this. The most common purpose for creation of a new quotation is the need to code a piece of data. Often creation of a new quotation and coding can be considered a single operation.23) Nonetheless, free quotations (not linked—at the moment of creation—to a code) can be a very useful tool. We can imagine a procedure technically analogous to creation of free codes, which would consist of marking out only free quotations during an initial reading of data, without thinking of any codes (for the time being). Strategically, the procedure might be understood as an alternative/complement to what STRAUSS and CORBIN (1990, p.62) call "open coding". And precisely for the production of free quotations we might use a similar rule as for links: commented free quotations are fully legitimate, uncommented only as exceptions. [43]

6.2 How to see relevance?

Let us assume that our data are segmented and coded carefully and with circumspection. Segments and codes are linked to each other by various kinds of relations where appropriate. Comments are attached to created objects and links (that are, in fact, analytical objects too), which—as I have just argued—enhances the quality and argumentative groundedness of our work. In short, a large number of partial and limited analytic considerations have been materialised (or rather virtualised) in the form of observable and manipulable objects—codes, quotations, comments and links … So far so good. But this surely cannot be the end of analysis, but rather the beginning. [44]

What next then? What to start with? There are so many potential points of interest, so many possible questions. We now need to become focused. And we also need to reduce our empirical material and work further only with some parts of it, the most relevant ones. [45]

But how can we recognise a relevant piece of text? How to identify the most relevant codes or memos? Some would suggest a really, really deep think. It's time to step out from the somewhat mechanistic world of computer processing and finally start doing true intellectual work … but I don't think so. On the contrary, this is the moment when we should stick to the computer and ask for an answer from Atlas.ti. No, I do not believe in magic (as might be implied by the quirk mentioned in the introduction). I only believe in relevance as an emergent and recognisable property of my entire work up to now. [46]

Indeed, a glance at the monitor and few clicks of mouse are enough in Atlas.ti to see which quotations are most relevant and thus most promising for further analytical scrutiny. Provided we have proceeded as described above, we can easily have a look at everything we have thought of our data. What exactly is worthy of notice? Simply put, an especially important piece of our data is a quotation for which we have a comment; and/or which is connected to several codes; and/or which has been linked to (an)other quotation(s), preferably with commented (argued) links; and/or which has appeared in noteworthy network views … But wait, which network views—among all the saved ones—are noteworthy? Again, it is the same principle: those with comments, those containing relevant quotations and important codes. Important codes? Yes, those codes that are associated with higher numbers of quotations; that keep a specific position in the scheme of codes; that are used for classification of quotations in key PDs (such as a project proposal); and/or that are linked to relevant memos. Relevant memos? Yes, again, those memos that are linked to interesting quotations and codes (and therefore are conceptually and empirically saturated); and those that are also linked to other memos so that they participate in the structure of an overall argument.24) [47]

All these qualities are well and easily visible in Atlas.ti. Density and nature of links especially can be seen almost immediately. When you look at respective lists of objects, you become oriented in a few seconds. Recent versions of the programme even offer nice summarising previews of co-occurrences of codes in the data set. Possibilities of various synoptic views are overwhelming. Of course, you cannot start your analytic work with Atlas.ti by pressing the magic button marked "Relevant text search"; but after you have fruitfully spent some time on your data, many Atlas.ti buttons become truly magical: just click on the button that opens a small quotation manager window and then make one more click to sort your quotations by the number of links to other objects—and voila, here on the top we have candidates for the position of most relevant pieces of the data. In the same manager we immediately see which quotations are commented and we can even filter out uncommented ones. The list of candidates gets more narrow and solid. There are several ways to find out how many codes (and which codes) are associated with the candidate quotations. Are these important codes? If yes, the respective quotation should be elevated in the ranking of candidates. And so on.25) [48]

You can see all this quickly and easily, without serious or deep considerations involved. Well, not really. But the important acts of thinking have already happened, in countless moments of our previous coding, segmenting, commenting, linking …; and now it is sufficient to only take a brief look and make use of these numerous small acts materialised and visualised in a powerful sum. If you trust your judgement, as it has been applied during the long-term and detailed work with individual PDs, quotations and other objects, you can comfortably rely upon the criteria outlined above. They help to crown your entire effort. [49]

6.3 Reading data in a new way

From the suggested point of view, the quality and relevance of concepts and their empirical content are the results of the ongoing analytical work, not its precondition. Relevance is made. And it is made not exactly by our thinking alone. Rather, as something that can easily be seen, it is produced by material practices, in which the virtual environment of the computer plays a crucial role of mediator. Atlas.ti provides an interface in which and through which we do thinking. [50]

We could similarly describe practical counterparts of some other mental operations. Let us take, for instance, the situation when we need to temporarily look away from theoretical concepts used up to the moment and look at our data "with new eyes". This is a difficult task for one's mind, requiring a lot of self-discipline and renunciation. But it has a very practical dimension. We can arrange our working environment, our virtual scene, so that the software takes on (at least partly) the burden of the above-mentioned intellectual challenge. It is possible, with a few clicks of mouse, to simply filter out all the respective codes—i.e., the codes that embody the above-mentioned theoretical concepts. As a result, they completely disappear from the virtual desk. They can be found neither in the code manager nor in the object explorer. These codes are removed even from the margin area. Simply put, they temporarily cease to exist. And this is how it is practically done that the studied documents are read (as much as possible) "anew", without the conceptual burden of previous analysis. Out of sight, out of mind. [51]

Making things temporarily invisible, or rather something we could call selective visualisation, is an important aspect of the visualisation principle. It occurs, in fact, all the time. Imagine the most ordinary situation when we browse quotations ascribed to a code or several codes. Such a procedure substantially transforms our reading of the data. We do not read individual documents as usual anymore, i.e., one after another. Instead of studying the interview with Mr. Miller, then the legal document, then a sociological article, then another interview, and so on, we proceed transversely. By listing and viewing all the quotations coded, e.g., by the code "money", we construct—out of the original data and in addition to them—a new composite and multi-vocal text on financial matters. This composite text is another embodiment of our progressive moving from original contexts and meanings to a sociological argument. As a new element, a newly created object, it belongs a little less to our respondents and a little more to us, analysts. [52]

When I speak of the construction of a new text I do not mean it as a metaphor. What we have here is a quite real sequence of sentences and paragraphs, which can be read on the monitor from the beginning to end and which can be saved as a new document or even printed on paper. We can even assign such a newly created document as another PD to our project (hermeneutical unit) and treat it as material to be further analysed.26) … Why should we? Because once the pieces of data are cut off from original contexts and put to other (thematically defined) relationships, they tell a story unheard so far. What seemed to be important at first may suddenly appear as a minor issue; conversely, what we originally considered as marginal may gain importance, since, for instance, it becomes clear how often different people mention it. A space for new insights and ideas opens up, which brings about new textual additions (comments, links, codings), and thus also new relevance … the serendipity principle in action. [53]

What then constitutes the new quality of sociological reading of data? How is a new understanding of reality born? Initially it seems that interpretation of qualitative data involves a range of manipulations with textual units—manipulations that stem from repeated reading of one and the same set of collected data. A closer look, however, reveals something else. The researcher in fact manipulates the texts of data so that new texts are progressively created (written) out of the old ones and alongside them. It is not a linear process, but a tangled and intermittent procedure. As its result, a number of new accounts emerge, in which the voices of studied actors are still present, but more and more so is the voice of the researcher too. These new accounts offer and provoke new perspectives and insights. Such a textual practice, based as much on writing as on reading, is the primary vehicle of the production of a new understanding. [54]

Once we start considering "natural" activities of consciousness such as thinking or seeing27) as embodied material practices, we better realise on what grounds sociological interpretation truly separates from ordinary social interpretations. Sociologists score primarily not by bright and refined minds or sharp eyes, but rather by everything they practically perform with their data. What appears as reading one and the same data in a new way, which can be taken as a desirable outcome of analysis, is in fact an effect of the procedure in which we artfully produce new and new (versions of) texts and read them with basically one and the same set of eyes and mind. [55]

7. Conclusions

It can be said that CAQDAS has brought about extraordinary easiness, speed, and reliability with which we can move through and through extensive data sets and with which we are able to remember, recollect and think. But programmes such as Atlas.ti offer much more than that. They enable us to see from various perspectives what (we believe) happens in our minds. The sophisticated interface of these software tools is important not only to allow intuitive and comfortable operation, but also because it brings a range of mutually related devices of visualisation. [56]

Atlas.ti therefore enables researchers to think in a visible way. Visualised thoughts or mental operations can easily be stored, recollected, classified, linked, filtered out in great numbers … and made meaningful in sum. Visualisation implies, for instance, that codes are not only mental entities or concepts, but also named elements of various size and colour that can be manipulated by hands and controlled by vision. Thus, thinking made visible is by the same token thinking made more accountable and instructable.28) [57]

Thinking is inseparable from doing. This is an important, but neglected lesson for qualitative analysis. It is paradoxical that so many texts on qualitative methodology ignore the lesson, given the fact that it was introduced and elaborated within several related intellectual traditions that constitute the theoretical background of key qualitative approaches.29) The advent of CAQDAS even deepened the paradox. Software packages for qualitative analysis are often presented as tools that can extend and support capabilities of researcher's mind, but that cannot "really think". As such, these presentations reaffirm the mentalistic, essentially methodological conception of knowledge. [58]

Inspired by contemporary science and technology studies, I have attempted to show CAQDAS and qualitative analysis in a different light. Instead of describing ordinary moments of qualitative analysis and interpretation in terms of specific mental operations (represented in the software's interface), I have emphasised material practices and manipulations. The analytic work with Atlas.ti is especially suitable for such reframing. Indeed, it might be argued that qualitative computing is misunderstood insofar as software packages are not seen as virtual environments or media for embodied and practice-based knowledge making. Inseparability of thinking and doing in qualitative analysis is hardly observable better elsewhere. [59]

Grounded theory methodology (broadly defined), this more or less explicit alter ego of CAQDAS, has been reframed too. When described not in terms of methodological or theoretical concepts but rather in terms of what we practically do with the analysed data, grounded theory becomes perfectly compatible with the textualist, post-structuralist paradigm (from which it has allegedly departed considerably). As Zygmunt BAUMAN summarises (1992, pp.130f.):

"One of the most important boundaries that cannot be drawn clearly and that generate ambiguity in the very process of being compulsively drawn is that between the text and its interpretation. The central message of Derrida is that interpretation is but an extension of the text, that it "grows into" the text from which it wants to set itself apart, and thus the text expands while being interpreted which precludes the possibility of the text ever being exhausted in interpretation." [60]

And this is precisely what we have seen. The way analysts manipulate, transform and extend PDs in Atlas.ti (or with scissors, glue, and colour pencils) might be taken as an empirical demonstration of this post-structuralist argument. To put it differently, GTM looks desperately "modern", scientistic, and far away from what was brought about by the textual turn in the social science only insofar as its procedures are interpreted "immaterially", i.e., as basically conceptual work on data. Once we take seriously STRAUSS's statement, quoted at the beginning, that qualitative analysis should be understood as both physical and conceptual sets of tasks, GTM becomes open to all post-structural and radical constructivist sensitivity. [61]

Such understanding of GTM, however, does not imply a loss of its normativity and instructability. The contrary is true. While GTM has always been popular among teachers and students for its relative ability to be formulated as practical and understandable guidelines for action, the proposed reframing would only enhance this virtue. [62]

Notes

1) Pieces of this contribution have already appeared in Czech as parts of my reviews of Atlas.ti (KONOPÁSEK, 1998, 2005a) and in a conference paper (KONOPÁSEK 2005b). This paper has been written within the work on the framework research programme "Theoretical Research on Complex Phenomena in Physics, Biology and Social Sciences", MSM 0021620845. It was originally published in the HSR-Supplement 19: "Grounded Theory Reader" (MEY & MRUCK 2007) and has been revised for FQS. <back>

2) Hence the notion of CAQDAS, computer assisted qualitative data analysis software, used for this family of qualitative computing. It should be noted, however, that there are other programmes useful for qualitative analysts, but constructed quite differently, namely on the principles of co-occurrence analysis. These programmes are explicitly intended for generation and attribution of meaning on the basis of computerised analysis (with practically no direct intervention of a clever human mind) of co-word networks in huge bodies of data (TEIL & LATOUR, 1995). Attempts at "intelligent" computer processing of qualitative data are explored even within the family of classical CAQDAS tools. Software called Qualrus recently introduced the concept of "intelligent coding". The programme attempts to propose suitable codes for selected quotations on the basis of an analysis—running as a background process on the computer—of all coding operations made so far (assuming that quotations containing similar words would be coded similarly) … But let us leave these interesting developments aside for now. <back>

3) This tendency that writings on qualitative research "are long on their discussions of data collection and research experiences and short on analysis" has been noted also by STRAUSS (1987, p.xi). <back>

4) The role of texts and textualisations (inscriptions) in scientific work is summarised by LAW (1986). <back>

5) For instance, a recording is well-tied with its translation into text by means of accurate and faithful transcription. ASHMORE and REED (2000), among others, show that it is not an easy task. <back>

6) Identification and comparison of different paradigms in qualitative research has become a popular topic in books and articles (CRESWELL, 1997; GUBA & LINCOLN, 1994; GUBRIUM & HOLSTEIN, 1997). <back>

7) A similar emphasis on research practices and a reserve toward theories of qualitative research can be found in SEALE (1999). <back>

8) After all, the title of the already quoted book on GTM by Anselm STRAUSS (1987) is "Qualitative Analysis for Social Scientists" (without further qualification). <back>

9) Of course, even working with cards, scissors and colour pencils within the "old" paper-pencil model can be (and usually is) viewed as a direct extension of mental processes. But still, most people would probably think that picking computers with their virtual, "non-material" environment as an example is not the best way how to overcome this mentalistic or representationist approach. But I believe the opposite is true: CAQDAS is an opportunity to grasp an alternative view of qualitative analysis as a set of practical manipulations with data. <back>

10) Laboratory has been a prominent topic within contemporary sociology of science both in the literal sense (i.e., in a number of laboratory studies such as KNORR-CETINA & MULKAY, 1983; LATOUR & WOOLGAR, 1986; LYNCH, 1985) and more widely in the sense of laboratory as a basic instrument for (scientific) control and visualisation (GIERYN, 2006; MILLER & O'LEARY, 1994). <back>

11) For the sake of simplicity, the examples and names are not authentic, however they are inspired by real work. The same is true for the rest of the examples in this paper. <back>

12) There are some differences if we consider audio recordings or images instead of textual PDs, but basically the argument would be similar. <back>

13) A number of practical recommendations regarding formatting of PDs and font settings, both aimed at good arrangement of visualisation, could be given here. From the perspective presented here, these would be directly relevant for analytic procedures. <back>

14) The list of quotations is not fixed, of course, it grows as we process the data; and some quotations may be deleted. Also, we often go back to the original documents and look for other relevant passages. But in any case, by creating quotations we create a selection of data that allow us to look at them in greater detail. <back>

15) Let us also note: the particular comment of Mr. Miller was highlighted as important last Tuesday, while the paragraph of the resolution had been marked as relevant ten months ago, before the fieldwork started; and the argument from literature has just been noted. (We immediately see these procedural details when we look at the quotations—the date of their creation or modification is an automatically generated part of their headers.) Thus, not only various documents meet in front of our eyes at this moment, but also various moments of our own previous analytical work. <back>

16) Coding is precisely the moment when an objection may easily arise: semantic relevance cannot be assessed by a computer programme such as Atlas.ti; the crucial analytical assessments and decisions necessary for the coding process have to be made by a thinking subject. But again, I do not deny that qualitative analysts have to think. I only say that the practical instructive value of an appeal such as "Think! Think more and better!" is rather low. Furthermore, just taking notice of a semantic relationship does not bring, in itself, any analytic utility. Such an observation becomes effective only together with its inscription into an analytical object ("link") that allows for its further use. <back>

17) That is why it is so important to choose appropriate names for codes. If we choose badly, we do not see the content of quotation aggregates clearly enough. <back>

18) Since we can use several such handles at once, we should keep codes simple and referring to a single thing—we can always combine them freely later. <back>

19) The possibilities of organised reading are further enhanced by the ability of Atlas.ti to make complex queries: we can, for instance, view all the quotations that speak both about "negotiation" and "legislation" (and study how exactly in all the specificities). <back>

20) The authors jokingly admit that "exploration" was included among the principles mainly to obtain a nicer acronym (VISE). <back>

21) Visualisation and visual representation has been extensively debated in contemporary science studies (LYNCH & WOOLGAR, 1990; LATOUR, 1986; SNYDER, 1998). By means of visualisation we create conditions for controlling, manipulating, and accumulating small pieces of knowledge, often meaningless in themselves, and integrating them into more elaborated and complex statements. Nice strategies of visualisation in qualitative research can be found in MILES and HUBERMAN (1994). <back>

22) Atlas.ti allows creation of qualifiable links (links of different kinds of relation) either between individual quotations or between codes—these are called "strong links" (MUHR & FRIESE, 2004, p.212). <back>

23) Some software packages, such as Ethnograph, originally even did not allow creation of free quotations. <back>

24) The criteria should be applied sensitively. A quotation may be considered as relevant and suitable for further attention even if it is coded by a single code or no comment is attached—provided, for instance, it is coded by an especially important code. Simply put, the criteria are not strict, but they still provide good orientation. <back>

25) It should be noted for those who are not familiar with programmes such as Atlas.ti that one can easily skip from lists and overviews to quotations themselves and their original locations. In other words, we not only see whether a quotation is commented or not or how many links it has to other objects, we can immediately read it in full length. <back>

26) This might be considered as a "system closure" (RICHARDS & RICHARDS, 1994). <back>

27) Seeing as instructed and material practice has been nicely demonstrated, from an ethnomethodological perspective, in a recent conference paper by LAURIER and BROWN (2005). <back>

28) Instructability was recently highlighted by GARFINKEL (2002) as a key concept for ethnomethodological understanding of practical action which can never be fully dependent on rule-following, but which is still understandable and accountable. <back>

29) I refer here, above all, to phenomenology (MERLEAU-PONTY, 2002 [1945]), ethnomethodology (GARFINKEL, 1967; HOLSTEIN & GUBRIUM, 1994), post-structuralism (DERRIDA, 1976; DENZIN, 1994, 1995), and constructivism (BERGER & LUCKMANN, 1967; SCHWANDT, 1994). <back>

References

Ashmore, Malcolm & Reed, Darren (2000). Innocence and nostalgia in conversation analysis: The dynamic relations of tape and transcript [45 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 1(3), Art.3, http://qualitative-research.net/fqs-texte/3-00/3-00ashmorereed-e.htm [Date of access: February 17, 2008].

Bauman, Zygmunt (1991). Intimations of postmodernity. London: Routledge.

Becker, Howard S. (1993). Theory: The necessary evil. In David J. Flinders & Geoffrey E. Mills (Eds.), Theory and concepts in qualitative research: Perspectives from the field (pp.218-229). New York: Teachers College Press.

Berger, Peter L. & Luckmann, Thomas (1967). The social construction of reality: A treatise in the sociology of knowledge. Harmondsworth: Penguine Books.

Coffey, Amanda; Holbrook, Beverley & Atkinson, Paul (1996). Qualitative data analysis: Technologies and representations. Sociological Research Online, 1(1), http://www.socresonline.org.uk/1/1/4.html [Date of access: February 17, 2008].

Creswell, John W. (1997). Qualitative inquiry and research design: Choosing among five traditions. London: Sage.

Denzin, Norman K. (1994). The art and politics of interpretation. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), Handbook of qualitative research (pp.500-515). London: Sage.

Denzin, Norman K. (1995). The poststructural crisis in the social sciences: Learning from James Joyce. In Richard Harvey Brown (Ed.), Postmodern representations: Truth, power, and mimesis in the human sciences and public culture (pp.38-59). Urbana & Chicago: University of Illinois Press.

Derrida, Jacques (1976). Of grammatology. Baltimore, MD: Johns Hopkins Press.

Garfinkel, Harold (1967). Studies in ethnomethodology. Englewood Cliffs: Prentice Hall.

Garfinkel, Harold (2002). Ethnomethodology's Program: Working out Durkheim's aphorism. Oxford: Rowman & Littlefield Publishers, Inc.

Gieryn, Thomas F. (2006). City as truth-spot: Laboratories and field-sites in urban studies. Social Studies of Science, 36(1), 5-38.

Glaser, Barney & Strauss, Anselm L. (1967). The discovery of grounded theory: Strategies for qualitative research. New York: Aldine de Gruyter.

Guba, Egon G. & Lincoln, Yvonna S. (1994). Competing paradigms in qualitative research. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), Handbook of qualitative research (pp.105-117). London: Sage.

Gubrium, Jaber F. & Holstein, James A. (1997). The new language of qualitative method. New York: Oxford University Press.

Holstein, James A. & Gubrium, Jaber F. (1994). Phenomenology, ethnomethodology, and interpretive practice. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), Handbook of qualitative research (pp.262-272). London: Sage.

Knorr-Cetina, Karin & Mulkay, Michael (Eds.) (1983). Science observed: Perspectives on the social studies of science. London: Sage.

Konopásek, Zdeněk (1997). Co si počít s počítačem v kvalitativním výzkumu: program ATLAS/ti vakci [What is computer-assisted qualitative data analysis good for? Atlas.ti in action]. Biograf, 12, 71-110.

Konopásek, Zdeněk (2005a). Aby myšlení bylo dobře vidět: Nad novou verzí programu Atlas.ti [Making our thinking visible: A review of the new version of Atlas.ti]. Biograf, 37, 89-109.

Konopásek, Zdeněk (2005b). Co znamená interpretovat text? [What it means to interpret a text of qualitative data?] In Michal Miovský, Ivo Čermák & Vladimír Chrz (Eds.), Kvalitativní přístup a metody ve vědách o člověku—IV: Vybrané aspekty teorie a praxe [Qualitative approach and methods in the human sciences—IV: Toward some aspects of theory and practice] (pp.85-95). Olomouc: FFUP.

Latour, Bruno (1986). Visualisation and cognition: Thinking with eyes and hands. Knowledge and Society: Studies in the Sociology of Culture Past and Present, 6, 1-40.

Latour, Bruno (1987). Science in action: How to follow scientists and engineers through society. Milton Keynes: Open University Press.

Latour, Bruno (1995). The "pédofil" of Boa Vista: A photo-philosophical montage. Common Knowledge, 4(1), 144-187.

Latour, Bruno & Woolgar, Steve (1986). Laboratory life: The construction of scientific facts (2nd. edition). Princeton: Princeton University Press.

Laurier, Eric & Brown, Barry (2005). Method and phenomena: Learning to see fish and flying objects. Paper prepared for the International Institute for Ethnomethodology and Conversation Analysis conference "International Perspectives", Bentley College, Waltham, MA 02452, USA.

Law, John (1986). The heterogeneity of texts. In Michel Callon, John Law & Arie Rip (Eds.), Mapping the dynamic of science and technology: Sociology of science in the real world (pp.67-83). London: Macmillan.

Lincoln, Yvonna S. & Denzin, Norman K. (1994). The fifth moment. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), Handbook of qualitative research (pp.575-586). London: Sage.

Lynch, Michael (1985). Art and artifact in laboratory science: A study of shop work and shop talk in a research laboratory. London: Routledge & Kegan Paul.

Lynch, Michael & Woolgar, Steve (Eds.) (1990). Representation in scientific practice. Cambridge: MIT Press.

Merleau-Ponty, Maurice (2002 [1945]). Phenomenology of perception. London: Routledge.

Mey, Günter & Mruck, Katja (Eds.) (2007). Grounded theory reader. HSR Supplement, 19.

Miles, Matthew B. & Huberman, Michael A. (1994). Qualitative data analysis: An expanded sourcebook (2nd. edition). Thousand Oaks: Sage.

Miller, Peter & O'Leary, Ted (1994). The factory as laboratory. Science in Context, 7(3), 469-496.

Muhr, Thomas & Friese, Susanne (2004). User's manual for Atlas.ti 5.0 (2nd. edition). Berlin: Scientific Software Development.

Richards, Tom & Richards, Lyn (1994). Using computers in qualitative research. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), Handbook of qualitative research (pp.445-462). London: Sage.

Schwandt, Thomas A. (1994). Constructivist, interpretivist approaches to human inquiry. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), Handbook of qualitative research (pp.118-137). London: Sage.

Seale, Clive (1999). The quality of qualitative research. London: Sage.

Snyder, Joel (1998). Visualization and visibility. In Caroline A. Jones & Peter Galison (Eds.), Picturing science—Producing art (pp.379-397). London: Routledge.

Strauss, Anselm L. (1987). Qualitative analysis for social scientists. Cambridge: Cambridge University Press.

Strauss, Anselm L. & Corbin, Juliet (1994). Grounded theory methodology: An overview. In Norman K. Denzin & Yvonna S. Lincoln (Eds.), Handbook of qualitative research (pp.273-285). London: Sage.

Teil, Genevieve & Latour, Bruno (1995). The Hume machine: Can association networks do more than formal rules? Stanford Humanities Review, 4(2), 47-65.

Author

Zdeněk KONOPÁSEK is sociologist at the Center for Theoretical Study, the Institute for Advanced Studies of Charles University in Prague and the Academy of Sciences of the Czech Republic. Also, he is member of the sociology department at the Faculty of Social Studies, Masaryk University in Brno. His main areas currently are STS (science and technology studies), especially the relationship between science and politics in socio-technical controversies, and qualitative research methods. He edited Our Lives as Database: Doing a Sociology of Ourselves—Czech Social Transitions in Autobiographical Research Dialogues (Charles University Press 2000). Since 1994, he is editor-in-chief of Biograf, a Czech and Slovak peer-reviewed journal for social qualitative research.

Contact:

Zdeněk Konopásek

Center for Theoretical Study
(The Institute for Advanced Studies at Charles University and the Academy of Science of the Czech Republic)
Jilska 1
110 00 Praha 1, Czech Republic

E-mail: zdenek@konopasek.net
URL: http://zdenek.konopasek.net/

Citation

Konopásek, Zdeněk (2008). Making Thinking Visible with Atlas.ti: Computer Assisted Qualitative Analysis as Textual Practices [62 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 9(2), Art. 12, http://nbn-resolving.de/urn:nbn:de:0114-fqs0802124.

Forum Qualitative Sozialforschung / Forum: Qualitative Social Research (FQS)

ISSN 1438-5627

Creative Common License

Creative Commons Attribution 4.0 International License