4.1 Comparison of the four methods modelled
In the previous section we modelled and discussed what is required to usefully report individual qualitative analysis methods within interdisciplinary studies. In this sub-section we step back and comparatively discuss the methods we modelled.
Only content analysis identified the variables of interest (i.e. specified codes) before starting coding and analysis. The other three approaches identified variables inductively through ‘bottom-up coding’, either using the terms used by the interviewees (e.g. the ‘folk terms’ in the domain analysis, and the flowery language of interviewees in the metaphor analysis), or identifying rules of inclusion and exclusion in the membership categorization analysis. In those three methods, once variables were identified, they were converted into an analytic framework that was deductively applied to the remaining texts and checked in an iterative cycle with the texts already coded.
Content analysis, as conducted here, was counting the number of times the terms determined to be relevant appeared in the text. Although not presented here, analysis may be continued through the use of theoretically motivated descriptive and correlational statistics which would be then reported according to the norms governing reporting of quantitative analysis. We were able to report this analysis method transparently because its operation relied on deductive application of a clearly declared coding scheme. In our example, the content analysis gave information on household composition, what households shared, ICT-tools used, and frequency and duration of communication with those tools.
Domain analysis, as used in this example, permitted us to work from manifest features of the transcripts to identify cognitive structures used by respondents in the interview. Domain analysis may be used on a substantial set of interviews, with special attention to the presence or absence of subgroups’ use of folk terms. For the purposes of simplicity, in this example we did not examine the interaction between domain and metaphor analysis. In practice, the folk-terms identified as key within a domain analysis may be themselves or may be closely associated with metaphors. In these cases, the additional layers of meaning associated with terms by metaphor analysis must be carried forward through the domain analysis as the often-normative shadings that come with metaphors may be analytically relevant.
Membership Categorization Analysis added to the domain analysis by identifying who was a member of the household and who not, while in the domain analysis the emphasis was on the significance of the household, without taking into account who were part of it and what made them part. MCA, like domain analysis, may be used in analysis of substantially sized data sets, although both seem less suitable for large data sets than content analysis.
We were not able to produce an adequate metaphor analysis. Like domain analysis and MCA, metaphor analysis requires repeated close reading of transcripts. As the metaphors studied are found precisely at the intersection of language and culture as they collide in an interview setting that was, in this case, foreign to both, we could not apply thematic codes with the sort of confidence possible with content analysis. While we were able to identify that there was a metaphor, we could not produce an unambiguous description of the range of possible associated meanings nor could we reliably describe the rules that govern association of these meanings with the flowery language we identified. Were we reporting a metaphor analysis in a context where no part of the research aspired to be a valid account, as is appropriate in some exploratory studies or in those where the assumptions required for valid descriptions are not met, we may have chosen to proceed by associating our own meanings with identified metaphors. In the context of a project that is both inter-disciplinary and mixed-methods, however, it is not appropriate for researchers to silently include speculation as data. When there is reason to believe that the words used by respondents have connotations that are analytically relevant, it is certainly appropriate to recognize those connotations. Identification of these connotations, however, would require explicit design of a transparently reported distinct research effort to develop a formal ruleset for the identification and interpretation of the analytically relevant metaphors found in the narratives examined.
In our experience, and as suggested in the discussion of domain analysis given above, it is rarely possible to answer a socially relevant research question through use of a single method of qualitative analysis. Contrary to what we have observed to be common practice, it is not appropriate to report, for example, only that a ‘frame analysis’ was undertaken where that term describes interactive application of several constitutive methods. Each of these constitutive methods, and the means by which the data arising therefrom are combined, should be described separately. The level of detail required to support this sort of description may very well not fit either the norms or the space afforded in current publication fora. In those cases, the reporting of qualitative analysis useful for interdisciplinary teams will require publication of supplemental material.
4.2 Coding and analysis
Coding requires segmentation of a narrative into units of meaning that are hopefully compatible with the conceptual framework within which the research questions were formulated and appropriate for the sort of analysis required to answer that question. When reporting qualitative analysis of narrative data for interdisciplinary teams, this segmentation and then the association of these segments with codes should not be presented as analysis as these two steps most closely approximate the work done by a respondent when she provides a value in response to a structured survey item or when a researcher records the value displayed on an instrument. With this in mind, a transparent discussion of coding is not an adequate report of qualitative analysis. Once a narrative is segmented in a manner that fits the researcher’s conceptual framework, the texts so coded are data appropriate for analysis. How the narrative fragments are interpreted once coded is determined by the nature of the data analysis method chosen. For example, within MCA text coded as ‘category bound activity’ is interpreted and used quite differently than text coded as ‘path metaphor’ within a metaphor analysis. In the absence of a well established shared lexicon, the mechanisms and content of the interpretations made through analysis should be reported in detail. In order to be interpretable by inter-disciplinary teams, it may be better to report coding as ‘data processing’ and the manipulation and interpretation of the coded narrative fragments as ‘data analysis.’
The results of content analysis, domain analysis and MCA may usefully be presented in the form of a table or graph and in this article we showed examples of both. The graphs were produced within a qualitative data analysis program, in this case Atlas.ti. Presenting results in a way that does not solely rely on ‘typical’ quotes is recommended. When quotes are used, the justification for their selection, as well considered in discussions around annotation for transparent inquiry, must be reported (
https://qdr.syr.edu/ati).
4.4 Compatibility with contributions from the natural and life sciences
Qualitative analysis of subject response data within interdisciplinary studies is, appropriately, reductive. Some authors, for instance St. Pierre and Jackson (
2014) argue that coding ought to be avoided entirely. They state that lecturers “teach analysis as coding because it is teachable” (St. Pierre & Jackson,
2014, p. 715) and reject the many textbooks and university research courses which, according to them, support the positivist, quasi-statistic analytic practice, reducing words to numbers. We agree that coding is a reductive exercise and that coding and analysis can be distinguished. We, however, think this critique is not relevant as it makes epistemic assumptions that are not appropriate for inter-disciplinary mixed-methods research. Research on environmental challenges, for example, is funded to inform practice and the measure of this research, ultimately, is predictive validity. For this, researchers must assume that the world described is somewhat stable, that descriptions thereof will converge, that the data they gather represents something more than instrument effects and that it is possible to reduce the complexity of the world sufficiently to render a useful representation. If the purpose of qualitative inquiry within interdisciplinary efforts is to complement and extend quantitative findings, it is appropriate to adopt a compatible stance. The assumptions necessary to support such reductive analysis, as long discussed (e.g. Bergdahl
2019; Shankman et al.
1984) may not hold in some circumstances and naïve combination of fundamentally different data does gross disservice to both.