Braun & Clarke 2022 - Chapter 9

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/30

flashcard set

Earn XP

Description and Tags

Getting your own house in order: Understanding what makes good reflexive thematic analysis to ensure quality.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No study sessions yet.

31 Terms

1
New cards

What is the key to quality in TA?

(a) Understanding good practice, and

(b) understanding problems or poor practice, so that you can recognize and avoid them (through a).

2
New cards

What are strengths or opportunities of reflexive TA?

  • Flexible with regard to theory, research question, data collection method, dataset size and generation strategy, and analytic orientation (inductive-deductive, semantic-latent, experiential-critical) and purpose (descriptive-interpretative, in-depth examination of selected data domains versus rick description or interrogation of meaning across entire dataset). This means it has potential for wide ranging application.

  • Status as a method, rather than a theoretically informed and delimited ‘off-the-shelf’ methodology. This means researchers must actively engage with questions of underlying theory and philosophy; knowing and reflexive use of TA is crucial for quality.

  • An accessible ‘starter method’ for those new to qualitative research.

  • Can highlight similarities and differences across the dataset.

  • Can generate unanticipated insights.

  • Allows for social as well as psychological interpretations of data.

  • Useful for experienced qualitative researchers seeking to produce complex, nuanced, sophisticated and conceptual analyses.

  • When used within an experiential framework, results are accessible to an educated general public.

  • Easy to incorporate into ethnography and participatory designs; theoretical flexibility avoids theoretical tensions and contradictions.

  • Flexibility and accessibility make it a useful method for community research designs, where participants are co-researchers and contribute to data analysis; also useful for designs where participants are invited to reflect on the resulting analyses or these are returned to participants.

  • Can be used to produce analyses with actionable outcomes and that can inform policy development.

3
New cards

What are limitations or challenges of reflexive TA?

  • Flexibility and wide range of potential applications can lead to ‘analytic paralysis’, especially for those new to qualitative research.

  • The researcher must engage with theory before data analysis or risk theoretical assumptions and concepts being unknowingly and unreflexively imported into the analysis.

  • Flexibility with regard to theory and analytic orientation and purpose means it is difficult to formulate precise guidance for higher-level (more interpretative) analysis. As with many other qualitative approaches, doing TA involves ‘craft skills’ that are difficult to distil into recipe-like guidance.

  • Limited interpretative power if not used in combination with a particular theory or concepts.

  • Cross-case orientation means the complexities and contradictions in the accounts of individual participants can be difficult to retain/capture in the analysis (especially in research with larger samples).

  • Can’t be used for a fine-grained analysis of language practice (see also N. King & Brooks, 2018).

4
New cards

In regards to the analysis not fit for purpose, what are some problems in TA?

  • Analysis fails to address, or only partially addresses, the stated research question (e.g. the focus of the analysis has shifted from the original intent but the research questions or aims have not been revised to reflect this).

  • The analysis does not cohere with the claimed theoretical and philosophical assumptions; there is a disconnect between the claimed assumptions and the enactment and reporting of the analysis.

  • The theoretical assumptions guiding the analysis are not explicated; the analysis is treated as atheoretical. Theoretical assumptions and concepts are imported unacknowledged and unknowingly into the analysis.

5
New cards

In regards to the analysis not fit for purpose, what is some good practice in TA?

  • Research question revisited in light of the developed analysis. The analysis clearly and fully addresses the research question.

  • The research exhibits good conceptual ‘fit’ and ‘methodological integrity’ (Levitt et al., 2017). The TA approach coheres with the theoretical assumptions of the study. The analysis as reported matches the theoretical positions declared.

  • The research clearly explicates the philosophical and theoretical assumptions underpinning their use of TA, and the analysis enacts and reflects these. The use of TA is theoretically knowing and reflexive.

6
New cards

In regards to weak or underdeveloped themes (evidence of premature analytic closure), what are some problems in TA?

  • Use of data collection questions as ‘themes’; simply summarizing what participants said in relation to each question (topic summaries as ‘themes’).

  • Use of topic summaries as themes.

  • Only summarizing what participants said; little or no analytic (interpretative) work undertaken beyond summarizing the data content.

  • Themes are poorly named (e.g. one-word theme names). The theme names convey little information about the essence or central concept of the theme.

  • Using existing disciplinary concepts as theme names. Data are simply summarized within an existing concept or framework (this is particularly problematic in analyses described as inductive).

  • Too many themes: Themes are thin and scrappy, containing few or even only one analytic observation; confusion between codes (single facet) and themes (multiple facets). Discussion of themes lacks depth and detail.

  • Too few themes: Themes are long and overly complex; lack coherence, focus and boundaries.

  • Analysis is overly fragments - many different levels of themes reported.

  • Analysis is thin - themes are underdeveloped.

  • Too much overlap between themes.

  • Too little relationship between themes. Themes appear completely unrelated; themes do not tell an ‘overall story’.

  • Themes are not internally consistent or coherent, appearing to lack a central underpinning concept.

7
New cards

In regards to weak or underdeveloped themes (evidence of premature analytic closure), what is some good practice in TA?

  • Themes are not limited to data collection questions and evidence thoughtful, reflective analytic work that develops and interprets patterns - sometimes beyond semantic content.

  • Themes cohere around a shared central organizing concept. An individual theme does not report ‘diverse’ meaning in relation to a topic, unless contradiction is the focus of the theme or diverse meaning at a semantic level is underpinned by a unifying latent concept.

  • Analysis goes beyond data summary (data ‘reduction’) to interpret, and to explain the significance of the data, in relation to the research question.

  • The theme name (usually a brief phrase) captures something of the theme essence or central concept, orienting the reader to what is to follow. Data quotations might be used for (some) theme names, perhaps slightly paraphrased and with an explanatory sub-title.

  • Existing theoretical concepts are employed knowingly and reflexively, as tools to enrich the analysis, not as delimiting boundaries for it. What the analysis contributes to the existing literature, how it extends and develops this, are clearly articulated.

  • The number of themes is appropriate and each theme is presented in depth and detail; the boundaries between themes are clear. Six or fewer themes reported in an 8,000-word report. Themes are coherent and focused and capture significant patterns of meaning. Although themes are unified by a central concept, they are not limited to one analytic observation (single facet) but capture a cluster of related observations (multiple facets).

  • There is a maximum of three theme levels - overarching themes, themes and subthemes - and judicious use of the latter to highlight a facet of the central concept.

  • Analysis is thick and tells a rich interpretative story that goes beyond simple description.

  • Each theme is distinct, but the themes work together to tell an overall story about the data in relation to the research question. The relationship between the themes is clearly evident or explained.

  • The purpose or focus of each theme is clear. Themes are underpinned by a central organizing concept.

8
New cards

In regards to weaknesses of interpretation (in presented analysis), what are some problems in TA?

  • Data are interpreted in a contextual vacuum; failure to situate data within relevant (social, political, policy, etc.) contexts.

  • Themes consist of headings, one or two sentences of framing analytic narrative and a long string of data extracts. Analytic narrative is largely absent or underdeveloped.

  • Analysis evidences ‘arguing with the data’ or ‘taking sides’ - disagreeing with some participants by citing evidence to show their beliefs are mistaken; being overtly judgmental or critical of participants’ perspectives in experiential TA.

  • Paraphrasing rather than interpreting data. The analytic narrative simply repeats what the participant said in slightly different words.

  • No analysis is done. The data are left to ‘speak for themselves’, sometimes framed as the research not intervening in participants’ accounts. The researcher assumes the meaning and significance of the data are obvious.

9
New cards

In regards to weaknesses of interpretation (in presented analysis), what is some good practice in TA?

  • The data are contextualized and located with relevant contexts, including, where appropriate, the wider social context.

  • Themes consist of a fully developed and rich analytic narrative, with data extracts embedded. The analytic narrative is (ideally) thoughtful, insightful, compelling, nuanced and multi-faceted.

  • The researcher’s orientation to the data is that of curiosity and making sense of meanings, rather than judgement. The researcher explains what is interesting or important about the data, in relation to the research question and the relevant contexts.

  • The analytic narrative explains the relevance of data content.

  • The researcher explains to the reader what meaning they make of the data, and the relevance and significance of this.

10
New cards

In regards to relationship between data and analytic narrative, what are some problems in TA?

  • Connection between data extracts and analytic claims unclear and absent. Data extracts do not convincingly or compellingly illustrate what is claimed.

  • Too many or too few data extracts used. At the extremes, no data extracts to illustrate analytic claims/the theme consists of a sentence or two of analytic commentary, then a string of data extracts.

  • Several data extracts are presented to illustrate minor analytic observations. One or no data extract are presented to illustrate major analytic observations.

  • Failure to consider other obvious interpretations of the data in a way that undermines the convincingness of the analysis.

  • Insufficient evidence of patterning of themes across the dataset. Evidence for themes undermines by over quoting a small number of data items and failing to quote extracts of data from across the dataset.

11
New cards

In regards to relationship between data and analytic narrative, what is some good practice in TA?

  • Good ‘fit’ between data extracts and analytic claims. The selected extracts are vivid and compelling.

  • Good balance between data extracts and analytic narrative - the precise proportion depends on the type of analysis undertaken, but generally the analysis will consist of at least 50% analytic narrative.

  • Major (complex) analytic claims are well illustrated with relevant data extracts. Minor analytic points are merely noted or illustrated with one extract or a few very short extracts.

  • If there are other (fairly obvious) interpretations of the data, these are considered. The researcher makes a case for why their interpretation is compelling, perhaps drawing on evidence from elsewhere in the dataset or existing literature.

  • The researcher carefully selects data extracts from across the dataset, including a variety of participants/data items, to demonstrate patterning across the dataset. Multiple extracts from one (particularly articulate and expressive) participant or data item are balanced by extracts from across the other data items.

12
New cards

What is ‘premature closure of analysis’?

Premature closure of analysis happens when the researcher stops analyzing their data when they have produced only superficial results, capturing the most obvious meanings in the data (Connelly & Peltzer, 2016).

13
New cards

How does the ‘premature closure of the analysis’ play out?

In TA, this plays out in a number of ways:

1.) The way themes are conceptualized;

2.) Lack of interpretative engagement;

3.) Uncritical use of theoretical concepts.

14
New cards

How do confusing topics and themes cause ‘premature closure of analysis?’

Confusing topics and themes: Connelly and Peltzer (2016) argued that premature closure is often related to a confusion between categories (or data topics) and themes. How do you know if you’re doing this? Connelly and Peltzer suggested that a topic can usually be labelled with one word (e.g. stigma; gender), whereas a theme generally requires a longer and more nuanced label to capture its essence (see also Sandelowski & Leeman, 2012). Premature closure related to topic summary ‘themes’ can also happen when data collection questions (e.g. interview or qualitative survey questions) are used as ‘themes’, with each theme effectively consisting of summaries of participant responses to interview questions (e.g. Lorch et al., 2015). It might seem reasonable to know the diversity of views around a topic, and there might indeed be some use in this. But some have argued that “knowing the difference between a theme and a topic is foundational to the crafting of accessible research findings” (Sandelowski & Leeman, 2012, p. 1407), and actionable or usable research results in applied research (see also Connelly, & Peltzer, 2016). Braun and Clarke would like to add that it is also vital to high quality (reflexive) TA practice. Hence, topic summaries are often the result of researchers treating their data collection questions as themes and summarizing participant responses to each question.

15
New cards

How does lack of interpretative engagement with the data cause ‘premature closure of analysis?’

Lack of interpretative engagement with the data: Superficial analyses can also reflect a lack of interpretative engagement with the data. If the explicit goal of analysis is to produce a surface reporting of what participants said, then that is one thing. Occasionally this might be the intentional analytic goal, especially when the data are more concrete. But more typically, this seems to reflect underdeveloped analysis. The data themselves are not analysis. Analysis for reflexive TA is the result of a (subjectivity-inflected) interpretation of data by the researcher. Reporting a large number of themes is also suggestive of premature closure and a superficial engagement with data. DeSantis and Ugarriza (2000) argued that reporting too many themes is equivalent to giving the reader unanalyzed data; it precludes any meaningful interpretation of the analysis, and dilutes the unifying function of a theme. Interpretative engagement with data is key for high quality reflexive TA. Hence, reporting a large number of themes, and subthemes, is a ‘red flag’ for premature closure.

16
New cards

How does uncritically using existing theory and concepts cause ‘premature closure of analysis?’

Uncritically using existing theory and concepts: Premature closure can also occur when pre-existing(disciplinary) concepts and terminology - ‘small’ theory - are used to provide a structure or framework for the analysis. Such concepts and terminology must be defined and used knowingly and reflexively; without this, the use of such concepts can result in a superficial analysis, one that risks simply recycling existing knowledge instead of developing new understandings related to the dataset and context of the research (Connelly & Peltzer, 2016). Such analyses might not be understood or positioned as ‘deductive’, but they effectively are that, if reading and interpreting the data is guided and constrained by (the unknowing use of) such concepts. One important question too rarely addressed is whether the researcher’s engagement with the data was limited rather than expanded by their - explicitly acknowledged or not - favored theory. This is an important question for all analysis; what we bring inevitable shapes the scope of our possible sense-making and our interpretive lens. At worst, the use of pre-existing theory results in an analysis that ‘fits into’ the theoretical framework. Theory merely provides a framework for presenting the data. To avoid analytic foreclosure, reflexive awareness and critical self-questioning around theory are essential. There is a real risk that working deductively and using pre-existing (small) theory can result in an impoverished analysis - merely fitting the data into the existing theoretical framework. Reflexivity, and interrogating how you’re engaging with the data and the theory in the analytic process are key to avoiding this problem. Ask yourself: (how) are implicit ideas or theories - often these are disciplinary-embedded and even invisible ones - shaping and limiting my engagement with the data?

17
New cards

What is ‘methodological integrity?’

What’s key in choosing - and using - an analytic approach is conceptual coherence and ‘fit’ (see Braun & Clarke, 2021b, and Design Interlude). Does this method, and this version of it, suit your purpose? Braun and Clarke find Levitt et al.’s (2017) concept of methodological integrity useful for thinking about quality. Methodological integrity captures alignment and coherence in research design and procedures, research questions and theoretical assumptions, so that a research project produces a trustworthy and useful outcome. Methodological integrity requires a thoughtful, reflexively aware researcher, something Braun and Clarke have argued is vital for quality TA practice. Braun and Clarke’s approach to TA is not like a baking recipe that must be followed exactly for the outcome to be successful. Doing good quality TA is far more about sensibility than strictly following procedure or technique. This is not to say that procedures are redundant; rather, following procedures ‘to the letter’ is no guarantee of quality.

18
New cards

What is ‘theoretical knowingness?’

Remembering that Braun and Clarke conceptualize TA as an adventure, not a recipe (Willig, 2001), our recommendations for ensuring quality in reflexive TA center on ways to foster depth of engagement, researcher reflexivity and theoretical knowingness. The concept of theoretical knowingness captures the practice of engaging with and deploying theory deliberatively and reflexively in our research.

19
New cards

What does ‘high-quality [qualitative] research’ depend on?

For Braun and Clarke, quality depends not on notions of consensus, accuracy or reliability, but on immersion, creativity, thoughtfulness and insight. It depends on moving beyond the obvious or superficial meanings in the data - unless your purpose is expressly and knowingly to focus on these. As “it takes much effort, time and reflection to develop the craft of TA” (Trainor & Bundon, 2020, p. 20), Braun and Clarke emphasize time as a key resource for reflexive TA research. British social psychologist Brendan Gough and New Zealand health psychologist Antonia Lyons referred to “the (slow) craft of doing high quality [qualitative] research” (2016, p. 239). Braun and Clarke have similarly emphasized the importance of the “slow wheel of interpretation” to high quality qualitative research (Braun & Clarke, 2021d). For Gough and Lyons, “creative thinking, theorizing, imagination, patience are all essential to high quality research and thus to the production of new and different knowledge” (2016, p. 239). Braun and Clarke heartily agree!

20
New cards

What series of strategies do Braun and Clarke list that can help researchers maintain a curious and open stance, to keep their adventurous spirit alive, and encourage fresh perspectives in their data?

  • Reflexive journaling;

  • Allowing plenty of time for your analysis;

  • Gaining insights from others (e.g. peers, supervisors, co-researchers);

  • Naming themes carefully;

  • Drawing inspiration from good published examples;

  • Demonstrating quality through an ‘audit trail’.

21
New cards

What is Braun and Clarke’s 15-Point Checklist for good reflexive TA - version 2022?

No. Process Criteria

  • Transcription - The data have been transcribed to an appropriate level of detail; all transcripts have been checked against the original recordings for ‘accuracy’.

  • Coding and theme development - Each data item has been given thorough and repeated attention in the coding process.

  • The coding process has been thorough, inclusive and comprehensive; themes have not been developed from a few vivid examples (an anecdotal approach).

  • All relevant extracts for each theme have been collated.

  • Candidate themes have been checked against coded data and back to the original dataset.

  • Themes are internally coherent, consistent, and distinctive; each theme contains a well-defined organizing concept; any subthemes share the central organizing concept of the theme.

  • Analysis and interpretation - in the written report - Data have been analyzed - interpreted, made sense of - rather than just summarized, described or paraphrased.

  • Analysis and data match each other - the extracts evidence the analytic claims.

  • Analysis tells a convincing and well-organized story about the data and topic; analysis addresses the research question.

  • An appropriate balance between analytic narrative and data extracts is provided

  • Overall - Enough time has been allocated to complete all phases of the analysis adequately, without rushing a phase, or giving it a once-over-lightly (including returning to earlier phases or redoing the analysis if need be).

  • Written report - The specific approach to thematic analysis, and the particulars of the approach, including theoretical positions and assumptions, are clearly explicated.

  • There is a good fit between what was claimed, and what was done - i.e. the described method and reported analysis are consistent.

  • The language and concepts used in the report are consistent with the ontological and epistemological positions of the analysis.

  • The researcher is positioned as active in the research process; themes do not just ‘emerge’.

22
New cards

What is ‘reflexive journaling’ or a ‘reflective journal?’

Canadian nursing scholar Lorelli Nowell and colleagues describe a reflexive journal as a “self-critical account of the research process” (2017, p. 3), documenting the researcher’s own thoughts about the developing analysis and conversation about it with others. Reflexive journals invite and encourage an ongoing, embedded process of reflection about your research practices and assumptions throughout the research process (Nadin & Cassell, 2006), rather than a sporadic and compartmentalized engagement with reflexivity. Use the journaling process to reflect on how your assumptions and responses might delimit your engagement with the data, and to open-up new and alternative interpretative possibilities. In journaling for quality TA, Braun and Clarke encourage you to reflect on the prior knowledge and assumptions you bring to the research and how these might shape your interpretation of your data. This is an important tool for avoiding ‘positivism creep’ - the inadvertent (re)appearance in Big Q TA of positivist assumptions, that those of us with a background in a positivist-dominated discipline are particularly vulnerable to. Try to get distance from how you are making sense of your data and reflect also on your emotional responses to the data (and to any participants. To function as a reflexive journal, it needs to be a space where you question and push yourself, rather than a space to simply record your thoughts (see Cunliffe, 2004, 2016).

23
New cards

How do you avoid ‘positivism creep’ by developing a qualitive sensibility?

Something very common in both student and published TA research in many disciplines and research fields, particularly those in which (post) positivism dominates, is ‘positivism creep’. By this Braun and Clarke mean positivist assumptions slinking into the research, unacknowledged by the authors (Braun & Clarke, 2021c). Braun and Clarke come back to reflexivity - and specifically reflexivity around the disciplinary values, assumptions and norms you are embedded within (Wilkinson, 1988) - as a vital tool here, ensuring you undertake (reflexive) TA in a knowing way. Similarly, developing a thorough-going qualitative sensibility is key to avoiding positivism creep. A qualitative sensibility is a qualitative ‘head-space’ or orientation, a way of thinking about research underpinned by a deep-seated and almost ‘intuitive’ sense of the ethos of Big Q qualitative inquiry, which connects to bigger questions of ontology and epistemology.

24
New cards

What can ‘talking about your data and analysis with others do?’

Talking with others about your data and/or your developing analysis can be useful for clarifying your analytic insights and deepening your engagement with your data (it can also be a tool for reflexivity; Nadin & Cassell, 2006). Braun and Clark believe that engagement with others always adds something to our analysis - even if it’s only validation that we’re noticing something useful. More typically, the questions that others have, or their different ‘takes’, can help us to deepen our interpretation. Alongside formal mentorship and supervision, talking and presenting to peers and others can be effective tools for developing the richness and quality of your analysis.

25
New cards

What does Braun & Clarke mean by the ‘slow wheel of interpretation?’

If there is one rule for reflexive TA, it’s that it always takes longer than you anticipate. Furthermore, it’s crucial to think of time not just in terms of hours of time spent analyzing the data, but also a broader expanse of time to think, to reflect, to let ideas percolate; time for moments of clarity, sudden insight or inspiration, time to put things down and walk away for a while. One of the papers from Hong Kong authors effectively documents how the first author ‘dwelled with’ the data, reflecting, asking questions, seeking out theories that might spark insights into particular aspects of the data, interrogating his own assumptions and eventually developing an account of the implicit meanings in the data. This is what Braun and Clarke mean by the slow wheel of interpretation (Braun & Clarke, 2021d).

26
New cards

What can working with a more experienced qualitative researcher as a supervisor, mentor, or co-researcher help you with?

Working with a more experienced qualitative researcher as a supervisor, mentor, or co-researcher can help you avoid problems like premature closure of the analysis and to manage and contain anxieties about getting analysis ‘right’. Braun and Clarke encourage those new to TA to review and reflect with their supervisor, mentor or co-researcher at all stages of the research process.

27
New cards

What can the stages of the research process consist of that those new to TA may be reviewing and reflecting with their supervisor, mentor or co-researcher about?

  • Reviewing data for quality (such as early interviews or focus groups to ensure depth and richness);

  • Sharing initial analytic observations and insights;

  • Reviewing initial coding, thematic maps and theme definitions;

  • Reviewing a first attempt at a theme write-up;

  • And of course the first full draft of your analysis, if the person is your supervisor.

28
New cards

What is the aim of ‘review’ and ‘reflection’ from your mentor?

From a quality point of view, the aim of such review and reflection is not to determine whether you have got it ‘right’ or your mentor or supervisor ‘agrees’ with your analysis. Rather, it’s an opportunity to: explain and clarify your thinking; be questioned; explore alternative ways of making sense of and interpreting the dataset, coding and patternings, that you haven’t considered; and reflect on whether your particular standpoints or experiences are (problematically) limiting and constraining how you engage with the data. It’s a place where you can develop material for your reflexive journaling; a place where your assumptions can be ‘revealed’ and interrogated.

29
New cards

How do researchers make sure themes are themes, and name them carefully?

Besides the obvious point that for quality reflexive TA, themes need to be based on shared patterns of meaning, cohering around a central organizing concept, theme names matter. Names should convey the ‘essence’ and ‘intent’ of a theme. Theme names are like mini-mini ‘abstracts’ for themes. Individually each provides the reader with the headline to the story of that theme; together they headline the overall story of the analysis. Writing about an interview project with informants/participants who are nurses, Connelly and Peltzer noted: when a researcher designates a 1-word theme, such as ‘collaboration,’ what does that mean in relationship to the experiences of the informants as interpreted by the researcher? Using only 1 word as a theme, there is no way of knowing, for example, if the experiences with collaboration were positive or negative, or whether collaboration is important to the nurses. One-word themes do not convey what the researcher found out about collaboration (2016, p. 55). Poorly named themes are, unfortunately, something we often encounter in both student and published research. In scanning an abstract, we’re often struck by what might seem like underdeveloped or topic summary themes - but sometimes this is simply because the themes have not been appropriately named (Braun & Clarke, 2021c). The researcher’s job is to do the interpretative work of drawing out what is meaningful in the data and telling the reader about it, not leaving the reader to do the interpretive work themselves. Names are part of this.

30
New cards

What is ‘triangulation?’

Triangulation involves using multiple data sources, participant groups or researchers to gain a richer or multi-faceted account of the phenomena under study. Tracy (2010) suggested the metaphor of ‘crystallization’ as a qualitative alternative to triangulation. Instead of the realist overtones and suggestion of a singular reality and truth that triangulation embeds, crystallization- Tracy argued - evokes a goal of capturing a more complex, multi-faceted, but still thoroughly partial, understanding of a phenomenon. Braun & Clarke realize only now how well this fits well with our conceptualization of themes as multi-faceted gems!

31
New cards