Qualitative Data Analysis Notes

Qualitative Data Analysis

Introduction

  • Qualitative data analysis is a multifaceted, complex, and systematic process.
  • The approach to data analysis depends on the study's purpose, conceptual framework, and methodology.
  • Data sources include:
    • Transcripts of interviews
    • Narrative documents
    • Photographs
    • Newspapers
    • Field notes

Qualitative Data Characteristics

  • Qualitative research often results in more data than required due to its open nature.
  • Glesne (2011) referred to large data volumes as “fat data.”
  • Researchers must be methodical and organized to manage large data volumes successfully.
  • Data organization can be done manually or using computer software.
  • Computer programs can sort data but cannot analyze it.

Data Management

  • Interviews are a common source of qualitative data, typically audio-recorded and transcribed.
  • Verbatim transcription allows researchers to review and engage with the data fully.
  • Original words from interviews are often included in published studies for verification.
  • Ensuring privacy and confidentiality is crucial during data analysis.

Data Analysis Purpose

  • The purpose of data analysis is to answer the research question.
  • The general process involves:
    • Preparing and organizing the data
    • Reading and re-reading the database to reduce the data into codes
    • Reducing codes into themes that represent the findings
  • Each study is unique and relies on the researcher's creativity, intellect, style, and experience.

Data Analysis Goal

  • The overall goal is to look for “insight, meaning, understanding, and larger patterns of knowledge, intent, and action” in the data (Averill, 2015, p. 1).

Integrated Approach to Data Analysis

  • Many researchers integrate data collection and analysis.
  • This approach must be indicated in the research design.
  • Preliminary analysis starts as data accumulates to determine additional data needs and initial emerging codes.
  • Qualitative research is often inductive, examining specific details to generate a larger understanding of the phenomena.

Steps for Qualitative Data Analysis (Glesne, 2011)

  • Write notes during data collection to add observations and thoughts about the data.
    • Helps sort data into general categories, mark useful quotes, and relevant questions.
  • Reflective journaling can help track and sort data.

Development of Coding Schemes

  • Coding is the process of progressive marking, sorting, resorting, defining, and redefining collected data.
  • Organize data into meaningful clusters by grouping similar or related data through multiple readings.
  • Data reduction is the process of selecting, focusing, simplifying, abstracting, and transforming data from transcriptions, field notes, and observations.

Strategies for Analyzing Qualitative Data (Miles, 2014)

  • Note patterns and repetitive themes.
  • Cluster data that share common characteristics.
  • Make metaphors (describe something by creating an image).
    • Example: It all went pear-shaped - to go wrong
  • Count the number of instances an event or characteristic is mentioned.
  • Make contrast or comparison.
  • Partitioning variables – breaking down codes into sub-codes or themes into subthemes.

Example

  • Participant quote about methadone clinic staff: "They are there for you personally… They'd actually bring it to you or they find a means to get it to you… They didn't just think of me as somebody with a disease they see me as a person."

Types of Data Analysis

  • Subsuming the particulars of the study into the general – using a level of abstraction.
    • Abstraction is generalizing complex events to underlying concepts, removing complexities.
    • Example—Chess: 'Control the centre of the board' means 'creating a positional advantage' and 'keeping pieces active'.
  • Factoring – generating words to express common findings.
  • Note relationships between variables.

Additional Data Analysis Techniques

  • Find intervening variables – variables that link concepts together.
  • Build a logical chain of evidence – validating each of the relationships identified.
  • Make conceptual or theoretical coherence – linking the overarching “how and why” of the phenomena under study.
  • Variables example: Level of Education, Income, Health Status.

Drawing Conclusions and Verification

  • The researcher must stay open to new ideas, themes, and concepts during data collection and analysis.
  • Verification occurs as the data are collected.

Coding Types (Richards and Morse, 2007)

  • Descriptive – factual information/knowledge, i.e., gender, age, ethnicity.
  • Topic – a particular subject - short phrases, a grouping of words.
  • Analytical – identifying patterns within codes, which create themes.
  • Codes can evolve during the analysis.
  • Codebooks are kept to organize codes into lists.
  • Analysis can go on for extended periods (weeks to months).

Coding and Themes

  • Coding goes through several stages before clear patterns appear, which become themes.
  • Once finalized, the researcher analyzes themes to draw interpretations and implications.

Thematic Analysis (Clark, 2006)

  • Semantic – little analysis beyond what the participant stated, with some interpretation of significance.
    • Example: Themes from a focus group about donut preferences: (1) Blueberry fritter, (2) Boston cream, (3) Maple glazed, and (4) Plain.
  • Latent – identify underlying ideas, assumptions, and conceptualizations.
    • Example: Themes from a study about student nurses’ experiences: (1) initial clinical anxiety, (2) theory-practice gap, (3) clinical supervision, and (4) professional role.

Data Analysis Methods

  • Phenomenology
    • Immersion in the data—read and reread.
    • Extract significant statements.
    • Determine the relationship among themes.
    • Describe phenomena and themes.
    • Synthesize themes into a consistent description of the phenomenon.
  • Ethnography
    • Immerse in the data.
    • Identify patterns and themes.
    • Take cultural inventory.
    • Interpret findings.
    • Compare findings to the literature.
  • Grounded Theory
    • Divide data into discrete parts.
    • Compare data for similarities/differences.
    • Compare data with other data collected, continuously—constant comparative method.
    • Cluster into categories.
    • Develop categories.
    • Determine relationships among categories.
  • Case Study
    • Identify the unit of analysis.
    • Code continuously as data are collected.
    • Find commonalities and themes.
    • Analyze field notes.
    • Review and identify patterns and connections.

Conclusion

  • Explore the relationship between codes and themes.
  • Drawing conclusions involves examining and describing the relationship between the themes.

Data Display

  • An organized, compressed assembly of information that permits conclusion drawing and action.
  • Helps researchers and consumers understand the data.
  • Visual displays include graphs, flow charts, matrixes, and models.
  • Vignettes of participants’ experiences are shared in narrative text.

Data Display: Results

  • The results and discussion sections provide findings and implications for practice.
  • The Results Section:
    • Data bound – often presented in a table.
    • Qualitative data – can use descriptive statistics to describe the sample and text to reflect coding and themes.
    • Quantitative data – description and inferential statistics are presented.
    • All data must be presented, sets the stage for the discussion section.

Data Display: Discussion

  • Discussion of the Results
    • Interpretation of the results involving all aspects of the study.
    • The researcher makes the data come alive.
    • The researcher interprets and gives meaning to the numbers in quantitative data and the meaning of concepts in qualitative data.
    • The researcher will interpret the data in light of the theoretical framework and literature reviews.
    • Transferability is discussed.

Additional Discussion Points

  • Discussion of how the data may suggest additional or previously unrealized relationships.
  • Study limitations are discussed about the steps of the process.
  • Potential threats to the validity are discussed.
    • Internal and external.
  • Generalizability or inferences from the data are discussed.
  • Recommendations for practice or future research are suggested.

Trustworthiness

  • Is the data analysis/interpretation a fair representation?
  • Questions to consider:
    • What do you notice?
    • Why do you notice what you notice?
    • How can you interpret what you notice?
    • How do you know your analysis is “right”?

Criteria for Judging Scientific Rigour

  • Credibility - Truth of findings as judged by participants and others within the discipline.
    • Examines internal validity; member checks, prolonged data collection, triangulation; visibility of researcher worldview; description of researcher interaction with participants affect on findings; negative and disconfirming cases; comparison to published literature.
  • Auditability - It is judged by the adequacy of the information that leads the reader through the research process, from the research question to the findings and conclusion.
    • Establishes trustworthiness, leaves an audit trail that others can replicate and clearly understand.
  • Fittingness - The degree to which study findings are applicable outside the study situation and how meaningful the results are to individuals not involved in the research.
    • Like reliability; how well the study “fits” other contexts and settings; findings “ring true.”
  • Confirmability - Findings reflect the implementation of credibility, auditability, and fittingness standards.
    • Reviewing the audit trail together with the interpretation of findings can help determine that researchers not involved in the present study would arrive at similar conclusions as the study’s researchers.

Data Analysis Critique Criteria

  • Is the method of analysis clear?
  • Is it appropriate for the study?
  • Can you follow the analysis step by step?
  • Is there evidence that the interpretation accurately reflected what was said?
  • Are credibility, auditability, fittingness, and trustworthiness accounted for?