Critical Appraisal of Qualitative Studies

Nature of Qualitative Research

  • Qualitative research collects and analyzes non-numerical or narrative data (text, audio, video).
  • Aims to understand participants' experiences, opinions, and attitudes or to explore a concept.
  • Provides in-depth insight and may generate new research questions.
  • Helps in understanding complex contextual factors (social, legal, resource constraints) to inform best practice.

Qualitative Study Designs

  • Grounded Theory:
    • Constructs theory from observations of lived experiences.
    • Involves inductive reasoning, identifying themes from data.
    • Applies codes based on ideas arising from the data.
    • Typical questions:
      • What's going on?
      • What are people doing/saying?
      • What assumptions are made?
      • How do structure and context influence actions and statements?
  • Discourse Analysis:
    • Analyzes language beyond the sentence level.
    • Considers written, spoken, and non-verbal interactions.
    • Emphasizes the social aspects of communication.
    • Examines how language is used to build trust, create doubt, evoke emotions, or manage conflict.
    • Context is crucial for understanding meaning.
  • Phenomenology:
    • Studies experience to uncover the meaning of lived experiences.
    • Answers the question: What was it like to experience something?
    • Experiences shaped by previous experiences, beliefs, values, morals, culture, and religion.
  • Ethnography:
    • Studies cultures and subcultures.
    • Uncovers and describes the meaning of rituals, symbols, and customs.
    • Research question example: What does a diagnosis of diabetes mean to a specific person?
  • Case Study:
    • In-depth study of a person, family, group, community, institution, intervention, or program.
    • Aims to understand a complex issue or object.
    • Considers political, social, historical, and personal issues.
    • Data collection through observation, interviews.
    • Answers how or why questions.
    • Uses multiple sources of evidence (surveys, interviews, documents, etc.).

Inductive vs. Deductive Approaches

  • Quantitative research: deductive, aims to test existing theory.
  • Qualitative research: often develops theory inductively, from specific observations to broader generalizations.
  • Deductive Approach in Qualitative Research:
    • Researchers have an a priori expectation model.
    • Use an organizing framework of themes for coding.
  • Inductive Approach in Qualitative Research:
    • Works exclusively from the data.
    • Involves detailed re-readings of raw data to derive concepts and themes.
    • Recursive process between data analysis and literature.
    • Findings arise directly from the analysis of raw data, not prior expectations.

Critical Appraisal of Qualitative Research

  • Necessary to assess trustworthiness before implementation into practice.
  • Appraisal considers methods (data collection, analysis) and research design.
  • Quantitative research assesses reliability, validity, and generalizability.

Assessing Rigor and Trustworthiness in Qualitative Research

  • Focus on transferability, credibility, reflexivity, and transparency.
  • Transferability: extent to which the study allows readers to connect the data to wider community settings.
  • Credibility: the believability and appropriateness of the research account.
  • Reflexivity: researchers' account of their engagement and influence on the study.
  • Transparency: making the entire research process explicit, including the rationale for decisions.

Frameworks and Checklists

  • Frameworks focus on overarching concepts like transferability, credibility, reflexivity, and transparency.
  • Checklists (e.g., SRQR) provide standards for reporting qualitative studies, which has also been adapted into a critical appraisal tool.
  • CASP (Critical Appraisal Skills Programme) Checklists:
    • Developed for critical appraisal of scientific studies.
    • Checklists for various study types (RCTs, systematic reviews, cohort studies, etc.).
    • CASP checklist available for qualitative studies.

Using the CASP Qualitative Checklist

  • Questions to help make sense of a study (yes, no, or can't tell answers).
  • No scoring system.
  • Two screening questions:
    1. Was there a clear statement of the aims of the research?
      • Goal, importance, and relevance of the research.
    2. Is qualitative methodology appropriate?
      • Does the research seek to interpret actions or subjective experiences?
  • If both screening questions are "yes," proceed with appraisal.

Section A: Are the Results Valid?

  • Was the research design appropriate to address the aims?
    • Justification for the chosen research design.
  • Was the recruitment strategy appropriate?
    • Explanation of participant selection.
    • Why selected participants are most appropriate.
    • Discussion around recruitment (e.g., why some chose not to participate).
    • Sampling:
      • Purposeful, not probabilistic.
      • Aims to maximize information-rich data.
      • Techniques:
        • Maximum Variation Sampling: range of characteristics (age/ethnicity)
        • Convenient Sampling: recruits most available participants.
        • Snowball Sampling: participants invite others.
        • Stratified Sampling: uses above average, average and below average cases for a particular variable of interest.
        • Homogenous Sample: focuses on a particular subgroup in a population who are hard to reach or retain
        • Typical Case: helps describe and illustrate a programme to those who aren't familiar with it.
  • Data Collection:
    • Were data collected in a way that addressed the research issue?
    • Justification of setting for data collection and explanation of how data were collected.
    • Qualitative Data Sources:
      • Interviews (structured, semi-structured, face-to-face, telephone, video platform such as Zoom).
      • Written responses to open-ended survey questions.
      • Group interviews/focus groups.
      • Audio-taped/recorded transcriptions.
      • Field notes: record social phenomena, interactions, and behaviors.
        • Direct observation (researcher recording notes).
        • Indirect (audio/video recording).
    • Data Saturation: point where more data sampling yields little to no new information or codes/themes, therefore completing the participant recruitment and data gathering.
    • Iterative Data Collection: cycles of data collection and analysis to test frameworks.

Section B: What Are the Results?

  • Ethical Considerations:
    • Ethics committee approval.
    • Details of how research was explained to participants.
    • Discussion of issues (informed consent, confidentiality, effects on participants).
  • Data Analysis Rigor:
    • In-depth description of analysis process.
    • Clear derivation of categories and themes if thematic analysis is used.
    • Explanation of how data were selected to demonstrate analysis.
    • Presentation of sufficient data to support findings.
    • Consideration of contradictory data.
  • Qualitative Analysis Spectrum:
    • Quasi-Statistical: uses software to identify frequently occurring words and synonyms.
    • Immersion Crystallization: researchers immerse themselves in the data to identify patterns and themes.
  • Researcher's Role:
    • Reflection on the researcher's role and potential bias.
    • Bracketing: separating researchers' experiences from the study.
  • Clear Statement of Findings:
    • Explicitly presented findings.
    • Discussion of evidence for and against researchers' arguments.
    • Findings discussed in relation to the research question.
    • Discussion of credibility (triangulation, respondent validation, multiple coders).
  • Methods for quality control:
    • Content analysis: comparisons between statements made by the participant on a particular topic.
    • Respondent validation/member checking: giving participants the researcher's interpretation in order to check the authenticity of the work.
    • Inter-rater agreement: confirming that they are assigning the same meaning to the themes for two or more researchers independently applying the codes to data.

Section C: Will These Results Help Locally?

  • How valuable is the research?
    • Contribution to existing knowledge.
    • Consideration of findings in relation to practice, policy, or research literature.
    • Identification of new research areas.
    • Discussion of transferability to other populations.
  • Key questions:
    • Does this study help me understand the context of my practice?
    • Does the study help me understand my relationships (e.g., with patients and families)?