T

Page 165-179 | Chapter 5: Introduction to Research Methodology (Cambridge A Level Psychology Notes)

5.0 Introducing Research Methodology

  • Psychology as a scientific discipline

    • Relies on empirical evidence to study cognition, emotions, and behavior.
    • Research methods test theories, establish cause-and-effect relationships, and ensure findings are reliable and valid.
    • Emphasizes the cyclical nature of research: formulate questions, design studies, collect data, analyse results, refine theories.
  • Key vocabulary and definitions

    • Research Method: A technique used to collect and analyze data to answer psychological questions (e.g., experiments, observations).
    • Empirical Evidence: Data gathered through direct observation or experimentation, rather than theory alone.
    • Causality: The relationship where one variable (cause) directly influences another (effect).
    • Reliability: The consistency of a research method or measure; results should be repeatable under the same conditions.
    • Validity: The extent to which a study measures what it intends to and can be generalized to real-world settings.
    • Ecological Validity: How well findings apply to everyday life (high in natural settings, low in artificial labs).
    • Internal Validity: The degree to which a study accurately establishes causality without confounding factors.
    • External Validity: The degree to which results can be generalized beyond the study sample or setting.
  • Significant information on research practice

    • Psychology uses a range of methods to balance control (for causality) and realism (for applicability).
    • Example: lab-based studies offer high control but may lack mundane realism (everyday relevance).
    • Ethical considerations are introduced early (e.g., informed consent and avoiding harm) as per guidelines from bodies like the British Psychological Society (BPS).
    • The research process is cyclical: start with an aim, form hypotheses, select methods, analyse data, and evaluate.
    • Falsification principle: Hypotheses guide the study and allow falsification; this is a key scientific principle.

5.1 Research Methods

  • This section focuses on specific research methods, beginning with experiments as a core approach for establishing causality.
  • It covers:
    • Variables
    • Hypotheses
    • Types of experiments
    • Examples from psychological studies
  • Self-reports may be introduced toward the end of this page range, but methodological concepts (e.g., sampling, ethics) begin later (around page 197 in the source).

Key Concepts: Variables, Hypotheses, and Aims

  • Aim: A broad statement of what the study intends to investigate.
    • Example: "To investigate the effect of sleep deprivation on memory recall."
  • Hypothesis: A precise, testable prediction about the relationship between variables.
    • Directional (One-Tailed) Hypothesis: Predicts the direction of the effect.
    • Example: "Participants exposed to noise will recall fewer words than those in silence.")
    • Notation example:
      • If using population means, a directional hypothesis can be expressed as \mu{noise} < \mu{silence}.
    • Non-Directional (Two-Tailed) Hypothesis: Predicts a difference without specifying direction.
    • Example: "Noise levels will affect word recall."
    • Notation example: \mu{noise} \neq \mu{silence}.
    • Null Hypothesis (H0): Assumes no effect or relationship.
    • Example: "Noise levels will have no effect on word recall; any difference is due to chance."
    • Notation: H0: \mu{noise} = \mu_{silence}.
    • Alternative Hypothesis (H1): Assumes an effect exists, opposing the null.
  • Operationalization: Defining variables in measurable terms for replication.
    • Example: Memory recall operationalized as the number of words remembered from a 20-word list.
    • Notation example: Memory\ Recall = #\text{words recalled from a 20-word list}.
  • Independent Variable (IV): The factor manipulated by the researcher.
    • Example: Presence/absence of noise.
    • Notation: IV \in {\text{noise present}, \text{noise absent}}.
  • Dependent Variable (DV): The factor measured to assess the IV's effect.
    • Example: Number of words recalled.
    • Notation: DV = #\text{words recalled}.
  • Extraneous Variable: Any unintended factor that could influence the DV.
    • If it systematically affects one group, it becomes a confounding variable.
  • Significant information: Variables must be operationalized for objectivity. Hypotheses guide the study and allow falsification; Aims provide context, while hypotheses make predictions testable.

Types of Experiments

  • Experiments manipulate the IV to observe changes in the DV with the aim of establishing causality.
  • Types vary by setting and level of control.

Significance and Practical Details of Experiments

  • Controls: Methods to reduce unwanted variation and bias.
    • Standardization: All participants receive the same instructions and procedures.
    • Random Allocation: Random assignment to groups to reduce selection bias.
    • Counterbalancing: Alternate the order of conditions to reduce order effects.
  • Order Effects: Changes in participants' performance due to the sequence of conditions.
    • Practice effect: Improvement from repetition.
    • Fatigue effect: Decline due to tiredness.
  • Demand Characteristics: Cues that reveal the study's aim, leading to unnatural behavior.
  • Participant Variables: Individual differences (e.g., IQ, motivation) that could confound results.
  • Overall, experiments are ideal for establishing causality but trade off control for realism.
  • Lab experiments often have high experimental realism (engaging tasks) but low mundane realism (everyday tasks).
  • Counterbalancing example: An ABBA sequence minimizes order effects in a repeated-measures design.
  • Random allocation: Reduces bias in independent groups.

Introduction to Self-Reports (Likely starting around pages 178-179 in the source)

  • Definition: Methods where participants report their own thoughts, feelings, or behaviors.
    • Examples: Questionnaires, interviews.
  • Types:
    • Closed Questions: Fixed responses (e.g., yes/no, Likert scales) for quantitative data.
    • Open Questions: Free responses for qualitative data.
  • Vocabulary to know:
    • Counterbalancing: A technique used in experimental research to control for order effects in a repeated-measures design, where the same participants experience all levels of the IV.
    • Social Desirability Bias: Participants answer to appear favorable.
    • Filler Questions: Irrelevant items included to disguise the aim.
  • Strengths: Easy to administer; access to private thoughts.
  • Weaknesses: Bias from lying or misinterpretation; low response rates.
  • Significant information: Self-reports are subjective but useful for exploring attitudes (e.g., Baron-Cohen et al.'s eyes test for empathy).

Connections, Ethics, and Real-World Relevance

  • Methodological trade-offs are central to study design: control vs realism, laboratory control vs ecological validity.
  • Ethical considerations underpin all methods: informed consent, minimizing harm, confidentiality, and voluntary participation consistent with guidelines such as the BPS.
  • Real-world relevance emerges through ecological validity and the application of findings to everyday settings.
  • The cycle of science emphasizes continual refinement of questions, methods, and theories based on results and replication.