ZD

Introduction to Psychological Research Methods (OpenStax Psychology 2e – Chapter 2)

Why is Research Important?

  • Research validates claims with objective, tangible evidence.
  • Scientific research is empirical, relying on observable evidence.
  • Research proves ideas through study and testing.
  • Psychology, as a science, requires research for investigation, verification, and support of findings.

Use of Research Information

  • Advertising campaigns often misuse "scientific evidence."
  • Critical thinking about claims involves assessing:
    • Expertise of the claimant.
    • Potential gains from the claim.
    • Justification of the claim based on evidence.
    • Opinions of other researchers.

The Process of Scientific Research: Inductive vs Deductive Reasoning

  • Psychological research uses inductive and deductive reasoning.
  • Deductive reasoning: predicting results based on a general premise.
    • Example: "All living things require energy (premise), ducks are living things, therefore ducks require energy."
  • Inductive reasoning: drawing conclusions from observations.
    • Example: "Seeing many fruits on trees leads to the assumption that all fruits grow on trees."
  • Process:
    1. Scientists form ideas (theories/hypotheses) through deductive reasoning.
    2. Hypotheses are tested through empirical observations, and conclusions are formed through inductive reasoning.
    3. Conclusions lead to new theories, hypotheses, or broader generalizations.

The Scientific Method

  • The scientific method includes proposing hypotheses, conducting research, and creating/modifying theories.
  • Scientists use inductive reasoning to form theories, which generate hypotheses.
  • Theory: A well-developed set of ideas explaining observed phenomena.
  • Hypothesis: A tentative, testable statement (prediction) about the relationship between variables.
    • Predicts behavior if a theory is correct.
    • Often an "if-then" statement.
    • Must be falsifiable (capable of being proven incorrect).
  • Freud’s theories, like the division of the mind into id, ego, and superego, have lost favor due to being unfalsifiable.

Approaches to Research

  • Clinical or case studies
  • Naturalistic observation
  • Surveys
  • Archival research
  • Longitudinal and cross-sectional research

Clinical or Case Studies

  • Focus on one individual, typically in an extreme or unique psychological circumstance.
  • Provide extensive insight but are difficult to generalize to a larger population.
  • Example: Study of Genie, who suffered severe abuse and social isolation, to understand the effect on development.

Naturalistic Observation

  • Observation of behavior in its natural setting.
  • Natural behavior is typically hidden when under observation.
  • Effective in studying genuine behaviors by eliminating performance anxiety.
  • Observer bias: Skewed observations aligning with observer expectations.
    • Establishment of clear criteria helps eliminate observer bias.
  • Example: Jane Goodall’s naturalistic observations of chimpanzee behavior.

Surveys

  • Use a list of questions delivered via paper, electronically, or verbally.
  • Gather data from a sample (subset) of a larger population.

Archival Research

  • Uses past records or data sets to answer research questions and find patterns.

Longitudinal and Cross-Sectional Research

  • Cross-Sectional Research: Compares multiple segments of a population at a single time (e.g., different age groups).
  • Longitudinal Research: Studies the same group repeatedly over an extended period.
    • Researchers anticipate participant attrition (reduction in numbers).
    • Example: The CPS-3 study helps understand the association between smoking and cancer.
  • Attrition: Reduction in research participants due to dropouts over time.

Correlational Research

  • Correlation: Relationship between two or more variables.
  • Correlation Coefficient (r): A number from -1 to +1 indicating the strength and direction of relationship.
  • Positive Correlation: Variables change in the same direction.
  • Negative Correlation: Variables change in opposite directions; not the same as no correlation.
  • Scatterplots provide a graphical view of correlation strength and direction.
  • Stronger correlation = data points closer to a straight line.

Correlation Does Not Indicate Causation

  • Cause-and-effect relationship: Changes in one variable cause changes in another; can be determined only through experimental design.
  • Confounding variable: An outside factor affecting both variables of interest, falsely suggesting causation.
    • Example: Ice cream sales and crime rates increase with temperature (confounding variable).

Illusory Correlations

  • Illusory Correlations: Seeing relationships between unrelated things.
  • Confirmation bias: Ignoring evidence disproving beliefs.
  • Illusory correlations can contribute to prejudicial attitudes and discriminatory behavior.
  • Example: The belief that a full moon affects behavior, which research disproves.

Causality: Conducting Experiments & Using the Data

  • Experiments are the only way to establish cause-and-effect relationships.
  • Experiments require precise design and implementation.

The Experimental Hypothesis

  • Hypotheses can be formulated through observation or review of previous research.

Designing an Experiment

  • Experimental group: Participants experiencing the manipulated variable.
  • Control group: Participants not experiencing the manipulated variable; serves as a comparison basis.
  • Experimental manipulation should be the ONLY difference between groups.

Defining Variables and Measurement

  • Operational definition: Description of actions used to measure dependent variables and manipulate independent variables.

Avoiding Bias and the Placebo Effect

  • Experimenter bias: Researcher expectations skew results.
  • Participant bias: Participant expectations skew results.
  • Single-blind study: Participants are unaware of group assignments, but researchers know.
  • Double-blind study: Both researchers and participants are unaware of group assignments.
  • Placebo effect: Expectations influence experience.
    • Control groups receive a placebo treatment to differentiate between actual effects and expectancy.

Variables

  • Independent Variable: Controlled/manipulated by the experimenter. It should be the only important difference between groups.
  • Dependent Variable: Measured by the researcher to determine the impact of the independent variable.

Selecting Participants

  • Participants: Subjects of psychological research.
  • Population: The overall group of interest.
  • Sample: A subset selected from the population.
  • Random Sample: Everyone in the population has an equal chance of selection.
    • Preferred for representativeness (sex, ethnicity, socioeconomic status, etc.).

Assigning Participants to Groups: Experimental or Control

  • Random Assignment: Participants have an equal chance of being assigned to either group.
    • Achieved through statistical software or coin flipping.
    • Prevents systematic differences between groups.
    • Necessary for determining true cause-and-effect relationships.

Issues to Consider Manipulating Variables

  • Random assignment is essential for stating causation.
  • Quasi-experimental designs: Used when independent variables cannot be manipulated (e.g., sex).
    • Cause-and-effect cannot be determined.

Ethics

  • Unethical questions cannot be answered using experimental designs (e.g., the effect of child abuse on self-esteem).
    • Requires other approaches like case studies or surveys.

Interpreting Experimental Findings

  • Statistical analysis: Determines the likelihood that differences between groups occurred by chance.
  • Results are considered significant if the odds of chance are 5% or less.
  • True experiments reduce the odds of results occurring by chance.

Reporting Findings

  • Research is reported in peer-reviewed scientific journals to professionals/scholars.
  • Peer-reviewed journal article: Reviewed by experts who provide feedback before publication.
    • Weeds out poorly designed studies.
    • Improves articles and ensures clarity.
  • Replication: Determines the reliability of original research; can expand on original findings or cast doubt.

Bad Science & Retraction: The Vaccine-Autism Myth

  • Publications claiming vaccines cause autism have been retracted.
  • Large-scale research disproved the link due to falsified and financially motivated data.

Reliability and Validity

  • Reliability: Consistency and reproducibility of results.
  • Inter-rater reliability: Agreement among observers on recording and classifying events.
  • Validity: Accuracy of measuring what is designed to measure.
    • A valid measure is always reliable, but a reliable measure is not always valid.

Ethics: Research Involving Human Participants

  • The Institutional Review Board (IRB) reviews research proposals involving human subjects.
  • Informed consent: Informing participants about expectations, risks, implications, and the right to withdraw, while ensuring data confidentiality.

Deception

  • Deception: Purposely misleading participants to maintain experiment integrity, provided there is no harm.
  • Debriefing: Providing complete information about the experiment at its conclusion.
  • Example of unethical research: The Tuskegee Syphilis Study, where participants were not informed or treated for syphilis.

Ethics: Research Involving Animal Subjects

  • The Institutional Animal Care and Use Committee (IACUC) reviews research proposals involving non-human animals.
  • 90% of animal research uses rodents or birds because of similar basic processes to humans.
  • Animals are used when research would be unethical with human participants.
  • Researchers minimize pain and distress in animal subjects.