B

Psychological Research Methods and Ethics

Critically Evaluating a Study

  • Identifying limitations/gaps in research allows curious researchers to address them.

Characteristics of Scientific Psychological Research

  • Psychologists aim to determine causes of mental events and behaviors (e.g., impact of divorce on children).

  • Approaches:

    • Quantitative: Statistical analysis of data from experiments/surveys.

    • Qualitative: In-depth analysis of data from interviews/observations/case studies.

    • Mixed Methods: Integrates both.

Attributes of Quality Research

  • Theoretical Framework: Grounded in theory.

    • Theories: Systematic ways of organizing/explaining observations.

    • Provides framework for hypothesis.

  • Hypothesis: Proposed relations between variables (cause-effect).

  • Variables: Phenomenon that can take on more than one value.

    • Continuous Variables: Continuum of possible values (e.g., reaction time in seconds ranging from 0.5 to 5 seconds).

    • Categorical Variables: Fixed values (e.g., car make, biological gender).

Generalizability

  • Whether research results can be applied to the entire population of interest.

  • Population: Entire group a researcher is interested in (e.g., all people diagnosed with anorexia).

  • Samples: Smaller subset of the population studied to make inferences.

  • Representative Sampling: Similar enough to the population for conclusions to be true for the rest of the population.

  • Sampling Bias: When the sample is not representative.

Validity

  • Requires internal and external validity.

  • Internal Validity: Procedures of the study are sound, not flawed.

    • Compromised by unrepresentative samples or unstandardized design.

  • External Validity: Experimental conditions resemble real-world situations.

    • Replicating results using different data collection procedures.

  • Balance: Must balance internal and external validity; tightly controlled experiments may not resemble real life.

Objectivity

  • Researcher is impartial.

  • Variables can be measured objectively.

Measurement

  • Measure: Concrete means to determine the value of a variable.

    • Illness: Number of clinic visits.

    • Hunger: Time without food.

    • Aggression: Amount of electric shock delivered.

  • Reliability and Validity: Variable measure must be reliable and valid.

    • Reliable Measures: Produce consistent measurements (e.g., not a rubber ruler).

    • Valid Measures: Actually measure the variable of interest (e.g., intelligence ≠ knitting speed).

Assessing Reliability
  • Test-Retest Reliability: Test yields similar scores for the same individual over time.

  • Internal Consistency: Different ways of asking the same question yield similar results.

  • Inter-rater Reliability: Two different testers rate the same person similarly.

Validity of Psychological Measures
  • Measure's ability to assess the variable it is supposed to assess.

  • Validation Research: Relates the measure to an objective criterion or measures with demonstrated validity.

    • Example: Depression measure predicting suicide risk.

  • Predictive Validity: A valid measure can predict other theoretically related variables.

Test bias
  • Tests are biased if the mean score of different groups are systematically different, and the results of the test scores make incorrect predictions in real life (e.g., Westernized IQ tests for indigenous people).

  • Mitigation: Using multiple measures

Revision Quiz

  • Question 1: A systematic way of organizing and explaining observations.

    • Answer: A theory.

  • Question 2: A sample is defined as a subgroup…

    • Answer: of the population that it's likely to be representative of, population as a whole.

  • Question 3: A measure is internally consistent if…

    • Answer: Several ways of asking the same question yield similar results.

Summary of Good Psychological Research

  • Grounded in theory.

  • Representative of the population.

  • Generalizable.

  • Reliable.

  • Valid.

Research Methods

  • Experimental, Descriptive, Correlational, and Qualitative Research.

  • Psychology uses a scientific approach

  • Goals: Description, prediction, and understanding.

    • Description: Summarizes relationships between variables.

    • Prediction: Anticipates future events.

    • Understanding: Identifies causes of phenomena.

Research Method types:

  • Descriptive: Describes behavior.

  • Correlational: Predicts behavior.

  • Experimental: Establishes causes of behavior.

Experimental Research

  • Asks whether systematic variation in one variable causes variation in another.

  • Aims to assess cause-and-effect relationships.

  • Independent Variable (IV): Manipulated by the experimenter.

  • Dependent Variable (DV): Response of the participants.

    • Example: THC exposure and appetite.

      • IV: THC dose (0 or 2.5 mg).

      • DV: Amount of food consumed in 30 minutes.

Control Groups
  • Similar to the experimental group but not exposed to the treatment (e.g., placebo).

Steps in Conducting Experiments
  1. Framing the Hypothesis: Predicts the relationship between variables (derived from a theory).

  2. Operationalizing the Variables: Converting abstract concepts to testable forms.

  3. Developing a Standardized Procedure: Setting up experimental/control conditions, attending to demand characteristics and researcher bias.

    • Demand Characteristics: Participants trying to respond how they think the experimenter wants them to.

    • Blind Studies: Participants (and sometimes researchers) are unaware of important aspects to prevent bias.

    • Double-Blind Studies: Minimize expectancy effects from participants and experimenters; neither knows the treatment.

  4. Selecting and Assigning Participants:

    • Random Assignment: Minimizes systematic differences between groups (age, gender).

  5. Applying Statistical Techniques to the Data:

    • Descriptive Statistics: Describing the data.

    • Inferential Statistics: Determining causality vs. chance.

  6. Drawing Conclusions:

    • Evaluating whether hypotheses are supported (IV and DV are related as predicted).

    • Interpreting findings in the broader theoretical framework.

    • Assessing generalizability.

Advantages of Experiments

  • Cleanest findings; can determine cause and effect.

  • Replicable.

Limitations of Experiments

  • Complex real-world issues are hard to study in the lab.

  • IV cannot always be manipulated (e.g., divorce).

  • Quasi-Experimental Designs: Do not allow random assignment; common in psychology.

    • Example: Comparing intellectual ability of children of divorced vs. together parents.

  • External Validity: Uncertainty about how well lab findings parallel real life.

Descriptive Approach

  • Seeks to describe phenomena rather than manipulate variables.

  • Does not demonstrate cause and effect.

  • Methods: Case studies, naturalistic observations, survey research.

Case Studies
  • In-depth study of behavior in one person or a small group.

  • Used when large numbers are unavailable; common in clinical research.

  • Drawbacks:

    • Small sample size (limits generalizability).

    • Susceptible to researcher/observer bias.

  • Usefulness: Before/after quantitative studies.

Naturalistic Observation
  • In-depth study of phenomena in their natural setting.

  • Examples: Primate behavior in the wild.

  • Advantage: Good generalizability.

  • Disadvantages:

    • Observation can alter behavior.

    • Technique cannot infer the cause of behavior.

Survey Research
  • Asks questions of large numbers to gain information on attitudes and behavior.

  • Used to describe behavior and make arguments about influencing variables.

  • Does not demonstrate cause and effect.

  • Two approaches: Interviews and questionnaires.

  • Limitations:

    • People may respond inaccurately.

    • Sampling issues.

      • Random Sample: Selected from general population without systematic bias.

      • Stratified Random Sample: Specifies the percentage from each population category.

Correlational Approach

  • Aims to determine the degree to which variables are related.

  • Assesses the extent to which being high/low on one measure predicts being high/low on the other (e.g., weight and height).

  • Can determine mathematical association between data from experiments, case studies, or surveys.

  • Does not establish causality.

    • Correlation does not equal causation.

  • Correlation Coefficient: Measures the extent to which two variables are related (positive or negative).

    • Range: +1 to -1.

    • Strong correlation: Close to +1 or -1.

  • Scatterplot Graphs: Show the scores of every participant along two dimensions.

Qualitative Research

  • Studying phenomena in natural settings, drawing meaning from people's words, language, and action.

  • Involves in-depth analysis of relatively few participants (e.g., 30 participants), yielding richer data and deeper understanding.

  • Multiple orientations; no single way to approach/analyze qualitative research.

  • The role of the researcher is acknowledged.

Key Theoretical Perspectives
  • Positivism: Universal truth can be discovered through systematic observation and objective research/measurement.

  • Interpretivism: Social worlds are subjective and constructed by individuals.

  • Critical Theory: Focuses on examining the role of power and how it's expressed in our society.

Epistemological Approaches
  • Objectivism: Phenomena exist independent of our beliefs and consciousness.

  • Constructivism: Meaning of phenomena is the result of social and intellectual constructions.

  • Subjectivism: Meaning is assigned to phenomena by the observer.

Revision Quiz

  • Question 1: In experimental research, the dependent variable is the

    • Answer: measured

  • Question 2: Case studies are…

    • Answer: subject to more researcher bias than other approaches.

  • Question 3: A stratified random sample reflects…

    • Answer: the proportion drawn from each population category.

Modern Technology and Psychological Research

Advances

  • EEG (Electroencephalogram)

  • CAT Scan (Computerized Axial Tomography)

  • MRI (Magnetic Resonance Imaging)

  • PET (Positron Emission Tomography)

  • FMRI (Functional Magnetic Resonance Imaging)

    • Example: FMRI used to study brain activity during mental imagery.

Internet Research

Positive opportunities
  • Easy access to research and data.

  • Recruit participants easily, with stratified demographics.

  • Cost and efficiency benefits over paper surveys.

  • Access, automation, and easy data processing.

  • Participants unaffected by researchers in own environment.

Challenges
  • Accuracy of information must be scrutinized carefully.

  • Connectivity issues may exclude participants, leading to sampling issues.

  • Uncontrolled data collection may make it hard to verify meaningful completion.

  • Ensuring welfare is a challenge, with anonymous information.

Ethics and conduct
  • Confidentiality

  • Debriefing procedures challenges online.

  • Digital collection of data more efficient for researcher; makes it easier for data to be shared.

  • Anonymity of participants & ability to keep results confidential

Ethical Guidelines (Australian Psychological Society)

  • Respect rights and dignity of participants.

  • Act with professionalism and integrity.

  • Informed Consent: Participants understand the purpose and expectations.

  • Voluntary Participation: No excessive rewards or forced participation.

  • Confidentiality: Information not made public without consent.

  • Deception: Only when necessary; must inform participants afterward.

  • Ethical Approval: Required when conducting research with people.

  • Animal Research: Benefits must outweigh costs.

Critical Evaluation of a Study

Questions:
  1. Does the theoretical framework make sense?

  2. Does the hypothesis flow logically from the theory?

  3. Is the sample adequate and appropriate?

  4. Are the measures and procedures adequate?

  5. Are the data conclusive?

  6. Are the broader conclusions warranted?

  7. Does this study say anything meaningful?

  8. Is the study ethical?

Critical Thinking

Principles:
  • Skepticism: Questioning results.

  • Objectivity: Being impartial; putting aside biases.

  • Open-mindedness: Considering all sides.

Fallacies in Arguments
  1. Straw Man: Attacking the opposing argument to strengthen your own (without evidence).

  2. Appeals to Popularity: Claiming a widespread argument must be true (without evidence).

  3. Appeals to Authority: Claiming an argument is true because of the authority of the person making it (without evidence).

  4. Arguments Directed to the Person: Attacking the authors of the opposing argument based on their supposed failings (instead of evidence).

Revision Quiz

  • Question 1: Which of the following are challenges faced by internet research?

    • Answer: All of the above

  • Question 2: Informed consent involves…

    • Answer: the participant's ability to agree to take part in the study.

  • Question 3: Three principles underlying critical thinking. They are…

    • Answer: Scepticism, objectivity and open mindedness. This includes skepticism questioning and analysing objectivity, referring to impartial approach and open mindedness, is considering all sides of an issue.