Field Methods in Psychology – Chapter 1 Essentials

Why Study Research Methods

  • Everyday media cites “scientific” claims; methods help you judge credibility.

  • Key questions to ask: Who were the participants? What method? Peer-reviewed? Generalizable?

Ways of Knowing

  • Intuition: gut feelings; prone to biases (illusory correlations).

  • Authority: experts or influencers; can be wrong without evidence.

  • Empiricism: knowledge via systematic observation & measurement.

  • Scientific Skepticism: requires testable, replicable evidence before acceptance.

Scientific Approach (Process)

  • Ask a focused question.

  • Collect data with clear variables & ethical procedures.

  • Analyze via statistics or thematic coding.

  • Draw conclusions, note limits, consider alternatives.

  • Share for peer critique & replication.

Features of Scientific Thinking

  • Data-driven: claims need observable evidence.

  • Replicable: methods detailed so others can repeat.

  • Adversarial: ideas compete; only strongest evidence survives.

  • Peer-reviewed: experts vet work before publication.

Spotting Pseudoscience

  • Claims not testable / unverifiable.

  • Relies on testimonials, vague language.

  • Ignores method details & peer review (e.g., astrology, graphology, facilitated communication).

Goals of Scientific Research

  • Description: what is happening? (e.g., hrs on TikTok).

  • Prediction: identify correlations; X \rightarrow Y likely (not causal).

  • Causation: show X causes Y via:

    1. Temporal precedence

    2. Covariation

    3. No alternative explanations

  • Explanation: uncover underlying mechanisms (the “why”).

Basic vs Applied Research

  • Basic: tests theories, seeks general knowledge (e.g., fMRI on memory encoding).

  • Applied: solves real problems, evaluates interventions (e.g., school mental-health program).

  • Interconnected: basic provides foundations; applied refines theories in context.

Eight Key Questions When Evaluating a Study

  1. Research goal: description, prediction, causation, explanation?

  2. Method: survey, experiment, observation, etc.?

  3. Measures: how were variables operationalized?

  4. Participants: who & how selected? Generalizable?

  5. Findings: what do data show?

  6. Replication: confirmed by others?

  7. Limitations: biases, sample size, confounds?

  8. Ethics: safe & respectful procedures?

Quick Reminder

  • Critical thinking in research = asking for evidence, methods, and peer validation before believing any claim.