Field Methods in Psychology – Quick Reference Notes
Why Study Research Methods
- You encounter research claims daily; critical thinking helps separate science from hype or poor studies.
Ways of Knowing
- Intuition: gut feelings; can be biased and lead to illusory correlations.
- Authority: beware experts without relevant background; peer review helps validate.
- Empiricism: knowledge from observation and measurement; the basis of science.
- Scientific Skepticism: ideas must be testable, falsifiable, and replicable; evidence matters.
The Scientific Approach
- Science follows a systematic path: Ask a question, Collect data, Analyze it, Draw conclusions, Share results for critique.
Systematic Approach: Key Steps
- Ask a Question: identify a phenomenon or issue to explore.
- Collect Data: use systematic methods; define variables clearly; ensure ethical data collection.
- Analyze It: organize data; look for patterns, correlations, themes; use statistics or thematic coding.
- Draw Conclusions: relate findings to the original question; acknowledge limitations and alternative explanations.
Data-Driven, Replicable, Adversarial, Peer-Reviewed
- Data-Driven: conclusions rely on observable data.
- Replicable: others can repeat the study and obtain similar results.
- Adversarial: ideas compete; strongest evidence prevails.
- Peer-Reviewed: experts critique before publication; quality is scrutinized.
What is a Peer-Reviewed Journal Article?
- Abstract, introduction, methods, results, limitations, conclusions (format varies by discipline).
- Look for: author credentials, publisher, references, clear methodology, discipline-specific language.
- Goal: ensure rigor before dissemination.
How to Find & Read Peer-Reviewed Articles
- Check author credentials and affiliations.
- Verify the publisher (scholarly societies, university presses, major publishers).
- Review references to trace sources.
- Assess article structure and clarity.
- Language is discipline-specific; not always accessible to non-experts.
Watch Out for Pseudoscience
- Warning signs: cannot be tested or verified; vague/emotional language; relies on testimonials; lacks method.
- Examples: astrology, graphology, untestable claims.
- Real-world dangers: misleading claims can lead to harmful decisions.
More Pseudoscience Examples
- Facilitated communication: initial claims suggested autism communication via facilitator, but proper testing showed the messages came from the facilitator, not the child; led to serious real-world harms.
Goals of Scientific Research
- Descriptive (Description): describe patterns and contexts of behavior.
- Predictive (Prediction): identify patterns/correlations to forecast outcomes.
- Causal (Determination of Causality): determine whether X causes Y.
- Explanatory (Understanding/Explain): explain why phenomena occur; link to theory.
Descriptive Example
- Teens’ TikTok usage hours described without explaining why behaviors occur.
Prediction vs Causation
- Prediction: correlations help identify at-risk groups for interventions.
- Causation: beware that correlation does not equal causation; require evidence of causality.
Determining Causality: The Gold Standard
- To prove causation, three criteria must be met:
- \text{Temporal Precedence: } \text{Cause comes before effect}
- \text{Covariation: } X \text{ and } Y \text{ vary together}
- \text{No Alternative Explanations: Nothing else explains the relation}
- Example: violent video games cause aggression only if these conditions hold and alternate explanations are ruled out.
Determining Causality in the Field
- In real-world settings, hard to control all variables; use quasi-experimental designs or longitudinal studies to infer causality.
- Example: compare communities with/without a peer-led reproductive health program and track results over time.
Why Explain Behavior?
- Explanation links observations to theory; investigates mechanisms (e.g., poverty affecting academic performance via resources, stress, or support).
- Consider multiple mechanisms (stigma, past trauma, cultural values) to understand why a behavior occurs.
Basic vs Applied Research
- Basic Research: seek foundational knowledge; aim to understand mechanisms and test theories.
- Applied Research: solve real-world problems; implement interventions and evaluate effectiveness.
Basic vs Applied: Examples and Purpose
- Basic: memory encoding mechanisms using fMRI; understanding attention, development, or emotion regulation.
- Applied: evaluating a school mental health program to reduce anxiety; testing a resilience-building intervention.
Interconnection of Basic and Applied
- Basic research provides theory; applied research tests and refines theories in real-world settings.
- Theoretical advances from basic research inform practical applications; successful applications raise new questions for basic research.
Quick Practice: Evaluation Questions (Eight Key Questions)
- What was the goal? (Description, Prediction, Causation, or Explanation) 8
- What method was used? (Survey, Experiment, Observation) 8
- What was measured? How were variables defined? 8
- Who were the participants? Can results be generalized? 8
- What were the findings? What do the data actually show? 8
- Have others found similar results? Was there replication? 8
- What are the limitations? Sample size, biases, ethical concerns? 8
- Was the study safe and respectful? 8
Final Reminders
- Always distinguish descriptive, predictive, causal, and explanatory aims.
- Consider the strength of evidence: data-driven, replication, peer-review, and potential biases.
- Use systematic questions to evaluate any research quickly during review or exam prep.