PSY 100: 9/4/25
Chapter 2 Notes: The Scientific Method in Psychology
Context and purpose
- Psychology is framing itself as a science in the 19th century after earlier philosophical disagreements (nature vs nurture, determinism vs free will, mind–brain problem).
- Goal: develop a rigorous, testable science of psychology with shared methods, definitions, thresholds, and categories.
- Emphasis: psychology is not just about ideas; it requires evidence, testable methods, and replicable results.
- The instructor highlights that the course focuses on general psychology, scratching the surface, and emphasizes exam-style, scenario-based questions in addition to factual definitions.
Key historical and methodological points
- Early psychology faced consensus problems; turning psychology into a science required independent methodological standards (not just borrowing from physics or chemistry).
- Intersection with other sciences (biology, neuroscience) helps strengthen methods via measurable brain processes and biological substrates (neurotransmitters, EEG, MRI, fMRI).
- Advances in technology enable more objective data collection and the possibility to link mental processes to brain activity.
- Distinction between subjective and objective data is central to scientific rigor.
Subjective vs. Objective data
- Subjective data: based on opinions, self-reports, interpretations.
- Objective data: observable, measurable, and can be collected via instrumentation (e.g., heart rate variability, skin temperature).
- Examples discussed: asking participants about stress (subjective) vs. measuring physiological indices (objective).
- When data converge (self-report aligns with physiological measures), confidence in a finding increases.
Foundational elements of the scientific method in psychology
- Evidence-based practice: psychology seeks to build knowledge through repeatable observations and experiments.
- The need for rigorous, replicable methods across different environments and populations.
- The role of technology and standardized procedures to reduce confounds (e.g., chair/room constants, equipment standards).
- The concept of cross-disciplinary integration (neuroscience, biology) to strengthen explanatory power.
What makes psychology a science (summary of key ideas)
- Psychology studies human thoughts, emotions, and behavior, which are dynamic and context-sensitive.
- Researchers strive to create standardized methods so that findings can be reproduced in other labs and settings.
- Replicability is essential to establish robust, generalizable knowledge.
- Researchers aim to integrate findings with broader scientific knowledge (e.g., neural mechanisms).
Data collection and the nature of evidence
- Objective data examples: physiological measures (e.g., heart rate variability), biometric indicators, neuroimaging signals.
- Subjective data examples: self-reports, questionnaires, interviews.
- The goal is to combine objective measures with well-constructed subjective measures to strengthen inferences.
- When data from different sources align, conclusions are more credible; discrepancies require scrutiny of methods or theory.
The become-a-psychologist assignment and research realism
- Students may run a small data-collection project (become a psychologist) to practice collecting data, analyzing results, and reporting findings.
- The assignment mirrors what full researchers do: literature review, hypothesis formation, methods design, data collection, results, interpretation, and discussion.
Important terminology and concepts (definitions and implications)
- Science (general): knowledge generation through careful observation and testing.
- Hypothesis: a clear, testable predictive statement about the relationship between variables.
- The IF-THEN structure: helps specify the relation between an independent variable and the expected dependent outcome.
- Evidence and proof: science seeks evidence that supports or refutes hypotheses; not absolute certainty.
- Variables:
- Independent variable (IV): the manipulated factor.
- Dependent variable (DV): the measured outcome.
- Confounds/controls: variables kept constant or accounted for to prevent alternative explanations.
- Literature review: situates new work within what is already known; helps refine hypotheses and methods.
- Replication: repeating a study to verify results; critical for establishing reliability.
- Replication crisis in psychology and the push for transparency and preregistration.
- Meta-analysis: statistical synthesis of many studies to estimate overall effect size when individual studies vary.
Hypothesis development and design considerations (core student activities)
- Hypotheses should be:
- Clear and predictive: state a concrete relationship between variables.
- Testable and measurable: specify how to test and what to measure.
- Often written with an if–then structure for explicit relationships between IV and DV.
- Specific about what is manipulated (e.g., duration, material type) and what is measured (e.g., test scores, response time).
- Example discussion prompts from the class:
- Hypothesis example:
ext{If } ext{listening to music for } t ext{ minutes while studying}, ext{ then test scores will be higher than baseline}. - The need to specify the context (e.g., type of material, subject matter, test format) to avoid vague generalizations.
- Variables in design:
- IV could be music exposure (presence/absence, duration, type).
- DV could be performance on a test, memory recall, or another outcome.
- Additional variables to consider: age, gender, prior sleep, study environment, etc.
- Example study described in the lecture:
- A short nap and memory memory test design (12 female participants, aged 17–21): learn a list of words, immediate test, nap, delayed test.
- Predicted outcome: better memory performance on the delayed test relative to the immediate test.
- Before running a study, researchers are urged to conduct a power analysis to determine the needed sample size to detect the expected effect with adequate power (often 0.8).
- Random assignment and control groups:
- Randomly assign participants to conditions (e.g., music vs no music) to balance individual differences.
- Use a control group to compare against the experimental manipulation.
- Blinding considerations (briefly touched): reduce bias by preventing participants or experimenters from knowing group assignments when possible.
- Potential issues in design:
- Small sample sizes can limit generalizability and statistical power.
- Demographic limitations (e.g., only female participants, narrow age range) limit broad applicability.
- Confounds (hormonal status, sleep quality, time of day) can influence results and need control or acknowledgement.
Qualitative vs. quantitative methods (how data are collected and analyzed)
- Qualitative:
- Data: interviews, case studies, narrative descriptions.
- Analysis: descriptive, thematic; results are descriptive rather than numerical.
- Quantitative:
- Data: numerical measurements (e.g., test scores, reaction times).
- Analysis: statistical tests; outcomes include means, variances, p-values, effect sizes.
- Some studies may use a mix of both approaches, depending on the hypothesis and data availability.
How results are interpreted and reported
- After collecting data, researchers determine whether results support or refute the hypothesis.
- Replication is essential: results should hold across labs, populations, and contexts or be explained by context-specific factors.
- If findings are not replicable, this challenges the robustness of the claim and may indicate confounds, sampling biases, or random chance.
- Transparent reporting includes detailing the environment, equipment, participant characteristics, and procedural steps to enable replication.
- Reporting sections in psychology papers typically include: Introduction (literature review and hypotheses), Method, Results, Discussion.
- The discussion interprets findings, addresses limitations, compares with prior work, and suggests future directions.
Real-world critique and ethical/publication considerations
- The field recognizes publication bias toward positive findings; journals increasingly publish null results to improve transparency.
- The replication crisis has driven calls for preregistration, data sharing, and methodological rigor.
- Practical implications: findings should be considered within the context of sample, materials, environment, and population.
- Ethical considerations in data collection, consent, and reporting require careful attention to maintain integrity and trust.
A compact sample workflow from hypothesis to conclusion
- Step 1: Literature review to identify a gap and form a theoretical basis.
- Step 2: Develop a clear, testable hypothesis with an explicit IF–THEN structure.
- Step 3: Design the method with a plan for data collection, measurement, and analysis; choose between qualitative, quantitative, or mixed methods.
- Step 4: Collect data using controlled procedures; random assignment and control groups when possible.
- Step 5: Analyze data to determine whether results support the hypothesis; report effect sizes and statistical significance as applicable.
- Step 6: Interpret results in the discussion; consider limitations and alternative explanations; discuss replication and generalizability.
- Step 7: Consider broader implications and potential future research; share data and methods to support replication.
Mini-experiment demonstration and critical thinking exercise (class activity)
- An informal exercise explored posture and perception via a TED Talk reference on power posing and cortisol (Amy Cuddy).
- Takeaway: body posture can influence psychological states, which in turn can affect performance and perception; however, interpret with caution and rely on rigorous research when drawing conclusions.
Practical exam-oriented tips discussed in class
- Expect scenario-based questions requiring you to design or critique a study rather than only recalling definitions.
- Be prepared to specify hypotheses, variables, controls, sample size considerations, and analysis plans.
- Understand how to translate a general question (e.g., does listening to music help studying?) into a concrete, testable study design.
Notable numerical references and illustrative examples (with LaTeX formatting)
- Example 1: Demographic and design details from a nap-memory study:
- Participants: 12 female participants, aged 17 ext{ to } 21.
- Design: learn list of words, immediate test, then nap, followed by delayed test.
- Prediction: delayed test performance would exceed immediate test performance.
- Example 2: Power analysis and sample size discussion:
- Concept: to detect a true effect with adequate power, researchers estimate the required sample size using power analysis (often targeting power ext{Power} o 0.8) given an anticipated effect size and alpha level \alpha.
- Example 3: Hypothesis formatting and specificity:
- Initial form: ext{If } X ext{ then } Y.
- Refined form: specify the time, materials, population, and outcome; e.g.,
ext{If participants study with } 20 ext{ minutes of music in the morning using material type A, then test scores on a standardized test will be higher than those without music, measured by a multiple-choice test with discrete scoring.} - Example 4: Numerical and statistical planning language:
- E.g., “power analysis,” “sample size equal to n per group,” “random assignment,” and “control vs experimental groups,” all of which are essential for replicability and validity.
Connections to broader themes and real-world relevance
- The evolution from philosophy to science in psychology highlights the tension between intuitive explanations and evidence-based conclusions.
- The interplay between subjective experience and objective measurement remains a core methodological issue.
- With neuroimaging, physiological measures, and computational models, psychology increasingly integrates across disciplines to understand complex human behavior.
- Transparency, preregistration, and data sharing are growing practices intended to improve reproducibility and trust in psychological science.
Summary takeaways
- Psychology as a science rests on careful observation, testable hypotheses, controlled methods, and replicable results.
- Clear hypotheses (often IF–THEN) and explicit variables are essential for testability and replication.
- Researchers must balance qualitative and quantitative methods, depending on questions and data.
- Replication and meta-analytic approaches are critical for establishing robust, generalizable knowledge.
- Ethical, methodological, and contextual factors influence study design, interpretation, and the applicability of findings.
Quick references to exam-ready ideas
- Definition recap: science = knowledge through careful observation and testing; psychology uses evidence-based methods to study thoughts, emotions, and behavior.
- Distinguish IV (manipulated) vs. DV (measured); control confounds; rely on random assignment when possible.
- Understand why replication and transparency matter; be prepared to discuss how environmental or demographic factors can influence results.
- Be able to articulate a clear, testable hypothesis with an IF–THEN structure and specify the exact conditions, measures, and population.
Additional classroom notes
- SI (Supplemental Instruction) opportunities exist to reinforce material with small-group review (max 25 people).
- Paper 1: assigned to review instructions and format; expectation to build a short research report with hypotheses, methods, results, and discussion.
- The instructor emphasizes minimizing distractions during class (silence, no phones) for full engagement.
Ethical and practical implications highlighted in the discussion
- The field should be transparent about methods, environment, and participant characteristics to enable replication and fair interpretation.
- Negative or null results are valuable and should be published to avoid biased conclusions.
- Real-world applications require caution when generalizing from small or homogeneous samples; consider age, gender, and cultural factors.
Acknowledged limitations and caveats
- Small samples (e.g., 12 participants) limit generalizability.
- Demographic restrictions (e.g., only female participants, narrow age range) constrain applicability to broader populations.
- Hormonal status, sleep quality, and time of day can act as confounds and must be considered in interpretation and reporting.
Final word on learning approach
- Think like a scientist: define variables, formulate testable hypotheses, design rigorous methods, preempt confounds, and pursue replication and transparency as core practices of psychology.