1/104
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Experimental Study
Manipulates independent variable to determine causation; Example: Assigning therapy vs. control to test depression outcomes
Observational Study
No manipulation; examines associations; Example: Studying relationship between trauma and anxiety
Case-Control Design
Compares groups with vs. without an outcome; Example: PTSD group vs. non-PTSD group and comparing childhood trauma
Cohort Design
Follows one or more groups over time; Example: Tracking adolescents exposed to violence
Cross-Sectional Design
Measures all variables at one time; Example: Measuring stress and sleep today
Retrospective Design
Looks back at past variables; Example: Adults recalling childhood adversity
Case-Control Strength
Efficient for studying rare disorders; Example: Schizophrenia risk factors
Case-Control Weakness
Recall bias and no causation; Example: Misremembered childhood trauma
Cohort Strength
Establishes temporal order; Example: Trauma occurring before depression
Cohort Weakness
Attrition over time; Example: Participants dropping out of longitudinal study
Single-Group Cohort
One group followed over time; Example: Trauma survivors tracked longitudinally
Multi-Group Cohort
Compares exposed vs. non-exposed groups; Example: Violence-exposed vs. non-exposed youth
Accelerated Cohort Design
Multiple age groups studied simultaneously; Example: Ages 10, 15, 20 followed together
Birth Cohort
Group born at same time followed over lifespan; Example: 2000 birth cohort study
Construct Specification Issue
Difficulty defining variables clearly; Example: Defining “abuse”
Group Selection Issue
Groups differ on confounding variables; Example: SES differences
Causality in Observational Studies
Cannot establish causation, only associations; Example: Trauma linked to PTSD but not causal
Construct Validity
Measure reflects intended construct; Example: Depression scale measures depression not anxiety
Reliability
Consistency of measurement; Example: Same score over time
Validity
Accuracy of measurement; Example: Test measures what it claims
Content Validity
Measure covers full construct; Example: Depression scale includes mood and sleep
Criterion Validity
Measure relates to outcome; Example: Test predicts diagnosis
Concurrent Validity
Correlates with current measure; Example: New scale matches existing one
Predictive Validity
Predicts future outcome; Example: SAT predicting GPA
Test-Retest Reliability
Stability over time; Example: Same score after two weeks
Inter-Rater Reliability
Agreement between raters; Example: Two clinicians give same rating
Internal Consistency
Items correlate within test; Example: Cronbach’s alpha
Sensitivity
Detects small changes; Example: Tracking therapy improvement
Standardized Measure (Pro)
Validated and comparable across studies; Example: Beck Depression Inventory
Standardized Measure (Con)
May not fit all populations; Example: Cultural mismatch
Modify Existing Measure
Adapt measure for new population; Example: Adult scale modified for adolescents
Develop New Measure
Create when none exists; Example: New IPV beliefs scale
Global Rating
Overall judgment of functioning; Example: Clinician rates severity
Self-Report Inventory
Participant answers questions; Example: Depression questionnaire
Projective Technique
Interprets ambiguous stimuli; Example: Rorschach test
Direct Observation
Observes behavior directly; Example: Parent-child interaction
Psychobiological Measure
Biological indicators; Example: Cortisol levels
Computerized Assessment
Digital testing methods; Example: Online surveys
Obtrusive Measure
Participants aware of being measured; Example: Acting differently when observed
Reactivity Problem
Behavior changes due to observation; Example: Participant acts nicer
Solution to Reactivity
Use unobtrusive measures or disguise purpose; Example: Hidden observation
Unobtrusive Measure
Participants unaware of measurement; Example: Archival records
Simple Observation (Pro)
Behavior is natural; Example: Watching playground behavior
Simple Observation (Con)
Little control over variables
Contrived Observation (Pro)
High control; Example: Lab interaction task
Contrived Observation (Con)
Artificial environment
Archival Records (Pro)
Easy access to existing data; Example: Medical records
Archival Records (Con)
Data may be incomplete or biased
Physical Traces
Indirect evidence of behavior; Example: Wear patterns on floor
Converging Evidence
Multiple measures agree; Example: Self-report and observation both show improvement
Incremental Validity
New measure adds unique information
Practicality
Time and cost considerations
Inconsistent Results Reason
Measures assess different constructs or error; Example: Anxiety vs. depression measures differ
Manipulation Check
Tests whether IV worked; Example: Mood induction increases sadness
IV Works + DV Changes
Strong support for hypothesis
IV Works + No DV Change
Theory not supported
IV Fails + DV Changes
Confound present
IV Fails + No DV Change
Inconclusive results
Avoid Sensitization
Use subtle or indirect checks
Exclude Participants
Usually not recommended due to validity concerns
Pilot Testing
Test manipulation effectiveness before study
Clinical Significance
Real-world meaningful change; Example: No longer meets diagnosis
Compare to Norms
Compare to healthy population averages
Dysfunctional Comparison (Pro)
Clear benchmark for improvement
Dysfunctional Comparison (Con)
Limited generalizability
Diagnostic Change (Pro)
Clinically meaningful outcome
Diagnostic Change (Con)
May be too strict
Social Impact
Effect on daily functioning; Example: Teacher reports improvement
Scope/Breadth
Change across settings; Example: Home and school
Disseminability
Ease of spreading treatment to others
Cost-Benefit Analysis
Weighs effectiveness vs. cost
Acceptability
Whether treatment is acceptable to clients
Traditional Case Study
Generates hypotheses; Example: Rare disorder case
Single-Case Design Requirements
Repeated measures, baseline, and comparison
ABAB Design
Treatment introduced and withdrawn to test effects
Multiple Baseline Design
Treatment staggered across subjects or behaviors
Changing Criterion Design
Gradual stepwise behavior change; Example: Reducing smoking
Changing Criterion Problem
External factors may influence behavior
Evaluation of Data
Visual inspection of trend, level, variability
Qualitative Research Goal
Understand lived experiences
Ethnography
Study of culture; Example: Community norms
Phenomenology
Study of lived experience; Example: Trauma experience
Grounded Theory
Develop theory from data
Qualitative Characteristics
Flexible, narrative, small samples
Qualitative vs Anecdotal
Systematic and rigorous vs informal
Qual vs Quantitative
Depth vs numerical data
Qualitative Pros
Rich, detailed data
Qualitative Cons
Subjective interpretation
Descriptive Validity
Accuracy of observations
Interpretive Validity
Accuracy of meaning
Theoretical Validity
Fit with theory
Internal Validity (Qual)
Coherence of findings
External Validity (Qual)
Generalizability
Triangulation
Use of multiple data sources
Credibility
Believability of findings
Confirmability
Objectivity of findings
Transferability
Applicability to other contexts
Mixed Methods Research
Combines qualitative and quantitative approaches
Deception
Misleading participants about study purpose; Example: Hidden hypothesis
Debriefing
Explaining study after participation