Chapter 1:
People watching: why people do what they do: Why we do what we do
Curiosity of human behavior
Research is used in lots of different fields:
Ask yourself: what's the evidence??
Use research to come up with clinically supportive treatments: so they know that treatments work and can help
Three approaches we learn about the world:
Intuition: “gut feeling” to draw conclusions about the world
Main problem: Biased (everyone is but still an issue :)
Authority/Prestigious: when someone we respect/ trust tells us something that we perceive is true
Could be biased, we dont know it true
Faith: gain knowledge and experience through spiritual intuition, using faith to create scientific inquiry
Sometimes accurate, sometimes not
Scientific approach: spectacle: train your mind to be spectacle
Based on logic
Grew out of the tradition of empiricism
Empiricism: accurate information can only be obtained through direct and verifiable observation
If you can’t observe it, then it isn't true
Both a process and a worldview
Characteristic of Science: making a more accurate way to observe human behavior
Skeptical: looking for evidence before accepting it
Empirical: based in direct observation, needs to be testable or falsifiable
Collaborative: research scientists when they find something they share their findings and allow other scientist collaborate on it
Adversarial: theories compete, multiple theories of the same thing, all fighting to be accurate
Example: Obese theory: more calories in than out or hormonal theory: calories are not equallying up to each other: hormonal impact makes people obese over time
Humble: scientist are humble: when theories are refuted scientist must shift gears and try another way
Peer-reviewed: before research finding is shared and published, it is evaluated by a group of your peers.
To make sure high quality research is out in the world
Pseudoscience: the use of seemingly scientific terms that have no real evidence made in empirical observation
Where do we often see it/hear it?
Tiktok, youtube
sign a an assertion is not scientific
Not testable/ falsifiable: if not testable, they can't be refuted
Vague, biased or extreme language: “always” or “for everyone”
Based on anecdote / testimony: worked for one person, works for everyone
Claims made by “experts” who:
Vague credentials
Outside of the scope of expertise
Conflicts of interest: someone whos making a claim about a product that they can make money off of
5. Ignore conflicting evidence: only talking about evidence that supports what they have to say, not about the negates it
6. Cannot be independently verified: one off finding, no one else can replicate
Be skeptical of everything you hear/ read!!
Being a skilled consumer of research:
Trust depends on type and quality of the design
When conclusion are not supportable by the design
Type of design: different types of designs allow you to draw different types of conclusions
Ex: people try to draw causal conclusions from correlational evidence
Quality of design:
if you have a really small sample size, the measures that your using to quantify the data
C. Questions to ask when reading and evaluating a study:
What was the primary goal of the study?
The goal of the study determines the research design
What was the research method and did the research method used allow the research method used allow the researcher to meet their goal?
If a researcher is trying to find causation, they need to do a true experiment
What was measured and how was it being measured?
Identify key dependent variables
To what/ whom can we generalize the findings?
Can't generalized the finding farther than the people that were studied
What were the results? Do the results support or refute the goal of the study?
Have other researchers found similar results?
What are the limitations of the study?
Discussion part of the study → talks about limitations
Did the study have any ethical issues? If so, how were they addressed by the researcher?
D. Be aware of the possibility for bias: people may see what they wanna see
Even scientists are biased
Research process is geared toward the hypothesis
Censorship by journal editors
The socio-political environment influences what research questions are asked and what research findings are published.
How big of a sample to generalize?
Conclusions only applied to men
Failure to rule out alternative explanation
Inaccurately inferring causation
The logic of the scientific method:
Inductive approach
Bottom-up: start with data, then we move up to theory
Data driven approach that starts with data and leads to theory building
Data → theory
B. Deductie approach
Top-down
Start with existing theory → move the way down to collect data
Theory → data
The goals of science: correspond to different research techniques
Description: your trying to describe a phenomena
Where, when, how much: it occurs
Purpose
Directly observable vs. latent variables
Directly observable: Reaction time, frequency of sound
Latent variables: hunger, depression: you know that they are there but it ethereal/ hypothetical
Ex: Negata et Al.: Studied social media used in adolescence during the COVID pandemic
Asked over 5,000 adolescents to report and describe the amount of non-schoolwork screen time they used during the day.
Broke down data by racial and ethnicity
Research methods used for description
Naturalistic observation: go into a citation (the wild) and observe people
Surveys: Questionnaires/ interview can be done with pen and paper or online
Case studies: in depth look at an individual, family or company. Looking at the one case, all factors and influences on that one case.
Correlation and prediction:
Purpose #1: Determine if two variables are related, are they correlated?
Purpose #2: If variables are related, we can use our knowledge of one variable to make predictions about another variable
Research methods used for correlation and prediction:
Correlational study: measures two or more variables, explore the relationship between the variables to see if they are related
Only dependent variables
Regression analysis: create a production equation
When two variables are correlated, we can use a statistical equation to see where we would score on another variable
Guessing weight by just knowing your height, SAT scores to see what your college GPA would be
Quasi-experiments (judging defendants based on attractiveness): relationships between a participants characteristics and people's behavior
You don't have as much control as you do with a true experiment
You are using a characteristic of your participants to determine the groups instead of a treatment
Determine groups by characteristic of the participants: coming preassigned into groups
Ex: gender, age, socio-economic status
Correlation doesn't imply causation!!!
Causation:
Purpose
Experiment is the only research method that can establish causation
Criteria for establishing causation
Covariation: as one variable changes, so does the other one
The treatment was playing violent video games, the behavior studies was aggression: violence video games is covariant to violent acts
Temporal precedence: the change in one variable precedes the change in another, you need this to see which variable is the case and which is the effect
Playing the violent video games proceed the aggressive behavior, meaning the aggressive behavior has to be the effect
Ruling out alternative explanations: have to control for all variables that also might impact behavior, manipulate only the one that we think is the true cause
Other variables for video games to rule out that influence aggression: gender, amount of parental supervision
Explanation: involves explaining why something occurs
Explaining why something occur
Ex: children of alcoholics, some become alcoholics while others don’t: what's the difference?
Very difficult to truly explain human behavior, thoughts, and emotions
Behavior has multiple causes
No human behavior has just one cause!!
What might be a cause in one person, would not be a cause in another person!!
Types of research: Basic vs. Applied:
Basic Research
Science for science’s sake
Ask and answer questions about behavior
Understanding theoretical issues
Ex: Marian & Neisser (2000) tested the recall of personal memories in bilingual people. (spoke russian and english)
What they found, people remembered more about their lives in Russian when they spoke Russian and were interviewed in Russian (Same for English)
Demonstrates Dependent retrieval: how language helps memory recall
More examples of basic research in the textbook
B. Applied research
Science for finding solutions to practical problems
EX: Wells Perrine (2001) studied the effects of bringing pets to work
They found that people who brought pets to work were less stressed
They found a solution to lower the stress levels at work
Program evaluation: is a specific type of applied research: evaluates the effectiveness of programs: any and all programs
1998: Broski and Terney: evaluated the effectiveness of the Big Brother/ sister program among 10-16 year olds: helped disadvantaged kids by giving them a friend or someone to rely on that was older than them
The younger children that participated were less likely to skip school, do drugs, act violent, etc than children not in the big brother/ sister program
Their evaluation reveal the program was effective
C. Comparing Basic and Applied Research :
Neither is more important than the other
They have reciprocal relationship, with one often serving as a springboard for the other.
Both are necessary and required: they both can be intertwined with another
Chapter 2:
Research Questions, Hypotheses & Predictions:
Choosing a research question
Specific enough to be testable
Ex: What causes depression? (not testable and not specific)
Ex: Is depression related to a diet with ultra processed foods (specific and testable)
B. a hypothesis: a tentative statement (answer) concerning the nature of the relationship between two or more variables.
Ex: Theres a relationship between consumption of high processed foods and depression
Hypothesis → study design
C. A prediction: grows out of the hypothesis, but is testable
a testable statement form what you expect from the research
Operational definition:
Ex: there will be a direct correlation between scores on the Beck Depression inventory and scores on the short questionnaire of the highly processed food consumption
The role of evidence
When to keep the hypothesis
When to reject the hypothesis or do more research
A hypothesis never is PROVEN!! We have evidence that supports, suggests or demonstrates!
More on Hypotheses and Predictions:
Terminology when testing hypothesis
Null Hypothesis:
Nothing is going on
The one we actually test
Research Hypothesis:
A.aka. Alternative hypothesis
Something IS going on
The one we actually believe
We test the null hypothesis to decide what to do with the research hypothesis:
If the null is rejected, then the research hypothesis is accepted
If the null is retained then the research hypothesis is rejected/ refuted
Variables → determines research design → determine your hypothesis
^And how we rationalize them determines research designs
B. Types of Hypothesis/ Predictions
Causal Hypotheses: “Taking drug A causes symptoms of depression to abate”
Only possible during a true experiment
Anything the implies causation
Descriptive Hypotheses: “southerns are likely to espouse conservative political views”
Surveys, natural observation, case studies
Descriptive research methods
Correlational Hypothesis: “involvement in school related activities is related to a freshmen’s degree of adjustment in their first semester”
Or hypotheses of associated
Correlational research and quasi experiments
correlational : dependant variables only!!!: we measure them
C. Characteristics of a good predictions:
What makes a prediction a good one?
Testable
Uses operational definitions
Specific, not vague: they need to be testable
Looking at the relationship between gender and empathy
Can't say “ there a relationship between gender and empathy
Can say: “women are more likely to score higher on an empathy test than men”
Has a sound rationale
Based in research or theory
Relevant (not Trivial)
Applied research is relevant
Not all basic research is relevant but a lot of it is
Helps describe the world
Practical and ethical to test
Do you have the resources?
Are there any ethical concerns?
Sources of Research Ideas:
Common sense
Are cultural truisms really true?
cultural truisms: expression we think are true: “absence makes the heart grow fonder”
Ex: Zajonc (1968) “familiarity breeds contempt”
The more you know someone the less interested that you are in them ( He believed that this was true but realized this was actually the opposite)
^ the more familiar you are with people, the more you like them!
Familiarity breeds content!!
Practical Problems: when you have practical problems you need to solve
Applied research
Ex 1: placement of brake lights on cars
Ex 2: vending Machines in schools
Observing the World
Both personal and public events/ situations provoke curiosity
Ex: controversy over the effects of music lyrics on behavior of kids
Especially concerned about rap music.
Freid (1999) “bad man’s blunder” lyrics study (rap vs. country)
Got two groups of participants: had ALL of them read the same song lyrics “bad man’s blunder”
The lyrics have a violent theme
Half of the participants were told it was a rap song
Half of the participants were told it was a country song
Asked the participants to rate their reaction to the song
People who thought it was a rap song rated the song more negatively than people who thought it was a country song
Theories
Theory: a systematic, coherent, and logical set of ideas about a particular topic or phenomenon that serves to organize and explain data and generate new knowledge.
Good theories explain a large body of data and facilitate the creation of new hypotheses.
Ex: Buss (1989) socio-biological theory to explain gender differences in sexual behavior-parental investment theory: if the goal is to have offspring
women invest A LOT into having a child: the women’s strategy is looking for a good provider
Men: their entire investment, a couple of minutes: have as many sexual partners and children as they can
Are there difference in the characteristics each gender uses to “select mates”: across cultures
Men: looking for signs of “reproduction capacity”: youth and attractiveness
Women: “resource acquisition”: resources and status
Other explanations? Eagly & Wood (1999)
Different socio-cultural factors for why men and women choose men
Past Research: science is an accumulative endeavor
Very convenient sources for research ideas
Look in the discussion section of any published study!!
Evaluate the methodology of a study, and try to improve it or even refute it
Choose to study a different sample (ex: instead of college student sample maybe do older adults)
Operationally define the variables a different ways
Library Research
Why do Literature Searches?
Learn about your topic
Make sure your topic is original
Give credit to people for previous work
Insights on research design
About Professional Journals
Conduct research, then submit to a journal for publication
Ditor receives it and decides if it will be reviewed
Most papers are rejected
If accepted, then it is assigned to at least 3 experts to review
Few are accepted pending revision, and even fewer are accepted as is.
Can take up to 2 years!
Types of Journal Articles
Empirical Research Articles:
Scholarly articles that: report on a research study that answer a specific question
Ex: Seifi et Al (2024): looking at the correlation between hyperinsulinemia (chronically high insulin levels) and mental health
When we have a lifestyle and diet that keeps our insulin levels chronically high, associated with increased levels of anxiety and depression.
What we are doing in class!!
B. Literature Reviews:
Summarize and evaluate empirical articles on a topic and summarize and evaluate them.
Scholarly articles that:
Mentioned in the article title (ex: “a review of the literature” or “ a systematic review”)
Read lit reviews of topics that are new to you
Psychological bulletin, Psychological Review & annual Review of Psychology
C. Meta-Analyses:
Scholarly articles that:
Collectively analyze data from several similar empirical studies using statistical research studies
Allow researchers to draw conclusions based on objective statistical procedures
Draws conclusions based on statistical
D. Theoretical Articles
Scholarly Articles that: summarize all information (empirical information) on a topic to understand behavior
Ex: Bandura (1977): “Self-efficacy: Toward a unifying theory of behavioral change”
Proposes a comprehensive theory that our beliefs about our own self-efficacy predict our outcomes when it comes to behavioral change
Beliefs about self-efficacy predict behavioral outcomes
People with high perceived self-efficacy are more likely to succeed.
Closer Look at Anatomy of an Empirical Research Article
Abstract:
A short, concise summary of the research, sort of like a sneak preview of the article
Information included:
Starts with a general statement about the import hypothesis then it gives a brief description
Introduction: a logical introduction of the current study’s research question as it relates to previous research
Information included: starts by discussing the problem under investigation
Deduce the variables and explain how they are defined
Go on to build an argument for the predictions: literature review
Method: A description of what you did in sufficient details to allow replication
Allows people to:
Evaluate the efficacy of your design
to allow the reader to replicate your study if they wanted to
Participants or subjects: referring to humans and non-human animals: describe sample size, age, gender and other key demographics of the sample
Apparatus/ materials: you describe any equipment, devices, questionnaires, surveys, videos used in the study
Procedure: describe what happens to your subjects from start to finish in detail.
Results:
A summary of the outcome(s) of the study
Information included:
Describe the scores being analyzed and what those scores mean, tell the reader what high and low scores mean.
Go into mean, median and mode: describe them
Go into the statistics → used to test you hypothesis
Use tables and figures to show your findings in an interesting and ethical way.
Discussion:
The concluding part of the paper in which the research gets to speculate about his findings
Information included:
Restating results, relating results back to the argument back in the introduction
Draw conclusions
Ok to speculate
In the discussion you can speculate why
Offer ideas for future research for the topic
Chapter 3:
Milgram’s experiment: Studied Obedience
Design of the study:
Participants: men from all walks of life
Task: “teach” another person (confederate) sets of words, if they didn't learn they would have to shock them
Findings: many people obeyed the “scientist”
B. importance of the study and Real world implications:
Would it be approved today?
Do the benefits outweigh the harm?
Do the findings apply to real-world situations>>
Historical Context of Current ethical standards:
Nuremberg code:
Nuremberg trials after WW2
10 rules of human rights and ethical rules for studies
Had no enforcement
The Nuremberg Code
B. Declaration of Helsinki:
Created in 1964 to have “teeth”: had enforcement mechanism
Published research must follow the Helsinki guidelines
C. Belmont Report
Public demand for action to protect people from mistreatment in research
The Natn’l Commission for the Protection of Humans Ss of Biomedical and Behavioral Research (1979)
Belmont Report
The three Basic ethical Principles of the Belmont Report:
Beneficence (and non-maleficence)
Respect for Persons (aka Autonomy)
Justice
Federal Regulations Regarding Research Ethics:
Institutional Review Boards and Animal Use Committees:
Institutional Review Boards (IRBs)
What they do
Why they were created
Who are the members?
B. Animal care and use committees (ACUC’s)
What they do
Scope of oversight
C. publishing or presenting research
D. Federal regulation also requires specific educational training
Required at most colleges and universities
Ex: CITI (collaborative IRB training initiative)
Determining Type of IRB Review Required
Two factors that define “research”
Is the project systematic?
Is the project seeking new, generalizable knowledge?
Handout 3b IRB Decision Tree
B. Question 1: Is the project even “Research?”
If the project ISN'T “research”....
If the project IS “research”, then go to next question
C. Question 2: does the study involve minimal risk or greater than minimal risk?
If the study involves greater than minimal risk…
If the study involves minimal risk, then there are three 3 possible choices for type of IRB review:
Exempt Review:
Expedited Review
Limited Review
Professional standards for Research ethics:
APA ethical principles of psychologists and code of conduct (In eCampus)
The APA ethics Code:
A set of principles and standards that guide the behavior or psychologists in all their roles
The Five Ethical Principles:
Principle A: Beneficence/Non-Maleficence
Principle B: Fidelity and Responsibility
Principle C: Integrity
Principle D: Justice
Principle E: Respect for people’s rights and dignity
Principle A: Beneficence and Non-maleficence:
Definition: Researchers should maximize the benefits of research and the well-being of participants while minimizing harm or distress
Risk-Benefit Analysis:
Potential benefits of Research:
Education d. money/ gifts
Skills e. Satisfaction
Treatment
2. Potential risks:
Physical Injury
Psychological Injury
Social Injury
Minimal Risk: Research procedures are similar to activities that the participants engage in everyday.
Dealing with Risk:
Be honest with participants
Make sure there is help available if needed after the study
Collect data anonymously
Or ensure confidentiality
What Do You Think? Do handout 3a, Part 1
Principle B: Fidelity and Responsibility:
Definition: Psychologists should establish relationships of trust with people they work with, and act in accordance with the professional and scientific responsibilities of the people they serve.
Make sure to hold up your end of the implicit contract with participants.
Principle C: Integrity:
Definition: Psychologists should seek to promote accuracy, honesty and truthfulness in the science, teaching and practice of psychology.
Dont cheat, steal, lie, engage in fraud or misrepresent facts.
Share your findings
Don't fabricate or falsify data
Why people might falsify data
When to be suspicious of fraud
Ex: stephen Bruening
D. researchers should never plagiarize.
Definition: misrepresenting someone else’s work as yours
Make sure you cite the original authors!
Plagiarism also includes using AI and passing the work off as your own!!
Principle D: justice:
Definition: Psychologists should ensure fairness and equity for everyone in accessing the benefits of the contributions of psychology (research benefits, access to clinical care).
Ensure fairness in receiving the benefits and bearing the potential risks
Involves equity in participant selection for research
Use scientific reasoning to select participants
Ex: excluding women from medical research
C. justice also involves protecting the vulnerable/ powerless
Before ethical regulations, marginalized groups were sometimes taken advantage of.
Ex: Tuskegee Syphilis Study (1932-1972).
Principle E: Respect for People’s Rights and Dignity (AKA Autonomy)
Definition: Researchers should respect that participants are independent, self directed ants who are able to make their own informed decisions about whether or not participate in research
Safeguards for vulnerable populations
Requires informed consent
Impact by coercion
Impacted by the use of deception
More on Informed COnsent Forms:
Social contract b/t researcher and participant
Information included in a consent form:
Purpose of the research
Procedures
Risks and benefits
Compensation (if applicable)
Assurance of confidentiality
Statement of willingness
Contact information for questions
C. ALWAYS get consent when there is anything beyond minimal risk.
D. Special cases:
Minors (assent)
People with cognitive impairments
People who may feel coerced
E. Research situations when you can dispense with informed consent:
Anonymous surveys with minimal risk
Archival research
Naturalistic observation and privacy is NOT expected
More on Deception:
It is not deceptive without your research hypothesis!
Types of Deception:
Passive deception
Active deception:
Misleading people about the nature of the research
Staging events
Giving participants misinformation about themselves
C. Deception May be necessary:
Certain research questions could not be explored without deception
Knowing influences behavior
D. Considerations:
Is the knowledge obtained worth it?
Is there another way without deception?
E.Debriefing= the researcher's opportunity to tell participants about the study, to clear up misconceptions & to deal with any potential negative effects of participating.
Allows participants to recover
Promotes positive feelings about the research
Can tell participants how the research benefits society.
Does debriefing even work?
Chapter 4:
Variables:
Introductory concepts:
Variable = any event, situation, behavior or individual characteristic that can change or can take on more than one value.
Levels: different values of a variable
Gender (2 levels)
Test score (infinite levels)
Using operational definitions:
Latent (hypothetical) constructs:
Ex: depression, anxiety, PTSD,
What we need to do to test latent variables scientifically
Operational definition (OD): defining a latent variable w/ the procedure or methods used to measure or manipulate it.
If you can't operational define a variable, you can't study it scientifically
OD’s allow scientists to communicate:
Ex: conceptual definition of romantic love
Short love scale: (SLS-12): Sternburg says you need all of these things to have love: definition of romantic love
Passion
Intimacy
Commitment
Operational Definition: scores on the short love scale
With this definition, scientists are able to talk, communicate and debate the constructs. Need a common language.
No single operational definition is perfect
Importance of Establishing Construct Validity
Must establish the accuracy/ validity of an operational definition
Validity refers to accuracy in measurement
Scientists are constantly evaluating validity
Construct validity tells us how accurate our operational definition is in quantifying our latent variable
Defining Relationships between Variables
Defining relationships between variables:
Purpose of most behavioral research
Ex: height and weight; exercise and relative physical health, TV watched & fear of crime
2. Linear Relationships (straight-line relationship)
Positive linear relationship (direct)
Variables change in the same direction
Ex: as study time ^, the exam grade ^
B. negative linear relationship: (inverse)
Variables chance in the opposite direction
Ex: as stress ^, white blood cell count v
3. Curvilinear Relationships:
Relationship between variables goes from negative to positive or from positive to negative
Ex: N→ P: U-Shaped function: ex: age and # of car accidents
Ex: P → N: Upside-down U-Shaped: Yerkes-Dodson Law: performance tends to increase with stress; but with too much stress performance tends to drop
4. No Relationship:
No predictable relationship between the variables
No discernible pattern on a graph
5. Reduction of Uncertainty:
Ultimate Goal of Science: to determine universal and predictable rules about how the world works
Easy for physical sciences (gravity)
Not so easy for behavioral science: so many different factors for behavior, they affect everyone differently
B. What is Uncertainty?
Uncertainty= “randomness in events”: (aka: random variability, random error, or error variance)
Behavioral research seeks to reduce uncertainty by identifying predictable relationships by identifying predictable relationships b/t variables
Example from text: Instagram Users:
You ask 200 adults in your town whether or not they use Instagram
Your best guess before you collect any data
Your best guess assumes that instagram use is random (ex: 50/50 chance of yes/ no): known as Random variability
Based on random variability, you’d be accurate in guessing preference 50% of the time
Lets say your next study, you add age to the mix and collect data from 100 adults aged 18-49 and 100 adults 50+
How accurate would you be at guessing preference?
By adding the variable of age, we have reduced uncertainty
Some uncertainty remains
Non-Experimental vs. Experimental research:
Non-experimental Methods:
Definition= study of the relationship b/t variables by observing and measuring the variables of interest.
Ways to observe and Measure:
Ask people to describe their behavior (survey)
Observe people’s behavior yourself
Take physiological measures (HR, BP, EEG, MRI)
Look at public records/ archives (SAT and GPA study)
Ex: The relationship between exercise and anxiety:
Operational define both variables:
Collect data: see if variables are related
Finds inverse relationship between variables: one goes up, while one goes down: they co-vary
C. Establish Covariation: the two variables change together in a predictable way
If you find a relationship between two variables, then you know that they co-vary
As one changes, the other changes in a predictable way.
D. limitations of the Non-Experimental Method:
Cannot determine the direction of the cause and effect (cant establish temporal precedence).
Doesn't know the cause and what's the effect (what came first)
The third variable Problem:
Confounding variables (extraneous variables)
Ex: Relationship between ice cream sales and shark attacks ( third variable being warmer weather)
2. Experimental Methods
Definition= the study of the relationship between variables by manipulating one variable and measuring its effects on another variable, while holding all other variables constant . - Can establish causation!!
Example: studying the relationship between exercise and anxiety:
Manipulate exercise to see impact on anxiety:
Create two groups: exercise vs. no exercise:
Allow us to establish covariation
Allows us to determine
Allows us to control for EV’s in 2 ways:
Experimental control: holding all variables constant (except for the variable your manipulating)
Randomization (random assignment into groups)
Chance becomes the great equalizer
Three criteria to establish causation: true experiment
Covaration
Temporal precedence
Ruling out all extraneous variables
Identifying variables:
Experimental methods: two/ three variables of
Independent variable: the presumed cause, the one we manipulate
Dependent variable: presumed effect, the one you measure
B. non-experimental methods:
Descriptive research: at least 1 Dependent variable
Correlational research: at least 2 dependent variables: looking at relationship
Quasi-experiments:
Quasi Independent Variable: instead of manipulating, use a characteristic of a participant:
Ex: gender, race, age, socio economic
Dependent variables: the one we measure
Flaws/ Limitations in Experimental Research: the more ways you study a phenomenon, the more you understand it
Artificallality: makes the research situation unnatural, changes responses people will make.
Would participants act the same way in a lab as they would in real life? Probably not.
Try a field experiment to prevent this!!
Still manipulate a independent variable but its in a natural setting
Don't have the same amount of control though
Some variables can be experimented on: either for practical reasons or ethical reasons:
Cannot do a true experiment of the effects of child abuse on cognative development
Why? Cause that's unethical silly!
Alternative: use a quasi experiment: participants are pre-assigned due to personal experiences
Sometimes research questions don't need experiments:
Some science just wants to describe phenomena: just use descriptive research!
Piaget's Cognitive development theory: developed his entire theory by observing
Frued’s theory of personality development: developed theory by case studies
Developed his theory by observing his clients
4. Criteria for Establishing Causation:
Covaration:
Temporal precedence:
Ruling out Alternative explanations:
Choosing a research method:
Experimental control and artificiality
Cost and benefit of strict experimental control
Limits the questions you can ask
Changes the responses that people would make
B. Non-Experimental Alternative: field Experiment:
Manipulate an IV in a natural setting
Less control than a lab experiment
Ex: Latane, compared helping behavior in men and women
2. Ethical and practical limitations of experiments
Some variables can’t be studied experimentally
Non-experimental alternative: Quasi-Experiments (Ex Post facto Designs)
Ex: child abuse and reading comprehension; alcholism and depression
Participant variables are important to study!
3. Some questions are not answerable by experiments:
Sometimes science simply seeks to describe phenomena
Non-experimental alternatives meet this goal:
Piaget and cognitive development (observation)
Freud & personality development (case study)
Bem and Sex roles in society (survey)
C. Sometimes science seeks to make predictions about behavior
D. non-experimental alternatives meet this goal:
Ex: family Hx and breast cancer
Ex: SAT and college GPA
5. Using multiple methods
No research method is perfect
All the methods have their advantages and disadvantages.
Ask a question using multiple methods to increase confidence in your conclusions
Evaluating the accuracy of research: Validity:
Validity: the extent to which, given everything that is known, a conclusion that we draw from a research study is reasonably accurate.
Types of validity:
Research methods can be described and evaluated in terms of various types of validity.
The 4 that are most relevant to this chapter:
1 Construct
2. External
3. Internal
4. Conclusion
3. Construct validity: Defining hypothetical constructs:
Refers to how accurately the operational definitions you choose accurately represent the hypothetical variables that you’re trying to study
There are multiple possible operational definition for every latent variable
We’ll discuss specific ways to establish construct validity for operational definitions more in the nest chapter: They include:
Face validity
Content validity
Criterion-related validity
4. External validity: Generalizing results
Refers to the extent to which a study’s findings can be generalized to other populations and/or setting
Generalizing from sample to population
Discrepancy in participants’ behavior/ responses
Ex: Diffusion of responsibility (Kitty Genovese)
5. Internal validity: Establishing caution:
Refers to the degree to which our study design allows us to determine cause and effect
Only an issue with true experiments
Experimental control, internal validity and external validity.
Which is more important?
Depends on your research goals
6. Conclusion Validity: Making accurate Decisions
Refers to degree to which our conclusions are credible or believable based on our research design
Based partly on accuracy of our statistical inferences
When drawing statistical inferences from data we can make correct decisions or errors:
1. Correct decisions:
Say there is a relationship and there really is one!
Say there is no relationship and there really isn't one!
2. Errors:
Type 1 Error: Say there's a relationship and there isn't :(
Type 2 Error: Say there's no relationship and there is
Implications:
Type 1 Error can lead to false conclusions and misguided actions based on incorrect assumptions.
Type 2 Error may result in missed opportunities and the failure to recognize significant patterns or connections.