Empiricism
using evidence from the senses or from instruments that assist the senses as the basis for conclusions
basic research
research done to enhance the general body of knowledge ◦E.g., measuring motivations in depressed v. non-depressed individuals
translational research
uses findings from basic research to develop and test applications ◦E.g., developing a therapy for depression that affects motivation
applied research
research done with a practical problem in mind. E.g., developing new therapy for depression
theory
a set of statements that describes general principles about how variables relate to one another e.g., What is the link between violent video games and aggression?
hypothesis
a way of stating the specific outcome the researcher expects to observe if the theory is accurate (PREDICTION) ◦e.g., Playing a violent video game will increase subsequent aggression
3 features of a good theory
supported by data
falsifiable
parsimonious
Parsimony (Occam's Razor)
the simplest explanation is the best
Falsifiability
Can the claim be disproved?
theories are often unfalsifiable
Why is proof impossible? If we cannot prove, then what do we do in science?
we never actually 'prove' anything ◦Our theories can only be consistent with the data (or 'not yet falsified')
A scientific view of the world simply represents the weight of converging evidence ◦Given the data, what is most likely to be correct
Scientific method
A series of steps followed to solve problems including collecting data, formulating a hypothesis, testing the hypothesis, and stating conclusions.
Theory-Data Cycle
theory, research questions, research design, hypothesis, data
Personal experience
relying on personal experience can lead us to erroneous conclusions
lack of comparison groups
probabilistic Ex: working out clears my head
Confounds- other factors which may have caused a change in your outcome of interest
intuition
◦We tend to reinforce our previously-held ideas
authority
◦Authority figures are subject to the same biases produced by personal experience and intuition that we are
ex; learning styles
How do confounds and comparison groups affect these sources of information
What is meant by real-world events being probabilistic?
findings will not explain all individuals all of the time- only on average
Good-story heuristic
if it sounds plausible, believe it
availability heuristic
we judge the likelihood and frequency of an event by the ease with which relevant examples come to mind
confirmation bias
we seek out and attend to evidence that confirms our beliefs, but ignore evidence that contradicts them
confirmatory hypothesis testing
the tendency to ask only the questions that will lead to the expected answer
empirical scientific thinking
based on actual data, not just intuition or personal experience
objective scientific thinking
◦utilizes clearly defined methods that allow others to collect and evaluate the same data
systematic scientific thinking
observations are structured so that they can speak directly to the issue being evaluated
variable levels
nominal, ordinal, interval, ratio
measured variable
a variable that is being observed/recorded
manipulated variable
: a variable that is directly under the control of the researcher
independent variable
variable that is manipulated
dependent variable
The outcome factor; the variable that may change in response to manipulations of the independent variable.
operational definition
defines a concept by stating precisely how the concept is measured and/or manipulated in a particular study
frequency claim
describe a particular rate or level of a single variable
Ex: 10% of Americans regularly exercise
association claim
state that one variable is associated with another variable ex; aggression is higher during the warmer months
causal claim
states that one variable is responsible for/caused changes in another variable
ex: alcohol consumption impairs motor skills
Positive correlation
two variables change in the same direction, both becoming either larger or smaller
negative correlation
as one variable increases, the other decreases
no association
when the points in a scatter plot do not have any pattern
construct validity
umbrella term to determine if the underlying, unobservable psychological entity is actually being measured.
4 big validities
construct, external, statistical, internal
Type 1 error
false positive
type 2 error
false negative
3 criteria for causality********
covariance, temporal precedence, internal validity
ethical issues of the Milgram Obedience Study
-extreme stressful to teacher participants
some experienced long-term guilt after being debriefed
balanced potential risk to participants and the value of knowledge
ethical issues of the Tuskegee Syphilis experiment
human subjects werent treated respectfully, harmed, members of disadvantaged group
Belmont Report
Purpose was to discuss ethical principles researchers should follow ◦Called partly in response to Tuskegee study Provided three main guiding principles for research: 1.Respect for Persons 2.Beneficence Justice
additional ethical principles by the APA
IRB- Institutional Review Board
informed consent
deception -debriefing
data fabrication and falsification
Anonymous vs confidential data
◦Anonymous—researchers do not collect any information that could be used to identify (e.g., name, birthdate)a participant
◦Confidential—researchers collect some information that could be used to identify (e.g., name, birthdate) a participant, but they prevent its disclosure
informed consent
◦Researcher must explain the study to participants in everyday language and give them a chance to decide whether to participate.
deception
Researchers withhold some detail(s) of the experiment through omission or commission:
Debriefing
◦Describe the nature of the deception and why it was necessary ◦Always provided in studies with deception
data fabrication
◦Researcher invents/creates data that fits hypotheses
data falsification
◦Researcher influences study results by: ◦Selectively deleting data ◦Influencing participants to behave in hypothesized way
observational measure
◦Recording observable behaviors or physical traces of behaviors
physiological measures
◦Recording biological data such as brain activity, hormone level, or heart rate
self-report measures
◦Recording responses to ratings scales, questionnaires, and/or interviews Multiple rates possible for same question
scales of measurement
nominal(names, ex: eye colors, religions) , ordinal, interval, ratio
categorical variable
levels are categories
quantitative/continuous variable
levels are coded with meaningful numbers along a continuum.
3 types of quantitative variables
ordinal- order (1st, 2nd, level of education interval- equal intervals (temperature, IQ) ratio- Zero means something, height weight
Reliability
how consistent the results of a measure are.
Validity
whether the operationalization of a construct is measuring what it is supposed to measure
relationship between reliability and validity
possible to be reliable and not valid
not possible to be valid and not reliablfe
test-retest reliability
simply involves measuring the same individuals twice and correlating their scores
interrater reliability
reliability focuses on whether two individuals use the operational definition the same way
internal reliability
focuses on whether people give consistent responses to the test uses cronbachs alpha
5 types of construct validity
1.Face validity 2.Content validity 3.Criterion validity 4.Convergent validity Discriminant validity
Which of the types of construct validity are subjective and which are empirical?
subject: face and content empirical; convergent, criterion, and discriminant
convergent validity
asks whether a measure is correlated with measures of similar constructs ◦E.g., Does a measure of intelligence correlate with other aptitude measures (e.g., Stanford-Binet vs. Wonderlic)
discriminant validity
asks whether a measure is correlated with measures of dissimilar constructs
external validity
extent to which we can generalize findings to real-world settings