Research methods Ch 1-2
critical thinking
- systematically evaluating information to reach conclusions based on the evidence presented
* steps in critical thinking:
* (1) what is the claim
* (2) what is the evidence (if any)
* personal experience
* research
* (3) is the evidence flawed or biased
* alternative explanation
* (4) given the evidence, should the claim be rejected or accepted
* (5) if accepted, what degree of confidence can we have
Benefits of knowing research methods
- in life:
- careers:
* mental health
* business
* politician
“This I believe”
- personal experience (self or others)
* informal observations (anecdotal evidence) - systematic observation (research)
- authority
* qualifications
* evidence
limits of personal experience
- faulty perception
- faulty memory
- faulty thinking
* prone to logical fallacies (handout)
* confirmation bias
other problems with personal experience
- (1) lacks adequate comparison conditions
* present-present bias (example of confirmation bias)
* focus on the presence of an event and the presence of an outcome
* fail to look for the lack of evidence
* noticing the hits/ignoring the misses - (2) it is confounded
* other plausible alternative explanations
science as a way of knowing
empirical research is designed to minimize these biases
- empirical
* based on systematic observations
* “systems” minimize bias; rule out alternative explanations - theory-data cycle
* theory
* explanation of how things generally relate (general principles)
* not about a specific person or situation
* hypothesis
* -a testable prediction about how specific things should relate in a specific context given some theory
* (1) needs to be replicated
* (2) science cannot “prove” theories
* why? because we cannot test all possibility
* data
* observations
*
*
*
science as a way of thinking
- qualities of a good theory
* supported by a large quantity and variety of evidence
* falsifiable
* testable claims, thus rejectable
* the more testable /falsifiable the better
* parsimonious
* the simpler (fewer assumptions, etc.) the better - kinds of research questions
* basic
* designed to increase general knowledge
* translational
* designed to develop an application
* applied
* designed to solve problems in the real world - results made public
* to other scientists → conferences, journals
* peer review process
* peer input continues after publication science is self-correcting
* allows for replication
* to the public → magazine, newspaper, TV, etc.
* be cautious
* why?
* maybe based on weak/flawed evidence
* journalist may misreport research or modify conclusions - really good theories, with no valid alternatives, get treated as” facts”
* Note: science does not provide absolute “truth”
* even “facts” are always tentative - reject theories (or claims) when multiple evidence is disconfirming
- \
* claim→ older adults are more depressed than younger adults
Example: theory of evolution
- supported by a large quantity and variety of evidence
consulting scientific sources
- scientific journals
class notes 1/13/23
consulting scientific sources
- scientific journals
types of scientific articles
- empirical journal article → actual study
* contains the following sections.
* introduction
* methods
* results
* discussion - review journal articles
* (a) qualitative → summary of empirical studies
* (b) meta-analysis
* quantitative → takes raw data from published studies.
* new statistical analysis to estimate effect size.
journals
- be careful.
* not all journals are legit.
* “predatory”
* if unsure
* (a) check that it is peer-reviewed
* (b) check the “impact factor (>1.0)
* indicates how often its articles get cited
* how?
* journals homepage or internet search
how to find scientific sources
- database via library
* e.g. PsycINFO
* searches only psy sources
* indicates peer review
* many useful tools - google scholar
* searches all journals
* limitations
non-journal, scientific sources
- edited books (handbooks) less rigorous peer review always reviews of research
- academic books
- dissertations
* new empirical studies or reviews done by P.H.D students
* not peer-reviewed
* should not cite - trade books and magazine articles
* how to evaluate
* how much do they cite other research
descriptive statistics
- ways to organize and summarize data
* frequency histogram - two main things to represent.
* (1) measure of central tendency
* what do scores “center” on
* (2) measure of variability
* how spread out are the scores
central tendency
- which to use
* usually, we use the mean
* but… problem of “outliers” - very unusual data point(s)
variability
- range
- standard deviation (SD)
* average difference of scores from the mean
* what if we just add up the deviation scores =0
* solution = square the score
* sum of all squared scores divided by sample size is called “variance”
* standard deviation =square root of the variance
class notes 1/18/23
relative standing of a score
- often we want to express how a single score differs from the other scores
- 2ways
* percentile ranked
* the percentage of scores that fall below the score
* 10th percentile =10% of score fall below that score
* 95th percentile = 95% of scores fall below that score - z-score
* how far the score falls above or below the mean expressed in unit of SD
* z-score = (score -mean) /SD
normal distribution
why normal distribution is important
- symmetrical
- how the area under the curve represents possibility of getting a particular score or higher
- when z is negative area beyond
- IQ score distrabution : mean=100 SD=15
- 85 or higher
* what is the z score of 85
* 85 has a z-score of 1
* the difference -100 = awnser - sampling error
* the difference in samples not the same - amount of sampling error depends on sample size
- standard error
- important for conducting statistical analyses (assessing probabilities )
- usually , if the probability is less than 5% of getting a score that extreame we say it is ststistically significant
* alpha level - this is an example of inferential statistics.
ch 3
- three claims, four validities
- interrogation tools for consumers of reasearch
variables
- variable - is something that varies between people and conditions.
* at least two levels of values in a study - depending on the study , some potential variables are heald constant
* if so it is not considered a variable in that study
measured vs. manipulated variables
- measured → only documented
- manipulated → something the researcher varies at the start of a study
* assign participants to different levels of a variable.
* some cannot be manipulated at all
* age/sex/ethnicity/height
* some cannot be manipulated for ethical reasons
* \
* abuse vs. no abuse, poor va. healthy diet, schoolingvs. no schooling, social isolation - \