MM

Untitled Flashcards Set

Producer of research- know about research methods and how to measure accurately 

consumer of research- be able to read about reserach with curiosity and a critical eye 

empiricism- answering psychological questions with direct, formal observations 

theory-set of statements taht describe the general principles about how variables relate to one another 

hypotheses- prediction 

theory leads to specific hypotheses about the answers 

scientific norms 

- universalism- scientific claims are evaluated according to their merit, and the same preestablished criteria apply to all scientists and all research 

- communality- scientific knowledge is created by a community and its findings belong to the community 

- disinteredness- scientists try to discover teh truth whatever it is, not swayed by conviction, idealism, politics, or profit 

- organized skepticism- questioning everything, including thei rown theories, widely accepted ideas, and "ancient wisdom" 

Basic research- enhances the general body of knowledge rather than address a specific, practical problem 

applied reserach- done with practical problem in mind and the researchers conduct their work in a local, real- world context 

translational research- use of lessons from basic research to develop and test applications to health care, psychotherapy, or other forms of treatment and intervention 

peer review cycle- journal editor sends the paper to three or four experts on the subject.  

comparison groups- enable us to compare what would happen both with and without the thing we are interested in  

probabilistic research- its findings do not explain all cases all of the time 

availability heuristic- things that pop up easily in our mind tend to guide our thinking 

present/present bias- reflects our failure to consider appropriate comparison groups 

confirmation bias- only looking at information that agrees with what we want to believe 

bias blind spot- being biased about being biased 

empirical journal articles- report, for the first time, the results of an empirical research study 

review journal articles- summarize and integrate all the published studies that have been done in one research area 

meta analysis- combines the results of many studies and gives a number that summarizes the magnitude, or the effect size of a relationship 

variable- something that varies 

- has two levels 

constant- something that could potentially vary but has only one level in the study 

measured variable- one whose levels are simply observed and recorded 

manipulated variable- a variable a researcher controls, usually by assigning study participants to the different levels of that variable 

claim- an argument someone is trying to make 

frequency claims- describe a particular rate or degree of a single variable 

association claims- argue that one level of avariable is likely to be associated with a particular level of another variable 

correlate- one variable changes, the other variable tends to change too 

validity- appropriateness of a conclusion or decision, and in general, a valid claim is resonable, accurate, and justifiable 

construct validity-  how well a conceptual variable is operationalized 

generalizability- how did the reserachers choose the studys participants, and how well do those participants represent the intended population 

external validity- how well the results of a study generalize to, or represent, pple or contexts besides those in the original study 

statistical validity- statistical conclusion validity 

- the extent to which a studys statistical conclusions are precise, reasonable, and replicable 

point estimate- usually a percentage 

confidence interval- a range designed to include the true population value a high proportion of the time 

margin of error- calculated so that we know how close is the claim to the true frequency 

correlational studies support association claims, measuring two variables 

internal validity- a study's ability to eliminate alternative explanations for the association 

criteria for establishing causation 

- temporal precedence- the cause must occur before the effect 

- strong correlation between the cause and effect 

- the ability to rule out alternative explanations for the observed relationship 

historical examples of unethical research 

-tuskegee syphilis study- no informed consent, witholding effective treatment for syphilis, targeted a vulnerable population of african american men 

milgram obedience studies- deception, participants not fully informed, lack of informed consent, psychological harm, pressure to continue, limited right to withdraw 

belmont report principles of ethics 

- principle of respect for persons- individuals potentially involved in research should be free to make up their own mind about whether they wish to participate in a research study, and informed consent. Pple who have less autonomy are entitled to special protection when it comes to informed consent 

- principle of beneficence- do no harm 

- principle of justice- fair balance between the kinds of people who participate in research and the kinds of pple who benefit from it- researchers consider the extent to which the participants involved in a study are representative of the kinds of people who would also benefit from its results 

APA's five general principles- respect for persons, justice, beneficence, fidelity and responsibility, and integrity 

fidelity and responsbility- establish relationships of trust; accept responsibility for professional behavior 

integrity- strive to be accurate, truthful, and honest in one's role as researcher, teacher, or practitioner 

deception 

-through omission- witholding some details 

- through commission- actively lying to participants 

debriefing- when researchers use deception, they must spend time after the study talking with each participant in a structured conversation, describing the nature of the deception and explain why it was necessary 

data fabrication- instead fo recording what really happened in a study, researchers invent data that fit their hypotheses 

data falsification- when researchers influence a study's results, perhaps by selectively deleting observations from a data set of by influencing their research subjects to act in the hypothesized way 

questionable research practices 

- scientists should report their own data objectively and make their data public (communality), even when the results do not support their hypotheses (disinteredness) 

IRB- protects human participants 

IACUC- protects animal subjects in research studies 

self report measure- operationalizes a variable by recording people's answers to questions about themselves in a questionnaire or interview 

observational measure- operationalizes a variable by recording observable behaviors or physical traces of behaviors 

physiological measure- operationalizes a variable by recording biological data, such as brain activity, hormone levels, or heart rate 

ordinal scale- applies when the numerals of a quantitative variable represent a ranked order 

interval scale- applies to the numerals of a quantitative variable that meets two conditions- no true 0, equal intervals between levels 

ratio scale- applies when the numerals of a quantitative variable have equal intervals and when the value of 0 truly means "none" fo the variable being measured 

reliability- how consistent the results of a measure are 

validity- whether the operationalization is measuring what it is supposed to measure 

test- retest reliability- a study participant will get pretty much the same score each time they are measured with it 

interrater reliability- consistent scores are obtained no matter who measures the variable 

internal reliability- a study participant gives a consistent pattern of answers, no matter how the researchers phrase the question 

correlation coefficient- indicate how close the dots, or points, on a scatterplot are to a line drawn through them 

average inter item correlation- average of all these correlations  

cronbach's alpha mathematically combines the AIC and the number of items in the scale. closer to 1.0, the better the reliability 

face validity- subjectively considered to be a plausible operationalization of the conceptual variable in question 

content validity- requires knowledge of the conceptual definition- a measure must capture all parts of a defined construct 

criterion validity- evaluated whether the measure under consideration is associated with a concrete behavioral outcome that it should be associated with, according to the conceptual definition 

physical trace measurements- research method where researchers collectd ata by observing and analyzing phsyical evidence left behind by peoples past behavior 

Â