Producer of research- know about research methods and how to measure accuratelyÂ
consumer of research- be able to read about reserach with curiosity and a critical eyeÂ
empiricism- answering psychological questions with direct, formal observationsÂ
theory-set of statements taht describe the general principles about how variables relate to one anotherÂ
hypotheses- predictionÂ
theory leads to specific hypotheses about the answersÂ
scientific normsÂ
- universalism- scientific claims are evaluated according to their merit, and the same preestablished criteria apply to all scientists and all researchÂ
- communality- scientific knowledge is created by a community and its findings belong to the communityÂ
- disinteredness- scientists try to discover teh truth whatever it is, not swayed by conviction, idealism, politics, or profitÂ
- organized skepticism- questioning everything, including thei rown theories, widely accepted ideas, and "ancient wisdom"Â
Basic research- enhances the general body of knowledge rather than address a specific, practical problemÂ
applied reserach- done with practical problem in mind and the researchers conduct their work in a local, real- world contextÂ
translational research- use of lessons from basic research to develop and test applications to health care, psychotherapy, or other forms of treatment and interventionÂ
peer review cycle- journal editor sends the paper to three or four experts on the subject. Â
comparison groups- enable us to compare what would happen both with and without the thing we are interested in Â
probabilistic research- its findings do not explain all cases all of the timeÂ
availability heuristic- things that pop up easily in our mind tend to guide our thinkingÂ
present/present bias- reflects our failure to consider appropriate comparison groupsÂ
confirmation bias- only looking at information that agrees with what we want to believeÂ
bias blind spot- being biased about being biasedÂ
empirical journal articles- report, for the first time, the results of an empirical research studyÂ
review journal articles- summarize and integrate all the published studies that have been done in one research areaÂ
meta analysis- combines the results of many studies and gives a number that summarizes the magnitude, or the effect size of a relationshipÂ
variable- something that variesÂ
- has two levelsÂ
constant- something that could potentially vary but has only one level in the studyÂ
measured variable- one whose levels are simply observed and recordedÂ
manipulated variable- a variable a researcher controls, usually by assigning study participants to the different levels of that variableÂ
claim- an argument someone is trying to makeÂ
frequency claims- describe a particular rate or degree of a single variableÂ
association claims- argue that one level of avariable is likely to be associated with a particular level of another variableÂ
correlate- one variable changes, the other variable tends to change tooÂ
validity- appropriateness of a conclusion or decision, and in general, a valid claim is resonable, accurate, and justifiableÂ
construct validity-Â how well a conceptual variable is operationalizedÂ
generalizability- how did the reserachers choose the studys participants, and how well do those participants represent the intended populationÂ
external validity- how well the results of a study generalize to, or represent, pple or contexts besides those in the original studyÂ
statistical validity- statistical conclusion validityÂ
- the extent to which a studys statistical conclusions are precise, reasonable, and replicableÂ
point estimate- usually a percentageÂ
confidence interval- a range designed to include the true population value a high proportion of the timeÂ
margin of error- calculated so that we know how close is the claim to the true frequencyÂ
correlational studies support association claims, measuring two variablesÂ
internal validity- a study's ability to eliminate alternative explanations for the associationÂ
criteria for establishing causationÂ
- temporal precedence- the cause must occur before the effectÂ
- strong correlation between the cause and effectÂ
- the ability to rule out alternative explanations for the observed relationshipÂ
historical examples of unethical researchÂ
-tuskegee syphilis study- no informed consent, witholding effective treatment for syphilis, targeted a vulnerable population of african american menÂ
milgram obedience studies- deception, participants not fully informed, lack of informed consent, psychological harm, pressure to continue, limited right to withdrawÂ
belmont report principles of ethicsÂ
- principle of respect for persons- individuals potentially involved in research should be free to make up their own mind about whether they wish to participate in a research study, and informed consent. Pple who have less autonomy are entitled to special protection when it comes to informed consentÂ
- principle of beneficence- do no harmÂ
- principle of justice- fair balance between the kinds of people who participate in research and the kinds of pple who benefit from it- researchers consider the extent to which the participants involved in a study are representative of the kinds of people who would also benefit from its resultsÂ
APA's five general principles- respect for persons, justice, beneficence, fidelity and responsibility, and integrityÂ
fidelity and responsbility- establish relationships of trust; accept responsibility for professional behaviorÂ
integrity- strive to be accurate, truthful, and honest in one's role as researcher, teacher, or practitionerÂ
deceptionÂ
-through omission- witholding some detailsÂ
- through commission- actively lying to participantsÂ
debriefing- when researchers use deception, they must spend time after the study talking with each participant in a structured conversation, describing the nature of the deception and explain why it was necessaryÂ
data fabrication- instead fo recording what really happened in a study, researchers invent data that fit their hypothesesÂ
data falsification- when researchers influence a study's results, perhaps by selectively deleting observations from a data set of by influencing their research subjects to act in the hypothesized wayÂ
questionable research practicesÂ
- scientists should report their own data objectively and make their data public (communality), even when the results do not support their hypotheses (disinteredness)Â
IRB- protects human participantsÂ
IACUC- protects animal subjects in research studiesÂ
self report measure- operationalizes a variable by recording people's answers to questions about themselves in a questionnaire or interviewÂ
observational measure- operationalizes a variable by recording observable behaviors or physical traces of behaviorsÂ
physiological measure- operationalizes a variable by recording biological data, such as brain activity, hormone levels, or heart rateÂ
ordinal scale- applies when the numerals of a quantitative variable represent a ranked orderÂ
interval scale- applies to the numerals of a quantitative variable that meets two conditions- no true 0, equal intervals between levelsÂ
ratio scale- applies when the numerals of a quantitative variable have equal intervals and when the value of 0 truly means "none" fo the variable being measuredÂ
reliability- how consistent the results of a measure areÂ
validity- whether the operationalization is measuring what it is supposed to measureÂ
test- retest reliability- a study participant will get pretty much the same score each time they are measured with itÂ
interrater reliability- consistent scores are obtained no matter who measures the variableÂ
internal reliability- a study participant gives a consistent pattern of answers, no matter how the researchers phrase the questionÂ
correlation coefficient- indicate how close the dots, or points, on a scatterplot are to a line drawn through themÂ
average inter item correlation- average of all these correlations Â
cronbach's alpha mathematically combines the AIC and the number of items in the scale. closer to 1.0, the better the reliabilityÂ
face validity- subjectively considered to be a plausible operationalization of the conceptual variable in questionÂ
content validity- requires knowledge of the conceptual definition- a measure must capture all parts of a defined constructÂ
criterion validity- evaluated whether the measure under consideration is associated with a concrete behavioral outcome that it should be associated with, according to the conceptual definitionÂ
physical trace measurements- research method where researchers collectd ata by observing and analyzing phsyical evidence left behind by peoples past behaviorÂ
Â