Psychology & Statistics 221 Ultimate Review l

0.0(0)
studied byStudied by 10 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/103

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

104 Terms

1
New cards

Describe the five methods of knowing & their benefits and problems?

  1. Intuition (Quick misleading decisions)

  2. Authority (Allows us to learn from authority figures but authority figures can be wrong/biased)

  3. Rationalism (Using logic can help us to come to a conclusion or invalidate our findings)

  4. Empiricism (Limitations to our experiences, but the scientific method relies on observations)

  5. Scientific method

2
New cards

What is the Definition of Science

The general approach to understanding the natural world

3
New cards

What are the fundamental features of science?

  1. Systematic empiricism

  2. Empirical questions

  3. Public Knowledge “Stand on the shoulders of past scientist”

4
New cards

What are the three goals of science?

  1. To Describe

    • Achieved via careful observations/examination relationships

  2. To Predict

    • When two events co-occur we can use that knowledge to test a causal question

  3. To Explain

    • To determine the causes

5
New cards

What is the difference between basic and applied research?

Basic research refers to research that details an understanding of human behavior

  • “A basic understanding”

Applied research refers to research that addresses practical problems

  • “You can apply it to real life”

6
New cards

Why can’t we relay on our common sense? Examples?

Humans easily fall into their own beliefs and are influenced by authority figures that we cannot trust ourselves

  1. Confirmation bias

  2. Hindsight bias

7
New cards

A person’s tendency to believe or argue upon facts that confirm their beliefs can be referred to as _____________ bias.

Confirmation bias

8
New cards

Mental shortcuts formed to maintain our beliefs and knowledge can be referred to as _______ bias?

Hindsight bias

9
New cards

What importance did the Belmont Report of 1979 have in moral principles?

Considered the importance of respect for persons (autonomy), Beneficence (maximize benefits, minimizing harm), and Justice (Equal treatment)

10
New cards

In accordance to the ethical codes, what are threats to autonomy?

Coercion (excessive benefits), deception, allowing participants incapable of making informed decisions

11
New cards

What were examples of historical written codes used to help provide guidance on ethical issues?

  1. Nuremberg code - set of 10 ethical principles after the Nazi mistreatment

  2. Declaration of Helsinki - If working with humans, people need to provide a protocol

  3. Belmont Report - Importance in justice, beneficance, and autonomy

  4. Federal Policy for the protection of human subjects

  5. IRB (Institutional review board) - Decided if benefits > risks

12
New cards

What are the characteristics of the APA Ethics Code?

  1. Research must do NO HARM

  2. Institutional Approval

  3. Privacy & Confidentiality

  4. Competence

  5. Record Keeping

  6. Informed consent

  7. Avoid Inducements

  8. Deception

  9. Debriefing

13
New cards

In what circumstances are researches able to dismiss inform consent?

If no harm occurs to participants (Natural observations), or anonymous questionaries

14
New cards

When is deception justified for research use?

If no alternatives are available and there’s no harm to participants; At the end of the study researcher must debrief the purpose of the study

15
New cards

How do we come up with research questions?

Common sense, past research, theories, observations, and practical problems

16
New cards

When searching for a research topic, what should a person search for?

a. Sources that help to refine research question

b. Identify possible methods

c. Add context for the study - Has there been previous literature on the topic

d. Focus on recent studies

e. Find 50 reliable sources

17
New cards

True or False: A research question should be empirically testable and abide by either an interest, a practical implication, a gap in knowledge, or a feasible (replicable)

True!

18
New cards

True or False: A theory refers to a specific predication about a new phenomenon

False! A hypothesis refers to a specific predication about a new phenomenon, while a theory refers to the systematic body of ideas about a particular topic helping to generate new ideas

19
New cards

A good hypothesis includes the following characteristics: _______

  1. Testable

  2. Falsifiable

  3. Logical

  4. Positive

20
New cards

Contrast Inductive and Deductive reasoning?

Inductive reasoning refers to reasoning based on observations

Deductive reasoning refers to reasoning based on previous knowledge

21
New cards

A quantity or quality that varias across people and situations refers to a ______

Variable

22
New cards

A variable that takes on a finite number of categories refers to a __________ variable. The nominal and _____ scale of measurements are included.

Categorical ; Ordinal

23
New cards

A variable that takes an infinite number of variables between any two values refers to a _______ variable. The scale of measurements include ratio and ______ scales.

Continuous ; Interval

24
New cards

True or False: A definition of the variable in terms of precisely how it is measured refers to a psychological definition.

False! A psychological definition refers to the conceptual definition of a variable in theoretical terms (e.g anger or happiness). The answer would be operational definition!

25
New cards

When designing a research study, what should a researcher consider in planning the study?

  1. Identifying and defining the variables and providing an operational definition for each

  2. Sampling & measurements

  3. Experimental or non-experimental research design

  4. Determining the location of the study (lab vs. field)

26
New cards

What makes an experimental design different from a non-experimental design?

A manipulation of variables, controlling for confounds, and examining an outcome, while non-experimental soley measures variables

27
New cards

What type of validity should a researcher consider when performing laboratory research?

Consider the study's internal validity- Can we infer a causal relationship between variables? Are there 3rd confounding variables explaining the relationship?

In lab settings: Internal validity tends to increase

28
New cards

What type of validity should a researcher consider when performing field research?

External validity - Are they able to generalize their findings to other settings?

In field settings: External validity tends to increase

29
New cards

In order to analyze collection of data the following statistics are used _______ and ________

Descriptive and inferential statistics

30
New cards

What is a measurment?

Assignment of scores to individuals so that the scores represent some characteristic of the individuals

31
New cards

What are the four levels of measurements?

Nominal, Ordinal, Interval, and Ratio

32
New cards

A measure’s ability to yield consistent results each time it is applied refers to ________

Reliability

33
New cards

Errors occur in all experiments, what are the three types of errors that could occur?

  1. Measurement error - Natural error that would occur

  2. Random error - variation in the true score due to unforeseen circumstances

  3. Systematic error - Error thats consistently pushing scores higher or lower

34
New cards

Sally entered a room filled many children her age where they’ll be asked questions about the bullying occurring in their schools. Sally was faced with the researcher asking her the questions, and did not want to appear as a snitch to her classmates, so she answered the questions as if she was never aware of the situation. What type of bias is presented?

Participant bias

35
New cards

Sally was given a survey to complete in regards to her personality that would be shared back to her teacher. Sally wanted to appear likable and smart in front of her teacher, so she decided to pick answers based on those characteristics. What type of bias would result in this survey having systemic error?

Social desirability bias

36
New cards

What are ways of measuring the reliability in methods?

Test-retest (over time), alternative forms, Split-half, Internal consistency (across items), and inter-rater (Across different researchers)

37
New cards

If the construct is assumed to be stable over time, then the scores collected over time should be consistent therefore we refer to this as _______ reliability.

Test-retest reliability

38
New cards

If a researcher is assessing items of the same construct but the scores vary over time, what type of reliability should we be concern about?

Internal Consistency

39
New cards

What is the definition of interrater reliability?

The agreement (consistency) there is between raters of the same behavior

40
New cards

What statistics can be used to measure interrater reliability?

Quantitative measures: Cronbach alpha

Qualitative measures: Cohen’s Kappa

41
New cards

What is the definition of chronbachs alpha?

A commonly used method for evaluating the internal consistency of a measure via tying all together the scores and relating them back to the construct

42
New cards

What are four ways researchers able to increase the reliability of their study?

  1. Blind or double blind studies

  2. Standardized procedures and measures

  3. Increase the number of items in a survey or questionnaire

  4. Make it more difficult for participants to guess your hypothesis

43
New cards

The extent an empirical measure accurately reflects the true meaning of the content under consideration refers to ______

Validity

44
New cards

The degree to which a measure is measuring the construct that is claimed to measure refers to ______ validity?

Construct validity

45
New cards

The extent to which scores on the measure in question are related to scores on the other measures of the same construct or similar construct refers to ______ validity.

Convergent validity

46
New cards

What is the definition of divergent validity?

Constructs that should not have any relationship

47
New cards

The extent to which a method appears “on its face” to measure the construct of interest refers to _____ validity?

Face validity

48
New cards

What is the definition of content validity?

The extent to which a method includes items to cover all aspects of the construct

49
New cards

The extent to which peoples scores on a measure are correlation with other variables that one would expect them to be correlated with refers to ____ validity?

Criterion validity

50
New cards

The extent to which a score on a test predicts scores on some criterion measure refers to _____ validity?

Predictive validity

51
New cards

To conceptually define a construct one must ______. While to operationally define a construct we use a _______.

Depend on literature review ; existing measure

52
New cards

How can we implement measuring good techniques?

  1. Keep study anonymous

  2. Minimize socially desirable responding

  3. Standardize procedure

  4. Avoid group processes

53
New cards

True or False: A non-experimental research can infer causal relationships

False! cannot establish causal relationships cause theres no manipulation

54
New cards

When is non-experimental research used?

Nature of question, feasibility, and ethics

55
New cards

Correlational and observational studies are part of _______ research?

Non-experimental

56
New cards

A non-experimental research with two (more) variables and measures the statistical relationship between them with little to no effort to control extraneous variables refers to _______ research.

Correlational research

57
New cards

Why are researchers unable to imply causation in correlational studies?

Due to the third variable problem, directional problem (Aren’t sure if X causes Y or vice versa), and temporal precedence (Which variable came first)

58
New cards

What are spurious correlations?

Refers to the false assumption that a correlation exists between two variables, but really does not (Like a False Surprise)

59
New cards

How are corrections measured?

The use of scatterplots helps to determine the relationship, direction, and strength (Pearson R) of a correlation

60
New cards

How can we interrupt the relationship of a correlation using Pearson R?

Analyze the Spread and R-value

  • Spread allows us to determine the distance in which values are from the line of best fit

  • R-value can either give us info on the strength of the line and the direction of the data using (-/+)

61
New cards

What are features of the correlation coefficient ® that we must consider in interrupting correlations?

  1. correlation coefficient ( r ) describes ONLY linear relationships

  2. The value of (r) will be influenced by outliers skewing the data

  3. Always calculate (r) for each cluster data (Simpson’s paradox)

  4. The correlation coefficient will be impacted by the range of scores

62
New cards

What is the purpose of a linear regression?

Allows us to determine how one variable (predictor) predicts the criterion variable

63
New cards

What is the line of best fit?

Line that best minimizes residuals and helps to measure the amount of error from the data point and the estimated data

64
New cards

What is qualitative research?

Research that begins with a less focused research question, collects a lot of unfiltered data from a small number of people, and less concern about concluding cause its raw data

65
New cards

How do we collect data for qualitative research?

Interviews and focus groups

66
New cards

How do we provide a data analysis for qualitative research?

Grounded theory, theoretical narrative, and themes

67
New cards

What is the qualitative- quantitative method research?

Combination of qualitative and quantitative research in a study and determine if different methods reach the same result (triangulate)

68
New cards

What is observational research?

Systematically observe and record behavior

69
New cards

What is naturalistic observation?

Observing behavior in the environment

70
New cards

What is participant observation?

When the researcher becomes an active participant of the group they are observing

71
New cards

What is structured observation?

Observing by using some sort of coding system to know what you are finding

72
New cards

What is a case study?

A specific case of research

73
New cards

What is archival research? How can it be used to do content analysis?

Researching by using compiled past data to answer a RQ

We can use research from archival data to transfer it into qualitative data

74
New cards

What is secondary data?

Pre-collected data thats analyzed to create new RQ

75
New cards

What type of research looks at variables of interest that are measured using self-report, conducted either in person, via mail, or internet?

Survey Research

76
New cards

What should we consider when planning a survey research?

Prior RQ and hypothesis to avoid p-hacking, questions have good construct validity, and obtained an unbiased sample

77
New cards

What is the cognitive model in which individuals engage with when taking a survey?

The model states the process in which individuals 1) interpret the questions asked 2)retrieve necessary info from memory to answer 3) Form a judgment 4) Transform judgment into a response 5) make any changes

78
New cards

What is the definition of context effects?

Unintended influences on respondents answers because they are not related to the content of the item but to the context in which the item appears

79
New cards

Whats an example of the context effect?

Item-order effect, where the order in which the items are presented affects peoples reponses

80
New cards

What types of responses can be found in a survey and what are they’re disadvantages/ vice verse?

  1. Open-ended questions;

    • AD: Participate freely answers question

    • DIS: Takes time (likely to be skipped) to answer & analyze

  2. Close-ended questions;

    • AD: quick to answer and easier to analyze & quantify

    • DIS: Difficult to write (e.g need a limited amount of options, prevent bias among participants)

81
New cards

When writing survey items how can we minimize unintended context effect and maximize the reliability and validity of participants responses?

BRUSO

B - “Brief” item is straight to the point

R- “Relevant” items are relevant to RQ

U - “Umbiguous” items are interpreted in only one way

S - “Specific” items are clear and understandable

O - “Objective” items don’t reveal researchers own opinions or lead participant in a particular matter

82
New cards

How should a survey be properly formated?

Survey should include an introduction stating the invitation to participate in study (include purpose, info abt the institution, and importance),consent form, then instructions, then questions, then debriefing (if deception used).

83
New cards

What is probability sampling?

The researcher can specify the probability that each member of the population will be selected for the sample

84
New cards

What is non-probability sampling?

The researcher cannot specify the probability that each member of the population will be selected for the sample

85
New cards

Examples of non-probability sampling?

Convenience sampling and snowball sampling

86
New cards

What is convenience sampling?

Sample consist of individuals who are happy to give you data easily

87
New cards

What is snowball sampling?

Sampling in which existing research participants help recruit additional participants for the study

88
New cards

What is quota sampling?

Sampling in which subgroups in the sample are recruited to be proportional to those subgroups in the population

89
New cards

What is self-selection sampling?

Sampling in which individuals willing take part of the research on their own accord without researcher asking them

90
New cards

What is purposive sampling?

Sampling based on a criteria the researcher has

91
New cards

What are examples of probability sampling?

Simple random sample, stratified random sampling, proportionate stratified random sampling, disproportionate stratified random sampling, cluster sampling

92
New cards

What is simple random sampling?

Sampling in which each individual in the population has an equal probability of being selected for the sample

93
New cards

What is stratified random sampling?

Sampling in which the population is divided into subgroups (strata) and then a random sample is taken from each strata

94
New cards

What is proportionate stratified random sampling?

Sample in which the proportion of respondents in each of various subgroups matches the proportion in the population

95
New cards

What is disproportionate stratified random sampling?

Used to sample extra respondents from specific small subgroups

96
New cards

What is cluster sampling?

Sample is based on clusters in the populations and individuals in each cluster are randomly sampled

97
New cards

How large should a survey sample be?

The larger the sample, the closer the statistic on the sample will match the true population value

98
New cards

What is sampling bias?

Occurs when a sample is selected in such a way that it is not representative of the entire population and therefore produces inaccurate results

99
New cards

What is non-response bias?

Occurs when there is a systemic difference between survey responders from survey responders

100
New cards

How can we reduce non-response bias?

Attempt a different technique to aquire more responses: in person interview > Telephone surveys > mail/internet