Chapter 2: Types of Data

0.0(0)
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/45

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

46 Terms

1
New cards

Mail Surveys

Advantages:

  • Very inexpensive; paying for postage  

  • Allows interviewees to do research to answer the survey  

Disadvantages:

  • People may throw away the survey; low response rate

2
New cards

Telephone Surveys

Advantages

  • Cheaper: only paying for time with the interviewer  

  • Quick Responses  

  • Easier to monitor  

Disadvantages

  • Shorter  

  • Nonresponsive (5%); easy to say not interested  

  • Not everyone has a telephone

3
New cards

Face-to-Face Surveys

Advantages

  • Longer; can't kick you out easily  

  • Higher response rate (60%)  

  • Better rapport  

  • Nonverbal information; can learn more about the person based on what's in their home, body language  

  • Props: can show things like pictures and maps  

Disadvantages

  • expensive

  • time-consuming

4
New cards

Group-Administrated Surveys

Surveys given to a group at the same time.

Advantages

  • Can get a lot of responses at once  

  • Cost-Effective

Disadvantages  

  • May limit who you can ask

  • Group influence may affect responses

Example: A professor hands out a questionnaire to all students in a lecture hall to evaluate the course.

5
New cards

CATI (Computer-Assisted Telephone Interviewing)

Telephone surveys where the interviewer follows a computer-guided script

  • Advantages 

    • Put in data as you get it  

    • Organization flows well because of the use of systems/computers  

    • 90% of telephone surveys are done this way  

    • Can have complex surveys  

    • Skip patterns: based on the response to the previous question 

      • Are you married if not that’s a skip pattern if I'm research married couples  

  • Disadvantages  

    • Cost money  

      • Cost of software and computers  

6
New cards

CAPI (Computer-Assisted Personal Interviewing)

Face-to-face interviews where the interviewer enters responses into a computer.

  • Advantages  

    • respondents fill in the data  

    • People are much likely to tell the truth through these surveys  

  • Disadvantages

    • expensive

Example: A census worker uses a tablet to record household information during door-to-door interviews

7
New cards

Exit Polls

Survey given to someone after they vote 

Advantages

  • Done around elections; used to project who is likely to win 

  • Gives a breakdown of the voter demographics  

  • Large data sets  

Disadvantages

  • sample bias (not all voters participate)

8
New cards

Web Surveys

Surveys conducted online through websites, emails, or social media.

Advantages

  • fast data collection

  • low cost

  • easy to analyze

Disadvantages

  • exclude those who don’t use the internet

  • potential for low usable response rates

Example: A beauty brand emails customers a link to a digital survey asking for feedback on a new product

9
New cards

Phraseology

The way a question is worded can influence responses.

  • Has a big impact on the results/answers  

  • Can show bias or making people choose only one answer that you want them to choose  

10
New cards

Question Order

The sequence of questions affects how respondents answer.

  • Typically sensitive and controversial questions are at the end  

    • Helps build rapport with the person  

Example: Asking about political affiliation before asking about policy opinions may influence responses.

11
New cards

Filter Questions

Questions used to determine if a respondent should answer a following question.

Example: "Have you ever purchased skincare products? (Yes/No)" → If "Yes," proceed to questions about brand preferences

12
New cards

Open-Ended

  • Leads to many different responses  

    • What's the biggest problem in this country today  

  • Responses have to be coded  

  • You don't know how people responded; allows for them give an opinion you may not have expected  

  • Allows for more depth, but harder to analyze

13
New cards

Close-Ended

Respondents select from predefined options.

  • not a lot of chose in the matter

Example: Likert Scale - How satisfied are you? (Very Satisfied, Satisfied, Neutral, Dissatisfied, Very Dissatisfied)

  • Easier to analyze but is more restrictive

14
New cards

Non-Response Rates

The percentage of people who do not respond to a survey

  • Many people no longer trust people enough to answers calls or do face-to-face surveys  

  • It takes more work to get response rates particularly for face to face

15
New cards

Race of Interviewer

People are likely to change their response based on the race of the interviewer  

16
New cards

Effect of Reference Point

A respondent's answers may be influenced by prior information or context provided in the survey.

  • People are more likely to change their response based on how the reference point affects  

  • Example: How are public schools doing in the nation vs. How are your public schools doing  

    • Can lead to biased responses based on prior questions or external comparisons.

17
New cards

Coding

The process of categorizing topics for analysis

18
New cards

Occupation Coding

Difficult due to broad job categories and classification issues.

19
New cards

Race Coding

Racial Categories

  • Example: The U.S. Census has changed racial categories since 1790, influenced by politics 

  • Challenges:

    • Changing racial identities, especially with genetic testing leading to multiracial claims.

    • Who assigns race can affect how individuals are classified at birth and death

20
New cards

Event Data

Recording the number of times a specified event has taken place  

Example: A political scientist analyzing global protests by using news reports and government records.

21
New cards

Textual Data

Information derived from written or spoken text, such as news articles, interviews, or social media posts

  • Speeches, written documents, seating arrangements  

Example: A researcher studying public opinion by analyzing tweets about a presidential debate

22
New cards

Content Analysis

The process of interpreting and coding textual material to identify patterns and themes.

  • Evaluate texts like documents or speeches

  • Example: Analyzing political speeches to count how often certain words, like "freedom" or "justice," are used.

23
New cards

Experimental Data

Data collected from controlled experiments where variables are manipulated.

24
New cards

Experimental Group

Exposed to the effect being tested

25
New cards

Control Group

Not exposed to the effect being tested 

26
New cards

Placebo Effect

Psychological effect where a person experiences changes simply because they believe they are receiving treatment

Example: A patient given a sugar pill (placebo) reports pain relief, even though the pill has no active ingredients.

  • studies should be blind

27
New cards

Paired Testing

A method where two individuals with similar characteristics, except for one key difference

  • Test for discrimination or bias.

  • Example: Buying a house and only difference is race

28
New cards

Non-Obtrusive Measures

Ways of collecting data without directly interacting with subjects, reducing bias

Example: Analyzing foot traffic in a store using security camera footage instead of customer surveys

29
New cards

Focus Groups

A small-group discussion led by a researcher.  

  • People are asked questions in group and this allows for in-depth discussions and one can also see how subjects interact. 

Example: 1st time used in political campaign – Willie Horton Ad 

30
New cards

Multi-mode Research (Triangulation)

Using more than one research strategy/approach  

Example: A researcher studying climate change using satellite images, historical temperature data, and interviews with scientists

31
New cards

Split-Ballots

A method where different versions of a survey question are given to different groups to test wording effects.

Why are they used:

  • Identify Bias: See if wording influences responses (e.g., "welfare" vs. "government assistance").

  • Improve Accuracy: Select the best phrasing to reduce misinterpretation.

  • Compare Responses: Test different question structures to find the most reliable way to measure public opinion.

32
New cards

Pre-Tests

A trial run of a survey

Why are they used:

  • Conducted before the actual study to identify and fix issues.

  • Example: Researchers distribute a draft questionnaire to a small group to check for unclear wording or confusion before launching a nationwide survey

33
New cards

Sampling

A smaller group selected from the population to represent it in a study.

34
New cards

Population

The entire group that a study aims to analyze.
🔹 Example: All registered voters in the U.S. during an election poll

35
New cards

Random Sample

A sample where every individual in the population has an equal chance of being selected.
🔹 Example: Drawing names from a hat to select participants for a research study.

36
New cards

Stratified Random Sample

Population is divided into subgroups (strata) and a random sample is taken from each other

Example: A researcher surveys 100 students, ensuring equal representation of freshmen, sophomores, juniors, and seniors

37
New cards

Cluster Sample

The population is divided into groups (clusters), and entire clusters are randomly selected.

  • used with face-to-face Surveys

  • used to save money

  • Example: Conducting 1,500 surveys is expensive  

    • Instead going to 300 random locations (clusters)  

      • And selecting 5 households in each cluster  

        • 300 x 5 = 1,500  

38
New cards

Snowball Sampling

Participants recruit other participants, often used for hard-to-reach populations.

39
New cards

Disproportionate Sampling

Some groups in the population are intentionally oversampled/undersampled to ensure enough representation

Example: A survey on racial attitudes might oversample Black respondents to ensure enough data for analysis.

40
New cards

Quota Sample

41
New cards

Convenience Sample

A sample selected based on ease of access rather than random selection.

Example: A professor surveys students in their own class instead of the entire university.

  • may not represent the larger population, leading to bias

42
New cards

Non-probability Sample

A sampling method where not everyone in the population has an equal chance of being selected

  • Example: Quota & Convenience Sample

43
New cards

Weighting Data

Adjusting survey results to make the sample more representative of the actual population.

Example: If a poll has too few young voters, researchers give their responses more weight to match real-world demographics.
🔹 Why It Matters: Helps correct imbalances in survey samples and improve accuracy.

44
New cards

Sampling

Errors that occurs even with a pure random sample  

  • Error due to the luck of the draw  

  • Sample maybe different from the population  

Example: A poll predicts a candidate has 52% support, but the true percentage is 50% due to a ±2% margin of error

45
New cards

Non-Sampling Error

Errors caused by survey design, question wording, or respondent behavior, not by random sampling.

  • bias in the way the sample was drawn 

  • Sample may be systematically different than the population  

  • Example: A survey asking about income may have biased responses because people underreport their earnings.

46
New cards

Margin of Error

A statistical estimate of how much survey results might differ from the true population value.
🔹 Issue: A bogus concept that can give too much certainty to an uncertain process.
🔹 Example: A poll shows a candidate at 52% with a ±3% margin of error, but real-world inaccuracies (e.g., biased sampling, question wording) can make the results less reliable than they seem.