1/45
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Mail Surveys
Advantages:
Very inexpensive; paying for postage
Allows interviewees to do research to answer the survey
Disadvantages:
People may throw away the survey; low response rate
Telephone Surveys
Advantages
Cheaper: only paying for time with the interviewer
Quick Responses
Easier to monitor
Disadvantages
Shorter
Nonresponsive (5%); easy to say not interested
Not everyone has a telephone
Face-to-Face Surveys
Advantages
Longer; can't kick you out easily
Higher response rate (60%)
Better rapport
Nonverbal information; can learn more about the person based on what's in their home, body language
Props: can show things like pictures and maps
Disadvantages
expensive
time-consuming
Group-Administrated Surveys
Surveys given to a group at the same time.
Advantages
Can get a lot of responses at once
Cost-Effective
Disadvantages
May limit who you can ask
Group influence may affect responses
Example: A professor hands out a questionnaire to all students in a lecture hall to evaluate the course.
CATI (Computer-Assisted Telephone Interviewing)
Telephone surveys where the interviewer follows a computer-guided script
Advantages
Put in data as you get it
Organization flows well because of the use of systems/computers
90% of telephone surveys are done this way
Can have complex surveys
Skip patterns: based on the response to the previous question
Are you married if not that’s a skip pattern if I'm research married couples
Disadvantages
Cost money
Cost of software and computers
CAPI (Computer-Assisted Personal Interviewing)
Face-to-face interviews where the interviewer enters responses into a computer.
Advantages
respondents fill in the data
People are much likely to tell the truth through these surveys
Disadvantages
expensive
Example: A census worker uses a tablet to record household information during door-to-door interviews
Exit Polls
Survey given to someone after they vote
Advantages
Done around elections; used to project who is likely to win
Gives a breakdown of the voter demographics
Large data sets
Disadvantages
sample bias (not all voters participate)
Web Surveys
Surveys conducted online through websites, emails, or social media.
Advantages
fast data collection
low cost
easy to analyze
Disadvantages
exclude those who don’t use the internet
potential for low usable response rates
Example: A beauty brand emails customers a link to a digital survey asking for feedback on a new product
Phraseology
The way a question is worded can influence responses.
Has a big impact on the results/answers
Can show bias or making people choose only one answer that you want them to choose
Question Order
The sequence of questions affects how respondents answer.
Typically sensitive and controversial questions are at the end
Helps build rapport with the person
Example: Asking about political affiliation before asking about policy opinions may influence responses.
Filter Questions
Questions used to determine if a respondent should answer a following question.
Example: "Have you ever purchased skincare products? (Yes/No)" → If "Yes," proceed to questions about brand preferences
Open-Ended
Leads to many different responses
What's the biggest problem in this country today
Responses have to be coded
You don't know how people responded; allows for them give an opinion you may not have expected
Allows for more depth, but harder to analyze
Close-Ended
Respondents select from predefined options.
not a lot of chose in the matter
Example: Likert Scale - How satisfied are you? (Very Satisfied, Satisfied, Neutral, Dissatisfied, Very Dissatisfied)
Easier to analyze but is more restrictive
Non-Response Rates
The percentage of people who do not respond to a survey
Many people no longer trust people enough to answers calls or do face-to-face surveys
It takes more work to get response rates particularly for face to face
Race of Interviewer
People are likely to change their response based on the race of the interviewer
Effect of Reference Point
A respondent's answers may be influenced by prior information or context provided in the survey.
People are more likely to change their response based on how the reference point affects
Example: How are public schools doing in the nation vs. How are your public schools doing
Can lead to biased responses based on prior questions or external comparisons.
Coding
The process of categorizing topics for analysis
Occupation Coding
Difficult due to broad job categories and classification issues.
Race Coding
Racial Categories
Example: The U.S. Census has changed racial categories since 1790, influenced by politics 
Challenges:
Changing racial identities, especially with genetic testing leading to multiracial claims.
Who assigns race can affect how individuals are classified at birth and death
Event Data
Recording the number of times a specified event has taken place
Example: A political scientist analyzing global protests by using news reports and government records.
Textual Data
Information derived from written or spoken text, such as news articles, interviews, or social media posts
Speeches, written documents, seating arrangements
Example: A researcher studying public opinion by analyzing tweets about a presidential debate
Content Analysis
The process of interpreting and coding textual material to identify patterns and themes.
Evaluate texts like documents or speeches
Example: Analyzing political speeches to count how often certain words, like "freedom" or "justice," are used.
Experimental Data
Data collected from controlled experiments where variables are manipulated.
Experimental Group
Exposed to the effect being tested
Control Group
Not exposed to the effect being tested
Placebo Effect
Psychological effect where a person experiences changes simply because they believe they are receiving treatment
Example: A patient given a sugar pill (placebo) reports pain relief, even though the pill has no active ingredients.
studies should be blind
Paired Testing
A method where two individuals with similar characteristics, except for one key difference
Test for discrimination or bias.
Example: Buying a house and only difference is race
Non-Obtrusive Measures
Ways of collecting data without directly interacting with subjects, reducing bias
Example: Analyzing foot traffic in a store using security camera footage instead of customer surveys
Focus Groups
A small-group discussion led by a researcher.
People are asked questions in group and this allows for in-depth discussions and one can also see how subjects interact.
Example: 1st time used in political campaign – Willie Horton Ad
Multi-mode Research (Triangulation)
Using more than one research strategy/approach
Example: A researcher studying climate change using satellite images, historical temperature data, and interviews with scientists
Split-Ballots
A method where different versions of a survey question are given to different groups to test wording effects.
Why are they used:
Identify Bias: See if wording influences responses (e.g., "welfare" vs. "government assistance").
Improve Accuracy: Select the best phrasing to reduce misinterpretation.
Compare Responses: Test different question structures to find the most reliable way to measure public opinion.
Pre-Tests
A trial run of a survey
Why are they used:
Conducted before the actual study to identify and fix issues.
Example: Researchers distribute a draft questionnaire to a small group to check for unclear wording or confusion before launching a nationwide survey
Sampling
A smaller group selected from the population to represent it in a study.
Population
The entire group that a study aims to analyze.
🔹 Example: All registered voters in the U.S. during an election poll
Random Sample
A sample where every individual in the population has an equal chance of being selected.
🔹 Example: Drawing names from a hat to select participants for a research study.
Stratified Random Sample
Population is divided into subgroups (strata) and a random sample is taken from each other
Example: A researcher surveys 100 students, ensuring equal representation of freshmen, sophomores, juniors, and seniors
Cluster Sample
The population is divided into groups (clusters), and entire clusters are randomly selected.
used with face-to-face Surveys
used to save money
Example: Conducting 1,500 surveys is expensive
Instead going to 300 random locations (clusters)
And selecting 5 households in each cluster
300 x 5 = 1,500
Snowball Sampling
Participants recruit other participants, often used for hard-to-reach populations.
Disproportionate Sampling
Some groups in the population are intentionally oversampled/undersampled to ensure enough representation
Example: A survey on racial attitudes might oversample Black respondents to ensure enough data for analysis.
Quota Sample
Convenience Sample
A sample selected based on ease of access rather than random selection.
Example: A professor surveys students in their own class instead of the entire university.
may not represent the larger population, leading to bias
Non-probability Sample
A sampling method where not everyone in the population has an equal chance of being selected
Example: Quota & Convenience Sample
Weighting Data
Adjusting survey results to make the sample more representative of the actual population.
Example: If a poll has too few young voters, researchers give their responses more weight to match real-world demographics.
🔹 Why It Matters: Helps correct imbalances in survey samples and improve accuracy.
Sampling
Errors that occurs even with a pure random sample
Error due to the luck of the draw
Sample maybe different from the population
Example: A poll predicts a candidate has 52% support, but the true percentage is 50% due to a ±2% margin of error
Non-Sampling Error
Errors caused by survey design, question wording, or respondent behavior, not by random sampling.
bias in the way the sample was drawn
Sample may be systematically different than the population
Example: A survey asking about income may have biased responses because people underreport their earnings.
Margin of Error
A statistical estimate of how much survey results might differ from the true population value.
🔹 Issue: A bogus concept that can give too much certainty to an uncertain process.
🔹 Example: A poll shows a candidate at 52% with a ±3% margin of error, but real-world inaccuracies (e.g., biased sampling, question wording) can make the results less reliable than they seem.