1/50
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No study sessions yet.
Quantivative research
Descriptive and Causal Research
Descriptive
Surveys
casual
Experiments
Quantitative Primary Research
Original research that gathers numerical data from a representative sample to describe a target market
Why choose a survey?
Projectable results
Descriptive results
4 functions of survey
1. Use specific questions to address the research objective
2. Standardize questions and response categories
3. Motivate respondents
4. Reduce response error
When doing a survey what must you ask yourself
- Am I being clear/motivational in instructions
- Right questions to address for objective
- Easy to read and answer
- As brief as possible
Best practices for a survey
- No more than 5-7 minutes
- Mostly close ended Qs
- Cluster related Content
- Optimize ability to analyze and report
2 Common universal survey components
1. Overall measurement= gauging sentiment
2. Demographic Question= describe sample representation
3 Basic Measurement concepts
1. Attributes
2. Definitions
3. Measurement
3 Question Response types?
1. Open ended
2. Categorical
3. Continuous
Categorical questions
Nominal- no natural order
Ordinal- can place in order
continuous (metric)
Ratio- number to attribute to something
Interval- Scales
NPS
Net Promoter Score; measures customer loyalty. (0-6 Detractors, 7-8 Passives, 9-10 Promotors)
Balanced scales
Do you have the same number of positive and negative options
*remember neutral is not zero (midpoint)
Likert Scales
Measures intensity- range of options from one extreme to another
Best practices for demographic questions
- Dont necessarly help RO
- Necessary to inducle to describe the sample represenations
- Respondent profiles (sample compared to target popualatuon to assess representativness
Common survey questions
1. Satisfaction
2. New Product Development
3. Brand Perception
4. Advertising concept
Satisfaction Questions
To gauge feedback on current products and services
- understand levels of engagement
Comparing Customer Satisfaction Results
-Trending monitors data over time
- Benchmarking compares data to peer organizations
New Product Development Questions
To gain feedback from a sample of target market about new product direction
Brand Perception Questions
Gather unbiased perceptions; almost always blind
Advertising Concept
To test advertising concepts prior to full development and production
Distribution Method 6 Factors to Consider
1. Topic Sensitivity
2. Time Requirements- when needed
3. How invested respondents are
4. Survey Length
5. Survey Complexity
6. Cost
Survey invitation components
1. Research Credentials
2. Purpose of the survey
3. Reason the respondent is being asked to participate
4. How responses will be used and if confidential
5. Survey Due date
6. Incentive
Survey Software
Qualtrics, Survey Monkey, and Google Forms
Why do satisfaction surveys have much higher response rates
People can say what they think good or bad
softening the list
alerting potential respondents ahead of invitation to participate
How to increase response rates
-soften the list
- create compelling invitation
- reminders
-Deploy soon after event
- incentives
Types of errors in surveys
sampling and non-sampling errors
Non-sampling errors (response)
Response- when someone answers a question in a way that misrepresents the truth
EX- 1- Deliberate falsification
2. Lazy respondents
3. Extremely bias
Non-sampling errors (Non-response)
- Failure to take part or answer specific questions on survey
Ex- Non-contacts, refusal to participate, Refusal to answer specific questions, Break off
Sampling calculation apply only to _1._______ not _2.________
1. Final obtained sample, 2. initial sample
Sources for sample names
-Company lists
-List brokers
-Research Panels
margin of error
the range of percentage points in which the sample accurately reflects the population
"How precise do I want to be?"
confidence level
the estimated probability that a population parameter lies within a given confidence interval
- Reliability
Data Cleaning
Process of correcting or removing inaccurate data.
Data Analysis
The process of compiling, analyzing, and interpreting the results of primary and secondary data collection.
- Making the raw data meaningful
Data Analysis- managerial skills
-Know who to pick good consultants/tools
-Healthy Skepticism
-Ability to spot data patterns
Common types of Data Analysis
-Summarizing Single Variables
-Differences and Relationships between variables
Summarizing Single Variables
-Looking at variables tested and summarizing frequency, average, % response
Ex : What % of shoppers using coupons
Differences and relationships between variables
Looking a differences and focusing on those that are significant
Responsibility's as a manager in Data analysis
1. Identify most significant learnings and differences
2. Support differences with statistical analysis
Written Reports
Title Page
Table of contents
Exec summary
Research design
Key learning and findings
Recommendations
Executive summary
Overview of research results and actionable recommendations
-Highlight vital inforamtion
Written report (Research design)
Business problem
key questions
Objectives
How research was conducted
Key learnings
Detailed portion highlighting critical observations and learnings
Data Visualization
Percentages
Line chart for data over time
Mean scores
Open ended comments
Word clouds
Verbatims
Market Research Process
Plan
1.State Business problem
2.State Research Objective
3.List Key Questions
4. Design the study
Execute
5. Collect Data
6. Analyze Results
7. Report and Recommend
Types of Research
exploratory, descriptive, causal
Qualitative Research
Exploratory
Tools
IDI
Ethnography
Stake Holder advisory board
Focus groups