1/6
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Intro
Outbound marketing includes search ads, display ads, and video ads. Until this point, the reading explained that marketers usually measure these ads using simple numbers like impressions, click-through rate (CTR), and conversions.
But Section 2.1.4 tells us that real measurement is much more complicated. Companies are now trying to answer deeper questions, and this section outlines five major challenges that make measuring outbound marketing effectiveness hard.
The five challenges are:
Correlation vs. Causation
Attribution
Dynamics (Delayed Effects of Advertising)
Online–Offline Interaction
Customer Lifetime Value (CLV)
Let’s break each of these down.
1⃣ Correlation vs. Causation
What’s the problem?
Just because two things happen together (e.g., someone saw an ad and made a purchase) does not mean the ad caused the purchase.
Why this matters:
Most simple ad metrics—clicks, impressions, sales right after exposure—can mistakenly make ads look more effective than they really are.
Real example from the reading:
A major eBay field experiment found that:
Many consumers who clicked on search ads would have bought anyway, either via organic search or by going directly to the site.
As a result, the return on paid search ads was far lower than traditional metrics suggested.
Google responded by saying their own studies show many search ad clicks are incremental—but the key point is:
👉 You must run experiments to know the truth.
Easy analogy:
If it rains the day you carry an umbrella, that doesn’t mean your umbrella caused the rain.
2⃣ Attribution
What is attribution?
It’s assigning credit for a sale across all the ads and touchpoints a consumer experienced.
The problem:
People rarely buy after seeing just one ad. Instead, their journey may include:
Seeing a Facebook display ad
Seeing a YouTube video
Googling the brand later and clicking a search ad
Finally buying
But last-click attribution (most common method) gives 100% of the credit to the final click—usually the search ad.
This is unfair because:
Other channels (e.g., display, video, TV) helped create awareness or interest.
Search ads tend to show up when people are already close to buying, so they get too much credit.
More advanced models exist, including:
First click
Linear
Time decay
Position-based
Regression / model-based
Experiment-based (most scientifically accurate)
But many of these models are still imperfect or based on arbitrary decisions.
3⃣ Dynamics: The Delayed Impact of Advertising
The issue:
Not all purchases happen right after someone sees an ad—especially for:
Cars
Financial products
Expensive electronics
Long-term services
These decisions take weeks or months, involving research and multiple touchpoints.
Google calls this the “Zero Moment of Truth (ZMOT)”—the long evaluation period before purchase.
The reading shows:
A heat map of car searches demonstrates months of online searching before people finally buy.
A bank’s advertising study found that ignoring long-term effects led to underestimating ROI by up to 40%.
Takeaway:
If you only measure short-term clicks or conversions, you will misjudge the real value of your ads.
4⃣ Online–Offline Interaction
Most businesses operate in both the online and offline world:
Online ads → Offline store sales
Offline ads (TV, print) → Online searches and purchases
Customers may research online but buy in-store (or vice versa)
Challenges:
How do online and offline channels support each other?
Example: Seeing a TV ad may increase the likelihood of clicking a search ad later.
How do you measure offline results from online ads?
For example, IKEA used Facebook to run an experiment showing 11% more store visits among people exposed to the ads.
If a manager doesn’t measure these cross-channel effects, they may undervalue certain channels.
5⃣ Customer Lifetime Value (CLV)
Traditional ad metrics (CTR, conversions, installs) focus on immediate results.
But businesses should really care about:
👉 How valuable a customer is in the long run.
Example from the reading:
Many mobile game companies used to optimize for cost per install (CPI).
But only 35% of users return after one day, and 94% disappear by one month.
Only 4% ever spend money.
So optimizing ads for “installs” makes little sense if most installs never become paying customers.
Now companies optimize for Customer Lifetime Value (CLV).
Other implications:
Customers acquired online might be more price-sensitive.
They may stick around for less time, reducing lifetime value.
This means:
👉 Channels that look cheaper (e.g., online acquisition) may not be better once you consider long-term revenue.
🎯 Summary of Section 2.1.4 in Simple Terms
Measuring outbound marketing is hard because:
Clicks don’t always mean the ad caused the sale.
Multiple ads help influence the same purchase, making it difficult to assign credit.
Some ads create value over the long term—not immediately.
Online ads influence offline behavior, and vice versa.
Not all customers acquired are equally valuable in the long run.
The big message:
To measure ads accurately, marketers must move beyond simple metrics (like CTR) and use more advanced tools like experiments, multi-touch attribution, and CLV-based assessment.