A cutting-edge data science model can only be created if the impact is measured properly. Pinterest upgraded everyone's preferred impact measurement metric, CTR, after carefully analyzing the problems.

 

Every Data person adores the CTR measure as it is:

  • Simple to explain to leadership, product managers do not question it, and measuring it is straightforward: Divide the number of clicks by the total number of impressions.
  • CTR data is easily available, so it will be easy to calculate, track, and analyze.
  • And, as a bonus, lots of advertisers want to have high CTRs too. So, you’ve aligned your advertiser and user interests!

 

So, just throw a lot of advanced machine learning at the problem to predict and maximize CTRs and you’re done, right? Not so fast!

CTR Suffers From Some Serious Shortcomings. “CTR Is Not the Key to User Engagement.”

CTR just reports on clicks on a certain item, regardless of whether it is relevant or not. Some may claim that the user clicked because the content or ad was relevant, however, this is not always the case; it could be due to positional bias, clickbait, or other factors. These flaws will be discussed in further depth below. CTR is only useful for measuring short-term gains without consideration of user engagement. As a result, we require a metric to track user engagement.

 

“You can fool all the people some of the time and some of the people all the time but you cannot fool all the people all the time” - Abraham Lincoln

Problem 1 - Positional Bias

Ads higher up on the page are more likely to be clicked, but this does not imply that the user finds them more valuable. If you fail to take this into account, you can conclude that users prefer the ads at the top of the page when, in reality, they are just more inclined to pay attention to them.

Problem 2 - CTR Promotes Clickbait

Items having a high click-through rate (CTR) may be "clickbaity" — they entice the user to click without providing anything helpful, instead sending them to ad-loaded slideshows that generate revenue for the advertiser but annoy the user.

Problem 3 - CTR Ignores Other Signals

CTR doesn't imply positive interaction. The person may still despise that specific piece of content. Hide Ads or drop-offs are powerful hate signals that CTR does not take into account.

Above is a Google Ads example to capture explicit user dislike feedback. No point in showing a strong dislike signal ad with high CTR; it will eventually put the user off the platform someday.

Problem 4 - Some Things Are Not Meant to Be Clicked

Ads that do not require a click, such as a video, brand awareness, or pitch ads, such as an Independence Day discount at a nearby store or a short video of a new car launch.

 

Okay..! We have now listed the problems with CTR but what is the Solution?

Going Beyond CTR

They adjusted their ads measurement metric to address the difficulties with CTR mentioned previously. A metric that incorporates engagement and more accurately captures user experience.

 

User Metric

Let's discuss the Numerator first - Weighted Engagement on Ads

 

Instead of just looking at CTR, we look at a weighted average of different actions — Clicks, Hides (Stop ads), Watch Time (on Video Ads), and Saves, among others — on the ads.

 

Consider two ads: B is clicked more often than A, but it is hidden by a lot more people.

Conclusion - One can readily conclude that ad type A is superior to ad type B because it receives more engagement and fewer users hide or disconnect with it while CTR alone shows the opposite picture.

 

If we choose the actions and weights carefully, we can address concerns 2, 3, and 4 with CTR.

 

Coming to the Denominator - Weighted Engagement on Neighbouring Organic Content

 

To address Problem 1 (Position Bias) - We need to dig a bit deeper and understand that the CTRs drop on ads and organic content as users scroll through.

 

LEARNING (This stands true in the majority of businesses) - If an ad or Feed Content on the 10th Position of the carousel gets a similar CTR to one in the 1st position, this means the ad on the 10th position is better. One can verify this hypothesis of positional bias issue with A/B testing on the platform as well.

 

To normalize the effect of positional bias on ads content, we compare the average engagement rate on ads to that of the organic content in the spots before and after. This is where the denominator comes in.

Positional Bias Normalise with Organic Content Neighbours

 

Experimenting with Above setup and New Ads Measuring Metric

Compare CTR vs New Ad Metric

From the above table, we can infer that Treatment has lower CTR than Control but a lot better on our Weighted Engagement metric.

Conclusion

I understand that digesting and practically applying this new metric can be difficult because CTR has been used to gauge ads and content relevance for a long time, but change brings growth.

#bigdata #ai #7wdata #artificialintelligence #cloud #fact #engineering #didyouknow #technology #physics #nasa #space #facts #universe #knowledge #dailyfacts #biology #factz #chemistry #astronomy #education #earth #memes #cosmos #amazing #nature #allfacts #tech #innovation #astrophysics

 

NOTE: If your ads monetization model is purely dependent on CTR, you can place a higher weight on Clicks when taking weighted engagement scores, but deleting engagement aspects and relying solely on CTR would make users unhappy in the long run.