When was the last time you clicked on an ad? Can’t remember?

That’s probably because most people never click on ads. Studies have shown that only 16% of people ever click, and half of those (8%) account for 85% of all clicks on display ads.

But if only a tiny minority of people are clicking, why do some marketers only look at post-click conversions to evaluate their campaigns? Doesn’t this leave out a huge swath of people who saw an ad, were influenced by it, but didn’t click? Plenty of us can remember a time when we saw an ad, were reminded of a product we were previously shopping for, and opened a new tab to search for the brand.

Google, Facebook, and Yahoo have all released studies showing that view-through conversions are a vital piece of information in evaluating and optimizing campaigns. They’ve also used rigorous A/B tests to show that there is significant and provable lift associated with a user viewing an ad, even if that user did not click.

Determining how to value view-through conversions, however, isn’t necessarily simple. Vendors that sell clicks are motivated to tell you not to count them at all. But when done intelligently, a blended ad attribution model that includes both post-click and post-view conversions paints a more accurate picture of your campaign performance. Even better, it puts the right incentives in place for your vendor to drive more incremental conversions, rather than just chasing clicks.

Set an intelligent view-through window and/or percentage allocation.

To count view-through conversions properly, some advertisers shorten the attribution lookback window. The lookback window determines how much time can pass between an impression and a conversion in order for the conversion to be attributed to the campaign. For example, if the lookback window is set to three days and the conversion happens on day four, then the conversion doesn’t count.

Another approach is to keep a longer window, but to use a discounted percentage when valuing view-through conversions. For example, an advertiser might count view-through conversions for 30 days, but only count each view-through conversion at a 15% rate, because that is the ratio of view-through conversions that they’ve learned through testing truly had an impact on the campaign.


In our own testing, AdRoll has seen that conversion behavior varies depending on what kind of business is running the campaign (e.g. retail versus B2B) and what action the campaign is trying to drive (e.g. a purchase versus a white paper download).

It’s clear that view-through conversions should be counted differently depending on the behavior they are meant to measure, and that buying cycles are different across industries. Our data shows that retail tends to convert more quickly than B2B, which suggests that overall, view through windows should be longer for B2B campaigns than for retail campaigns. And indeed, we see that practice broadly validated in the average windows used by our 20,000 advertisers.

But each business is different, and it’s important to arrive at an attribution model that makes sense for your particular case. But even though calibrating the exact weight of view-through conversions can be tricky, it’s worth the investment to arrive at an intelligent blended attribution model to accurately measure your vendor’s performance and to ensure to your campaigns are driving toward the right goal.


To learn more about solving attribution to boost ROI for your organization, sign up for our free webinar with Forrester: Marketing Metrics that Actually Matter!