- Intro
- Advertising Assisted Conversion Types Facebook: One Day View
- AdWords: View-through Conversion
- The A/B Test
- A/B test setup
- Results
- What we learned
Intro
Various methods are employed to cross-check the efficiency of tools and data health. These include the test traffic feature in Google Analytics, the sandbox environment in Adjust (for app tracking), and A/A tests in Optimizely.
However, how do we handle advertising platforms?
Advertising platforms operate on a pay-to-play basis. We validate conversions and clicks from all ads within Google Analytics as a single source of truth, but what about other metrics that are specific to ad platforms?
In 2016, we aimed to understand how creatives influence the assisted conversion types of ad platforms. Some might argue that a test not focused on conversions is pointless, but assisted conversions are part of the conversion flow, and as marketing attribution models and conversion windows are not particularly "glamorous.β
Advertising Assisted Conversion Types Facebook: One Day View
This is the lookback period after people view your ad, during which you attribute action-based results (such as clicks and conversions) to that ad.
AdWords: View-through Conversion
This is the period of time after an impression, during which a view-through conversion is recorded. View-through conversions occur when the user doesnβt interact with the ad but later converts.
The A/B Test
We decided to carry out an unconventional test.
The A/B test was not designed to identify the winning variation but to establish the benchmark of advertising assisted conversions. How can we set the benchmark for this test? We need a variation with a creative and a variation without one, which isn't feasible.
We used two variations of "empty" banners that contain no information about the advertised company, efood, and look like generic placeholders. Then, there are two variations of "branded" banners, one featuring food and one displaying an animal.
The test took place on the Google Display Network (GDN). We chose GDN because "empty" banners would be irrelevant in the context of Facebook ads, as they include the advertiser's logo and name, making them appear more like errors than tests.
Here were the results from the GDN test:
A/B test setup
Target | View-through conversions |
Hypothesis | If the group of empty banners generates fewer view-through conversions than branded banners, then branded banners indeed influence users more. The difference between empty and branded banners would be the benchmark we seek. |
Audience | We used two types of audiences, one low-performing (new cookies) and one high-performing (visitors β no conversions), to compare performance between groups. We avoided existing customers because their performance could skew our results.. |
Rotation | To avoid the auto-optimization most networks do, we set the campaign to "even" rotation so all banners would be served equally. However, due to the performance difference, there was a slight bias |
Budget | As it was a test, we didn't allocate a large budget (3-digit number), but it was still significant enough to generate traction. We always reserve a small budget for tests. |
Time | We selected a regular time period without heavy seasonality that could negatively impact overall performance. |
It's important to note that our intent was not to construct a tightly controlled A/B test in a laboratory setting. Instead, we aimed to design something meaningful for us, capable of generating sufficient traction to produce data worthy of analysis.
After running the campaign for a week, the results were as follows:
Results
PLACEMENT | AUDIENCE | CLICKS | CTR | View-through conversions |
Empty banner (1) | Low performance | 869 | 0.31% | 117 |
Empty banner (2) | Low performance | 676 | 0.38% | 86 |
Empty banner (1) | High performance | 240 | 0.60% | 174 |
Empty banner (2) | High performance | 206 | 0.66% | 161 |
Branded banner (animal) | Low performance | 357 | 0.18% | 116 |
Branded banner (food) | Low performance | 160 | 0.22% | 27 |
Branded banner (animal) | High performance | 114 | 0.43% | 179 |
Branded banner (food) | High performance | 73 | 0.46% | 48 |
To sum up:
PLACEMENT | AUDIENCE | CLICKS | CTR | View-through conversions |
Empty banners | Low/High performance | 1991 | 0.49% | 538 |
Branded banners | Low/High performance | 704 | 0.32% | 370 |
The table above reveals a couple of insights:
What we learned
π‘ CTR & Clicks
CTR and clicks were significantly higher for empty banners. We don't have a clear explanation for this; perhaps users found the unusualness of the empty banners intriguing, generating more clicks (and consequently, CTR). Surprisingly, the low-performance audience outperformed the high-performance audience in both variations.
π‘ View-through Conversions
View-through conversions were also substantially higher. One logical observation is that the high-performance audience generated more view-throughs.
π‘ Audience Performance
The low-performance audience generated more clicks, while the high-performance audience yielded better outcomes. The volume of clicks generated by the low-performance audience in both empty and branded banners variations is puzzling, though the better outcomes in the high-performance group require no further explanation.
π‘ Animal vs Food
The animal banner significantly outperformed the food banner. It seems a squirrel can indeed beat a pizza!
After this test, we didn't dedicate all our GDN budget to empty banners. We didn't shift the entire budget to another platform either. Instead, we adjusted our ad strategy and gained confidence to invest more in ad A/B testing.
Relevant posts: