One accurate measurement is worth more than a thousand expert opinions

– Admiral Grace Hopper


Since its inception in 2014, we at Ogury haven’t been short of great innovative ideas from both our genius people and astute clients. The challenge then comes when deciding which ideas to proceed with, and that’s even after categorizing by short-, medium-, and long-term options. We needed a way to easily test and compare different ideas and gain reliable results. That’s when we turned to online controlled experiments, also known as A/B testing.

What is A/B testing?

A/B tests are sometimes called online controlled experiments, field experiments or split tests. They are heavily used at companies like Airbnb, Amazon, Booking.com, eBay, Facebook, Google, Microsoft, Netflix, Twitter and Uber. These companies run 1,000s to 10,000s of experiments every year to evaluate and validate different ideas, hypotheses and opinions. The basic idea is to separate users into different groups or variations with different variables. You then run experiments for a specific period of time then analyze key metrics on every group and interpret the result.

Cupcakes, you say

Let’s take a look at a real-life example. Imagine you want to assess the effect of adding lemon zest to cupcakes. You bake your two batches of cupcakes; one with lemon zest and one without (plain). You hand the plain cupcakes to a group of friends, while another group gets the new lemon zest version. To ensure there aren’t any biases, each person is assigned to their group at random. After they’ve eaten the cupcakes, you gather feedback from your friends. And, if the lemon zest version gains the most votes from your friends, then it’s going to become your go-to cupcake recipe. 

In the AdTech industry, it’s not always simple to gather user feedback. We can run surveys, but the response rate isn’t always statistically significant. A/B testing is a great solution as it gives us and advertisers a more scientific method to optimize ad spend instead of relying on guesswork. A big bonus for us in AdTech is that we gain a huge amount of traffic with online advertising. With such a large sample size readily available, statistical bias is not a problem for us. 

Ogury’s A/B test ice-breaker

Our first step was to build our in-house experimentation platform which met various experiment and analysis requirements. The platform comprised two major parts: a real-time delivery decision maker system, which split inventory traffic into different strategies, and an analysis part with dashboards, which enabled us to compare results on those strategies. 

Once the team produced the first experimentation design, we ran our first test in our ads delivery process. 

We tested several ways to choose the best ads to display to customers. Ultimately, the one that achieved the best result was based on Ogury’s proprietary historical mobile journey data to predict user engagement rate. This first A/B test successfully predicted a 16% lift on the accomplished rate, which is the engagement rate of a customer when seeing an ad. After this successful ice-breaker, we were able to scale the number of experiments that we ran and the maturity of our platform.

Ogury-experimentation-dashboard
The experimentation platform dashboard displays the results from the A/B tests.

We’re doing a walk-run

As a rough rule of thumb, A/B tests stages are determined by the number of experiments an organization is conducting. 

  • Crawl phase: approximately one test per month (~10/year)
  • Walk phase: approx. one test per week (~50/year)
  • Run phase: approx. daily testing (~250/year)
  • Fly phase: thousands per year!

Today at Ogury, we are between the walk and run phases. The culture of A/B testing has grown so much that it’s now a fixture of many product engineering meetings. Someone will always ask: “did you run an experiment?” or just suggest in response to an idea “let’s run an experiment on that”. It’s almost second nature now. This builds on our data-driven culture at Ogury as part of which we ensure the results get highlighted to the entire company. 

Vodafone is a leading telecommunications company in the UK. With new legislation coming into force to give consumers greater flexibility over their mobile phone contracts, Vodafone needed to drive awareness and consideration of its new offering for sole traders. Vodafone partnered with Ogury to identify and reach a specific audience on mobile with impactful, engaging and fully on-screen formats.

Challenge

Vodafone launched a new solution for its business customers, giving them greater flexibility and control of their digital lives. The Vodafone EVO for Sole Traders offering enables business customers to pay for their phone and airtime separately, creating a monthly plan that suits their needs. Vodafone needed a mobile technology partner that could accurately identify its target audience and reach them at scale with engaging video ads in a brand-safe and fraud-free environment, all while respecting consumer privacy. 

Solution

Ogury leveraged its proprietary audience data to define the sole traders. These are individuals who operate their own business with no employees. Therefore, they wear multiple hats from social media manager to e-commerce manager and executive assistant. Knowing these behaviors, and the apps and sites they frequently use, enabled Ogury to pinpoint the audience to ensure the campaign reached the most motivated sole traders in the UK. 

Ogury delivered the campaign to sole traders using its proprietary method Ad Chooser to drive brand awareness with impactful fully on-screen videos. With Ad Chooser, the consumer can select their ad experience, which boosts their ability to remember the ad. 

Vodafone ad chooser

Ogury Ad Chooser

Results

Thanks to Ogury’s solution, the campaign achieved an average VCR of 89% and viewability of 94%. Moreover, the campaign Impact Survey reported a 77% lift in ad recall among those who had seen the ad, indicating that it was memorable and had been delivered to the intended audience. 

Download Vodafone’s case study