Transcription de l'entretien
Aleksandra Semenenko
Director of Data Science at Artefact
"Mastering Marketing ROI thanks to Incrementality Testing"
Hi everyone, I’m Emmanuel Malherbe. I’m the Director of Research at Artefact, and I have the chance to animate the Data Coffee. The concept is simple: we have one topic, one expert, and one coffee. The topic today is MROI, and the expert is Aleksandra. Hi Aleksandra!
Hi Emmanuel. I’m the Director of Consulting and Data at Artefact, and I take care of topics related to marketing measurement. Last time we had a coffee, we talked about MMM, so marketing mix modeling and things like that. Today, I thought it would be great to continue our discussion and talk about other things that exist.
What would be the difference compared to what we discussed last time?
The main difference would be the approach that companies use to test the incrementality of their marketing campaigns. It’s not always clear that you need to start with MMM from the beginning. As we discussed last time, MMM gives you amazing results. It’s really the strategic tool that you want to use, but at the same time, it requires a big data foundation, and you need to understand the whole business agenda that you want to cover with the model.
So, it’s quite consuming in terms of resources and data.
Exactly. It’s a big undertaking. Sometimes you want to start a bit easier, perhaps with foundational data, or take a step back and redefine your business questions. That’s when we go for incrementality testing, which we’re going to talk about today.
Excellent! Can you tell me a bit more about incrementality testing?
Sure, with pleasure. I’ll start with a small story. Imagine you are a CMO or head of marketing in a big company. You sell products all around Europe, the States, and Asia. You have a big production line and big demand on the market. In every region, you have a head of marketing or a marketing department. Once a year, all of your teams come to you, and you decide what you’re going to do for the next year – the activation plan.
Now, the UK team tells you that online video formats are booming and you need to invest there. So, you need to allocate part of the budget to shoot an expensive video campaign. Then, your Italian team tells you that yellow banners are not working; they need to be red, or they’re out of business. Basically, every team comes to you with their marketing plans, and you have 25 markets with 25 marketing teams. How do you feel?
I feel completely overwhelmed, to be honest.
It’s a lot, right? Many clients, like you in this case, don’t have a good tool to understand where to invest their money. Also, you’re not running an MMM, so you don’t have a plan at this point. You need something else. I’ll give you an example. One of our clients, a very global company with 25 markets, ran incrementality testing. They found that 63% of all marketing initiatives proposed for the year had a negative ROI. This means that you spend more on these campaigns than you gain in revenue, which is not what you want. However, 37% were great, so those needed to scale. Imagine you do it in France, it works out great, and you have a huge ROI. You want to expand this learning experience to other countries, but it’s only 37%. Without a tool to test it, you wouldn’t assume the percentage is that low. You might think it’s 50% or 60%, but it was 37% in this example, which was shocking. Coming back to the question of incrementality experiments, we use them to understand what to scale and what not to scale. We’re very tactical. We receive campaign plans from all market leads across different brands and geographies. We try every idea on a small perimeter, not the full country or budget straight away. We see if it works, and if it does, we scale it. If it doesn’t, we stop it and learn from it.
It sounds very agile compared to the big yearly plan.
Absolutely. In a mature organization where you have your big marketing measurement programs aligned, these two things work together. Experiments become part of your business as usual. For example, we had a client who dedicated 80% of their marketing budget to incrementality tests in the first year. After learning everything that year, they decreased it to 20% the next year. Incrementality experiments should become your business as usual. When you always test new things and get feedback from the market, you can feed these results into big MMM initiatives that provide a strategic overview. MMM relies on data foundations. Imagine you’re working with influencers, a type of marketing channel where you invest a lot, but it doesn’t last long. For MMM, it would mean one or two data points, which is not enough. But if you run incrementality experiments, you can feed these results into your MMM. Incrementality testing also encourages you to try different things in your channels. The worst thing for MMM is when you have TV activation flat, always doing the same thing, and putting the same money in TV every day, month, and year. For a statistical model, it means it’s a constant term, as your sales change but your media doesn’t. Incrementality testing pushes you to play with investments in TV. Do smaller investments in a small scope to understand if a new campaign works well, then invest as usual in the rest of the country. If the small campaign works, increase your overall investment in TV and maybe take from other channels that didn’t work, like the 63% we talked about previously.
It’s really not just about a lot of data but getting diverse and rich data.
Absolutely, and it gives you different types of business insights. MMM is something you run once or twice a year to inform your strategy for the year, while incrementality testing is more tactical, working with operational teams on the ground. Both can bring similar revenue. MMM ranks between 10% to 20% over one to three years, depending on your business type. Incrementality tests range between 7% to 15% over one year, as you restart testing with a new learning agenda each year. The effect is immediate but short. Potentially, you can achieve comparable results with both – one being long-term and strategic, and the other short-term and tactical, but both bring business results.
This is very clear, Sasha.
Thanks a lot for this fascinating discussion.
Thanks a lot for your attention and see you for the next Data Coffee.
TOUS NOS ÉPISODES THE BRIDGE
MICHEL TRICOT
CEO d'Airbyte
L’open source pour résoudre la connexion des datas ETL.