Advantage+ and A/B Testing: Balancing Automation and Creativity
Meta recently announced an innovation in the structure of its advertising campaigns, with the introduction of Advantage+. This consolidation of campaigns and ad sets has left some advertisers wondering about the future of testing on the platform: with the rise of AI and its ability to test multiple creative combinations automatically, is there still a place for manual A/B testing?
Advantage+ simplifies the process of creating and managing campaigns, with the only area to split out activity being at ad level. This means that specific budgets can no longer be allocated to interest audiences, demographics, locations, product sets, or ads. That said, one of the benefits of using Advantage+ is the ability to see the split between new and returning customers if you input first-party and website data on existing customers. This level of insight is not available in other campaign structures on Meta, and it can provide valuable information when analysing campaign performance, especially for advertisers particularly focused on life-time value or driving new customers.
However, this raises the question of how to test different creatives and strategies outside of new and returning customers.
Meta claims that it can automatically test up to 150 creative combinations and deliver the highest-performing ad variation to the highest-value shoppers, which has led some to believe that manual A/B testing is no longer necessary. It is important to note, though, that automation can only take us so far. While it’s true that AI can quickly test a vast number of creatives, it may not be able to identify the nuances and subtleties that come with manual A/B testing.
To find a balance between automation and creativity, there are a few approaches to consider.
One is to A/B test BAU activity against Advantage+ and compare the results; this will allow us to continue testing different audiences and optimisation strategies outside of Advantage+, whilst also giving insights on which drives performance based on client objectives.
Another is to A/B test creatives in a separate campaign from Advantage+ and add the winners to the Advantage+ campaign. This affords us more creative and branding control with regards to what is added to Advantage+, meaning we can continuously learn which creatives are resonating with our audience, as well as leveraging the power of automation offered by Advantage+.
It’s important to keep in mind that any testing periods on Advantage+ should be longer than we’re used to. It takes approximately double the time to make learnings compared to traditional campaign structures, so patience is essential.
In conclusion, Advantage+ has streamlined the campaign creation and management process on Meta, but it’s not the be-all-end-all solution for advertising success. There is still a place for manual A/B testing, especially for more nuanced and complex campaigns. By leveraging first-party data and taking the time to analyse results, advertisers can find a balance between automation and creativity that will drive better performance on Meta.