top of page
Writer's pictureMichelle Yang

A/B Testing in Social Advertising

How to optimise your strategy & improve your social media campaigns



Why should we do A/B testing on social media?


In marketing, we tend to think we know best what our customers want. But often, results and our expectations don’t line up. Different age groups, genders, geo-locations, or even what is happening on the other side of the country could make a beautifully curated campaign go haywire.


This is where A/B testing comes into play.


A/B testing is an excellent tool for advertisers to gain insights into the behaviours of their target demographics. By better understanding your audience’s needs and online habits, you can optimise your marketing strategy and improve future campaigns.


What is A/B testing on Facebook?


Facebook A/B testing allows us to test different variables, such as ad format, ad copy, audience groups or ad placement, to determine which performs best. For example, you could assume that customers who have been to your website have higher intent to shop than customers who liked your Facebook page. An A/B test on these two audience groups will allow you to compare both strategies at the end of the testing period to see which one performs best.


How to do A/B testing on Facebook


A/B testing on Facebook allows you to compare different versions of your ads to find which one will most effectively promote your brand. For the most conclusive results, ensure everything is identical except for the one variable you’d like to test.


Where do we begin?


First, you need to know which objective you want to A/B test because different objectives will dictate the key performance indicators (KPI) used to find the winning version when the test concludes.


Facebook allows you to test the following objectives:

  • Awareness

  • Traffic

  • Engagement

  • Leads

  • App promotion

  • Sales


Once you’ve decided the objective, you can now move on to the variable or question you’d like to answer. For example, you could start with, ‘How do our email subscribers engage with us?’ or ‘Does our target audience interact more with sales messages or brand messages?’ Then, the question can be refined to a more specific level, such as ‘Do we get a higher return-on-ad-spend from our email subscribers?’ or ‘Do we get more landing page views if we use more sales-oriented copy in the ads?’ This will help you gain insights into how your email subscribers engage with your brand socially and determine the key metrics.


Target audience and size


A general rule of thumb is that the Facebook algorithm always prefers scale, which means the more you put in, the more you’re likely to get. So, in this case, the bigger the audience, the better. If the audience group is less than 100, the ad is likely to under-deliver. To avoid under-delivery, we recommend broadening the size of the audience group to at least 10,000.


You also need to ensure this audience will only be exposed to the A/B testing ad and will not be included in other ad sets, as this would likely lead to inconclusive results.


Duration of test


Facebook's conversion window defaults to seven days. For best results, we recommend running the test for at least two weeks. Of course, it never hurts to know more; you can A/B test on Facebook for a maximum of 30 days. The ideal testing duration depends on the objective and offering. For example, suppose you have more premium product offerings, and it takes longer for a consumer to purchase after seeing your ad on Facebook. In that case, you're better off running the test for a more extended period, such as 21 days, to allow ample time for the consumer to decide and make a purchase.


Advertising budget


As we know, Facebook prefers scale. The higher the budget, the more resources you can obtain from Facebook to ensure your test runs smoothly.


A lower budget can lead to under-delivery. For example, let’s suppose you would like to test two different ad formats on an audience group of 100,000 people. In that case, it’s important to ensure that the budget is reasonably high enough ($100 will not work, unfortunately). If the budget is too low, it's challenging for the Facebook algorithm to even get the ads out in the first place.


Results


This will depend on the objective you chose to run the test. For example, if the objective is Sales, you could look at revenue (naturally), return-on-ad-spend and cost-per-acquisition. But if the objective is Traffic, you could look at landing page views and cost-per-link-click.


After reviewing your results, you may find there is no obvious winner. However, there are tips we can use to troubleshoot your A/B tests.


What’s next?


A/B tests can provide valuable information to help you create more successful advertising campaigns.


There are a few ways you can approach the results. If the objective was Traffic, you could rerun the test with a different objective, say, Conversion, to see if the results would be the same. Because these two objectives are different, the KPIs will need to be re-aligned so that you can find the winning version. The audience group that was more likely to click on the ad and visit the website might not be the audience group that is more likely to convert and purchase.


If you want to test with the same objective but different audience groups, you can use the winning ad creative in the rerun to find out which group engages best with that creative.


Need a hand with your A/B testing criteria? Or want to chat about your social media marketing strategy? Shoot us an email at hello@socialmotive.com.au or get in touch here.

55 views0 comments

Comments


bottom of page