<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1639164799743833&amp;ev=PageView&amp;noscript=1">
Diagram Views

A/B Testing Ideas for Paid Social

Sarah Corley
#Inbound Marketing, #Social Media
Published on June 30, 2020
austin-distel-tLZhFRLj6nY-unsplash

A/B testing your social media ads can help improve the results of your campaigns. Whether it's more clicks, conversions or improved engagement, we'll teach you how to craft effective A/B tests for social ads.

As marketers, data should be the foundation behind every decision made for clients.  

A house can’t stand without a strong base and the same applies to a marketing strategy. Without data guiding decisions, time and money is wasted, and you’ll end up with less than desirable test results. 
 
On a web page or a landing page, it’s easy to try different colors, copy and layout to see what performs best. In fact, one of the most famous examples is when Google tried out 40 different shades of blue to land on the blue that you now see in Gmail and across its homepage.  Once you've planned  your paid social campaign, it's time to think about creating different types of assets to test. 

On various social platforms like Facebook, Twitter, Instagram and LinkedIn, you can A/B test different versions of your paid social media ads to see what resonates with your audiences. In fact, a/b testing your ads can help you make content marketing and advertising decisions in real-time! 


What is A/B Testing? 


A/B testing can also be referred to as split testing or multivariate testing. It allows you to test two or more variables and determine a winner based on an end goal, usually conversion rate. Instead of just guessing which content might perform best, A/B testing on paid social media will give you the best bang for your buck and give you insights to help shape future online strategies. 

 


A/B Testing Basics: 

  • Before A/B testing, decide what key performance indicators (KPIs) are most important to you. By forming a hypothesis, testing it and making decisions based on statistical significance, you'll make smarter decisions related to your goals.  
  • A/B test your ads by changing one variable at a time so you know what is and isn’t working. On a web page, this could be a button color. On social media ads, it could be a variation of the ad creative or thumbnail of a video, etc..
  • Retest audiences as user experience can change based on your audience sample size over time. 
  • A/B testing needs a high volume of traffic to be effective. Focus on your top-performing ads to ensure sufficient sample size and maximize results.


Paid Social A/B tests to try:  


Copy: This is one of the most common A/B tests and probably one of the easiest to start testing. The text in your ad is worth A/B testing as audiences will respond to varied language which means your success may vary.  

  • Copy tests to consider:
    • Post length: Shorter captions versus longer vs bullet points 
    • Post style: Which will perform better--a question? A bold statement? A statistic? 
    • Use of emojis 
               

Screen Shot 2020-06-25 at 1.27.00 PM

Version A: 1,034 likes; 56,407 impressions

Screen Shot 2020-06-25 at 1.27.13 PM

Version B: 418 likes; 16,954 impressions 

Above is an example of a/b ads and test results that we did for a client. The only thing different between these two is the text. These paid social media ads ran for 1 week with the same budget, cost per result, image and sample size. You can see how the likes and impressions of Version A outperformed Version B strictly based on copy alone. Why is that? The copy in Ad A is more focused on building rapport and community while copy for Ad B focused on the client and what they can provide to the customer. 

Images: Split testing media can also be a great way to test what visually resonates or deters your audience. For example: will ads with people in them outperform a staged or landscape image? Will GIFs perform better than static images? Will images with brighter colors outperform those with dark colors? Text overlay or no overlay? A single product shot or a styled shot? These are all options that you can A/B test and the data will give you more insight.  


Headlines: If your headlines aren’t attention-grabbing, your users may not even read your ad and scroll right by it. Try experimenting with active language instead of a question.   

Experiment with different hooks: The first few seconds of a video ad or a boosted reel are crucial for grabbing attention. Test different hooks such as bold text, catchy visuals, or a quick intro to see which one leads to higher engagement. 

Test different music and audio: Testing different background music tracks, voiceover, or other audio can lead to more views and shares. Try different tracks to see what gets the best response. 

Video thumbnail variations: Thumbnails can make or break the clickthrough rate for boosted reel and video ads on social. Testing different stills or visuals can help you discover what captures attention. 

Call-to-Action (CTA): Each ad should have only one call-to-action. Whether it’s “visit” or “shop now,” your CTA button wording in your paid social media ads is another component to test. 

Various Ad Formats: Experiment with different ad formats like carousel, video, and single image ads, but keep variations within the same format for A/B testing. Testing drastically different formats within the same ad set can quickly drain your budget and lead to skewed results. Instead, compare elements like text overlays or thumbnails within a specific format to get clearer insights on what resonates most with your audience.  

Audience Targeting: Test a sample size by using a variety of filtering characteristics which might give you some of the best data test results. Multivariate testing on audiences might look like: 

  • City/State/Zip Codes
  • Countries
  • Gender
  • Education Level
  • Interests/Hobbies
  • Groups
  • Behaviors 

If you choose to A/B test manually (outside of platform-specific tools), keep the audience broader to reduce the risk of overlap and get more accurate insights. We'll cover manual A/B testing shortly. 

Placement: Where will your ad appear? On a story? In the feed? On the right side bar on a desktop? Try different placements to find out which one is most effective for your ad.

Day and Time: When are your paid social media ads performing best? 

Device type: Any patterns here? 

Manual A/B testing vs On-Platform Testing: While built-in A/B testing tools on social platforms can be convenient, there are situations where our team runs manual A/B testing as provides more flexibility and long-term benefits:

Maintaining Campaign Momentum: When you test manually, you can run your campaign continuously without setting a strict start and end date, unlike platform A/B testing tools that often require you to define a testing period. This means that if one variation performs exceptionally well, you can keep the ad running without interrupting its momentum. There's no need to restart a new campaign, which can sometimes lead to a drop in performance as the ad has to re-enter the learning phase.

Greater Flexibility: Manual testing allows to control how variations are set up, giving more freedom to adjust budgets, targeting, or creative elements on the fly. With platform-specific tools, your options might be more rigid, limiting how much you can experiment or adapt during the test.

Better Budget Management: By running ads manually within a campaign, you can more easily allocate budgets to high-performing ads as soon as you see results, rather than waiting for the platform's A/B tool to conclude the test. This real-time optimization can lead to better results without wasting budget on underperforming ads.

Overall, we recommend manual A/B testing when you want to maintain campaign performance and have the flexibility to act quickly on insights without losing traction. As mentioned earlier, keep your audience as broad as possible for manual testing to minimize the risk of overlap.

A/B testing is a powerful tool for refining your social ad campaigns, but success depends on a strategic approach. Whether you're experimenting with different ad formats, testing variations within the same format, using on-platform tools or choosing to run tests manually, the key is to keep your tests controlled and data-driven. By understanding your audience, setting clear goals, and using the right testing methods, you can optimize ad performance, make data-backed decisions, and maximize your marketing budget. Remember, consistent testing and learning are the keys to staying ahead in the competitive world of social advertising