What’s an A/B test you’ve performed that surprised you and why?
Tell us about an A/B test you've conducted in email marketing. Give us some context about your company and industry, then tell us about the parameters of your experiment. What were the results and why did they surprise you?
What’s an A/B test you’ve performed that surprised you and why?
I learned about A/B testing during a digital marketing course. We tested ad versions using a simulation tool. Sometimes a simple, colourful design performed better. User preferences are not always what we expect. Testing is important to find what really works.
What’s an A/B test you’ve performed that surprised you and why?
I learned about A/B testing during a digital marketing course, even though I haven't done one in real life yet. We used a computer program to test different versions of ads to see which one worked better. I was surprised to find that sometimes a simple design with fewer colours got more engagement than a more colourful, eye-catching version. It taught me that user preferences are not always what we expect, and that testing is important to find out what really works.
What’s an A/B test you’ve performed that surprised you and why?
While I haven't used A/B testing in the real world, during college I took a social media marketing course that used a simulation program which tasked students with conducting A/B testing for a fake brand in order to drive more engagement. We were able to edit nearly every design aspect of the "posts", from choosing the desired social media platform to editing the size/type of image presented in the post.
What’s an A/B test you’ve performed that surprised you and why?
As a retail marketing intern in the food industry, one A/B test I performed involved comparing two in-store product placements for a new snack launch—one at eye-level on the main aisle and another on the checkout counter display.
Surprisingly, the checkout counter display outperformed the eye-level aisle placement by a significant margin in terms of impulse purchases. This was unexpected because conventional wisdom suggests eye-level placement drives more sales. The result highlighted the power of last-minute purchase decisions in food retail and the importance of strategic placement near the point of sale.
What’s an A/B test you’ve performed that surprised you and why?
As the Chief Marketing Officer at Paysita Finance, a digital financial services company in the fintech industry, I regularly lead performance-driven email marketing campaigns. One of our goals was to increase user engagement and conversion rates for our newly launched USD transactional accounts feature.
Context & Objective:
We wanted to determine whether emphasizing emotional appeal or practical benefits in our subject lines would lead to higher open and click-through rates. The audience consisted of existing users segmented by activity level and demographics.
A/B Test Parameters:
• Version A (Emotional appeal): Subject line – “Enjoy Freedom with Your New USD Account”
• Version B (Practical value): Subject line – “Withdraw USD Anytime — No Limits, No Stress”
The email content was identical for both versions, focusing on the ease of using the new USD feature.
Results:
• Version A (Emotional): 18% open rate, 4.1% click-through rate
• Version B (Practical): 26% open rate, 7.3% click-through rate
Why It Surprised Me:
We anticipated that emotional messaging would resonate more, especially since financial freedom is a strong value in our brand identity. However, the practical subject line outperformed significantly. This revealed that our audience responded more to direct, benefit-driven communication than aspirational language, especially when dealing with transactional financial products.
Takeaway:
This test reshaped how we approached subject lines and copy. We began using more straightforward language in email marketing, emphasizing specific, tangible benefits—and saw continued improvements in engagement across campaigns.
What’s an A/B test you’ve performed that surprised you and why?
As a marketing cloud product expert and strategist, I have done various A/B testing for customers from different industries. One particular A/B testing for a client from automobile industry, we experimented with different send times and the text and design tone of the content. Upon testing, we realised emails sent during the day with vibrant colours and emails sent during evenings with subtle colours gained better results as it matches with the energy levels of the customers.