Tell us about an A/B test you've conducted in email marketing. Give us some context about your company and industry, then tell us about the parameters of your experiment. What were the results and why did they surprise you?
The most surprising A/B test I've performed involved using less text and more labeled images in emails sent later in the evening. The results showed a significant increase in engagement and conversions, suggesting that our audience preferred more visual content and less text during late hours. This indicated that readers might be more inclined to interact with visually driven content, such as images or videos, when they are winding down for the day and not in the mood for extensive reading.
The most surprising A/B test I've performed was due to the use of less text and more labelled images, sent at a later time in the evening. Perhaps suggesting the readers wanted less readable content later at night, and more visuals - images/videos.
I had seen an article suggesting using all lowercase letters in a subject line and thought it was silly. I ended up doing an AB test for proper punctuation vs all lowercase in the SL and to my surprise, the all lowercase SL had a significantly higher open rate.
The most interesting A/B test I've ran recently has actually been around time sends not content. We have been using smart sending times (times that are when people are most engaged with opening email) to help target the best times for us to send emails.
As a learning and development company, we wanted to see if click through rates could be improved by using buttons or images for links to case studies. We found that the sample groups preferred clicking meaningful text links which provided more of a context.
I split tested 2 subject lines. One was "Therapy Product Sale Ending soon" and the other was "Final hours for discounted therapy products". I expected the 2nd one to perform better since I didn't capitalize every first letter in the title and because it mentioned the word "discounted" instead of sale. I was completely wrong. The first title performed much better with almost double the amount of opens and clicks.
I had performed an A/B test when i wanted my friends to buy a product from me, self made products actually and they chose the one version A because i had written some quotes on them and didn't choose version B because it was plain yet attractive.
One that I did in the past was changing how pricing for airline tickets was displayed. Version A had the lowerst price listed first in a left to right line up. Version B had the highest price listed first in a left to right line up. Although most people still bought the lowest price, we did see a significant uptick in higher booking values off Verision B.
This isn't an A/B test in marketing; rather, as a BPO QA, I implemented two different coaching and training approaches for two groups of agents. After a week, I evaluated their progress to identify the most effective methods for coaching and training.