Tell us about an A/B test you've conducted in email marketing. Give us some context about your company and industry, then tell us about the parameters of your experiment. What were the results and why did they surprise you?
An A/B test I performed was on testing subject lines: one was framed as a question, and the other was not. It was surprising both gave the same open rates, despite advice that a subject line in the form of a question would perform better.
For my Substack newsletter, an email writing platform for writers, I A/B tested the best days and times to schedule an email to reach my audience. I initially would post after I finished writing. After looking at analytics and time zones, I found that the best days were Wednesdays and Fridays in the evenings.
My experience with A/B testing comes from using other platforms for running ads, testing YouTube titles and thumbnails, and my Substack newsletter.
I have yet to use an email marketing tool for my main website. My online business focuses on providing self-development and personal finance content to singles.
Tested two different video ads on LinkedIn for a telecoms company. One (B) did much better than the other (A) - it was a surprise since marketing thought the A would do better than B.
We conduct research within various specific professions. Within one area of healthcare, we tested send timing. We had historically sent invitations at 0600 ET so we were in the inbox early. When we tested a send at 1700 ET we were surprised to find that one specific segent strongly exceeded prior open/click/conversion rates at that time, while all other segments underperformed.
i work in a it industry and we sell softwares, websites, digital marketing, etc.
we send cold emails to organic clients who are looking for digital assistance. i perfomred a/b testing by changing the subject line, personalization, like i sent my portfolio according to client business niche in a version and sent random websites in the b version and the response rate increased significantly.
The most surprising A/B test I've performed involved using less text and more labeled images in emails sent later in the evening. The results showed a significant increase in engagement and conversions, suggesting that our audience preferred more visual content and less text during late hours. This indicated that readers might be more inclined to interact with visually driven content, such as images or videos, when they are winding down for the day and not in the mood for extensive reading.
The most surprising A/B test I've performed was due to the use of less text and more labelled images, sent at a later time in the evening. Perhaps suggesting the readers wanted less readable content later at night, and more visuals - images/videos.
I had seen an article suggesting using all lowercase letters in a subject line and thought it was silly. I ended up doing an AB test for proper punctuation vs all lowercase in the SL and to my surprise, the all lowercase SL had a significantly higher open rate.
The most interesting A/B test I've ran recently has actually been around time sends not content. We have been using smart sending times (times that are when people are most engaged with opening email) to help target the best times for us to send emails.