Let’s start by looking at how A/B testing, a feature that’s only available for one-time campaigns, impacted performance. The average open rates, click-through rates, and conversion rates for campaigns that used A/B testing were lower than for campaigns that didn’t use A/B testing in nearly every case.
This doesn’t mean A/B testing is a waste. The whole point is to determine which variation performs better. In many cases, that means the underperforming variation will bring the overall performance metrics down.
The benefit of A/B testing is that you can then apply what you learned from the test — that personalized subject lines prompt better open rates, that a more prominent call-to-action converts better, etc. — to future campaigns to improve your overall email performance.
As for drip campaigns, the results were a bit mixed. Most one-time campaigns that didn’t incorporate a drip sequence performed better, except in the case of newsletters.
As for automated campaigns, leveraging a drip sequence proved effective for boosting the average open rate and the average conversion rate for pre-arrival messages.
There’s a lot you can learn from using both drip campaigns and A/B testing, so consider adding them to your toolbox.