8 Effective Strategies for Testing Different Email Sending Frequencies
In the quest to perfect email marketing strategies, we turned to eight seasoned professionals, including CEOs and Founders, for their firsthand experiences in testing email-sending frequencies. From optimizing cadence with segmented frequency tests to leveraging user feedback, discover the varied approaches and successful outcomes they’ve achieved.
- Segmented Frequency Test Optimizes Cadence
- Split-Test Reveals Twice-Weekly Engagement Boost
- Methodical Testing Balances Engagement and Retention
- Predictive Sending Increases Opens and Revenue
- Timing Nudges to Match Conversion Patterns
- A/B Testing Segments for Engagement Metrics
- Continuous Testing Aligns with Customer Expectations
- User Feedback Informs Email Frequency Strategy
Segmented Frequency Test Optimizes Cadence
We implemented a segmented frequency test to optimize our email-sending cadence. We divided our subscriber list into three groups based on engagement levels: high, medium, and low. Each group received emails at different frequencies—twice weekly, weekly, and biweekly, respectively—over a three-month period.
Our most successful outcome was with the medium-engagement group. By increasing their email frequency from biweekly to weekly, we saw a 22% increase in click-through rates and a 15% boost in conversions. Interestingly, unsubscribe rates remained stable, indicating that the increased frequency didn’t negatively impact subscriber satisfaction.
This test revealed that tailoring email frequency to engagement levels can significantly improve campaign performance. It’s crucial to find the sweet spot where you’re staying top-of-mind without overwhelming subscribers.
Tristan Harris
Demand Generation Senior Marketing Manager, Thrive Digital Marketing Agency
Split-Test Reveals Twice-Weekly Engagement Boost
As an agency, we run split-tests with every client campaign, and the results over hundreds of campaigns spell it out beyond doubt: twice-weekly emails draw more engagement than lower or higher frequencies. The variance isn’t huge, but the results are consistent in about 75% of cases.
We typically start by running campaigns weekly to split-test and optimize copy and creatives. Then we test frequencies using 1, 2, and 3 emails per week as the single variable, with all other aspects of copy and creatives remaining consistent across the three campaigns.
- 1 per week gets the highest open rate (1pw: 29%, 2pw: 24%, 3pw: 20%).
- 2 per week get the highest CTR and conversion rate (1pw: 10%+3%, 2pw: 12%+5%, 3pw: 8%+2%).
- 3 per week get the highest unsubscribe rate: 1.5% (1pw: 0.5%, 2pw: 0.7%, 3pw: 1.5%).
Ben Hilton
Founder & Managing Director, Switch Jam Digital
Methodical Testing Balances Engagement and Retention
At our company, we use a methodical approach to test email-sending frequencies to optimize engagement without overwhelming our subscribers. Initially, we segment our audience based on their interaction levels—high, medium, and low. For each segment, we tailor the frequency of emails, starting with a baseline derived from industry standards and previous campaign data. We gradually adjust the frequency upwards or downwards in controlled intervals, closely monitoring metrics such as open rates, click-through rates, and unsubscribe rates. This method allows us to find a balance that maximizes engagement while maintaining a healthy subscription rate.
One successful test involved our client in the e-commerce sector, where we experimented with sending frequency for their promotional emails. We tested three different frequencies: once a week, twice a week, and three times a week. The segment that received emails twice a week showed the highest engagement levels without any significant increase in unsubscribe rates. Sales from emails also increased by 18% in this segment compared to the others. This test was instrumental in defining our ongoing email strategy for that client, proving that optimal frequency can significantly impact campaign effectiveness and sales outcomes.
Jason Hennessey
CEO, Hennessey Digital
Predictive Sending Increases Opens and Revenue
Probably the biggest lesson we’ve learned is understanding open tendencies, and I think ActiveCampaign really nails this down with predictive sending. It allows us to send not only at an optimized time (i.e., 11 AM on a Tuesday) but also when the actual contact normally opens and reads emails. This has helped us increase opens by around 10-15% on many of our client sends, even for one of our accounts with 500k sends. It’s massively improved opens and revenue from email.
Ross Jenkins
CEO, DigitalME
Timing Nudges to Match Conversion Patterns
The timing metric we do our best to hit, whenever possible, is the time between initial contact and purchase decision. We know that moving services are something that our customers plan ahead for, and we want to time our final nudges just right so that customers keep us in mind around decision time. One of the simplest metrics we use for this is simply tracking the time from when customers opt into our emails to when they eventually convert. This has helped us develop a range of conversion times, and we send nudge emails at the beginning, middle, and end of this range in order to maximize our chances without bombarding anyone.
Nick Valentino
VP of Market Operations, Bellhop
A/B Testing Segments for Engagement Metrics
First and foremost, it’s crucial to have an efficient marketing tool that allows you to schedule the frequency of your emails and test their performance. By doing so, you can avoid being tagged as spam, which could significantly harm your sender reputation and deliverability rates!
A good email marketing platform should offer features such as A/B testing, analytics, and deliverability tracking. Additionally, you can track engagement metrics to fine-tune your strategies and achieve the best possible results.
For example—one successful test I conducted involved segmenting our email list based on user engagement levels. We divided our audience into three groups: highly engaged, moderately engaged, and less engaged subscribers. Each group received emails at different frequencies; the highly engaged group received emails three times a week, the moderately engaged group twice a week, and the less engaged group once a week.
After a month, we analyzed the open rates, click-through rates, and unsubscribe rates for each segment. We found that the highly engaged group maintained steady engagement metrics, while the moderately and less engaged groups showed improved open and click-through rates without significant increases in unsubscribes.
David Rubie-Todd
Co-Founder & Marketing Director, Glide
Continuous Testing Aligns with Customer Expectations
We were very lucky at Gigli that we understood the importance of testing things early on when it came to marketing techniques and other brand-related changes/strategies. This meant that, when we started with our email marketing campaigns, we were very serious about testing what worked for our customers in particular and what didn’t. We found out pretty early on what frequency worked for our customers pretty much through trial and error—sending emails out at different frequencies for different (but similar) campaigns.
But we didn’t stop there.
We understand that customer needs change, especially as a business grows its audience and reputation. So, we created milestones where we run through a number of tests to ensure that our email frequency still aligns with customer expectations, or if we need to adjust things slightly.
Kam Talebi
CEO of Gigli, Gigli
User Feedback Informs Email Frequency Strategy
We operate a creator job board email list. Instead of using the conventional method of sending more emails and seeing what works, we opted to ask for our users’ opinions.
Our standard procedure includes sending out regular emails, which incorporate a feedback survey on frequency and timeframe.
This survey, included in every email, offers recipients the option to receive more frequent emails containing job opportunities. By engaging our audience in this manner, we enhance our click rates without the downside of getting our list removed.
Victor Hsi
Founder, UGC Creator
Submit Your Answer
Would you like to submit an alternate answer to the question, “Email marketing pros, what is one approach you took to test different email sending frequencies? Could you describe a successful test you conducted and its results?”