When you talk about a “long tail” strategy, you probably think it starts and ends with search engine marketing, but it should also be part of your email campaign plan.
If you don’t track results after your campaign has ended, you’re likely to under-report your success. With the pressure on email marketing to deliver results, this can be a huge oversight.
But it's easy to see why search dominates the concept of the long tail: the term entered the marketing vernacular in 2004, when author Chris Anderson published “The Long Tail: Why the Future of Business is to Sell Less and More,” which discussed the profit potential of selling less popular products with longer shelf lives, rather than high-demand, high-cost items that encourage impulse buys.
Anderson applied the concept of the long tail to search marketing in 2006. He argued that less popular but more specific keywords are more effective than general or expensive, high-traffic keywords in both paid and organic search. They may generate fewer hits, but they're more likely to lead searchers to the right results faster, he said.
The long tail of email marketing also helps with attribution
Email campaigns have always had a long tail, but we didn’t typically discuss it that way.
The long tail of email occurs when someone takes action on your email days, weeks, or even months after you send it – whether they opened your email on the day you sent it and then did something else, or whether they left it unopened and waited for a lucky break.
This long tail is one of email's many benefits, but one that may not currently be considered due to early campaign reporting deadlines. This has a negative impact on your email program, as you likely won't get enough credit for contributing to increased revenue, engagement, and value for your company. Without this credit, attribution can suffer, which can strain your budget.
Learn more: Email Marketing Strategy: A Guide for Marketers
Why people are overlooking the long tail of email
This all has to do with how you report on your campaign results. Typically, you only look at the first reporting period and then move on to the next campaign. Many understaffed and busy email teams only have time to take a quick look at their reports before moving on to the next campaign.
Another problem: reporting periods are often set arbitrarily, based on how often you send emails, rather than on data that shows how long it took recipients to respond.
But conversions and revenue happen over a longer period of time, thanks to email's “nudge effect,” which can prompt customers to open an email campaign weeks after it arrives in their inbox.
Low-commitment actions like downloads and time-sensitive actions like signups happen immediately. Spending money, purchases that require a longer consideration period, date-based reservations, etc. may take longer to convert.
The problem is that we don't keep track of performance over time, and often don't count activity that occurs after the reporting period is over – a week to a month or even more after the campaign launch. This is why I say our campaigns could have performed much better than the official reports show.
New research from the Data & Marketing Institute's Consumer Email Tracker 2023 highlights the need to continually revisit campaign performance. One finding is that 19% of consumers save discount, offer, or sales emails to receive at a later date. This means that while they're interested, now isn't the right time to take action.
What do you get from revisiting a campaign?
When was the last time you reviewed your results long after the reporting period had ended? You'll probably find email activity (opens, clicks, potential conversions) that, while not nearly as significant as the activity your campaign generated in the first reporting period, still contribute to your campaign's performance.
We found that users who saved the email for later had a higher intent, which resulted in a higher conversion rate than in the first reporting period.
As a result, your campaign metrics may be more positive than initially thought, especially when considering conversions.
Learn more: 7 key email metrics to track beyond opens and clicks
Case Study: Finding the Right Reporting Period to Measure Success
This case study from work I did for a client makes clear the need to include long-tail results in your reporting.
Here's what I discovered:
Our client sends campaigns twice a week and tracks them over a four day period. After digging into the reports using Google Analytics, they discovered that one campaign recorded a 128% lower email success rate.
This was not an anomaly, as we reviewed multiple campaigns and found similar results in other campaigns.
Using this data, they were able to define appropriate reporting periods, which resulted in increased email attribution and budget growth.
- Original date range of the trace: March 8th to March 11th
- 114 transactions.
- 1,294 web visits.
- 9% CR.
- Revenue: £8,326.
- Expanded tracking date range: March 8th to May 31st
- 303 transactions.
- 2,317 web visits.
- 13% CR.
- Revenue: £19,022.30.
We were surprised by how many of the customers who visited our website as a result of this email in April showed high intent to purchase. Although the number of visits was lower than in the previous 23 days, the conversion rate was an impressive 37%.
5 steps to conducting long-tail research
It's easy to tell if your campaign activity is lacking, especially conversions and revenue. Follow these five steps to set up a useful tracker:
1. Make sure your analysis software is set up correctly All email campaigns and programs are tagged so you can track the success of your emails.
2. Check the dashboard. In your Google Analytics dashboard, select 15-20 campaigns that you've sent within the past year.
3. Find campaign activities: For each campaign, look at all campaign activity to date or until nothing else is recorded. Record the number of weeks until there is no activity. This will give you insight into any long tail potential.
4. Expand your monthly reports: Create more than just a regular report on your campaigns from that time period. Go back a month and get the latest metrics for those campaigns as well. If you find any long-tail activity from that time period, add the page with the latest metrics for previous campaigns to a new report.
5. Find the cutoff: You need to decide when to stop investigating an activity. Remember, your investment will have to pay off over time, so be realistic. However, long-tail research can help you find the right cut-off point.
A/B testing and its impact on long-tail conversions
A typical A/B testing scenario would be to send a control message to 10% of your list, a variation to another 10%, and the winning version to 80% of your list – but if you factor in the long tail, the winning version might convert.
This standard A/B testing procedure is fine for tests that use opens and clicks as success metrics, but it doesn't work for conversions because it doesn't take into account conversions that occur long after the short 2-3 hour test period.
To account for this, do a 50-50 split, record your results, update your results over time, account for the long tail based on conversions, and set your test period based on how long your campaign was active as indicated in your analytics program.
Learn more: 7 common problems that prevent successful A/B/n email testing
Tail length varies by campaign
As we mentioned earlier, some campaigns, brands, or products may not have a long tail. Daily flash sales or campaigns with strict deadlines may generate opens and clicks, but not conversions. A season-opening cruise line campaign may have a much longer active period. But you won't know until you check.
My guess is that you'll find that many more conversions than you expected are occurring that aren't being accounted for. Collecting and reporting on that data will give you a better idea of how your email program is actually performing and how it's contributing to your company's revenue, attribution, and budget.
Contributors are invited to create content for MarTech and are selected based on their expertise and contributions to the martech community. Contributors work under the supervision of our editorial staff, and their contributions are checked for quality and relevance to our readers. Any opinions they express are their own.