Why email metrics mislead innovation teams
Jacob Dutton
25 Sept 2025

Email campaigns feel scientific. You get numbers: open rates, click-through rates, replies, unsubscribes. The data looks clean, the setup is straightforward, and the results arrive quickly.
Most teams use email campaigns to test whether people are interested in their concept. Send a series of emails explaining your solution, measure engagement, and assume high engagement equals validated demand.
What's a Message Engagement Test?
You build an email list, create a sequence of messages that introduce your concept over time, and track how recipients engage. The theory is that people who consistently open, click, and reply are your potential customers.
Here's what message engagement actually measures
Message engagement tells you whether your messaging resonates with people who are already interested in your topic area. If you're building productivity software and message people about productivity challenges, high engagement means you can write compelling content about productivity.
This is useful information, but it's not the same as market validation.
Consider what actually drives email behavior. People open emails because subject lines interest them. They click links because headlines promise interesting content. They reply when something sparks a thought or question.
None of these actions require the person to have a problem your solution could solve. Someone might find your productivity content fascinating while being perfectly satisfied with their current productivity methods.
The engagement trap
Teams often optimise for message engagement metrics without connecting those metrics to business outcomes. A 40% open rate feels successful compared to industry averages, but what does it predict about conversion rates?
We worked with a team developing financial planning software who achieved impressive message engagement: 34% opens, 18% clicks, 127 replies across their campaign. When they launched their beta, 9% of engaged subscribers actually tried the product.
The issue wasn't that their solution was bad, it was that their message campaign had attracted people interested in financial content, not people actively seeking financial planning solutions.
When message engagement tests work well
Message engagement testing shines in specific contexts. If you already know your target audience has validated problems, message campaigns can help you understand messaging preferences, feature priorities, and communication timing.
They're also valuable for testing different positioning approaches with qualified audiences. Send different value propositions to similar segments and see which generates stronger engagement and follow-through.
Sequential campaigns can test customer education assumptions. What information do prospects need before they're ready to consider your solution? How much context is required before your value proposition makes sense?
What to measure beyond opens and clicks
Track actions that require more investment than casual clicking.
People who reply with specific questions about implementation are showing different intent than people who simply click through to read more.
Monitor unsubscribe patterns.
High unsubscribes after certain emails might indicate messaging problems, but they could also indicate you're successfully filtering out uncommitted subscribers.
Follow up on engagement with actual conversion opportunities.
Offer demos, trials, or consultations to email engagers and track conversion rates. This bridges the gap between email interest and real intent.
Three ways to design better message engagement tests
1. Focus on specific behaviors rather than broad topics.
Instead of "productivity tips," target "people who work weekends regularly" or "teams missing project deadlines." Specific behaviors indicate specific problems.
2. Test problem recognition before solution introduction.
Start by validating whether recipients recognise and care about the problems you're solving. High engagement with problem-focused content is more predictive than engagement with solution-focused content.
3. Create meaningful commitment steps.
Include opportunities for recipients to take actions beyond clicking: scheduling calls, completing detailed surveys, or requesting customised information. These actions require more investment and indicate stronger intent.
Try this approach
Design an email sequence that tests problem recognition before solution introduction.
Email 1: describe a specific challenge without mentioning your solution.
Email 2: explore the costs and consequences of that challenge.
Email 3: ask recipients to share their own experiences with the challenge.
Measure engagement with problem content separately from solution content. High engagement with problems combined with low engagement with solutions might indicate market timing issues rather than solution problems.
The reality
Message engagement tests can provide validation insights under specific conditions. They're not standalone validation methods, but they can be valuable components of broader testing approaches.
Use them to test messaging and communication with audiences you've already qualified through other means. Don't use them as primary evidence of market demand.
More articles
Explore more insights from our team to deepen your understanding of digital strategy and web development best practices.