The simplest way to test if your solution makes sense

Jacob Dutton

3 Jul 2025

The Sketch Mockup test is the fastest way to test whether your solution concept makes sense to customers before writing a single line of code.

Most teams jump straight from idea to wireframes to development. They spend weeks perfecting digital mockups and months building features, only to discover that customers don't understand the value proposition or get confused by the core workflow.

Sketch Mockup tests force you to think through the user experience when all you have is paper and a pen.

What's a Sketch Mockup test?

A hand-drawn interface sketched on paper, with someone manually simulating the software's reactions to customer interactions. You walk customers through your solution concept using nothing but sketches and role-play.

No pixels, no code, no fancy design tools. Just paper, markers, and the core question: does this actually solve their problem in a way they understand?

The magic happens when customers try to "use" your sketched interface, and you discover all the assumptions that seemed obvious in your head but make no sense to them.

How a major QSR chain avoided £3M in wasted system development

A leading quick-service restaurant chain we work with was developing an internal inventory management system for their 800+ locations. Store managers were struggling with food waste, stockouts, and manual ordering processes that took hours each week.

The IT team's solution focused on predictive analytics; using historical sales data and weather patterns to automatically generate optimal order quantities. They planned an 18-month development project to build sophisticated forecasting algorithms.

Before building the system, they decided to test the core concept with Sketch Mockup testing across 12 restaurants. They created hand-drawn screens showing the key workflows and visited stores to test with actual managers.

The setup: Paper sketches of the system interface spread across the manager's office desk. One team member acted as the "computer," manually updating screens and calculations as managers worked through their weekly ordering process.

What they expected: Managers would appreciate automated forecasting and trust the system's recommendations.

What actually happened:

Managers ignored the predictions immediately.

When shown forecasted order quantities, managers said "That's not right" within seconds and started manually adjusting numbers. They had intuitive knowledge about local events, seasonal patterns, and customer preferences that no algorithm captured.

The real bottleneck emerged.

As managers walked through their actual ordering workflow, the problem wasn't calculating quantities, it was coordinating between multiple suppliers with different order deadlines, minimum quantities, and delivery schedules. Managers spent most of their time juggling logistics, not forecasting demand.

Trust was the critical factor.

Managers needed to see the reasoning behind recommendations. When the "computer" showed a suggested order without explanation, managers asked "Why?" and "What if this is wrong?" They wanted transparency, not automation.

The workflow reality surfaced.

Managers didn't order in isolation; they constantly checked with neighbouring locations about supplier issues, shared surplus inventory, and coordinated deliveries. The system needed to support collaboration, not replace human judgment.

Based on these insights, the team completely redesigned their approach:

  • Scrapped the predictive analytics engine (saving 12 months of algorithm development)

  • Built a coordination platform that simplified supplier management

  • Created transparency features showing the logic behind any suggestions

  • Added location-to-location communication and inventory sharing tools

  • Focused on workflow efficiency rather than automated decision-making

The repositioned system:

  • Launched 8 months ahead of the original timeline

  • Reduced average ordering time from 4 hours to 45 minutes per week

  • Decreased food waste by 23% across pilot locations

  • Achieved 89% adoption rate vs. typical 31% for new internal systems

The total cost of the Sketch Mockup testing: £2,400 for travel across 12 locations over 2 weeks. The value: avoiding £3M in development costs for a system managers would never have trusted or used properly.

How to run a Sketch Mockup test

The power of sketch mockups lies in their limitations. Because everything is manual and low-fidelity, customers focus on whether the concept makes sense rather than getting distracted by design details.

1. Sketch the essential flow

Draw only the screens customers need to complete their core task. Don't worry about visual design, just focus on information, buttons, and workflow. Stick figures and rough shapes are perfect.

2. Test with real scenarios

Don't ask customers to imagine using your app. Give them a real task: "You just had a business lunch that cost £47. Show me how you'd handle this expense." Watch them try to navigate your sketches.

3. Play the computer

One person facilitates while another acts as the "computer"; manually changing screens, writing in text fields, and showing results. This shows where your logic breaks down or confuses users.

4. Focus on moments of confusion

When customers pause, ask "what?" or try to click somewhere unexpected, dig deeper. These moments reveal assumptions that need testing with higher-fidelity experiments.

Common mistakes that waste the opportunity:

  • Making sketches too detailed or polished

  • Testing features instead of complete user flows

  • Asking what customers think instead of watching what they do

  • Explaining how things work instead of letting customers figure it out

  • Focusing on visual feedback rather than workflow insights

Try this next week

Take your current product or service concept. Sketch the 3-5 key screens customers would see when solving their main problem. Find 2-3 potential customers and walk them through a real scenario using only your sketches. Don't explain, just observe where they get confused or excited.

You'll likely find that the workflow you think is obvious actually makes no sense to customers, and the features you think are important aren't what they care about.

More articles

Explore more insights from our team to deepen your understanding of digital strategy and web development best practices.