Field Notes

Practical observations. No filler.

Things learned from 15 years of CRO, A/B tests, and staring at bounce rates. Updated when something is worth saying — not on a publishing schedule.

01Web StrategyOriginally shared on LinkedIn

How to turn customer problems into website improvements

This is the workflow I use at Pipedrive. High-level by design — there are dozens of nuances in practice — but enough to give you the shape of it.

1

Gather problems from everywhere

Research team findings, sales call recordings, partner feedback, direct customer conversations, support tickets. The goal isn't to find new problems — it's to collect what already exists but lives in different places.

2

Group by journey stage

Bucket everything under three categories: Discovery (problems before people understand you), Onboarding (friction in getting started), Churn (reasons people leave or get stuck). This already starts pointing you toward which parts of the funnel need attention.

3

Map to pages and website elements

For each problem, ask: which page could influence this? Then: is it a copy problem, a visual problem, or a structural problem? "I didn't know you had X feature" points to the homepage or features page, probably copy. "Your product seems complex" points to homepage visuals. "I wasn't sure if this was for my team size" could be messaging or missing social proof. Some problems map to one page. Some touch multiple.

4

Reframe as How Might We questions

"Customers don't understand our pricing" becomes "How might we make pricing feel intuitive before they reach the pricing page?" "Users think we're only for large teams" becomes "How might we signal flexibility on the homepage?" When you focus on problems alone, it's easy to feel stuck. HMW questions make them solvable.

5

Generate ideas

This is where the system you use matters. Feed it well-structured problems with full context. When the input is good, the output surprises you — ideas you wouldn't have reached on your own, at least not that fast.

Pretty cool to realise how many ways there are to use a website to influence and solve customer problems — once you've mapped the problems to pages first.

02Surveys

The question you're afraid to ask is usually the right one

Most survey questions are designed to confirm what you already believe. The ones that produce useful data are the ones that make you slightly nervous to send.

"Did you find what you were looking for?" is a safe question. "What stopped you from signing up?" is not. The first tells you something went wrong. The second tells you what.

The pattern I see repeatedly: teams write survey questions that assume the product is fine and look for confirmation. They ask "How satisfied were you?" instead of "What's missing?" They ask "How easy was it to use?" instead of "What confused you?"

The uncomfortable question works because it gives the visitor permission to be honest. Most people won't volunteer criticism — but if you ask directly, they'll tell you exactly what you need to hear.

A good survey question should feel slightly exposed to send. That's usually a sign you're asking the thing that actually matters.

03CRO

50 responses beats 50,000 pageviews for most SaaS decisions

A/B testing is a precision instrument. Most SaaS companies try to use it like a compass. The two are not interchangeable.

To detect a 5% lift in conversion with 95% confidence, a typical pricing page test needs somewhere between 5,000 and 15,000 visitors per variant. If your pricing page gets 800 visitors a month, that's a 10-month test — before you've accounted for seasonality, product changes, or the fact that the hypothesis will probably shift.

50 honest survey responses will not give you statistical significance. But they will give you a pattern. And "7 out of the last 12 visitors who didn't convert said they didn't understand what the Starter plan included" is a more actionable finding than a 12-month A/B test that couldn't reach significance anyway.

  • A/B tests tell you which variant won. Surveys tell you why.
  • A/B tests require traffic you may not have. Surveys work with 20 responses.
  • A/B tests take months to conclude. Surveys give directional answers in days.
  • They're not competitors. But for early-stage or low-traffic sites, surveys come first.

The companies most in need of conversion improvements are usually too small to run A/B tests correctly. Surveys are how they get there anyway.

Stop guessing. Start asking.

One script tag. Expert templates. Answers by Thursday.

No credit card required Set up in under 5 minutes 12 expert templates included

Free to build - pay only when you go live