A good customer interview asks about past behavior, not future intentions. Never ask whether someone would use your product. Ask about the last time they experienced the problem, what they did about it, how much it cost them, and what solutions they already tried. The goal is not to pitch your idea. It is to understand their reality so clearly that you know whether your solution fits it. Run at least 20 to 30 interviews before drawing any conclusions.
- Why most customer interviews fail
- The core principle: past behavior over future intention
- Who to interview and how to find them
- The interview structure that works
- Questions that destroy signal and what to ask instead
- How to interpret what you hear
- After the interview: turning conversations into decisions
- Frequently asked questions
Customer interviews are the cheapest way to prevent that outcome. One honest 30-minute conversation with the right person can tell you something that six months of building cannot. The problem is not that founders skip customer interviews. Most founders know they should do them. The problem is that most founders run them badly and come away with polite encouragement instead of real signal, then build anyway on the false confidence that creates.
This guide covers exactly how to run interviews that give you the truth, not the version of the truth people think you want to hear. Before you read further, it helps to understand where customer interviews sit in the broader picture of why startups succeed or fail. Our guide on why most businesses fail before they start gives you that full context.
Why Most Customer Interviews Fail
The failure mode is almost always the same. The founder sits down with a potential customer, explains their idea, and then asks some version of whether the person thinks it is a good idea, whether they would use it, and what features they would want. The person on the other end, not wanting to be rude, says it sounds interesting, that they would probably use it, and suggests a few features they would like to see.
The founder leaves feeling validated. They have data. People said yes. The problem is that none of what they heard is a business signal.
Compliments
"This is really cool." "What a great idea." "I wish something like this existed."
Feels like: validation. Is actually: politeness.
Opinions
"I would probably use this." "I think people would pay for that." "That sounds useful."
Feels like: intent. Is actually: speculation.
Wishlists
"It would be cool if it also did X." "You should add Y." "What would make this perfect is Z."
Feels like: product direction. Is actually: imagination.
The three interview mistakes that create this bad data are consistent across almost every founder who runs early-stage interviews poorly. They pitch instead of listening, turning the conversation into a sales call before they have anything to sell. They ask about hypothetical future behavior instead of real past behavior. And they talk to the wrong people, often friends and family who cannot give them honest negative feedback even when they try.
In a 2024 survey by product strategist Lenny Rachitsky, 78% of product professionals said they wished they had done more customer interviews in the early stages of their product. The most common regret: building the wrong thing for 6 to 12 months because they assumed instead of asked. That is an expensive lesson to learn when 20 conversations costing nothing could have taught it first.
The Core Principle: Past Behavior Over Future Intention
Everything in a good customer interview flows from one idea. People cannot reliably predict their own future behavior, but they can accurately describe their past behavior. When you ask someone whether they would use your product, you are asking them to predict the future. They will give you a socially comfortable answer, not a true one. When you ask them what they did the last time they faced the problem you are trying to solve, they can only tell you the truth because it already happened.
The critical test for whether a problem is real:
If the person has not tried to solve the problem themselves, it is not a real problem. It is a complaint. Real problems motivate action. People search for solutions, pay for imperfect alternatives, build workarounds, and complain loudly in the communities they belong to. A problem that produces none of those behaviors is not urgent enough to build a business around.
This single test eliminates more bad ideas than any other.
This principle also changes how you interpret silence. If you ask someone to describe the last time they experienced your target problem and they struggle to remember a specific instance, that is data. Important data. It tells you the problem is either infrequent, not painful enough to be memorable, or may not exist in the form you assumed. For more on how to use this kind of insight to shape your entire strategy before building, read our guide on problem-first vs product-first business strategy.
Who to Interview and How to Find Them
Talking to the wrong people produces worse results than not interviewing at all, because it gives you false confidence. Your interviews are only as good as the match between the people you talk to and the people who would actually need your product.
The right person to interview is someone who currently has the problem, has tried to solve it in some way, and is frustrated enough with existing solutions that they are still looking for something better. People who have struggled enough to look for solutions are your most informative source. Recruiting is genuinely hard, but talking to the wrong people is worse than not talking at all.
Where to find the right interview candidates
On how many interviews to run: aim for 15 to 30 quality conversations with strangers who match your target customer profile. After 15 to 20 interviews you will start hearing the same patterns. If you do not hear patterns after 20, your customer segment is too broad. Narrow it down and start again.
The sweet spot for each interview is 30 to 45 minutes. Shorter than 30 minutes rarely surfaces the real insight because people give their surface-level polished answer first and only get to the honest and useful version after they feel comfortable. Longer than 45 minutes risks fatigue and tangents that do not inform your core questions.
The Interview Structure That Works
The structure matters almost as much as the questions. A customer interview that jumps straight to questions feels like an interrogation. One that eases in, earns trust, and lets the conversation breathe consistently produces better and more honest information.
Before the interview
Write down your three riskiest assumptions before every session. Not your questions. Your assumptions. The things you are most likely to be wrong about that would most affect whether your business works. Then design your questions to surface evidence that either confirms or challenges each one. This prevents the most common trap in customer interviewing, which is unconsciously steering the conversation toward confirmation of what you already believe.
Prepare six to eight open-ended questions focused entirely on their past experience, not your idea. And commit to one rule you cannot break: you will not mention your product, your idea, or what you are building during a problem discovery interview. Not once. The moment you mention your solution, the conversation stops being research and starts being a sales call. You will get different answers.
Opening the conversation (first 5 minutes)
Start with context and permission. Tell them you are trying to understand how people currently deal with a specific type of problem, not pitch anything. Then ask a warm-up question about their role and their typical week. This serves two purposes: it builds rapport and it starts them talking in the descriptive mode you need them in rather than the evaluative mode they will default to if you open with your idea.
The core discovery questions (20 to 25 minutes)
These are the questions that do the work. Every one of them is anchored to past behavior, not future intention. Use them in roughly this order but follow the conversation where it leads rather than treating them as a rigid script.
Opening the problem
"Tell me about the last time you experienced [the problem]. What happened?"
Gets you a specific story with real context. Specific stories are far more informative than general descriptions.
Digging for real cost
"How much did you pay to solve it last time? What workarounds have you hacked together?"
Existing spend is your strongest demand signal. If they have never paid anything or built any workaround, the problem is not urgent enough.
Understanding existing solutions
"What did you try first to solve this? What was wrong with it?"
Maps the competitive landscape and tells you where existing solutions fall short, which is where your opportunity lives.
Measuring frequency and cost
"How often does this happen? What does it cost you in time or money when it does?"
Frequency times cost equals the total value of the problem. This is the foundation of your pricing ceiling.
Understanding the ideal outcome
"If this problem were completely solved, what would that look like for you?"
Gets you the outcome they want, not the feature they think they want. These are very different and the distinction matters.
Closing (final 5 minutes)
End every interview with two questions. First: "Who else should I talk to about this?" If your interview was useful and the person is genuinely engaged with the problem, they will almost always give you a name or two. This compounds your recruiting effort for free. Second: "Is there anything you expected me to ask that I did not?" This consistently surfaces the most important insight of the entire conversation, the thing they were waiting to say that your questions never quite reached.
Questions That Destroy Signal and What to Ask Instead
The gap between a useful customer interview and a misleading one is almost entirely in the wording of individual questions. Here are the most common bad questions, why they fail, and what to ask instead.
| Bad question | Why it fails | Better question |
|---|---|---|
| "Would you use this product?" | Gets social compliance, not truth. People say yes to be polite. | "Tell me about the last time you faced this problem." |
| "Do you think this is a good idea?" | Gets politeness. Asks them to evaluate your work, not describe their experience. | "How are you currently handling this? Walk me through it." |
| "Would you pay $X for this?" | Hypothetical pricing is almost always lower than real willingness to pay. | "How much did you spend trying to solve this last time?" |
| "What features would you want?" | Gets wishlists, not needs. People request features they would never actually use. | "Walk me through exactly what happens when this problem occurs." |
| "Do you have this problem?" | Leads the witness and gets a yes-or-no answer with no useful context. | "Describe a recent frustrating experience with X. What happened?" |
| "Doesn't it bother you when X happens?" | Suggests the answer. People will agree with a loaded question almost universally. | "What is the most frustrating part of dealing with X right now?" |
The pattern across all the bad questions is the same: they tell the person what to think rather than asking them what they actually experience. Good questions create space for the person to describe their reality. Bad questions fill that space with your assumptions before they can.
The most important thing you can do in a customer interview is stay silent after asking a question. Let the silence sit for longer than is comfortable. Most of the best insight in a customer interview comes in the second or third sentence of someone's answer, after they have said the polished version and started giving you the honest one. Jumping in to fill silence kills that moment every time.
How to Interpret What You Hear
Running 20 interviews is only valuable if you can distinguish between signal and noise in what you collect. Not everything you hear means something. Not every enthusiastic response is a buying signal. And not every hesitation is a red flag.
The green lights you are looking for are specific and behavioral. Someone using unprompted emotional language about the problem, saying things like "this drives me insane" or "I waste so much time on this." Someone who has paid money, even small amounts, trying to solve the problem with existing tools. Someone who asks whether they can sign up or be notified when you launch, without being asked. Someone who immediately refers you to three other people who have the same problem.
The red lights are equally specific. Polite interest with no urgency and no follow-up questions from them about your timeline. No current attempt to solve the problem at all, suggesting they can live with it indefinitely. Asking what your product costs before asking what it does, which suggests they are looking for a cheap alternative to something they already use rather than a genuine solution to a painful problem.
If fewer than 30% of interview subjects show strong interest or engagement after 20 interviews, that is a signal to reassess your customer segment or your problem hypothesis before going further. The problem might be real but your target customer definition might be too broad. Narrow it until the people you are talking to are the ones who feel the pain most acutely.
After the Interview: Turning Conversations Into Decisions
The interview itself is only half the work. The other half is what you do with what you heard. Most founders take scattered notes during interviews, review them once, and then proceed based on a general feeling of whether things went well. That process loses most of the signal.
Write up your notes immediately after each interview while the conversation is still clear in your mind. Capture specific quotes, specific behaviors the person described, specific numbers they gave you, and specific red or green flags from the criteria above. Do this within an hour of each conversation.
After ten interviews, start looking for patterns across your notes. Not impressions. Specific repeating phrases, behaviors, and problems. The patterns that matter are the ones that appear across multiple conversations without you prompting them. After 15 to 20 interviews, you will start hearing the same things from different people. If you do not hear consistent patterns after 20 interviews, your customer segment is too broad. Narrow it down and start another round.
Decision criteria after 20 to 30 interviews
Proceed if you have 3 or more consistent patterns in how people describe the problem, at least 30% who have paid for something trying to solve it, and at least 2 people who asked when they can sign up.
Narrow and repeat if you have mixed signals, inconsistent problem descriptions, or the enthusiasm does not convert to any behavioral signal. Your customer definition is probably too broad.
Pivot the problem if fewer than 30% show strong interest, nobody has paid anything trying to solve it, and the problem does not come up as something urgent or memorable in people's day-to-day experience.
Customer interviews do not stop once you have a product. They evolve. The questions change from "does this problem exist" to "are we solving it well enough." But the discipline of asking about past behavior rather than future intention, listening more than talking, and interpreting behavior signals over verbal signals stays constant throughout the life of any company that takes its customers seriously.
For the next steps after your interviews confirm there is a real problem worth solving, our guide on 10 questions every startup must answer before building anything gives you the full pre-build checklist. And for the financial side of validating whether the opportunity is large enough to sustain a business, see our guide on how much money you need to start a business.
Frequently Asked Questions
Want more guides like this?
Browse all free business guides on Groundwork.