The best product research often happens after the product is sold.

That sounds backward, but complex products reveal their truth in implementation. The customer has budget at stake. Real users are involved. Actual data appears. Legacy systems resist. Workflows stop being diagrams. The gap between the product's intended use and the customer's lived reality becomes impossible to ignore.

This is product discovery in disguise.

Not discovery as a workshop. Not discovery as a set of interviews. Discovery as the moment when the product is forced to earn its place inside real work.

Companies that learn from implementation get better. Companies that treat implementation as cleanup repeat the same mistakes with more confidence.

Implementation produces high-quality evidence

Pre-sale discovery has limits.

Customers speculate. Buyers simplify. Users describe the process they think they follow. Stakeholders understate edge cases. Everyone is influenced by what they want the purchase to mean.

Implementation is less forgiving.

The actual data export arrives. The system of record contains fields no one mentioned. The approval path is informal. The team has three versions of the same process. The "standard workflow" has regional exceptions. The executive sponsor wants one outcome, frontline users optimize for another, and the product's default assumptions suddenly look provincial.

This evidence is valuable because it is not abstract.

It shows where value gets delayed, where trust breaks, where configuration becomes consulting, where users need different language, and where the product has confused flexibility with burden.

The implementation team sees what customers can actually do, not what they said they would do.

The danger is treating field truth as anecdote

Product teams often discount implementation evidence because it arrives messily.

It comes through Slack threads, escalation calls, customer notes, custom work requests, migration problems, training questions, and frustrated delivery people. It is not packaged as clean research. It does not always have perfect sample size. It is tangled with customer-specific context.

So the organization calls it anecdotal.

Sometimes it is. But "anecdotal" can become a lazy way to ignore the most expensive evidence the company has.

A single implementation problem may be customer-specific. The fifth version of the same problem is a product signal. A recurring migration pain may be more than services friction. Repeated training confusion may be more than documentation debt. Scope expansion may be more than customer misbehavior. It may mean the product's conceptual model does not match how the market works.

Implementation evidence needs synthesis, not dismissal.

Discovery during implementation has a different question

Classic product discovery asks, "What should we build?"

Implementation discovery asks, "What must be true for what we built to create value?"

That is a different lens.

The answer may be a new feature. It may also be a better default, a narrower first use case, a migration tool, a readiness checklist, a partner capability, a clearer role definition, a simpler configuration surface, a stronger proof milestone, or a change in which customers the company should sell to.

Implementation discovery covers the whole value path, including the product artifact.

This is why it belongs in the implementation economy. The customer realizes value when the product becomes a trusted workflow with measurable outcomes. Anything that blocks that path is relevant evidence.

Product must stay close without taking over delivery

There is a bad version of this idea where product teams parachute into every implementation and create chaos.

That is not the goal.

Delivery teams need clear ownership. Customers need consistent implementation leadership. Product cannot turn every rollout into an open-ended research session.

But product does need structured exposure to implementation truth.

That can look like:

  • regular reviews of implementation failure patterns
  • tagging delivery issues by root cause
  • listening to selected kickoff, workflow, and go-live calls
  • shadowing difficult migrations
  • reviewing custom requests for repeatability
  • tracking which product assumptions create delivery labor
  • converting recurring implementation work into roadmap candidates
  • distinguishing customer-specific edge cases from market-wide gaps

The trick is a feedback system, not random heroic involvement.

Product should not own every implementation. Product should own learning from the pattern.

Services teams need a language for product signals

Implementation teams often know the truth before anyone else. They can feel which parts of the product are fragile, confusing, or overpromised.

But feeling is not enough. The organization needs a language for converting delivery pain into product signal.

Useful categories include:

Readiness gap: the customer lacks the data, owner, workflow clarity, or capacity required for value.

Product gap: the product cannot handle a common requirement without custom work.

Packaging gap: the sold package does not match what the implementation actually requires.

Positioning gap: the customer expected an outcome the product does not produce in that form.

Configuration burden: flexibility has been pushed onto the customer or delivery team instead of resolved through sensible defaults.

Trust gap: users cannot inspect, explain, or rely on the product enough to change behavior.

Workflow mismatch: the product supports a clean process that does not exist in the customer's environment.

These categories help avoid the useless debate between "the customer is difficult" and "the product is bad." Reality is usually more precise.

Implementation can protect the roadmap from fantasy

Roadmaps are vulnerable to abstraction. Strategy decks can make markets look coherent. Customer interviews can make needs look stable. Sales calls can make every request sound urgent.

Implementation adds friction to the story.

It asks: can this be deployed? Who will use it? What data does it require? What workflow does it replace? What happens when the recommendation is wrong? Who owns the exception? How long before the customer sees value? How much services work is hidden inside the feature?

These questions make roadmaps less theatrical.

They also reveal productization opportunities. If the implementation team repeats the same setup, cleanup, explanation, training, or validation work across customers, that is not simply "services." It may be the raw material of product advantage.

But the company has to be honest. Not all services work should become product. Some work is genuinely customer-specific. Some should be handled by partners. Some should be priced as expert services. Some should be eliminated by saying no to the wrong customer.

Implementation discovery helps make those choices.

Practical implications

Create a recurring implementation evidence review with product, services, sales, and customer success. Keep it focused on patterns, not anecdotes.

Track implementation friction by type. Do not let every problem become a generic "customer issue." Name the root cause.

Protect delivery teams from becoming unstructured research assistants. Their job is to realize customer value. The learning system should reduce their burden over time.

Require roadmap proposals for complex products to answer implementation questions: what must be configured, migrated, trained, trusted, and measured before value appears?

Finally, reward the teams that surface uncomfortable truth. Implementation evidence is often politically inconvenient because it exposes overpromising and product gaps. That is why it matters.

Implementation is product discovery with consequences.

Treat each implementation surprise as a classified signal: readiness gap, product gap, packaging gap, positioning gap, configuration burden, trust gap, or workflow mismatch. The category decides who has to act.

The customer paid for the product. Now the product has to prove what it really is.


This is part 3 of 10 in The Implementation Economy.