Skip to main content

Product Management Concept Questions

This section covers common conceptual questions in PM interviews with structured response frameworks.

Product Sense

Q1: How do you prioritize features?

Frameworks:

FrameworkFormulaUse Case
RICE(Reach x Impact x Confidence) / EffortQuantitative comparison
ICEImpact x Confidence x EaseSimpler scoring
Value vs Effort2x2 matrixVisual prioritization

Value vs Effort quadrants:

QuadrantCharacteristicsAction
Quick winsHigh value, low effortDo first
Big betsHigh value, high effortPlan carefully
Fill-insLow value, low effortIf time permits
Money pitsLow value, high effortDo not do

Anchor prioritization to: business goals, user needs, strategic fit.

Q2: How do you decide what to build?

Process:

StepAction
1Understand the goal: What business outcome are we seeking?
2Identify user problems: What pain points exist?
3Generate solutions: Multiple approaches to solve the problem
4Evaluate: Impact, effort, risk, strategic fit
5Validate: Test assumptions before building

Starting point: Always begin with problem identification, not solution.

Q3: What makes a good product?

Criteria:

CriterionDefinition
Solves a real problemAddresses genuine user need
Provides valueUsers would miss it if removed
Is usableUsers can accomplish goals
Is delightfulCreates positive experience beyond functional
Is sustainableBusiness model works

Q4: How do you know if a feature is successful?

Success criteria categories (defined before launch):

CategoryMetrics
Adoption% of users who try feature
EngagementUsage frequency and depth
RetentionContinued usage over time
BusinessRevenue, conversion impact
QualitySentiment, bug rate

Example (new search feature):

  • Adoption: 70% of users try within first week
  • Engagement: 3+ searches per session
  • Retention: Return rate increases 5%
  • Quality: No increase in support tickets

Q5: How do you handle conflicting stakeholder priorities?

Resolution process:

StepAction
1Understand underlying needs (why do they want this?)
2Find common ground (what goals do they share?)
3Use data to inform decision
4Align on criteria before deciding
5Escalate when consensus not possible

Key principle: Separate positions from interests. Different feature requests may serve the same underlying goal.

Metrics & Analytics

Q6: How would you measure success for [YouTube/Netflix/Spotify]?

Framework: North Star + Supporting Metrics + Guardrails

YouTube example:

CategoryMetrics
North StarTotal watch time
SupportingDAU/MAU, watch time per user, videos watched, creator uploads
GuardrailsUser satisfaction, content quality, creator monetization

Rationale: Watch time captures value better than views (duration matters) and leads revenue (advertising).

Q7: A key metric dropped 10%. How do you investigate?

Structured debugging:

StepAction
1. VerifyIs the data correct? Instrumentation issues?
2. WhenWhen did it start? Correlate with releases/events
3. WhoWhich segments affected?
4. WhatWhat related metrics also changed?
5. WhyForm and test hypotheses

Common causes:

  • Bug or deployment
  • External event (holiday, competitor, news)
  • Seasonality
  • Actual user behavior change

Q8: What is the difference between leading and lagging indicators?

TypeDefinitionExamplesCharacteristics
LaggingOutcome metricsRevenue, churn, NPSDefinitive but slow
LeadingPredictive metricsEngagement, activation, feature usageFaster iteration

Example (subscription service):

  • Lagging: Monthly revenue, annual retention
  • Leading: Feature adoption, session frequency, content consumption

Q9: How do you set OKRs?

Components:

ComponentCharacteristics
ObjectivesQualitative, inspiring, time-bound
Key ResultsQuantitative, measurable, stretching (70% achievement = success)

OKR quality comparison:

QualityObjectiveKey Result
PoorLaunch new featureShip by Q2
GoodBecome go-to tool for team collaborationIncrease weekly active teams from 10K to 25K

Good OKRs connect to company strategy and focus on outcomes, not outputs.

Q10: When would you not run an A/B test?

ScenarioRationale
Obvious improvementBug fix, legal requirement
Insufficient trafficCannot reach significance in reasonable time
IrreversibleCannot show different experiences
Strategic decisionsLong-term bets not measurable short-term
Qualitative changesBrand, aesthetic (requires different methods)

A/B tests are for optimization. Innovation often requires other research methods.

Strategy

Q11: How do you approach entering a new market?

Framework:

StepQuestions
Why this market?Size, growth, fit with core business
Who are we serving?Target segment, their needs
What is the landscape?Competition, substitutes
How do we win?Differentiation, unfair advantages
What is the path?Go-to-market, milestones

Key questions: What is our right to win? What do we uniquely bring?

Q12: How do you think about competitive moats?

Moat TypeDescriptionExample
Network effectsMore users = more valueSocial networks
Switching costsPainful to leaveEnterprise software
Scale economiesBigger = cheaperAWS, manufacturing
BrandTrust and recognitionApple
DataProprietary data improves productGoogle Search
RegulationLicenses, complianceBanking

Evaluation criterion: How defensible? Can competitors copy?

Q13: How do you identify product-market fit?

Positive signals:

SignalDescription
RetentionUsers keep coming back
Word of mouthUsers tell others (NPS > 40)
Organic growthGrowth without paid marketing
Clear value propUsers can articulate why they use it
Sean Ellis test>40% would be "very disappointed" if product went away

Warning signs:

  • Relying entirely on paid acquisition
  • High churn
  • Users cannot explain value
  • No passionate users

Q14: Build vs buy vs partner?

OptionWhen to Use
BuildCore competency, competitive advantage, have expertise, long-term investment
BuyNot core, exists in market, faster time-to-market, lower ongoing cost
PartnerStrategic opportunity, access to users/data, reduce risk, test before commit

Evaluation questions:

  • Is this core to our differentiation?
  • Do we have the expertise?
  • What is the opportunity cost?
  • How does it evolve over time?

Q15: How do you think about pricing?

Pricing lenses:

LensQuestion
Value-basedWhat is it worth to customers?
Cost-basedWhat does it cost us?
Competition-basedWhat do alternatives cost?

Pricing models:

ModelCharacteristics
Per user/seatPredictable, scales with value
Usage-basedAligns with value, harder to predict
Flat rateSimple, may leave money on table
FreemiumAcquisition tool, conversion challenge

Execution

Q16: How do you write a good PRD?

PRD sections:

SectionContent
Problem statementWhat are we solving and for whom?
GoalsSuccess metrics, definition of done
Non-goalsWhat we are explicitly NOT doing
User storiesAs a [user], I want [goal], so that [benefit]
RequirementsFunctional and non-functional
Open questionsWhat we still need to figure out

Characteristics of good PRDs: Focus on why (not just what), living documents, clear scope, readable by all stakeholders.

Q17: How do you manage technical debt?

Definition: Shortcuts taken to ship faster that create future work.

Management strategies:

StrategyDescription
TrackMaintain backlog of tech debt items
QuantifyMeasure impact on velocity, bugs, outages
AllocateReserve ~20% of sprint for debt
PrioritizeAddress debt blocking key features
PreventQuality standards, code review

PM role: Balance feature velocity with platform health. Make trade-offs explicit.

Q18: How do you work with engineering?

Effective partnership characteristics:

CharacteristicDescription
Shared contextEngineers understand the why
Early involvementEngineers shape solutions, not just implement
Clear prioritizationNo ambiguity about what matters
FlexibilityHow is engineering's domain
TrustNo micromanagement

Anti-patterns:

  • Throwing specs over the wall
  • Changing requirements mid-sprint
  • Ignoring engineering estimates
  • Not involving eng in discovery

Q19: How do you handle a feature request from the CEO?

Process:

StepAction
1. ListenUnderstand the underlying need, not just the request
2. ContextualizeHow does this fit with current priorities?
3. EvaluateImpact, effort, strategic fit
4. Communicate trade-offsIf we do this, what don't we do?
5. RecommendHave a point of view, defer appropriately

Approach: Neither automatic agreement nor automatic pushback. Be curious about the why and transparent about implications.

Q20: Describe a product you would kill. Why?

Criteria for killing features/products:

CriterionDescription
Low usageFew users, declining trend
High costMaintenance burden, opportunity cost
Strategic misfitDoes not align with direction
Better alternativesUsers can switch easily

Process:

  1. Validate with data
  2. Understand remaining users
  3. Provide migration path
  4. Communicate clearly
  5. Learn from it