K
KnowMBAAdvisory
ProductIntermediate5 min read

Cross-Functional Reviews

A cross-functional review is the structured walkthrough of a product spec with the non-product functions whose work it touches: engineering (feasibility), design (usability), legal (compliance), finance (business viability), sales (deal impact), customer support (ticket impact), and ops (delivery impact). Marty Cagan's 'four big risks' framework names this explicitly โ€” the viability risk (does it work for our business?) is the one product teams systematically skip until launch. Cross-functional reviews are the cheapest way to catch the show-stoppers BEFORE engineering builds. The pattern: circulate the spec 48 hours ahead, hold a 30-minute review with each function, log every concern as a decision-needed item. The goal isn't consensus โ€” it's surfacing the constraints that would otherwise show up at launch.

Also known asXFN ReviewsSpec ReviewPre-Build ReviewStakeholder WalkthroughViability Check

The Trap

The trap is the rubber-stamp review: scheduling the meeting, no one reads the spec, everyone nods, and the launch surfaces a SOC 2 issue legal could have caught in 5 minutes. The opposite trap is design-by-committee: every function gets veto power, the spec accretes everyone's edge cases, and the build doubles. The deepest trap is treating the review as a one-time event. Specs evolve during build; constraints surface late; without a re-review at the prototype stage, the team ships something legal hasn't actually seen.

What to Do

Run two reviews per major project: (1) Spec review (before commit) โ€” circulate 48 hours ahead, hold async-friendly 30-min walkthroughs with each function, log every concern with an owner and a resolution path. (2) Pre-launch review (before ship) โ€” walk the working prototype past the same functions; verify nothing changed materially since spec. Use a standing review checklist: legal/privacy, finance/pricing, sales enablement, support readiness, ops capacity. If a function's representative can't attend twice, escalate โ€” that function will block the launch later if it doesn't bless it now.

In Practice

Marty Cagan's SVPG writing on the 'four big risks' (value, usability, feasibility, viability) explicitly calls out that product teams reliably test value and usability, sometimes test feasibility, and almost never test viability until after launch. He recommends walking every prototype past finance, legal, sales ops, and customer support before commit. Cagan's coaching engagements regularly find that a single 30-minute legal review surfaces a compliance issue that would have killed the feature post-launch โ€” at 1/100th the cost of finding it after code ships. (Source: SVPG Four Big Risks)

Pro Tips

  • 01

    Make the review an async-first process. Send the spec 48 hours ahead, ask each function to comment in writing, then hold a 30-minute live meeting only for the items that need real-time discussion. This respects everyone's time AND produces a written record for the decision log.

  • 02

    Sales is the most over-weighted voice in cross-functional reviews. Their input on prospect demand is golden; their input on what to BUILD is the source of most feature factories. Listen for the problem they're hearing, not the solution they're proposing.

  • 03

    Customer support is the most under-weighted voice. They live with every shipped decision. If support says 'this will double our ticket volume,' believe them โ€” they have the receipts.

Myth vs Reality

Myth

โ€œCross-functional reviews slow down shippingโ€

Reality

They slow down BAD shipping. The 30 minutes spent in legal review is an order of magnitude cheaper than the 6-week post-launch scramble to fix a compliance issue. Teams skipping XFN reviews aren't shipping faster โ€” they're shipping more rework.

Myth

โ€œIf everyone has veto power, nothing shipsโ€

Reality

XFN reviews aren't about veto power โ€” they're about surfacing constraints. The PM remains the decision-maker. Functions raise concerns; the PM decides which to address, which to defer, which to accept. Veto power belongs only to legal/compliance for genuinely blocking issues.

Try it

Run the numbers.

Pressure-test the concept against your own knowledge โ€” answer the challenge or try the live scenario.

๐Ÿงช

Scenario Challenge

You're 3 weeks into building a new auto-renewal billing feature. A finance teammate stops by your desk and asks 'wait, are you handling failed-payment retries the same way our existing system does, or differently?' You don't know โ€” finance wasn't in the spec review.

Industry benchmarks

Is your number good?

Calibrate against real-world tiers. Use these ranges as targets โ€” not absolutes.

Cross-Functional Review Coverage

Mid-stage and growth product orgs

Strong

Every major spec reviewed by 4+ functions pre-build

Healthy

2-3 functions per spec

Light

Eng + design only

None

PM commits unilaterally

Source: SVPG / Cagan four big risks practice

Real-world cases

Companies that lived this.

Verified narratives with the numbers that prove (or break) the concept.

๐Ÿงญ

Marty Cagan / SVPG โ€” Four Big Risks

2008-present

success

Marty Cagan has codified the 'four big risks' framework across two editions of Inspired and the follow-up Empowered. His coaching data across hundreds of product teams shows a consistent pattern: teams over-invest in value and usability discovery, under-invest in feasibility discovery, and almost entirely skip viability discovery (the 'will this work for our business?' question). Cross-functional reviews are the operational practice that closes this gap. SVPG-coached teams typically install a standing 30-minute review with finance, legal, sales ops, and support BEFORE engineering commits.

Risks Most Under-Tested

Viability + feasibility

Recommended Review Cadence

Per major spec

Typical Issues Surfaced

60-80% pre-build vs. post-launch

Cost Multiplier

100x cheaper pre-build

The cheapest place to surface a launch-blocking issue is in a 30-minute meeting before code is written. Cross-functional reviews aren't process for process's sake โ€” they're risk insurance with a 100x ROI.

Source โ†—
๐Ÿ›ก๏ธ

Hypothetical: B2B Fintech Compliance Save

Hypothetical

success

Hypothetical: A fintech PM circulates a spec for a new instant-transfer feature. The spec review brings in legal, compliance, treasury, and customer support. Compliance flags that the feature would need updated KYC checks for cross-state transfers โ€” not in scope. Treasury flags the float exposure. Support flags an expected support-volume spike. The team re-scopes to in-state-only for v1, adds a treasury cap, and pre-staffs support. Launch ships clean. Without the review, the team estimates they'd have shipped, then spent 6-8 weeks retrofitting compliance checks while paying penalty risk.

Functions Reviewed

Legal, compliance, treasury, support

Issues Surfaced Pre-Build

3 material

Estimated Rework Avoided

6-8 engineering weeks

Status

Hypothetical illustration

In regulated industries, the cross-functional review isn't optional process โ€” it's the difference between a clean launch and a quarter of compliance-driven re-work. Compliance issues surfaced early are scope decisions; surfaced late, they're crises.

Decision scenario

The Skipped Legal Review

You're shipping a new user-data export feature that lets customers download their account data. Legal asked to review it; engineering is 3 weeks into the build and 2 weeks from ship. Your eng lead says 'we don't need to slow down โ€” it's just a download button.'

Build Progress

60% complete

Time to Ship

2 weeks

Legal Review Status

Requested but not held

Data Types Exposed

User profile + activity log

01

Decision 1

You can either ship on schedule and hold legal review post-launch, or pause to run the review now.

Ship on schedule โ€” legal can review after launch and we'll iterate if neededReveal
You ship. Two weeks later legal reviews and flags that the export includes IP addresses and device IDs of users WHO INTERACTED WITH the account holder โ€” third-party data exposure under GDPR. Feature is pulled, public apology issued, regulatory inquiry opened. The team spends 8 weeks retrofitting consent and data-minimization logic. The original 'speed' cost a quarter.
Engineering Rework: +8 weeksCustomer Trust: DamagedRegulatory Risk: Material
Pause the build for 48 hours, run a focused legal + privacy review, log decisions, then resumeReveal
The review surfaces the third-party data exposure issue and a missing consent flow. Engineering implements data-minimization (1 week of work, on top of the original timeline) and you ship 1 week late but with no regulatory exposure. Legal becomes a willing partner on the next 3 specs. The 1-week slip prevented an 8-week scramble.
Slip: +1 weekRegulatory Risk: EliminatedLegal Relationship: Strengthened

Related concepts

Keep connecting.

The concepts that orbit this one โ€” each one sharpens the others.

Beyond the concept

Turn Cross-Functional Reviews into a live operating decision.

Use this concept as the framing layer, then move into a diagnostic if it maps directly to a current bottleneck.

Typical response time: 24h ยท No retainer required

Turn Cross-Functional Reviews into a live operating decision.

Use Cross-Functional Reviews as the framing layer, then move into diagnostics or advisory if this maps directly to a current business bottleneck.