The 6 functions every product team should measure
Most teams measure engineering output. Here are the six functions that determine whether your product team is actually effective.
Ask a VP Product how their team is performing and you will get one of two answers.
The first is a dashboard: velocity, story points, DORA metrics, maybe a sprint burndown. These are engineering output metrics. They are useful and they are incomplete.
The second is a feeling: "I think we are doing okay." This is honest and it is unmeasurable.
Neither answer satisfies a board that wants to know whether the product org is both efficient and effective. Neither helps a new VP Product figure out where to focus in their first 90 days.
The gap is not the data. The gap is the framework.
Six functions, not one
A product team operates across six distinct functions. Engineering is one. Here are all six, and what measurement looks like for each.
1. Strategy
What it covers: Roadmap alignment, decision quality, prioritization rigor, strategic clarity.
What good looks like: The team can explain why they are building what they are building. Roadmap ties to company objectives. Prioritization is evidence-based, not opinion-based. Strategic bets are documented and revisited.
What bad looks like: The roadmap is a feature list ordered by whoever spoke loudest. Nobody can explain the "why" behind the current sprint. Strategy lives in the CEO's head and nowhere else.
How most teams measure it: They do not.
2. Design
What it covers: Discovery process, design maturity, user research integration, design system health.
What good looks like: Design is involved before engineering starts building. Discovery happens before delivery. There is a design system that is used, not just documented. User research informs decisions, not validates them after the fact.
What bad looks like: Design is a pixel factory. Engineers build first, designers make it pretty. There is no design system (or one that nobody follows). User research is a formality.
How most teams measure it: They do not.
3. Development
What it covers: Engineering practices, technical health, delivery cadence, code quality.
What good looks like: DORA metrics are healthy. Technical debt is managed, not ignored. CI/CD is fast and reliable. Code review is thorough and timely.
What bad looks like: Deployments are scary. Technical debt is growing faster than features. Build times are measured in coffee breaks.
How most teams measure it: DORA, velocity, story points. This is the one function most teams already track.
4. Operations
What it covers: Process maturity, documentation, team health, operational discipline.
What good looks like: Processes are documented and followed. Onboarding a new team member takes days, not months. Meetings have agendas and outcomes. The team operates with predictable rhythm.
What bad looks like: Tribal knowledge everywhere. Onboarding is "sit next to Sarah and figure it out." Retros happen but nothing changes. The team is busy but chaotic.
How most teams measure it: Maybe a team health survey. Usually gut feel.
5. Go-to-Market
What it covers: Launch readiness, adoption measurement, positioning clarity, market feedback.
What good looks like: Every feature has a launch plan. Adoption is measured, not assumed. The team knows how to position what they build. Market feedback flows back into the product process.
What bad looks like: Features ship and nobody notices. "Launch" means merging to main. There is no measurement of whether anyone uses what was built. Marketing finds out about new features from the changelog.
How most teams measure it: They do not.
6. Intelligence
What it covers: Data infrastructure, feedback loops, experimentation culture, outcome tracking.
What good looks like: The team knows which features drive outcomes. There is a culture of experimentation. Feedback loops are short and actionable. Decisions are informed by data, not just instinct.
What bad looks like: Analytics are set up but nobody looks at them. A/B testing is "something we should do." The team ships features and moves on without measuring impact.
How most teams measure it: Maybe product analytics (Amplitude, Mixpanel). Rarely connected to operational decisions.
The pattern
Notice the pattern: most teams measure Development, partially measure Intelligence, and do not measure the other four at all.
This means the VP Product is making decisions about a six-function organization with data from one function. That is like flying a plane with only the altimeter working. You know how high you are but not where you are going, how fast, or whether the engines are healthy.
What changes when you measure all six
When you have visibility across all six functions, three things happen:
1. You find the real bottleneck. It is rarely where you think. Teams that feel slow often have strong engineering (Development) but weak prioritization (Strategy) or poor launch processes (Go-to-Market). The fix is not "ship faster." The fix is "ship the right things to the right people."
2. You can explain it to the board. Instead of "I think we are doing well," you can say "Our development is strong at 4.1/5 but our go-to-market process is lagging at 1.9/5. We are investing in GTM process improvement this quarter because that is where the highest leverage is."
3. You stop optimizing the wrong thing. Without cross-function measurement, teams default to optimizing what they can measure: engineering speed. With a full diagnostic, you optimize for impact, not output.
Start with what you can see
You do not need to build a measurement system from scratch. Start with what is observable:
- Your product pages, documentation, and public presence reveal a lot about Design, Strategy, and Go-to-Market maturity
- Your engineering tools (GitHub, Linear) already contain signals about Development and Operations
- Your analytics tools contain signals about Intelligence
A product operations diagnostic pulls these signals together into a cross-function view. In 2 minutes, you can see where effort and impact diverge across all six functions.
That is the starting point. Not the ending point.
The ending point is a product team that is both efficient and effective. That knows not just how fast it ships, but whether it is shipping the right things.
Darren Card
Founder, Dacard.ai
See your diagnostic
Free. No sign-up required. Results in 2 minutes.