Discovery Flow Metrics
The Discovery Flow tracks research and validation work from initial discovery through to production deployment. Use these metrics to understand how effectively your team validates ideas before implementation.
Overview
Section titled “Overview”Grain: One row per discovery issue Refresh: Every 60 minutes Primary Use Case: Research process effectiveness and idea validation
Discovery issues represent research or investigation work that may lead to validated features. The Discovery Flow tracks:
- How long discoveries take to validate
- What percentage of discoveries result in features
- How quickly validated work moves to production
Duration Measures
Section titled “Duration Measures”All duration measures are calculated from specific timestamps. Understanding exactly what each measures is crucial for correct interpretation.
Discovery Duration
Section titled “Discovery Duration”What it measures: How long the discovery phase took from start to finish.
| Aspect | Value |
|---|---|
| Start Point | Discovery issue created |
| End Point | Discovery issue closed |
| Unit | Days |
| NULL when | Discovery not yet closed |
Available aggregations:
avgDiscoveryDurationDays- Average across all discoveriesmedianDiscoveryDurationDays- Median (recommended for typical performance)
Interpretation:
- Lower is generally better - Faster validation cycles
- Very low values may indicate insufficient research
- Very high values may indicate scope creep or blocked work
Typical ranges:
- Quick spikes: 1-3 days
- Standard research: 5-10 days
- Deep investigation: 2-4 weeks
Discovery to PR
Section titled “Discovery to PR”What it measures: Time between completing discovery and starting implementation.
| Aspect | Value |
|---|---|
| Start Point | Discovery issue closed |
| End Point | First PR created |
| Unit | Days |
| NULL when | Discovery not closed, or no PR created yet |
Available aggregations:
avgDiscoveryToPrDays- Average time to start implementationmedianDiscoveryToPrDays- Median time (recommended)
Interpretation:
- Measures handoff efficiency between research and implementation
- High values indicate bottlenecks in work prioritization
- Negative values are possible if PR work starts before discovery closes
Typical ranges:
- Well-prioritized: 0-2 days
- Normal backlog: 3-7 days
- Backlog issues: 2+ weeks
Discovery to Production
Section titled “Discovery to Production”What it measures: Time from completing discovery to reaching production.
| Aspect | Value |
|---|---|
| Start Point | Discovery issue closed |
| End Point | First production deployment deployed |
| Unit | Days |
| NULL when | Discovery not closed, or no production deployment yet |
Available aggregations:
avgDiscoveryToProductionDays- Average time to productionmedianDiscoveryToProductionDays- Median time (recommended)
Interpretation:
- End-to-end implementation time from validated idea to production
- Includes PR creation, review, merge, and deployment
- High values indicate slow delivery pipeline or complex implementations
Typical ranges:
- Fast delivery: 3-7 days
- Standard delivery: 1-3 weeks
- Complex features: 1-2 months
Total Lead Time
Section titled “Total Lead Time”What it measures: Complete end-to-end time from discovery creation to production.
| Aspect | Value |
|---|---|
| Start Point | Discovery issue created |
| End Point | First production deployment deployed |
| Unit | Days |
| NULL when | No production deployment yet |
Available aggregations:
avgTotalLeadTimeDays- Average total lead timemedianTotalLeadTimeDays- Median (uses successful deployments)p90TotalLeadTimeDays- 90th percentile (for capacity planning)
Interpretation:
- Most comprehensive metric for idea-to-production time
- Combines discovery duration + implementation time
- P90 is useful for setting expectations with stakeholders
Typical ranges:
- High-performing: 1-2 weeks
- Standard: 2-4 weeks
- Needs improvement: 1+ months
Validation Metrics
Section titled “Validation Metrics”Validation Status
Section titled “Validation Status”Discoveries can have four validation states:
| Status | Meaning |
|---|---|
validated | Discovery resulted in validated features |
invalidated | Discovery was rejected/not pursued |
closed_unvalidated | Closed without explicit validation decision |
in_progress | Still open/being researched |
Validation Rate
Section titled “Validation Rate”Measure: validationRate
Percentage of completed discoveries that were validated into features.
validationRate = (validated_count / completed_count) * 100Where completed_count = validated + invalidated + closed_unvalidated
Interpretation:
- 50-70% is typical for healthy discovery processes
- Very high rates (90%+) may indicate bias toward validation
- Very low rates (below 30%) may indicate poor initial filtering
Count Measures
Section titled “Count Measures”| Measure | Description |
|---|---|
count | Total number of discoveries |
validatedCount | Discoveries validated into features |
invalidatedCount | Discoveries marked as invalid |
closedUnvalidatedCount | Closed without validation decision |
inProgressCount | Still being researched |
Conversion Metrics
Section titled “Conversion Metrics”Track how discoveries progress through the delivery pipeline:
Discovery to PR Conversion Rate
Section titled “Discovery to PR Conversion Rate”Measure: discoveryToPrConversionRate
Percentage of discoveries that resulted in at least one PR.
discoveryToPrConversionRate = (discoveries_with_pr / total_discoveries) * 100Interpretation:
- Measures implementation rate of discoveries
- Some discoveries may not require code changes (documentation, process changes)
- Very low rates may indicate discoveries aren’t actionable
Discovery to Merge Conversion Rate
Section titled “Discovery to Merge Conversion Rate”Measure: discoveryToMergeConversionRate
Percentage of discoveries with at least one merged PR.
Interpretation:
- Measures completion rate through code review
- Gap between “to PR” and “to merge” indicates review bottlenecks
Discovery to Deployment Conversion Rate
Section titled “Discovery to Deployment Conversion Rate”Measure: discoveryToDeploymentConversionRate
Percentage of discoveries that reached successful production deployment.
Interpretation:
- Ultimate success metric - validated ideas in production
- Should track closely with merge rate for healthy pipelines
Relationship Metrics
Section titled “Relationship Metrics”Average Validated Feature Count
Section titled “Average Validated Feature Count”Measure: avgValidatedFeatureCount
Average number of features generated per discovery.
Interpretation:
- Scope indicator - do discoveries spawn one feature or many?
- High values may indicate discoveries are too broad
- Value of 1.0-2.0 is typical
Average Linked PR Count
Section titled “Average Linked PR Count”Measure: avgLinkedPrCount
Average number of PRs associated with each discovery.
Interpretation:
- Implementation complexity indicator
- High values suggest complex or poorly scoped discoveries
Dimensions (Filters)
Section titled “Dimensions (Filters)”Filter discovery metrics by:
| Dimension | Description |
|---|---|
provider | Issue tracking provider (e.g., Jira, Linear) |
state | Current issue state |
validationStatus | validated, invalidated, closed_unvalidated, in_progress |
isDiscoveryComplete | Whether discovery is closed |
hasPr | Whether any PR exists |
hasMergedPr | Whether any PR is merged |
hasProductionDeployment | Whether deployed to production |
hasSuccessfulProductionDeployment | Whether successfully deployed |
Via Joins:
Projects.id/Projects.name- Filter by projectTeams.id/Teams.name- Filter by teamUsers.id/Users.name- Filter by discovery author
Common Analysis Patterns
Section titled “Common Analysis Patterns”Discovery Process Health
Section titled “Discovery Process Health”Measures:- validationRate (target: 50-70%)- medianDiscoveryDurationDays (target: < 2 weeks)- inProgressCount (watch for growing backlog)Handoff Efficiency
Section titled “Handoff Efficiency”Measures:- medianDiscoveryToPrDays (target: < 1 week)- discoveryToPrConversionRate (compare to validation rate)End-to-End Performance
Section titled “End-to-End Performance”Measures:- medianTotalLeadTimeDays (primary KPI)- p90TotalLeadTimeDays (for SLA planning)- discoveryToDeploymentConversionRate (success rate)Q: What’s the difference between “Discovery Duration” and “Discovery to Production”?
Discovery Duration measures only the research phase (created → closed). Discovery to Production measures the time after the discovery closes until code reaches production. Total Lead Time combines both.
Q: Why is Discovery to PR measured from closed, not created?
This measures handoff efficiency - how quickly the team starts implementation after validation. If measured from creation, it would conflate research time with implementation time.
Q: A discovery shows 0 days for “Discovery to PR” - is that wrong?
No, this indicates the PR was created on the same day the discovery closed. This is good! It means implementation started immediately after validation.
Q: Can Discovery to PR be negative?
Technically yes, if a PR is created before the discovery closes (parallel work). This is unusual but not necessarily wrong - sometimes implementation starts during validation.
Related
Section titled “Related”- Analytics Overview - Understanding the three flows
- Delivery Flow - Track work items through delivery
- Deployment Flow - Deployment pipeline metrics
- DORA Metrics - DevOps performance framework
- SPACE Framework - Developer productivity framework