Skip to content

SPACE Framework

The SPACE framework provides a multidimensional approach to measuring developer productivity. Unlike single-metric approaches, SPACE recognizes that productivity is complex and requires multiple perspectives.

SPACE is a framework developed by researchers at GitHub, Microsoft, and the University of Victoria. It argues that developer productivity cannot be measured by any single metric and requires looking at five dimensions:

  • Satisfaction and well-being
  • Performance
  • Activity
  • Communication and collaboration
  • Efficiency and flow
DimensionKey MetricsSource Cube
SatisfactionavgSatisfaction, npsScoreSurvey Responses
PerformancechangeFailureRate, validationRateDeployment Flow, Discovery Flow
Activitycount (sessions), productionDeploymentCountSessions, Deployment Flow
CommunicationavgCollaboration, avgPrToReviewDaysSurvey Responses, Delivery Flow
EfficiencymedianTotalLeadTimeDays, avgInterruptionRateDelivery Flow, Metrics

Satisfaction captures how fulfilled developers feel with their work, tools, and environment. Well-being includes factors like burnout and work-life balance.

MetricDescriptionSource
avgSatisfactionAverage job satisfaction (1-5 scale)Survey Responses
npsScoreNet Promoter Score (promoter% - detractor%)Survey Responses
avgWorkLifeBalanceWork-life balance rating (1-5 scale)Survey Responses

Track these metrics over time to identify trends. Sudden drops may indicate team health issues. Compare across teams to identify systemic problems vs individual team challenges.

Measures:
- avgSatisfaction
- npsScore
- avgWorkLifeBalance
Dimensions: Teams.name, time (monthly)

Performance focuses on outcomes and quality, not just quantity. It measures whether the work being done achieves its intended goals.

MetricDescriptionSource
changeFailureRate% of production deployments that failDeployment Flow
validationRate% of discoveries that become validated featuresDiscovery Flow

These are outcome-oriented metrics. Change failure rate indicates deployment quality, while validation rate shows how well research leads to valuable features.

Measures:
- changeFailureRate
- validationRate
Filter: isProduction = true (for change failure rate)
Dimensions: Projects.name, Teams.name

Activity measures the actions developers take. The SPACE framework cautions against using activity metrics alone, as they can be gamed and don’t indicate productivity.

MetricDescriptionSource
countTotal AI coding sessionsSessions
productionDeploymentCountProduction deploymentsDeployment Flow
totalDurationHoursTotal time in AI sessionsSessions

Use activity metrics as context, not targets. High session counts with low productivity may indicate tooling problems. Low activity during high-stress periods may be appropriate.

Measures:
- Sessions.count
- Sessions.totalDurationHours
- productionDeploymentCount
Dimensions: Users.name, provider, time (weekly)

Collaboration metrics capture how effectively team members work together. This includes code reviews, knowledge sharing, and cross-functional participation.

MetricDescriptionSource
avgCollaborationTeam collaboration rating (1-5 scale)Survey Responses
avgPrToReviewDaysTime waiting for first reviewDelivery Flow
avgPrToApprovalDaysTime from PR creation to approvalDelivery Flow

Long review times indicate collaboration bottlenecks. Self-reported collaboration scores provide context that metrics alone can’t capture.

Measures:
- avgCollaboration
- avgPrToReviewDays
- avgPrToApprovalDays
Dimensions: Teams.name, Projects.name

Efficiency captures how smoothly work flows through the system. Flow state is the productive mental state when developers can focus without interruptions.

MetricDescriptionSource
medianTotalLeadTimeDaysEnd-to-end delivery timeDelivery Flow
avgInterruptionRateInterruptions during AI sessionsMetrics
avgTaskCompletionTimeTime to complete tasksMetrics

Lead time measures system efficiency. Interruption rate indicates how often developers lose flow state. Track these together to understand both system-level and individual-level efficiency.

Measures:
- medianTotalLeadTimeDays
- avgInterruptionRate
- avgTaskCompletionTime
Dimensions: Teams.name, type (feature/bug/chore)

A comprehensive SPACE dashboard should include at least one metric from each dimension:

SPACE Dashboard Query:
Satisfaction:
- avgSatisfaction (Survey Responses)
- npsScore (Survey Responses)
Performance:
- changeFailureRate (Deployment Flow)
Activity:
- Sessions.count
- Sessions.totalDurationHours
Communication:
- avgPrToReviewDays (Delivery Flow)
Efficiency:
- medianTotalLeadTimeDays (Delivery Flow)
Time dimension: Monthly trend
Team dimension: For comparison

GuideMode’s Survey Responses cube includes SPACE-specific questions for different team types:

SPACE DimensionDiscovery TeamsDelivery Teams
SatisfactiondiscoverySatisfactionjobSatisfaction
PerformancestakeholderConfidencecodeQualityConfidence
ActivitycustomerTouchpointFrequencydeploymentFrequency
CollaborationcrossFunctionalParticipationcodeReviewQuality
EfficiencyavgTimeToFirstValidationavgBuildCicdSatisfaction

See Surveys & Assessments for complete survey metrics.


  1. Use multiple dimensions - No single metric captures productivity
  2. Combine quantitative and qualitative - Survey data provides context for metrics
  3. Avoid gaming - Don’t use activity metrics as targets
  4. Team context matters - Discovery teams need different metrics than delivery teams
  5. Trend over absolute - Track changes over time, not arbitrary targets