Skip to content

Sessions & AI Analytics Cubes

These cubes provide analytics for AI coding sessions uploaded from the GuideMode Desktop app. They track session metadata, performance metrics, and the impact of AI tools on developer productivity.

Source Table: agent_sessions Description: Analytics for AI agent sessions including uploads, processing status, and metadata.

DimensionTypeDescription
idstringUnique session ID
providerstringAI provider (claude-code, copilot, etc.)
sessionIdstringProvider’s session ID
fileNamestringUploaded file name
processingStatusstringProcessing status (pending, completed, failed)
assessmentStatusstringAssessment completion status
sessionStartTimetimeWhen session started
sessionEndTimetimeWhen session ended
createdAttimeRecord creation time
uploadedAttimeWhen uploaded
userIdstringUser who uploaded
projectIdstringAssociated project
MeasureDescription
countTotal sessions
pendingCountSessions awaiting processing
completedCountSuccessfully processed sessions
failedCountFailed processing sessions
MeasureDescription
totalFileSizeTotal file size (bytes)
avgFileSizeAverage file size (bytes)
MeasureDescription
totalDurationTotal session time (ms)
totalDurationHoursTotal session time (hours)
totalDurationDaysTotal session time (days)
avgDurationAverage session duration (ms)
MeasureDescription
uniqueUsersCount of distinct users
uniqueProjectsCount of distinct projects
  • Upload Monitoring: Track session upload status
  • Provider Analysis: Compare usage across AI providers
  • Time Investment: Measure total time spent in AI sessions
  • User Engagement: Track active users and projects

Source Table: session_metrics Description: Detailed performance, usage, quality, and engagement metrics extracted from AI coding sessions.

DimensionTypeDescription
idstringMetric record ID
sessionIdstringAssociated session
providerstringAI provider
timestamptimeMetric timestamp
createdAttimeRecord creation time
usedPlanModebooleanWhether plan mode was used
usedTodoTrackingbooleanWhether todo tracking was used
MeasureDescription
avgResponseLatencyAverage AI response latency (ms)
maxResponseLatencyMaximum response latency (ms)
avgTaskCompletionTimeAverage task completion time (ms)
maxTaskCompletionTimeMaximum task completion time (ms)
MeasureDescription
avgReadWriteRatioRatio of read to write operations
avgInputClarityScoreAverage input clarity score
totalReadOperationsTotal file read operations
avgReadOperationsAverage reads per session
totalWriteOperationsTotal file write operations
avgWriteOperationsAverage writes per session
totalUserMessagesTotal user messages
avgUserMessagesAverage messages per session
MeasureDescription
totalErrorsTotal errors across sessions
avgErrorsAverage errors per session
maxErrorsMaximum errors in a session
totalRecoveryAttemptsTotal recovery attempts
avgRecoveryAttemptsAverage recoveries per session
totalFatalErrorsTotal fatal errors
avgFatalErrorsAverage fatal errors per session
MeasureDescription
avgInterruptionRateAverage interruption rate
avgSessionLengthAverage session length (minutes)
maxSessionLengthMaximum session length (minutes)
totalSessionLengthTotal session time (minutes)
totalSessionLengthHoursTotal time (hours)
totalSessionLengthDaysTotal time (days)
totalInterruptionsTotal interruptions
avgInterruptionsAverage interruptions per session
MeasureDescription
avgTaskSuccessRateAverage task success rate
totalIterationsTotal iterations
avgIterationsAverage iterations per task
avgProcessQualityScoreAverage quality score
totalOverTopAffirmationsTotal over-the-top affirmations
operationSuccessRate% successful operations
MeasureDescription
totalExitPlanModeTotal plan mode exits
avgExitPlanModeAverage per session
totalTodoWritesTotal todo list updates
avgTodoWritesAverage per session
planModeUsageRate% sessions using plan mode
todoTrackingUsageRate% sessions using todo tracking
MeasureDescription
totalGitFilesChangedTotal files changed via git
avgGitFilesChangedAverage files changed
totalGitLinesAddedTotal lines added
avgGitLinesAddedAverage lines added
totalGitLinesRemovedTotal lines removed
avgGitLinesRemovedAverage lines removed
totalGitLinesModifiedTotal lines modified
totalGitNetLinesChangedNet lines changed
totalLinesReadTotal lines read
MeasureDescription
avgGitLinesReadPerLineChangedLines read per line changed
avgGitReadsPerFileChangedReads per file changed
avgGitLinesChangedPerMinuteLines changed per minute
avgGitLinesChangedPerToolUseLines changed per tool use
MeasureDescription
totalInputTokensTotal input tokens
avgInputTokensAverage input tokens
totalOutputTokensTotal output tokens
avgOutputTokensAverage output tokens
totalCacheCreatedTotal cache tokens created
totalCacheReadTotal cache tokens read
avgContextLengthAverage context length
maxContextLengthMaximum context length
avgContextUtilizationAverage context utilization %
maxContextUtilizationMaximum context utilization %
totalCompactEventsTotal context compaction events
avgTokensPerMessageAverage tokens per message
avgMessagesUntilFirstCompactMessages before compaction
  • Performance Analysis: Track response times and latency
  • Quality Monitoring: Measure success rates and errors
  • Efficiency Metrics: Analyze code changes per session
  • Feature Adoption: Track plan mode and todo usage
  • Token Analysis: Monitor context window utilization

Source: Cross-join of pull_requests and agent_sessions Description: Measures the impact of AI tool usage on developer productivity by comparing AI-assisted vs non-AI-assisted PRs.

DimensionTypeDescription
prIdstringPull request ID
projectIdstringProject ID
teamIdstringTeam ID
hasAiSessionstringWhether PR had AI assistance (‘true’/‘false’)
aiProviderstringAI provider used (if any)
createdAttimePR creation time
mergedAttimePR merge time
MeasureDescription
totalPrCountTotal PRs
aiAssistedCountPRs with AI assistance
aiAdoptionRate% PRs with AI assistance
MeasureDescription
avgAiCycleTimeHoursAverage cycle time for AI-assisted PRs
avgNonAiCycleTimeHoursAverage cycle time for non-AI PRs
cycleTimeImprovementPercent% improvement from AI assistance

AI Adoption Rate:

aiAdoptionRate = (aiAssistedCount / totalPrCount) * 100

Cycle Time Improvement:

cycleTimeImprovementPercent =
((avgNonAiCycleTime - avgAiCycleTime) / avgNonAiCycleTime) * 100

A positive improvement means AI-assisted PRs merge faster.

  • AI Impact Assessment: Measure productivity gains from AI tools
  • Adoption Tracking: Monitor team adoption of AI assistants
  • Provider Comparison: Compare effectiveness of different AI providers
  • ROI Justification: Quantify AI tool benefits
MetricGoodNeeds Attention
aiAdoptionRate> 50%< 20%
cycleTimeImprovementPercent> 20%Negative
avgAiCycleTimeHours< avgNonAi> avgNonAi

JoinDescription
UsersUser who uploaded
ProjectsAssociated project
TeamsVia user membership
TeamsViaProjectsVia project assignment
JoinDescription
SessionsParent session
UsersViaSessionsUser via session
ProjectsViaSessionsProject via session

The AI Productivity cube uses raw SQL joins and doesn’t have standard cube joins. Filter using dimensions directly.

Sessions by provider:

measures: [Sessions.count, Sessions.totalDurationHours]
dimensions: [Sessions.provider]

Average metrics by provider:

measures: [Metrics.avgSessionLength, Metrics.avgTaskSuccessRate]
dimensions: [Metrics.provider]

AI productivity by team:

measures: [AIProductivity.aiAdoptionRate, AIProductivity.cycleTimeImprovementPercent]
dimensions: [AIProductivity.teamId]