These cubes provide analytics for AI coding sessions uploaded from the GuideMode Desktop app. They track session metadata, performance metrics, and the impact of AI tools on developer productivity.
Source Table: agent_sessions
Description: Analytics for AI agent sessions including uploads, processing status, and metadata.
Dimension Type Description idstring Unique session ID providerstring AI provider (claude-code, copilot, etc.) sessionIdstring Provider’s session ID fileNamestring Uploaded file name processingStatusstring Processing status (pending, completed, failed) assessmentStatusstring Assessment completion status sessionStartTimetime When session started sessionEndTimetime When session ended createdAttime Record creation time uploadedAttime When uploaded userIdstring User who uploaded projectIdstring Associated project
Measure Description countTotal sessions pendingCountSessions awaiting processing completedCountSuccessfully processed sessions failedCountFailed processing sessions
Measure Description totalFileSizeTotal file size (bytes) avgFileSizeAverage file size (bytes)
Measure Description totalDurationTotal session time (ms) totalDurationHoursTotal session time (hours) totalDurationDaysTotal session time (days) avgDurationAverage session duration (ms)
Measure Description uniqueUsersCount of distinct users uniqueProjectsCount of distinct projects
Upload Monitoring: Track session upload status
Provider Analysis: Compare usage across AI providers
Time Investment: Measure total time spent in AI sessions
User Engagement: Track active users and projects
Source Table: session_metrics
Description: Detailed performance, usage, quality, and engagement metrics extracted from AI coding sessions.
Dimension Type Description idstring Metric record ID sessionIdstring Associated session providerstring AI provider timestamptime Metric timestamp createdAttime Record creation time usedPlanModeboolean Whether plan mode was used usedTodoTrackingboolean Whether todo tracking was used
Measure Description avgResponseLatencyAverage AI response latency (ms) maxResponseLatencyMaximum response latency (ms) avgTaskCompletionTimeAverage task completion time (ms) maxTaskCompletionTimeMaximum task completion time (ms)
Measure Description avgReadWriteRatioRatio of read to write operations avgInputClarityScoreAverage input clarity score totalReadOperationsTotal file read operations avgReadOperationsAverage reads per session totalWriteOperationsTotal file write operations avgWriteOperationsAverage writes per session totalUserMessagesTotal user messages avgUserMessagesAverage messages per session
Measure Description totalErrorsTotal errors across sessions avgErrorsAverage errors per session maxErrorsMaximum errors in a session totalRecoveryAttemptsTotal recovery attempts avgRecoveryAttemptsAverage recoveries per session totalFatalErrorsTotal fatal errors avgFatalErrorsAverage fatal errors per session
Measure Description avgInterruptionRateAverage interruption rate avgSessionLengthAverage session length (minutes) maxSessionLengthMaximum session length (minutes) totalSessionLengthTotal session time (minutes) totalSessionLengthHoursTotal time (hours) totalSessionLengthDaysTotal time (days) totalInterruptionsTotal interruptions avgInterruptionsAverage interruptions per session
Measure Description avgTaskSuccessRateAverage task success rate totalIterationsTotal iterations avgIterationsAverage iterations per task avgProcessQualityScoreAverage quality score totalOverTopAffirmationsTotal over-the-top affirmations operationSuccessRate% successful operations
Measure Description totalExitPlanModeTotal plan mode exits avgExitPlanModeAverage per session totalTodoWritesTotal todo list updates avgTodoWritesAverage per session planModeUsageRate% sessions using plan mode todoTrackingUsageRate% sessions using todo tracking
Measure Description totalGitFilesChangedTotal files changed via git avgGitFilesChangedAverage files changed totalGitLinesAddedTotal lines added avgGitLinesAddedAverage lines added totalGitLinesRemovedTotal lines removed avgGitLinesRemovedAverage lines removed totalGitLinesModifiedTotal lines modified totalGitNetLinesChangedNet lines changed totalLinesReadTotal lines read
Measure Description avgGitLinesReadPerLineChangedLines read per line changed avgGitReadsPerFileChangedReads per file changed avgGitLinesChangedPerMinuteLines changed per minute avgGitLinesChangedPerToolUseLines changed per tool use
Measure Description totalInputTokensTotal input tokens avgInputTokensAverage input tokens totalOutputTokensTotal output tokens avgOutputTokensAverage output tokens totalCacheCreatedTotal cache tokens created totalCacheReadTotal cache tokens read avgContextLengthAverage context length maxContextLengthMaximum context length avgContextUtilizationAverage context utilization % maxContextUtilizationMaximum context utilization % totalCompactEventsTotal context compaction events avgTokensPerMessageAverage tokens per message avgMessagesUntilFirstCompactMessages before compaction
Performance Analysis: Track response times and latency
Quality Monitoring: Measure success rates and errors
Efficiency Metrics: Analyze code changes per session
Feature Adoption: Track plan mode and todo usage
Token Analysis: Monitor context window utilization
Source: Cross-join of pull_requests and agent_sessions
Description: Measures the impact of AI tool usage on developer productivity by comparing AI-assisted vs non-AI-assisted PRs.
Dimension Type Description prIdstring Pull request ID projectIdstring Project ID teamIdstring Team ID hasAiSessionstring Whether PR had AI assistance (‘true’/‘false’) aiProviderstring AI provider used (if any) createdAttime PR creation time mergedAttime PR merge time
Measure Description totalPrCountTotal PRs aiAssistedCountPRs with AI assistance aiAdoptionRate% PRs with AI assistance
Measure Description avgAiCycleTimeHoursAverage cycle time for AI-assisted PRs avgNonAiCycleTimeHoursAverage cycle time for non-AI PRs cycleTimeImprovementPercent% improvement from AI assistance
AI Adoption Rate:
aiAdoptionRate = (aiAssistedCount / totalPrCount) * 100
Cycle Time Improvement:
cycleTimeImprovementPercent =
((avgNonAiCycleTime - avgAiCycleTime) / avgNonAiCycleTime) * 100
A positive improvement means AI-assisted PRs merge faster.
AI Impact Assessment: Measure productivity gains from AI tools
Adoption Tracking: Monitor team adoption of AI assistants
Provider Comparison: Compare effectiveness of different AI providers
ROI Justification: Quantify AI tool benefits
Metric Good Needs Attention aiAdoptionRate> 50% < 20% cycleTimeImprovementPercent> 20% Negative avgAiCycleTimeHours< avgNonAi > avgNonAi
Join Description UsersUser who uploaded ProjectsAssociated project TeamsVia user membership TeamsViaProjectsVia project assignment
Join Description SessionsParent session UsersViaSessionsUser via session ProjectsViaSessionsProject via session
The AI Productivity cube uses raw SQL joins and doesn’t have standard cube joins. Filter using dimensions directly.
Sessions by provider:
measures: [Sessions.count, Sessions.totalDurationHours]
dimensions: [Sessions.provider]
Average metrics by provider:
measures: [Metrics.avgSessionLength, Metrics.avgTaskSuccessRate]
dimensions: [Metrics.provider]
AI productivity by team:
measures: [AIProductivity.aiAdoptionRate, AIProductivity.cycleTimeImprovementPercent]
dimensions: [AIProductivity.teamId]