FEATURES / ACTIVITY CLASSIFICATION

Know what your team actually does, not what Jira says

Your current tools count commits and PRs. They can't tell the difference between deep coding work, firefighting, review marathons, and meeting-driven days. Gitrevio's activity classifier uses ML models trained on engineering workflows to categorize every hour of your team's week into meaningful activity types — automatically.

How it classifies a week of work

# Activity classification — Sarah Chen, week of April 14
Deep coding ████████████░░░░ 38% (15.2h)
Code review ████████░░░░░░░░ 22% ( 8.8h)
Debugging ████░░░░░░░░░░░░ 12% ( 4.8h)
Meetings ████░░░░░░░░░░░░ 11% ( 4.4h)
Context switch ███░░░░░░░░░░░░░ 9% ( 3.6h)
Mentoring ██░░░░░░░░░░░░░░ 5% ( 2.0h)
Admin/process █░░░░░░░░░░░░░░░ 3% ( 1.2h)
# Classifier confidence: 94%
# Signals: 847 events across 5 integrations

The classifier ingests signals from every tool your engineers touch. Commit patterns, PR timelines, file types touched, Jira transitions, calendar events, review comment patterns, Slack activity windows — hundreds of events per day, per person.

It doesn't just look at what happened — it understands the shape of the work. A developer who commits to three files in one repo over four hours is doing deep coding. The same developer touching five repos in one hour is context switching.

No self-reporting. No time tracking tools. No interruptions. Your engineers keep working. Gitrevio watches the signals they already produce.

What the classifier sees

Each activity type is defined by a distinct pattern of signals, not a single event. The model weighs dozens of features to distinguish between categories with overlapping indicators.

Deep coding
Sustained commits to few files, low task switching, test writes, long uninterrupted sessions
Code review
PR comment timing, review depth, files viewed vs commented, suggestion-to-approval ratio
Debugging
Hot-fix branches, log file changes, rapid commit-revert cycles, stack trace references in comments
Mentoring
Review comments on junior PRs, pair programming patterns, explanatory comment length, repeated reviewer pairing
Meetings
Calendar blocks, gaps in commit activity, Slack silence during meeting windows, post-meeting commit bursts
Context switching
3+ repos in one hour, task-switching velocity, incomplete commits, branch hopping across projects
Admin / process
Jira grooming, sprint planning artifacts, config file updates, CI pipeline maintenance, access requests
Documentation
Markdown and README edits, wiki contributions, ADR creation, inline comment density spikes
Firefighting
Off-hours commits, incident channel activity, rollback patterns, rapid deploy cycles to production

Team-level patterns tell a bigger story

Individual activity breakdowns are useful. Team-level comparisons are where the real insights emerge. When your backend team spends 45% of their time in deep coding but your frontend team only hits 28%, that's not a talent gap — it's a process signal.

The frontend team's 31% debugging rate tells you they're spending more time fixing than building. Maybe it's a testing gap. Maybe it's a flaky CI pipeline. Maybe it's tech debt nobody wants to touch. Now you have the data to find out.

Track these patterns over time to see if your interventions actually work. Moved standup to async? Watch the meeting percentage drop. Invested in test coverage? Watch debugging hours shrink.

# Team comparison — week of April 14
Backend (6 engineers)
Deep coding ████████████░░░░ 45%
Code review ██████░░░░░░░░░░ 22%
Debugging ████░░░░░░░░░░░░ 14%
Meetings ███░░░░░░░░░░░░░ 8%
Other ███░░░░░░░░░░░░░ 11%
Frontend (4 engineers)
Deep coding ███████░░░░░░░░░ 28%
Code review ████░░░░░░░░░░░░ 15%
Debugging ████████░░░░░░░░ 31%
Meetings ████░░░░░░░░░░░░ 14%
Other ███░░░░░░░░░░░░░ 12%
! Frontend debugging is 2.2x backend rate

What this unlocks

Identify meeting-heavy teams

Find teams where meetings consume more than 20% of the week — and track whether async experiments actually reduce that number.

Find context-switching hotspots

Spot engineers who are spread across too many repos and projects. Context switching kills deep work — now you can measure it.

Understand velocity differences

When two teams have different velocity, activity classification tells you why. One team codes 45% of the time; the other only 28%.

Measure mentoring investment

See how much senior engineer time actually goes to mentoring. Track whether new hires get the review attention they need.

Detect burnout patterns

Declining deep coding percentage, increasing context switching, growing off-hours work — early warning signals you can act on before it's too late.

Validate process changes

Moved retros to async? Added a code review SLA? See the before-and-after in activity data, not just subjective survey responses.

See what your engineers actually do all week.

Get started free