Skip to Content
FeaturesDORA Metrics

DORA Metrics

CodeStax tracks the four DORA (DevOps Research and Assessment) metrics to help your team measure and improve engineering performance. Access them from Dashboard → Reviews → DORA.

The Four Metrics

1. Review Frequency

How often your team submits code for security review.

RatingThresholdWhat It Means
EliteMultiple per dayContinuous delivery cadence
HighWeekly to dailyRegular review habit
MediumMonthly to weeklyRoom for improvement
LowLess than monthlyReviews are infrequent

Why it matters: Teams that review frequently catch issues earlier and ship smaller, safer changes.

How it’s calculated: Count of completed PR reviews and scans per time period, normalized per active developer.

2. Lead Time for Changes

Time from code commit to completed security review.

RatingThresholdWhat It Means
EliteLess than 1 hourFast feedback loop
High1 hour to 1 daySame-day reviews
Medium1 day to 1 weekDelays in review pipeline
LowMore than 1 weekSignificant bottleneck

Why it matters: Long lead times mean developers wait days for security feedback, leading to context-switching and delayed fixes.

How it’s calculated: Median time between PR creation and review completion across all reviewed PRs.

3. Change Failure Rate

Percentage of changes that introduce new security issues.

RatingThresholdWhat It Means
Elite0-5%Almost all changes are clean
High5-10%Occasional issues caught
Medium10-25%Frequent security regressions
LowAbove 25%Systemic quality issues

Why it matters: A high failure rate indicates gaps in secure coding practices, missing pre-commit checks, or inadequate developer training.

How it’s calculated: (PR reviews with risk score > 50) / (total PR reviews completed) over the selected period.

4. Mean Time to Review (MTTR)

Average time to complete a security review once triggered.

RatingThresholdWhat It Means
EliteLess than 10 minutesAutomated and fast
High10-30 minutesEfficient pipeline
Medium30 minutes to 2 hoursPossible queue congestion
LowMore than 2 hoursScanner or infrastructure issues

Why it matters: Slow reviews block merges and slow down the entire development pipeline.

How it’s calculated: Average duration from scan trigger to review completion across all reviews.

Dashboard Views

Trend Charts

Each metric is displayed as a line chart over the selected time range (7 days, 30 days, 90 days). Trend direction arrows indicate whether you are improving or regressing.

Team Comparison

Compare DORA metrics across repositories to identify which teams are performing well and which need support.

Summary Cards

Four cards at the top of the page show:

  • Current value for each metric
  • Rating badge (Elite / High / Medium / Low)
  • Trend direction vs. previous period

Using DORA Metrics Effectively

  • Set team goals: Use the rating thresholds as targets — aim for “High” initially, then “Elite”
  • Identify bottlenecks: If Lead Time is high but MTTR is low, the bottleneck is before the scan (process, not tooling)
  • Track over time: Week-over-week trends matter more than absolute numbers
  • Combine with scan data: Pair Change Failure Rate with the types of issues found to target training

CLI Access

Retrieve DORA metrics programmatically:

codestax dora --repo my-org/my-repo --period 30d

See CLI Reference for full usage.

API Access

curl -H "X-API-Key: $CODESTAX_API_KEY" \ https://codestax.co/api/v1/dora/metrics?repo_id=123&period=30d

Response includes all four metrics with current values, ratings, and trend data.