Skip to main content
Waterline is an AI-powered ticket progress tracker that connects your GitHub repositories to Jira or GitHub Issues and computes a deterministic completion percentage for each ticket. Instead of checking PRs manually or asking for updates in standups, you get a structured breakdown — down to the specific functions and classes that implement each acceptance criterion.

Who it’s for

Waterline is built for the people closest to software delivery:
  • Engineering managers who want a live signal on ticket completeness without reading every diff
  • Developers who want to verify their implementation covers all acceptance criteria before closing a ticket
  • QA engineers who want a structured view of what’s implemented before writing test cases

How it works

Waterline follows a four-step flow from your first connection to a scored result.
1

Connect your repository

Authorize Waterline to read your GitHub repository. Waterline crawls it and builds a symbol-level index — every function, class, and method gets a semantic summary stored in a vector database.
2

Connect your issue tracker

Link Jira or GitHub Issues via OAuth. Waterline reads ticket summaries, descriptions, and acceptance criteria — no admin permissions required.
3

Analyze a ticket

Enter a ticket key (e.g. PROJ-123). Waterline finds the most relevant code symbols via semantic search, scores their relevance with an LLM, extracts the acceptance criteria from the ticket, and maps evidence to each criterion.
4

Get a progress score

Waterline returns a deterministic percentage with a per-criterion breakdown. Each criterion is scored SATISFIED (75%+ confidence), PARTIAL (40–75%), or UNSATISFIED (below 40%).

Key properties

Symbol-level precision

Waterline indexes individual functions and classes, not whole files. It can pinpoint exactly which part of the codebase addresses each acceptance criterion.

Deterministic scoring

The progress percentage is computed from fixed confidence thresholds — not a direct LLM output. The same evidence always produces the same score.

Incremental sync

Every push to your repo triggers an incremental re-index via webhook. Only changed files are re-processed, so the index stays fresh without burning LLM credits on unchanged code.

Provider-agnostic

Works with Anthropic (Claude), OpenAI (GPT), or a fully local Ollama setup. You can use different models for different tasks — for example, Claude for indexing and GPT-4o-mini for analysis.

Choose how to run Waterline

You can use Waterline through the hosted version at getwaterline.dev, or run it on your own infrastructure with Docker Compose.

Hosted version

Sign up at getwaterline.dev and connect your repo in minutes. No infrastructure, no LLM keys. Waterline provides everything. Free during beta.

Self-hosted quickstart

Run the full stack on your own infrastructure with Docker Compose. Bring your own LLM API keys and keep your code on your servers.