Skip to content

GitHub Issues — Quality Standards Guide

This guide defines the quality standards for all GitHub issues created in the XO Data Ops project (#15). It is the single reference for anyone — BI, data engineering, or operations — creating or reviewing issues in this project.

Project scope note: This project (#15) is the XO Data Ops write space. The [Product] project (#6) is read-only reference — do not create or edit issues there.


Issue Hierarchy at a Glance

Epic (rare — always tied to a Milestone, max 2-3 active)
 ├── Feature A [one area label]
 │    ├── Task(s) [same area label as parent]
 │    ├── Bug(s) [if found during feature work]
 │    └── Docs (companion — always created with Feature)
 └── Feature B [different area label]

Standalone (no epic):
 ├── Feature + Docs companion
 ├── Task (independent work)
 ├── Bug (independent fix)
 ├── Chore (maintenance)
 └── Docs (standalone)
  • Epics are containers that carry NO area label. They are always tied to a Milestone. Max 2-3 active at any time. They stay open until all children close.
  • Child issues (Feature, Bug, Task, Docs) are the actual units of work. Each should be M effort or smaller (1-2 days). If it's larger, split it.

Issue Types

Epic

An epic groups a multi-sprint initiative into a trackable parent issue.

Field Value
Title format [Epic] Imperative description
Example [Epic] Build WBP CSAT Analytics Pipeline
Native Type Epic
Label None (children carry area labels)
Stays open until All child checkboxes are checked
Milestone Required — tied 1:1 to a Milestone for external commitments
Create when Initiative has 3+ issues or spans more than 1 sprint

Body structure:

## Goal
What is the end state we're building toward?

## Scope
What is in scope? What is explicitly out of scope?

## Child Issues
- [ ] #N Title of child issue
- [ ] #N Title of child issue
- [ ] #N Title of child issue

## Milestone
Which milestone does this belong to?

Epic rules: - Break children down to M effort or smaller. L or XL children need further splitting. - Progress is visible as the task list completion percentage on GitHub. - One epic per initiative — do not create nested epics. - Epics carry NO area label. Their children carry the area context. - Always tied to a Milestone for external commitments.


Feature

A Feature issue represents new functionality being added to the platform.

Field Value
Title format [Feature] Imperative description
Example [Feature] Add CSAT Gold model from WBP Bronze
Native Type Feature
Label Area label (pipelines / snowflake / xoos / platform)
Template .github/ISSUE_TEMPLATE/feature.md

Body structure:

## Goal
What are we building and why? One or two sentences.

## Acceptance Criteria
- [ ] Criterion 1 (specific, independently verifiable)
- [ ] Criterion 2
- [ ] Criterion 3

## Background / Context
Upstream dependencies, relevant ADRs, related issues.

## Out of Scope
Explicitly state what this issue does NOT cover.

Bug

A Bug issue tracks something that is broken and needs to be fixed.

Field Value
Title format [Bug] Imperative description
Example [Bug] Fix null handling in Gladly extraction
Native Type Bug
Label Area label (pipelines / snowflake / xoos / platform)
Template .github/ISSUE_TEMPLATE/bug.md

Body structure:

## What's Broken
One-sentence description of the failure.

## Expected Behavior
What should happen?

## Actual Behavior
What is happening instead? Include error messages, stack traces, Snowflake query results.

## Steps to Reproduce
1.
2.
3.

## Impact
- [ ] Blocking production pipeline
- [ ] Incorrect data in Snowflake (specify table/layer)
- [ ] Broken DAG / Airflow task
- [ ] Other: ___

## Environment
- [ ] Production
- [ ] Dev / staging
- [ ] Local

## Additional Context
DAG run IDs, Snowflake query IDs, Airflow task logs, S3 paths.

Spike

A Spike is time-boxed research that ends with a written artifact regardless of the outcome. A spike without output never closes.

Field Value
Title format [Spike] Imperative description
Example [Spike] Evaluate dbt metrics vs Cube for aggregates
Native Type Task
Label Area label (pipelines / snowflake / xoos / platform)
Template .github/ISSUE_TEMPLATE/spike.md
Max duration 2 days (timeboxed — commit to this before starting)

Body structure:

## Goal
What question are we trying to answer? Be precise — vague goals never end.

## Timebox
Duration: [e.g., 2 days]
Deadline: YYYY-MM-DD

## Background
Why is this worth answering now? What decision or work does it unblock?

## What We Already Know
Prior art, ADRs, constraints that scope this spike.

## Questions to Answer
1.
2.
3.

## Output Format (required — choose at least one)
- [ ] Written findings doc in docs/
- [ ] Comparison table / scored matrix
- [ ] Prototype / POC code (link to branch or PR)
- [ ] ADR draft
- [ ] Other: ___

## Blocked By
Any hard dependencies.

Critical rule: Every spike must define its output format before work starts. A spike closes only when the artifact is delivered via PR.

The former [Decision] issue type has been folded into [Spike]. If a spike produces an architecture decision, the output is an ADR in docs/decisions/.


Task

A Task is a unit of implementation work. Tasks can be children of Features or standalone.

Field Value
Title format [Task] Imperative description
Example [Task] Add RECORD_KEY hashing to CND extraction config
Native Type Task
Label Area label (pipelines / snowflake / xoos / platform)

Use the Feature template body — Goal and Acceptance Criteria are still required.


Chore

A Chore is maintenance, cleanup, refactoring, or dependency updates with no user-facing behavior change.

Field Value
Title format [Chore] Imperative description
Example [Chore] Upgrade pandas to 2.2 across xo-core
Native Type Chore
Label Area label (pipelines / snowflake / xoos / platform)

Use the Feature template body — Goal and Acceptance Criteria are still required.


Docs

A Docs issue covers documentation-only work: ADRs, guides, README updates, reference table changes.

Field Value
Title format [Docs] Imperative description
Example [Docs] Write ADR for Bronze load strategy
Native Type Docs
Label Area label (pipelines / snowflake / xoos / platform)

Use the Feature template body — Goal and Acceptance Criteria are still required.


Issue Title Convention

All titles must use an imperative verb and follow the [Type] prefix pattern.

✅ Correct ❌ Wrong
[Feature] Add CSAT Silver model [Feature] Adding CSAT Silver model
[Bug] Fix null agent IDs in Gold fact [Bug] Null agent IDs
[Spike] Evaluate Cube for semantic layer [Spike] Look into Cube?

Imperative verbs to use: Add, Fix, Build, Create, Remove, Migrate, Evaluate, Investigate, Update, Refactor, Document, Deploy


Task Sizing & Effort Guidelines

Tasks are the smallest actionable unit. A Task should be completable in one PR by one person.

Effort Duration Appropriate for
XS < 2h Quick fix, config change, single-file edit
S Half day Small Task, single-concern change
M 1-2 days Maximum for a Task — ideal size
L 3-5 days Feature (break into M-sized Tasks)
XL > 1 week Epic candidate (break into Features)

If a Task is estimated at L or higher, it's not a Task — it's a Feature that needs to be broken down.

Situation What to use
Work has 1 PR, checklist of steps Task with acceptance criteria checkboxes
Work needs 2-3 independent deliverables Feature with child Tasks
Work spans 3+ issues across areas Epic with child Features
Multiple related items for similar sources One Feature with Tasks per source (not separate issues per source)

Anti-pattern to avoid: Creating a series of near-identical issues for individual items (e.g., one issue per Bronze table for the same client). Instead, group by source or client as one Feature with Tasks per source.


Docs Companion Rules

Parent Type Companion Docs? Signal
Feature Always Every new capability needs its docs updated
Bug Sometimes If the bug reveals a docs gap (external report, misuse pattern)
Task Rarely Only if it changes interfaces others depend on
Chore Almost never Only if it reveals undocumented fragility

Label System

Every issue must have exactly one area label. That is the only label type used. Epics do not get area labels. Their children carry the area context.

Area Labels (pick exactly one)

Label Covers CI enforced
pipelines Extract → Load: extractors, staging, DAGs, Airflow, orchestration
snowflake Bronze DDL, dbt models (Silver/Gold), schemachange, schema design
xoos Embedded data views, XOOS product features
platform Packages (xo-core/foundry/lens/bosun), CI/CD, tooling, docs, project ops

The label-check.yml CI workflow fails any PR where neither the PR nor its linked issue has one of these area labels.

Issue Types (native GitHub, not labels)

Issue typing is handled by GitHub's native Issue Types — not labels. Every issue must have exactly one:

Type When to use
Feature New capability
Bug Something broken
Task Implementation work or time-boxed research (spike) — spike prefix = timebox + mandatory artifact
Chore Maintenance, cleanup, refactor
Docs Documentation only
Epic Multi-sprint initiative grouping child issues

Custom Project Fields

Every issue in the project board has these custom fields:

Field Options Required for Refined?
Priority P1 - Critical / P2 - High / P3 - Normal / P4 - Low Yes
Domain WarbyParker / CondeNast / (unset = internal) No
Effort XS (<2h) / S (half day) / M (1-2 days) / L (3-5 days) / XL (>1 week) Yes
Iteration Current biweekly sprint Set at sprint planning

Definition of "Refined" — The Quality Gate

No issue moves to "In Progress" until it is Refined. An issue is Refined when ALL of these are true:

  • Title uses [Type] prefix and an imperative verb
  • Native Issue Type is set (Feature / Bug / Task / Chore / Docs / Epic)
  • Area label is set (pipelines / snowflake / xoos / platform)
  • Priority field is set in the project board (P1–P4)
  • Effort field is set in the project board
  • Sprint field answered — either assigned to current sprint, or explicitly left in backlog
  • Acceptance criteria are written as a checkbox list in the body

If any of these are missing, the issue stays in Backlog. A reviewer or the original author must fill in the gaps before sprint planning can pull it in.


Issue Lifecycle

Backlog → Refined → In Progress → In Review → Done
Stage Requirements Who
Backlog Has a title and native Issue Type Anyone
Refined All 6 quality gate criteria met Author or team lead
In Progress Assignee set + branch created (name includes issue number) Developer picks it up
In Review PR open with Closes #N in body Developer
Done PR merged (auto-close) or issue manually closed with explanation Reviewer merges

Acceptance Criteria Standards

Acceptance criteria define the closure conditions for an issue. They must be:

  1. Checkboxes — use - [ ] markdown syntax
  2. Independently verifiable — each item can be checked off separately
  3. Specific — not vague; include what to check and how

Good examples:

- [ ] Silver model row count matches Bronze row count (±0 for same batch)
- [ ] `dbt test --select csat_silver` passes with zero failures
- [ ] Gold fact table is visible in Tableau with correct agent names
- [ ] Re-running the pipeline for the same date produces identical output (idempotency)

Bad examples (too vague):

- [ ] Data looks correct
- [ ] Pipeline works
- [ ] Model is done


Branch Naming Convention

Every branch must include the issue number. This creates automatic linking in GitHub.

<type>/<issue-number>-<short-description>
Type prefix Use for
feature/ Feature issues
bug/ Bug issues
spike/ Spike issues
chore/ Chore issues
docs/ Docs issues

Examples: - feature/42-add-csat-gold-model - bug/57-fix-null-agent-ids - spike/61-evaluate-cube-metrics - docs/70-add-github-quality-standards

Tip: Create the branch via the GitHub issue UI ("Create a branch" button) — it auto-names it correctly and links the branch to the issue.


Pull Request Requirements

Every PR must:

  1. Link the issue — include Closes #N in the body (triggers auto-close on merge)
  2. Use the PR template from .github/pull_request_template.md
  3. Have an area label (on the PR or inherited from the linked issue)
  4. Require 1 review before merge — no self-merges

Minimum PR body:

## Summary
What does this PR do? One or two sentences.

## Linked Issue
Closes #N

## Type of Change
<!-- Feature | Bug fix | Chore | Docs | Spike output -->

## Testing
[Checked items relevant to the change]

Sprint Rhythm

Sprints are biweekly (2 weeks). The Sprint Board view filtered to the current iteration is the daily driver.

Day Activity
Monday Sprint planning — review Refined column, pull issues into current iteration, assign owners
Daily Async standup — update issue status, move between columns, flag blockers with a comment
Thursday Mid-sprint check — any blocked items? Stale In Progress? Scope creep?
Friday Sprint close — move unfinished items back to Refined, add retro notes

Epic Creation Guide

When to create an epic

  • The initiative spans 3 or more issues
  • Work will take more than one sprint
  • Multiple people will contribute to a single goal

How to structure an epic

  1. Create the epic issue first with [Epic] prefix
  2. Break the work into child issues — each should be M effort (1-2 days) or smaller
  3. Add child issue numbers to the epic's task list: - [ ] #N Title
  4. Set a Milestone on both the epic and children (for external commitments)
  5. Keep the epic open until all task list items are checked

Child issue sizing rule

Effort Max duration Action
XS < 2h OK
S Half day OK
M 1-2 days OK — ideal size
L 3-5 days Try to split into 2 M issues
XL > 1 week Must split — too large to track

Closing and Archiving Issues

Normal closure (automatic)

When a PR with Closes #N merges to main, GitHub automatically closes the issue and moves it to Done.

Manual closure (only when no work was done)

Close manually only when: - Issue was rejected before work started - Issue was cancelled or is out of scope - Issue is a duplicate of another

Always add a comment explaining why before manually closing.

Stale issues

Issues in Backlog with no activity for 4+ weeks should be reviewed at sprint planning. Either: - Refine and pull in, or - Close with a reason comment


Quick Reference Card

What you're creating Use this type Required body sections
New pipeline, model, or feature Feature Goal, Acceptance Criteria, Background, Out of Scope
Something is broken Bug What's Broken, Expected/Actual, Steps to Reproduce, Impact
Research needed Spike Goal, Timebox, Questions, Output Format
Implementation work Task Goal, Acceptance Criteria
Tech debt / cleanup Chore Goal, Acceptance Criteria
ADR / guide update Docs Goal, Acceptance Criteria
Multi-sprint initiative Epic Goal, Scope, Child Issues

See also: - github-projects-workflow.md — Full workflow guide including Claude's role - ADR 016 — Decision record for adopting these standards - XO Data Ops Project Board