Open Source Project Standard

The minimum professional baseline this standard expects from any public repository. Smaller than the foundation-style governance you see in large multi-vendor projects, larger than “we have a README and called it a day”.

The goal is a repository that behaves like a reliable public institution: clear decision rights, explicit contribution rules, predictable releases, automated quality gates, documented security reporting, and visible maintainer responsiveness.

Baseline components

Required for every adopter:

Contributor rights: DCO vs CLA

Inbound-rights policy must be singular and easy to explain. Pick one:

Do not run both. Do not conflate DCO sign-off with cryptographic commit signing — sign-off is a rights certification; signing is an authenticity control. A serious project may use both, but they solve different problems.

Issues vs Discussions

Default to Discussions enabled from day one. Without that separation, the issue tracker becomes a support forum within weeks. The SUPPORT.md file at the repo root makes the split explicit for contributors.

Governance, sized to the project

Skip foundation-style governance unless contributor volume justifies it. Use the smallest model that works, then grow:

ScaleModelWhat to document
Solo maintainerOne person decides everythingCODEOWNERS (catch-all), inbound-rights policy in CONTRIBUTING.md. No GOVERNANCE.md needed yet.
2–5 maintainersMaintainer committee with role splitAdd GOVERNANCE.md: who merges, who triages, who releases, escalation path.
5+ maintainers, multiple areasCommittee + area ownersPath-scoped CODEOWNERS, dedicated reviewers per area, optional release-manager rotation.
Foundation-scaleSIGs / TSC / councilsCharters per group, ownership maps, explicit RFC process. Only adopt when contributor volume forces it.

Anti-pattern: copying SIG/TSC structure from Kubernetes/Rust into a 2-maintainer project. The recherche behind this standard repeatedly warns against premature governance.

Release engineering

Health metrics

Track responsiveness and operating discipline, not stars or downloads. These are the metrics that signal a healthy project to potential adopters:

Review monthly at a fixed cadence. No tooling is mandated; GitHub’s built-in graphs (Pulse, Insights → Contributors, Traffic) plus the OpenSSF Scorecard cover the basics for free.

Trust signals worth adopting

In rough order of effort vs reward:

  1. OpenSSF Scorecard (already wired up in this repo) — automatic score, published badge.
  2. GitHub Security features: secret scanning, Dependabot alerts, optional code scanning. Mostly settings, not files.
  3. OpenSSF Best Practices Badge — application-based; aim for “Passing” early.
  4. Signed commits on protected branches — authenticity control orthogonal to DCO.
  5. SBOM + artifact attestations — only meaningful once you publish binaries or containers.

AI collaboration readiness

A repository should expose explicit rules, state, routing, and quality criteria so AI-generated work remains reviewable and controllable by humans. See the Human-AI Collaboration baseline and AGENTS.md for what that looks like in practice.

What this standard does NOT mandate

These are common recommendations elsewhere, deliberately not required here, because they tend to be cargo-culted before they’re useful:

Adopt them when they earn their place, not before.


Source: docs/open-source-project-standard.md — edits land here on the next deploy.