Methodology
The Concordance Incident Index is a public, CC BY 4.0 licensed dataset of major publicly-documented software incidents, each mapped to the Concordance engineering protocols that the company's own published root-cause analysis identifies as having failed.
Inclusion criteria
An incident is eligible for inclusion if it meets at least one of the following:
- SEC 8-K filing — the affected company filed a material cybersecurity incident disclosure.
- Published RCA / post-mortem — the affected company published an official root cause analysis with named root causes.
- Major impact — caused ≥$10M reported economic impact, OR ≥10M users affected, OR ≥4 hours major-service outage.
- Regulatory action — triggered enforcement action by a regulator (FTC, state AG, EU, sector-specific authority).
Sources
Every entry in the index links to one or more primary sources — the affected company's own RCA, an SEC filing, a CISA / NIST advisory, or first-hand reporting from established outlets. The factual claims (what happened, when, with what impact, what the root cause was) trace to those sources. Concordance does not make independent factual assertions about incidents.
Protocol mapping
The mapping from a published root cause to one or more of the 50 Concordance engineering protocols is editorial. We attempt to follow the principle: if the root cause says X failed, and X falls within the definition of Concordance protocol Y, we map X → Y. The mapping is shown alongside the source RCA so readers can audit the editorial choice. Disagreements with a specific mapping are welcomed (see Errata below).
Review status
Each entry carries a review status:
- Draft — entry has been written from primary sources but has not been reviewed by a second editor.
- Reviewed — entry has been read end-to-end by an editor independent of the original drafter.
- Verified — the affected company or a recognised independent expert has confirmed the mapping.
The Index launches with most entries at draft. Status is shown publicly on every entry so readers can weight the entry accordingly. Errata reports (below) feed into the review pipeline.
Errata and corrections
To report a factual error, suggest a missing source, or contest a protocol mapping, email hello@concordancelabs.com with the entry slug and the proposed correction. Acknowledged corrections are credited in the entry's history.
License
The Index dataset and per-entry content are licensed under Creative Commons Attribution 4.0 International (CC BY 4.0). You may use, share, adapt, and republish — including commercially — provided you credit "Concordance Incident Index" with a link to getconcordance.com/labs/incidents.
Programmatic access
The Index is served as JSON at /api/incidents. No authentication, no rate limit on read. Filter parameters: protocol, industry, severity, ai, since.
Roadmap
v0.1 launches with a curated foundation of headline-tier incidents (~10 at first, growing to ~250 over the following weeks). NSF SBIR Phase I research scales the methodology to 2,000–5,000 public-repo analyses to test four falsifiable hypotheses about AI's impact on engineering practice quality. Read the thesis.