How the station works
Every dispatch is a cached, revisioned, hash-verified object built from primary public sources. Here's the pipeline.
Pipeline
- Every fifteen minutes, a GitHub Actions workflow polls each upstream feed and stores the raw payload with a SHA-256 content hash.
- Payloads are normalised into the shared
Dispatchschema. New dispatches are created; existing dispatches bumprevisionwhen material data changes. - Lifecycle rules move dispatches between
active,resolving, andarchived. - The static site rebuilds and deploys. A tiny client-side overlay polls
/api/live/*to patch fresher values on top of the snapshot for time-critical fields (wind, pressure, advisory number, felt count).
Severity scales
Every dispatch keeps its native scale verbatim — Saffir-Simpson for tropical, moment magnitude (Mw) for quakes, CAP severity for NWS warnings. A normalised 0-100 score is computed alongside so cross-hazard comparisons on the Desk are meaningful.
Severity guides per class live on the dispatch pages themselves. See e.g. the sidebar on any earthquake dispatch.
Lifecycle
- Severe weather
activeuntil expiry ·resolvingfor 24 hours · thenarchived.- Earthquake
activefor 1 hour ·resolvingfor 7 days · thenarchived.- Tropical
activeuntil final advisory ·resolvingfor 14 days · thenarchived. A post-tropical system that reintensifies can return toactive.
Editorial rules
Dispatch briefs are composed by a deterministic rule bank (src/lib/editorial.ts) — no LLM, no external calls. The engine picks the first rule that fires and leads with its fragment, then appends the cleaned upstream summary. Rules cover things like "fourth warning today in this county," "largest quake on the board," "tropical system reintensifying," "tornado emergency," "felt by 500+ people." Rule names are exposed in briefExplained() for downstream feeds.
Immutability & URLs
Archived dispatches are immutable. The URL /events/{id}/ is durable: it never 404s, never redirects, never reshuffles. Post-archival corrections arrive as new linked records via supersedes rather than edits.
Known limitations
- Our data is not authoritative. Always click through to the source when making a decision.
- Live overlay freshness depends on upstream availability and our Cloudflare edge cache TTL (60 to 90 seconds).
- Population-exposure estimates used in severity normalisation are approximations; see the constants in
src/lib/severity.ts.