The editorial pipeline

How a published study moves from the literature into a scored SupplementScore entry. Last updated 2026-05-01.

The short version. A study only ever influences a score after it has been independently re-read by a second editor, classified by funder type, mapped to its supplement, and time-stamped. Nothing is auto-ingested.

From study to score, end to end

1

Discovery

Weekly · automated alerts

Each Monday, a literature scan runs against PubMed, the Cochrane Library, ClinicalTrials.gov, bioRxiv, and the EFSA/FDA bulletin feeds. The query set is keyed to the 734 supplements currently in the database, plus a watch list of 60 emerging compounds.

The scan returns roughly 250–400 candidate items per week. About 80% are filtered out at this stage: animal-only studies, single-case reports, conference abstracts without full data, and articles in non-indexed journals.

2

Triage

Same week · editor of the day

An editor reads each surviving abstract and classifies it as: pivotal (changes a score), confirmatory (reinforces an existing score), null (no change but worth citing), retraction-watch, or signal-only (interesting but not yet actionable).

Pivotal items are routed for full-text review. Everything else is filed against the relevant supplement entry as a citation, with the classification logged.

3

Full-text review

Within 7 days of triage · single reviewer

The full text is read alongside the protocol (where available). The reviewer extracts: study design, sample size, blinding, primary endpoint, effect size with confidence interval, dropout, and funder. They write a one-paragraph extraction that lives in the citation record.

Funder type is set explicitly: public, nonprofit, industry-sponsored, mixed, or undisclosed. Tier-1 supplements require a public or nonprofit pivotal study; industry-only evidence keeps a supplement at tier 2 even if the effect size is large.

4

Independent second-read

Within 7 days of step 3 · different reviewer

A second editor reads the same paper without seeing the first reviewer's extraction. They produce their own one-paragraph summary and effect-size estimate.

If the two summaries diverge meaningfully on direction, magnitude, or tier implication, both reviewers meet (async, in writing) and either reach consensus or escalate to a third reviewer. Disagreements are logged so we can audit our own reliability over time.

5

Score recalculation

Within 24 hours of consensus

The reviewer applies the formula e×7 + s×4 + r×3 + o×2 + c×2 + d×2 with the new evidence input. The composite score (0–100) and the tier (T1 ≥72, T2 60–71, T3 40–59, T4 <40) are recomputed.

Any movement of more than 4 points or any tier change writes a row to _archive/score-changes.csv with the citation, the reviewers, and a one-sentence rationale.

6

Tier-1 gate (if applicable)

Same day · automated

Any supplement promoted to tier 1 must satisfy the citation gate: at least one pivotal study with a public or nonprofit funder, indexed by PubMed, with a PMID stored in the entry. The gate runs as a CI check (scripts/tier1_gate.py) on every deploy. A failure blocks the deploy.

7

Publication

Next deploy · usually within 48 hours

The change is committed, the score updates, and the supplement's "last reviewed" date is bumped. The citation appears in the bibliography and (where the study is open-access) in the supplement's per-page references.

For pivotal updates, an entry is added to the homepage "What changed this week" feed and surfaced in the New this week callout for seven days.

8

Reader feedback loop

Continuous

Every supplement page carries a "Report inaccuracy" link. Submissions land in our editorial inbox, are triaged within 3 business days, and re-enter at step 3 (full-text review) when a reader provides a citation we haven't already considered.

What we won't do. We will not promote a supplement on the basis of mechanistic plausibility, animal data, or industry-only trials. We will not de-list a supplement on the basis of a single null study without first looking at the broader literature. And we never accept editorial input from manufacturers, distributors, or retailers — see the funder policy for the full statement.

How long does the cycle take?

Median time from a study being indexed in PubMed to a score change going live on the site: 21 days. Faster for high-priority safety signals (target: 72 hours), slower for marginal-effect studies that get parked in the watch list.

What about retractions?

A monthly retraction-watch task runs against every PMID we cite (scripts/retraction_watch.py). Any retracted study is removed from the citation set within one deploy cycle, and any score that depended on it is recalculated. The retraction itself is logged in the supplement's history so the change is visible to readers, not silently rewritten.

Related