Methodology


Fact Check Categorization

  • True
    Evidence from credible, preferably primary sources supports the statement as accurate.
    For example: the Federal Register shows a final rule published on the stated date; Treasury/agency disbursement data matches the amounts and recipients claimed.
  • Close
    Not 100% exact but close enough for a reasonable person.
    For example, a leader may claim "GDP has increased by 2.3%" when the actual growth was 2.25%. This is to help catch "good-faith" inaccuracies where something may be simplified for brevity, or where something was mildly exaggerated.
  • Misleading
    Technically correct but framed in a misleading way.
    For example: touting a "record funding increase" in nominal dollars while inflation-adjusted funding fell; citing a selective timeframe to claim "record-low unemployment" while omitting a recent uptick.
  • Unverifiable
    Not verifiable or falsifiable (e.g., opinion, intent, or unfalsifiable claims).
    For example: statements of intent ("we will hold bad actors accountable") without measurable criteria; subjective claims ("the best program") without objective standards.
  • Unclear
    Evidence is incomplete or developing; future updates may clarify.
    For example: an investigation is announced but no report exists yet; a draft rule is proposed and the final policy scope is still undetermined.
  • False
    Credible evidence contradicts the statement.
    For example: a claim says a program ended, but the agency’s official site still lists current enrollment; a statement says a law was struck down, but court dockets show no such ruling.
  • Tech Error
    Verification couldn’t be completed due to technical access/rendering issues.
    For example: the primary source website is down or heavily rate-limited; the official PDF is corrupted or requires login and an alternate source is pending.
    We typically try to revisit any issues classified as Tech Errors, though its not always possible to resolve.