Uncertainty-Aware Web-Conditioned Scientific Fact-Checking

Atomic scientific fact-checking with uncertainty-gated web corroboration

Ashwin Vinod, Katrin Erk

The University of Texas at Austin / University of Massachusetts Amherst

Atomic plus Search pipeline

Abstract

Scientific fact-checking is vital for assessing claims in specialized domains such as biomedicine and materials science, yet existing systems often hallucinate or apply inconsistent reasoning when verifying technical claims against a provided evidence snippet. We present a modular pipeline centered on atomic predicate-argument decomposition and calibrated, uncertainty-gated corroboration.

Claims are decomposed into atomic facts, aligned to local snippets, verified by a compact evidence-grounded checker, and only uncertain facts trigger domain-restricted web search over authoritative sources. The system supports both binary and tri-valued classification and abstains with NEI when retrieved evidence conflicts with the provided context.

Introduction

Scientific claim verification needs to be both accurate and conservative. Sentence-level verifiers often miss scope, negation, and quantitative details, while unrestricted search can increase cost and make evidence provenance harder to control. This paper focuses on the single-document setting, where one evidence artifact is primary and external corroboration should be selective rather than default.

The core idea is to verify at atomic granularity. Instead of judging the whole claim at once, the system extracts short predicate-argument facts, scores each one against a local snippet, and only escalates uncertain cases to authoritative web sources such as PubMed, NIH, WHO, CDC, FDA, and ClinicalTrials.gov.

Methodology

The pipeline follows five steps:

  1. Atomic decomposition: break the claim into predicate-argument facts of at most 25 words.
  2. Snippet selection: align each fact to the most relevant local evidence window.
  3. Grounded verification: score fact-snippet pairs with MiniCheck-7B.
  4. Uncertainty-aware search: only facts in the uncertainty band trigger web corroboration.
  5. Claim-level judgment: aggregate supported and refuted facts into Supported, Refuted, or NEI.

A key design choice is that retrieved evidence never overrides the provided context. If corroborating sources conflict with the original document, the system abstains with NEI.

Results

The full Atomic+Search pipeline improves over sentence-level MiniCheck, tool-augmented LLM baselines, and recent retrieval-and-verification systems across biomedical and out-of-domain evaluation.

Main Results

System BIONLI Bal. Acc. BIONLI Recall BIONLI F1 PubMedFact1k Macro-F1 Climate Bal. Acc. Climate Recall (S) Climate F1 (S)
GPT-o1 65.8% 60.2% 64.9% 71.2% 69.70% 56.10% 65.05%
MiniCheck 61.35% 59.8% 60.7% - 69.10% 55.00% 64.00%
GPT-5 Mini 62.9% 58.7% 61.8% 68.5% 67.90% 54.20% 63.10%
Qwen 32B MAD 62.5% 59.1% 61.3% 61.8% 67.30% 53.60% 62.60%
RARR 66.4% 62.3% 65.3% 72.3% 70.40% 57.80% 66.30%
GPT-5 Mini + Search 66.9% 62.7% 65.8% 72.5% 71.20% 58.50% 67.00%
Atomic+Search (ours) 68.7% 65.4% 66.7% 73.7% 73.83% 62.14% 70.04%

Ablation

Variant F1 Delta vs. full
Atomic+Search (full) 66.7% Baseline
No-Search 62.0% -4.7
No-Atomic 60.3% -6.4
MajorityVote (No-Judge) 52.1% -14.6
Per-fact support probabilities before and after web corroboration

Key Contributions

Atomic fact decomposition: verify claims at a more local and interpretable unit than sentence-level checking.

Uncertainty-gated web corroboration: use external evidence selectively rather than by default.

Conservative three-way prediction: output Supported, Refuted, or NEI, with abstention under evidence conflict.

New dataset release: introduce PubMedFact1k for biomedical three-way scientific claim verification.

Conclusion

Atomic+Search combines atomic structure, calibrated verification, and selective corroboration to improve scientific fact-checking while keeping rationales and provenance explicit. The gains come from structure and decision policy, not from simply switching to a larger model.

BibTeX Citation

@inproceedings{vinod2026uncertainty,
  title={Uncertainty-Aware Web-Conditioned Scientific Fact-Checking},
  author={Vinod, Ashwin and Erk, Katrin},
  year={2026}
}