LEAP: an LLM-based evidence augmented pipeline for table-based fact verification

Hanwen Zhang, Qingyi Si, Peng Fu, Zheng Lin, Zhigang Lu, Weiping Wang

Research output: Chapter in Book / Conference PaperChapterpeer-review

Abstract

The reasoning ability on Table-based Fact Verification (TFV) has been explored extensively on Large Language Models (LLMs). Existing TFV approaches can be broadly categorized into two main paradigms: prompting and fine-tuning. However, most prompting methods highly depend on powerful closed-sourced LLMs, which causes data breach problems, while fine-tuning methods mainly focus on generating end-to-end answers but often lack explainability. In this paper, we introduce a three-stage pipeline framework LEAP for TFV. It provides a step-by-step solution through instruction tuning by decomposing the statement into sub-questions. It consists of three sub-modules: a sentence decomposer, a table-based question answerer, and a table-based evidence-augmented verifier. To specialize the sub-modules in their respective tasks, we construct the reasoning evidence by data distillation and pseudo-labeling. Through diverse experiments on four TFV benchmarks, we demonstrate that our LEAP achieves state-of-the-art performance. In particular, it even outperforms human performance with an accuracy of 2.02% on Infotabs. As an effective method suitable for scenarios that require data privacy, our LEAP framework exhibits strong generalization and versatility on different types of backbone models and datasets. Furthermore, our experiments validate the effectiveness of each sub-module, proving its suitability as a standalone model for its respective task.

Original languageEnglish
Title of host publicationAdvanced Data Mining and Applications: 21st International Conference, ADMA 2025, Kyoto, Japan, October 22-24, 2025, Proceedings, Part I
EditorsMasatoshi Yoshikawa, Xiaofeng Meng, Yang Cao, Chuan Xiao, Weitong Chen, Yanda Wang
Place of PublicationSingapore
PublisherSpringer
Pages206-221
Number of pages16
ISBN (Electronic)9789819534531
ISBN (Print)9789819534524
DOIs
Publication statusPublished - 2026
EventADMA (Conference) - Kyoto, Japan
Duration: 22 Oct 202524 Oct 2025
Conference number: 21st

Publication series

NameLecture Notes in Computer Science
Volume16197 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceADMA (Conference)
Country/TerritoryJapan
CityKyoto
Period22/10/2524/10/25

Keywords

  • Decomposing
  • Instruction Tuning
  • Large Language Models
  • Table-based Fact Verification

Fingerprint

Dive into the research topics of 'LEAP: an LLM-based evidence augmented pipeline for table-based fact verification'. Together they form a unique fingerprint.

Cite this