Abstract
The reasoning ability on Table-based Fact Verification (TFV) has been explored extensively on Large Language Models (LLMs). Existing TFV approaches can be broadly categorized into two main paradigms: prompting and fine-tuning. However, most prompting methods highly depend on powerful closed-sourced LLMs, which causes data breach problems, while fine-tuning methods mainly focus on generating end-to-end answers but often lack explainability. In this paper, we introduce a three-stage pipeline framework LEAP for TFV. It provides a step-by-step solution through instruction tuning by decomposing the statement into sub-questions. It consists of three sub-modules: a sentence decomposer, a table-based question answerer, and a table-based evidence-augmented verifier. To specialize the sub-modules in their respective tasks, we construct the reasoning evidence by data distillation and pseudo-labeling. Through diverse experiments on four TFV benchmarks, we demonstrate that our LEAP achieves state-of-the-art performance. In particular, it even outperforms human performance with an accuracy of 2.02% on Infotabs. As an effective method suitable for scenarios that require data privacy, our LEAP framework exhibits strong generalization and versatility on different types of backbone models and datasets. Furthermore, our experiments validate the effectiveness of each sub-module, proving its suitability as a standalone model for its respective task.
| Original language | English |
|---|---|
| Title of host publication | Advanced Data Mining and Applications: 21st International Conference, ADMA 2025, Kyoto, Japan, October 22-24, 2025, Proceedings, Part I |
| Editors | Masatoshi Yoshikawa, Xiaofeng Meng, Yang Cao, Chuan Xiao, Weitong Chen, Yanda Wang |
| Place of Publication | Singapore |
| Publisher | Springer |
| Pages | 206-221 |
| Number of pages | 16 |
| ISBN (Electronic) | 9789819534531 |
| ISBN (Print) | 9789819534524 |
| DOIs | |
| Publication status | Published - 2026 |
| Event | ADMA (Conference) - Kyoto, Japan Duration: 22 Oct 2025 → 24 Oct 2025 Conference number: 21st |
Publication series
| Name | Lecture Notes in Computer Science |
|---|---|
| Volume | 16197 LNCS |
| ISSN (Print) | 0302-9743 |
| ISSN (Electronic) | 1611-3349 |
Conference
| Conference | ADMA (Conference) |
|---|---|
| Country/Territory | Japan |
| City | Kyoto |
| Period | 22/10/25 → 24/10/25 |
Keywords
- Decomposing
- Instruction Tuning
- Large Language Models
- Table-based Fact Verification