Large Language Models are Complex Table Parsers

被引:0
|
作者
Zhao, Bowen [1 ]
Ji, Changkai [2 ]
Zhang, Yuejie [1 ]
He, Wen [3 ]
Wang, Yingwen [3 ]
Wang, Qing [3 ]
Feng, Rui [1 ,2 ,3 ]
Zhang, Xiaobo [3 ]
机构
[1] Fudan Univ, Sch Comp Sci, Shanghai Key Lab Intelligent Informat Proc, Shanghai 200433, Peoples R China
[2] Fudan Univ, Acad Engn & Technol, Shanghai, Peoples R China
[3] Fudan Univ, Natl Childrens Med Ctr, Childrens Hosp, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the Generative Pre-trained Transformer 3.5 (GPT-3.5) exhibiting remarkable reasoning and comprehension abilities in Natural Language Processing (NLP), most Question Answering (QA) research has primarily centered around general QA tasks based on GPT, neglecting the specific challenges posed by Complex Table QA. In this paper, we propose to incorporate GPT-3.5 to address such challenges, in which complex tables are reconstructed into tuples and specific prompt designs are employed for dialogues. Specifically, we encode each cell's hierarchical structure, position information, and content as a tuple. By enhancing the prompt template with an explanatory description of the meaning of each tuple and the logical reasoning process of the task, we effectively improve the hierarchical structure awareness capability of GPT-3.5 to better parse the complex tables. Extensive experiments and results on Complex Table QA datasets, i.e., the open-domain dataset HiTAB and the aviation domain dataset AIT-QA show that our approach significantly outperforms previous work on both datasets, leading to state-of-the-art (SOTA) performance.
引用
收藏
页码:14786 / 14802
页数:17
相关论文
共 50 条
  • [31] Dissociating language and thought in large language models
    Mahowald, Kyle
    Ivanova, Anna A.
    Blank, Idan A.
    Kanwisher, Nancy
    Tenenbaum, Joshua B.
    Fedorenko, Evelina
    TRENDS IN COGNITIVE SCIENCES, 2024, 28 (06) : 517 - 540
  • [32] Dependency and phrasal parsers of the Czech language:: A comparison
    Horak, Ales
    Holan, Tomas
    Kadlec, Vladimir
    Kovar, Vojtech
    TEXT, SPEECH AND DIALOGUE, PROCEEDINGS, 2007, 4629 : 76 - +
  • [33] On the creativity of large language models
    Franceschelli, Giorgio
    Musolesi, Mirco
    AI & SOCIETY, 2024,
  • [34] Large language models and psychiatry
    Orru, Graziella
    Melis, Giulia
    Sartori, Giuseppe
    INTERNATIONAL JOURNAL OF LAW AND PSYCHIATRY, 2025, 101
  • [35] Large Language Models in Cyberattacks
    S. V. Lebed
    D. E. Namiot
    E. V. Zubareva
    P. V. Khenkin
    A. A. Vorobeva
    D. A. Svichkar
    Doklady Mathematics, 2024, 110 (Suppl 2) : S510 - S520
  • [36] Autoformalization with Large Language Models
    Wu, Yuhuai
    Jiang, Albert Q.
    Li, Wenda
    Rabe, Markus N.
    Staats, Charles
    Jamnik, Mateja
    Szegedy, Christian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [37] Imitation and Large Language Models
    Boisseau, Eloise
    MINDS AND MACHINES, 2024, 34 (04)
  • [38] The Smallness of Large Language Models
    Denning, Peter J.
    COMMUNICATIONS OF THE ACM, 2023, 66 (09) : 24 - 27
  • [39] Large language models in medicine
    Thirunavukarasu, Arun James
    Ting, Darren Shu Jeng
    Elangovan, Kabilan
    Gutierrez, Laura
    Tan, Ting Fang
    Ting, Daniel Shu Wei
    NATURE MEDICINE, 2023, 29 (08) : 1930 - 1940
  • [40] Large language models in medicine
    Arun James Thirunavukarasu
    Darren Shu Jeng Ting
    Kabilan Elangovan
    Laura Gutierrez
    Ting Fang Tan
    Daniel Shu Wei Ting
    Nature Medicine, 2023, 29 : 1930 - 1940