Transformer-based highlights extraction from scientific papers

被引:6
|
作者
La Quatra, Moreno [1 ]
Cagliero, Luca [1 ]
机构
[1] Politecn Torino, Dipartimento Automat & Informat, Corso Duca Abruzzi 24, I-10129 Turin, Italy
关键词
Highlights extraction; Transformer model; Extractive summarization;
D O I
10.1016/j.knosys.2022.109382
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Highlights are short sentences used to annotate scientific papers. They complement the abstract content by conveying the main result findings. To automate the process of paper annotation, highlights extraction aims at extracting from 3 to 5 paper sentences via supervised learning. Existing approaches rely on ad hoc linguistic features, which depend on the analyzed context, and apply recurrent neural networks, which are not effective in learning long-range text dependencies. This paper leverages the attention mechanism adopted in transformer models to improve the accuracy of sentence relevance estimation. Unlike existing approaches, it relies on the end-to-end training of a deep regression model. To attend patterns relevant to highlights content it also enriches sentence encodings with a section-level contextualization. The experimental results, achieved on three different benchmark datasets, show that the designed architecture is able to achieve significant performance improvements compared to the state-of-the-art. (c) 2022 Published by Elsevier B.V.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] Transformer-based Planning for Symbolic Regression
    Shojaee, Parshin
    Meidani, Kazem
    Farimani, Amir Barati
    Reddy, Chandan K.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [32] Transformer-based Bug/Feature Classification
    Ozturk, Ceyhun E.
    Yilmaz, Eyup Halit
    Koksal, Omer
    [J]. 2023 31ST SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU, 2023,
  • [33] Adapting transformer-based language models for heart disease detection and risk factors extraction
    Houssein, Essam H.
    Mohamed, Rehab E.
    Hu, Gang
    Ali, Abdelmgeid A.
    [J]. JOURNAL OF BIG DATA, 2024, 11 (01)
  • [34] Adapting transformer-based language models for heart disease detection and risk factors extraction
    Essam H. Houssein
    Rehab E. Mohamed
    Gang Hu
    Abdelmgeid A. Ali
    [J]. Journal of Big Data, 11
  • [35] Transformer-Based Fire Detection in Videos
    Mardani, Konstantina
    Vretos, Nicholas
    Daras, Petros
    [J]. SENSORS, 2023, 23 (06)
  • [36] Swin transformer-based supervised hashing
    Peng, Liangkang
    Qian, Jiangbo
    Wang, Chong
    Liu, Baisong
    Dong, Yihong
    [J]. APPLIED INTELLIGENCE, 2023, 53 (14) : 17548 - 17560
  • [37] Transformer-Based Deep Survival Analysis
    Hu, Shi
    Fridgeirsson, Egill A.
    van Wingen, Guido
    Welling, Max
    [J]. SURVIVAL PREDICTION - ALGORITHMS, CHALLENGES AND APPLICATIONS, VOL 146, 2021, 146 : 132 - 148
  • [38] Transformer-based ripeness segmentation for tomatoes
    Shinoda, Risa
    Kataoka, Hirokatsu
    Hara, Kensho
    Noguchi, Ryozo
    [J]. SMART AGRICULTURAL TECHNOLOGY, 2023, 4
  • [39] A transformer-based network for speech recognition
    Tang L.
    [J]. International Journal of Speech Technology, 2023, 26 (2) : 531 - 539
  • [40] EEG Classification with Transformer-Based Models
    Sun, Jiayao
    Xie, Jin
    Zhou, Huihui
    [J]. 2021 IEEE 3RD GLOBAL CONFERENCE ON LIFE SCIENCES AND TECHNOLOGIES (IEEE LIFETECH 2021), 2021, : 92 - 93