Label informed hierarchical transformers for sequential sentence classification in scientific abstracts

被引:1
|
作者
Takola, Yaswanth Sri Sai Santosh [1 ]
Aluru, Sai Saketh [2 ]
Vallabhajosyula, Anoop [3 ]
Sanyal, Debarshi Kumar [4 ,7 ]
Das, Partha Pratim [5 ,6 ]
机构
[1] Tech Univ Munich, Sch Computat Informat & Technol, Munich, Germany
[2] Shaw India Pvt Ltd, Hyderabad, India
[3] Arizona State Univ, Sch Comp & Augmented Intelligence, Tempe, AZ USA
[4] Indian Assoc Cultivat Sci, Sch Math & Computat Sci, Kolkata, India
[5] Ashoka Univ, Dept Comp Sci, Sonipat, India
[6] Indian Inst Technol Kharagpur, Dept Comp Sci & Engn, Kharagpur, India
[7] Indian Assoc Cultivat Sci, Sch Math & Computat Sci, Kolkata 700032, West Bengal, India
关键词
discourse segmentation; hierarchical transformers; scholarly data; scientific abstracts; sequential sentence classification;
D O I
10.1111/exsy.13238
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Segmenting scientific abstracts into discourse categories like background, objective, method, result, and conclusion is useful in many downstream tasks like search, recommendation and summarization. This task of classifying each sentence in the abstract into one of a given set of discourse categories is called sequential sentence classification. Existing machine learning-based approaches to this problem consider the content of only the abstract to obtain the neural representation of each sentence, which is then labelled with a discourse category. But this ignores the semantic information offered by the discourse labels themselves. In this paper, we propose LIHT, Label Informed Hierarchical Transformers - a method for sequential sentence classification that explicitly and hierarchically exploits the semantic information in the labels to learn label-aware neural sentence representations. The hierarchical model helps to capture not only the fine-grained interactions between the discourse labels and the words in the abstract at the sentence level but also the potential dependencies that may exist in the label sequence. Thus, LIHT generates label-aware contextual sentence representations that are then labelled with a conditional random field. We evaluate LIHT on three publicly available datasets, namely, PUBMED-RCT, NICTA-PIBOSO and CSAbstract. The incremental gain in F1-score in all the three cases over the respective state-of-the-art approaches is around 1%$$ 1\% $$. Though the gains are modest, LIHT establishes a new performance benchmark for this task and is a novel technique of independent interest. We also perform an ablation study to identify the contribution of each component of LIHT in the observed performance, and a case study to visualize the roles of the different components of our model.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Multi-label classification using hierarchical embedding
    Kumar, Vikas
    Pujari, Arun K.
    Padmanabhan, Vineet
    Sahu, Sandeep Kumar
    Kagita, Venkateswara Rao
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2018, 91 : 263 - 269
  • [42] Decision trees for hierarchical multi-label classification
    Celine Vens
    Jan Struyf
    Leander Schietgat
    Sašo Džeroski
    Hendrik Blockeel
    [J]. Machine Learning, 2008, 73 : 185 - 214
  • [43] Active learning for hierarchical multi-label classification
    Nakano, Felipe Kenji
    Cerri, Ricardo
    Vens, Celine
    [J]. DATA MINING AND KNOWLEDGE DISCOVERY, 2020, 34 (05) : 1496 - 1530
  • [44] Taming Pretrained Transformers for Extreme Multi-label Text Classification
    Chang, Wei-Cheng
    Yu, Hsiang-Fu
    Zhong, Kai
    Yang, Yiming
    Dhillon, Inderjit S.
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 3163 - 3171
  • [45] Peer-Label Assisted Hierarchical Text Classification
    Song, Junru
    Wang, Feifei
    Yang, Yang
    [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 3747 - 3758
  • [46] Hyperbolic Embeddings for Hierarchical Multi-label Classification
    Tomaz, Stepisnik
    Kocev, Dragi
    [J]. FOUNDATIONS OF INTELLIGENT SYSTEMS (ISMIS 2020), 2020, 12117 : 66 - 76
  • [47] Decision trees for hierarchical multi-label classification
    Vens, Celine
    Struyf, Jan
    Schietgat, Leander
    Dzeroski, Saso
    Blockeel, Hendrik
    [J]. MACHINE LEARNING, 2008, 73 (02) : 185 - 214
  • [48] Active learning for hierarchical multi-label classification
    Felipe Kenji Nakano
    Ricardo Cerri
    Celine Vens
    [J]. Data Mining and Knowledge Discovery, 2020, 34 : 1496 - 1530
  • [49] Evaluating Extreme Hierarchical Multi-label Classification
    Amigo, Enrique
    Delgado, Agustin D.
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 5809 - 5819
  • [50] Feature Selection for Hierarchical Multi-label Classification
    da Silva, Luan V. M.
    Cerri, Ricardo
    [J]. ADVANCES IN INTELLIGENT DATA ANALYSIS XIX, IDA 2021, 2021, 12695 : 196 - 208