Extractive-Abstractive Summarization of Judgment Documents Using Multiple Attention Networks

被引:1
|
作者
Gao, Yan [1 ]
Liu, Zhengtao [1 ]
Li, Juan [2 ]
Guo, Fan [1 ,2 ]
Xiao, Fei [1 ,2 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Cent South Univ, Sch Law, Changsha, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Judgment documents; Automatic summarization; Attention network; Encoder-decoder;
D O I
10.1007/978-3-030-89391-0_28
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Judgment documents contain rich legal information, they are simultaneously lengthy with complex structure. This requires summarizing judgment documents in an effective way. By analyzing the structural features of Chinese judgment documents, we propose an automatic summarization method, which consists of an extraction model and an abstraction model. In the extraction model, all the sentences are encoded by a Self-Attention network and are classified into key sentences and non-key sentences. In the abstraction model, the initial summarization is refined into a final summarization by a unidirectional-bidirectional attention network. Such a summarization could help improve the efficiency in case handling and make judgment documents more accessible to the general readers. The experimental results on CAIL2020 dataset are satisfactory.
引用
收藏
页码:486 / 494
页数:9
相关论文
共 47 条
  • [1] Question-driven text summarization using an extractive-abstractive framework
    Kia, Mahsa Abazari
    Garifullina, Aygul
    Kern, Mathias
    Chamberlain, Jon
    Jameel, Shoaib
    [J]. COMPUTATIONAL INTELLIGENCE, 2024, 40 (03)
  • [2] Query Oriented Extractive-Abstractive Summarization System (QEASS)
    Girthana, K.
    Swamynathan, S.
    [J]. PROCEEDINGS OF THE 6TH ACM IKDD CODS AND 24TH COMAD, 2019, : 301 - 305
  • [3] Extractive-Abstractive: A Two-Stage Model for Long Text Summarization
    Liang, Rui
    Li, Jianguo
    Huang, Li
    Lin, Ronghua
    Lai, Yu
    Xiong, Dan
    [J]. COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 173 - 184
  • [4] Unified extractive-abstractive summarization: a hybrid approach utilizing BERT and transformer models for enhanced document summarization
    Divya, S.
    Sripriya, N.
    Andrew, J.
    Mazzara, Manuel
    [J]. PeerJ Computer Science, 2024, 10 : 1 - 26
  • [5] Automatic Multi-Document Summarization for Indonesian Documents Using Hybrid Abstractive-Extractive Summarization Technique
    Yapinus, Glorian
    Erwin, Alva
    Galinium, Maulahikmah
    Muliady, Wahyu
    [J]. 2014 6TH INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND ELECTRICAL ENGINEERING (ICITEE), 2014, : 39 - 43
  • [6] Gated Graph Neural Attention Networks for abstractive summarization
    Liang, Zeyu
    Du, Junping
    Shao, Yingxia
    Ji, Houye
    [J]. NEUROCOMPUTING, 2021, 431 : 128 - 136
  • [7] A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss
    Hsu, Wan-Ting
    Lin, Chieh-Kai
    Lee, Ming-Ying
    Min, Kerui
    Tang, Jing
    Sun, Min
    [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 132 - 141
  • [8] Siamese hierarchical attention networks for extractive summarization
    Gonzalez, Jose-Angel
    Segarra, Encarna
    Garcia-Granada, Fernando
    Sanchis, Emilio
    Hurtado, Lluis-F
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2019, 36 (05) : 4599 - 4607
  • [9] Abstractive Text Summarization Using Enhanced Attention Model
    Roul, Rajendra Kumar
    Joshi, Pratik Madhav
    Sahoo, Jajati Keshari
    [J]. INTELLIGENT HUMAN COMPUTER INTERACTION (IHCI 2019), 2020, 11886 : 63 - 76
  • [10] Incorporating word attention with convolutional neural networks for abstractive summarization
    Yuan, Chengzhe
    Bao, Zhifeng
    Sanderson, Mark
    Tang, Yong
    [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2020, 23 (01): : 267 - 287