Extractive-Abstractive Summarization of Judgment Documents Using Multiple Attention Networks

被引:1
|
作者
Gao, Yan [1 ]
Liu, Zhengtao [1 ]
Li, Juan [2 ]
Guo, Fan [1 ,2 ]
Xiao, Fei [1 ,2 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Cent South Univ, Sch Law, Changsha, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Judgment documents; Automatic summarization; Attention network; Encoder-decoder;
D O I
10.1007/978-3-030-89391-0_28
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Judgment documents contain rich legal information, they are simultaneously lengthy with complex structure. This requires summarizing judgment documents in an effective way. By analyzing the structural features of Chinese judgment documents, we propose an automatic summarization method, which consists of an extraction model and an abstraction model. In the extraction model, all the sentences are encoded by a Self-Attention network and are classified into key sentences and non-key sentences. In the abstraction model, the initial summarization is refined into a final summarization by a unidirectional-bidirectional attention network. Such a summarization could help improve the efficiency in case handling and make judgment documents more accessible to the general readers. The experimental results on CAIL2020 dataset are satisfactory.
引用
收藏
页码:486 / 494
页数:9
相关论文
共 47 条
  • [31] MemSum: Extractive Summarization of Long Documents Using Multi-Step Episodic Markov Decision Processes
    Gu, Nianlong
    Ash, Elliott
    Hahnloser, Richard H. R.
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 6507 - 6522
  • [32] RBPSum: An extractive summarization approach using Bi-Stream Attention and Position Residual Connection
    Wang, Ziwei
    Zeng, Jun
    Tao, Hongjin
    Zhong, Lin
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [33] Abstractive social media text summarization using selective reinforced Seq2Seq attention model
    Liang, Zeyu
    Du, Junping
    Li, Chaoyang
    [J]. NEUROCOMPUTING, 2020, 410 : 432 - 440
  • [34] Abstractive Text Summarization Using Pointer-Generator Networks With Pre-trained Word Embedding
    Dang Trung Anh
    Nguyen Thi Thu Trang
    [J]. SOICT 2019: PROCEEDINGS OF THE TENTH INTERNATIONAL SYMPOSIUM ON INFORMATION AND COMMUNICATION TECHNOLOGY, 2019, : 473 - 478
  • [35] Graph-Based Extractive Arabic Text Summarization Using Multiple Morphological Analyzers
    Elbarougy, Reda
    Behery, Gamal
    El Khatib, Akram
    [J]. JOURNAL OF INFORMATION SCIENCE AND ENGINEERING, 2020, 36 (02) : 347 - 363
  • [36] Passage Retrieval on Structured Documents Using Graph Attention Networks
    Albarede, Lucas
    Mulhem, Philippe
    Goeuriot, Lorraine
    Le Pape-Gardeux, Claude
    Marie, Sylvain
    Chardin-Segui, Trinidad
    [J]. ADVANCES IN INFORMATION RETRIEVAL, PT II, 2022, 13186 : 13 - 21
  • [37] Source Code Summarization Using Attention-based Keyword Memory Networks
    Choi, YunSeok
    Kim, Suah
    Lee, Jee-Hyong
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2020), 2020, : 564 - 570
  • [38] Automatic Arabic Text Summarization for Large Scale Multiple Documents Using Genetic Algorithm and MapReduce
    Al Breem, Sulaiman N.
    Baraka, Rebhi S.
    [J]. 2017 PALESTINIAN INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY (PICICT), 2017, : 40 - 45
  • [39] Multi-Label Classification of Historical Documents by Using Hierarchical Attention Networks
    Kim, Dong-Kyum
    Lee, Byunghwee
    Kim, Daniel
    Jeong, Hawoong
    [J]. JOURNAL OF THE KOREAN PHYSICAL SOCIETY, 2020, 76 (05) : 368 - 377
  • [40] Multi-Label Classification of Historical Documents by Using Hierarchical Attention Networks
    Dong-Kyum Kim
    Byunghwee Lee
    Daniel Kim
    Hawoong Jeong
    [J]. Journal of the Korean Physical Society, 2020, 76 : 368 - 377