A Transformer-based Prior Legal Case Retrieval Method

被引:0
|
作者
Ozturk, Ceyhun E. [1 ]
Ozcelik, S. Baris [2 ]
Koc, Aykut [3 ]
机构
[1] Bilkent Univ, Elekt & Elekt Muhendisligi Bolumu, ASELSAN Res Ctr, Ankara, Turkiye
[2] Bilkent Univ, Hukuk Fak, Ankara, Turkiye
[3] Bilkent Univ, Elekt & Elekt Muhendisligi Bolumu, Ulusal Manyet Rezonans Arastirma Merkezi, Ankara, Turkiye
关键词
Natural language processing; legal tech; deep learning; prior legal case retrieval; legal NLP; Turkish NLP;
D O I
10.1109/SIU59756.2023.10223938
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, BERTurk-Legal, a transformer-based language model, is introduced to retrieve prior legal cases. BERTurk-Legal is pre-trained on a dataset from the Turkish legal domain. This dataset does not contain any labels related to the prior court case retrieval task. Masked language modeling is used to train BERTurk-Legal in a self-supervised manner. With zero-shot classification, BERTurk-Legal provides state-of-the-art results on the dataset consisting of legal cases of the Court of Cassation of Turkey. The results of the experiments show the necessity of developing language models specific to the Turkish law domain.
引用
收藏
页数:4
相关论文
共 50 条
  • [1] Legal Information Retrieval and Entailment Using Transformer-based Approaches
    Kim, Mi-Young
    Rabelo, Juliano
    Babiker, Housam Khalifa Bashier
    Rahman, Md Abed
    Goebel, Randy
    REVIEW OF SOCIONETWORK STRATEGIES, 2024, 18 (01): : 101 - 121
  • [2] Legal Information Retrieval and Entailment Using Transformer-based Approaches
    Mi-Young Kim
    Juliano Rabelo
    Housam Khalifa Bashier Babiker
    Md Abed Rahman
    Randy Goebel
    The Review of Socionetwork Strategies, 2024, 18 : 101 - 121
  • [3] NEIGHBOR-AUGMENTED TRANSFORMER-BASED EMBEDDING FOR RETRIEVAL
    Zhang, Jihai
    Lin, Fangquan
    Jiang, Wei
    Yang, Cheng
    Liu, Gaoge
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3893 - 3897
  • [4] A Transformer-Based Framework for Biomedical Information Retrieval Systems
    Hall, Karl
    Jayne, Chrisina
    Chang, Victor
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VI, 2023, 14259 : 317 - 331
  • [5] Transformer-Based Distillation Hash Learning for Image Retrieval
    Lv, Yuanhai
    Wang, Chongyan
    Yuan, Wanteng
    Qian, Xiaohao
    Yang, Wujun
    Zhao, Wanqing
    ELECTRONICS, 2022, 11 (18)
  • [6] Transformer-Based Video Deinterlacing Method
    Song, Chao
    Li, Haidong
    Zheng, Dong
    Wang, Jie
    Jiang, Zhaoyi
    Yang, Bailin
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT V, 2024, 14451 : 357 - 369
  • [7] Improving Efficiency and Robustness of Transformer-based Information Retrieval Systems
    Begoli, Edmon
    Srinivasan, Sudarshan
    Mahbub, Maria
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 3433 - 3435
  • [8] Improving Transformer-Based Neural Machine Translation with Prior Alignments
    Nguyen, Thien
    Nguyen, Lam
    Tran, Phuoc
    Nguyen, Huu
    COMPLEXITY, 2021, 2021
  • [9] TransHash: Transformer-based Hamming Hashing for Efficient Image Retrieval
    Chen, Yongbiao
    Zhang, Sheng
    Liu, Fangxin
    Chang, Zhigang
    Ye, Mang
    Qi, Zhengwei
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2022, 2022, : 127 - 136
  • [10] A Transformer-based Neural Architecture Search Method
    Wang, Shang
    Tang, Huanrong
    Ouyang, Jianquan
    PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION, 2023, : 691 - 694