Using Bidirectional Encoder Representations from Transformers (BERT) to predict criminal charges and sentences from Taiwanese court judgments

被引:1
|
作者
Peng, Yi-Ting [1 ]
Lei, Chin-Laung [1 ]
机构
[1] Natl Taiwan Univ, Dept Elect Engn, Taipei City, Taiwan
关键词
National language processing; Bidirectional encoder representations from transformer (BERT); Legal artificial intelligence (Legal AI); Artificial intelligence law (AI Law); Legal judgment prediction;
D O I
10.7717/peerj-cs.1841
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
People unfamiliar with the law may not know what kind of behavior is considered criminal behavior or the lengths of sentences tied to those behaviors. This study used criminal judgments from the district court in Taiwan to predict the type of crime and sentence length that would be determined. This study pioneers using Taiwanese criminal judgments as a dataset and proposes improvements based on Bidirectional Encoder Representations from Transformers (BERT). This study is divided into two parts: criminal charges prediction and sentence prediction. Injury and public endangerment judgments were used as training data to predict sentences. This study also proposes an effective solution to BERT's 512 -token limit. The results show that using the BERT model to train Taiwanese criminal judgments is feasible. Accuracy reached 98.95% in predicting criminal charges and 72.37% in predicting the sentence in injury trials, and 80.93% in predicting the sentence in public endangerment trials.
引用
收藏
页数:24
相关论文
共 50 条
  • [31] Automatic detection of actionable radiology reports using bidirectional encoder representations from transformers
    Nakamura, Yuta
    Hanaoka, Shouhei
    Nomura, Yukihiro
    Nakao, Takahiro
    Miki, Soichiro
    Watadani, Takeyuki
    Yoshikawa, Takeharu
    Hayashi, Naoto
    Abe, Osamu
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2021, 21 (01)
  • [32] Topic Modelling of Legal Texts Using Bidirectional Encoder Representations from Sentence Transformers
    Hammami, Eya
    Faiz, Rim
    ADVANCES IN INFORMATION SYSTEMS, ARTIFICIAL INTELLIGENCE AND KNOWLEDGE MANAGEMENT, ICIKS 2023, 2024, 486 : 333 - 343
  • [33] Multi-Domain Aspect Extraction Using Bidirectional Encoder Representations From Transformers
    dos Santos, Brucce Neves
    Marcacini, Ricardo Marcondes
    Rezende, Solange Oliveira
    IEEE ACCESS, 2021, 9 : 91604 - 91613
  • [34] Automatic detection of actionable radiology reports using bidirectional encoder representations from transformers
    Yuta Nakamura
    Shouhei Hanaoka
    Yukihiro Nomura
    Takahiro Nakao
    Soichiro Miki
    Takeyuki Watadani
    Takeharu Yoshikawa
    Naoto Hayashi
    Osamu Abe
    BMC Medical Informatics and Decision Making, 21
  • [35] BEKT: Deep Knowledge Tracing with Bidirectional Encoder Representations from Transformers
    Tian, Zejie
    Zheng, Guangcong
    Flanagan, Brendan
    Mi, Jiazhi
    Ogata, Hiroaki
    29TH INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION (ICCE 2021), VOL II, 2021, : 543 - 552
  • [36] Smart Contracts Implementation Based on Bidirectional Encoder Representations from Transformers
    Aejas, Bajeela
    Bouras, Abdelaziz
    Belhi, Abdelhak
    Gasmi, Houssem
    PRODUCT LIFECYCLE MANAGEMENT: GREEN AND BLUE TECHNOLOGIES TO SUPPORT SMART AND SUSTAINABLE ORGANIZATIONS, PT I, 2022, 639 : 293 - 304
  • [37] Feature Extraction with Bidirectional Encoder Representations from Transformers in Hyperspectral Images
    Sigirci, Ibrahim Onur
    Ozgur, Hakan
    Bilgin, Gokhan
    2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,
  • [38] Climate Change Sentiment Analysis Using Domain Specific Bidirectional Encoder Representations From Transformers
    Anoop, V. S.
    Krishnan, T. K. Ajay
    Daud, Ali
    Banjar, Ameen
    Bukhari, Amal
    IEEE ACCESS, 2024, 12 : 114912 - 114922
  • [39] Using Multilingual Bidirectional Encoder Representations from Transformers on Medical Corpus for Kurdish Text Classification
    Badawi, Soran S.
    ARO-THE SCIENTIFIC JOURNAL OF KOYA UNIVERSITY, 2023, 11 (01): : 10 - 15
  • [40] Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical-drug relation extraction?
    Tang, Anfu
    Deleger, Louise
    Bossy, Robert
    Zweigenbaum, Pierre
    Nedellec, Claire
    DATABASE-THE JOURNAL OF BIOLOGICAL DATABASES AND CURATION, 2022, 2022