Harnessing Pre-Trained Sentence Transformers for Offensive Language Detection in Indian Languages

被引:0
|
作者
MKSSS Cummins College of Engineering for Women, Maharashtra, Pune, India [1 ]
不详 [2 ]
不详 [3 ]
机构
来源
CEUR Workshop Proc. | / 427-434期
关键词
Engineering Village;
D O I
暂无
中图分类号
学科分类号
摘要
BERT - Hate-speech detection - Indian regional language - Indicnlp - Language detection - Language processing - Low resource languages - Natural language processing - Natural languages - Offensive language detection - Offensive languages - Sentence-BERT - Speech detection - Text classification - Transformer
引用
收藏
相关论文
共 50 条
  • [1] On the Sentence Embeddings from Pre-trained Language Models
    Li, Bohan
    Zhou, Hao
    He, Junxian
    Wang, Mingxuan
    Yang, Yiming
    Li, Lei
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 9119 - 9130
  • [2] Are Pre-trained Convolutions Better than Pre-trained Transformers?
    Tay, Yi
    Dehghani, Mostafa
    Gupta, Jai
    Aribandi, Vamsi
    Bahri, Dara
    Qin, Zhen
    Metzler, Donald
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4349 - 4359
  • [3] Calibration of Pre-trained Transformers
    Desai, Shrey
    Durrett, Greg
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 295 - 302
  • [4] Harnessing Generative Pre-Trained Transformers for Construction Accident Prediction with Saliency Visualization
    Yoo, Byunghee
    Kim, Jinwoo
    Park, Seongeun
    Ahn, Changbum R.
    Oh, Taekeun
    APPLIED SCIENCES-BASEL, 2024, 14 (02):
  • [5] IndicNLPSuite: Monolingual Corpora, Evaluation Benchmarks and Pre-trained Multilingual Language Models for Indian Languages
    Kakwani, Divyanshu
    Kunchukuttan, Anoop
    Golla, Satish
    Gokul, N. C.
    Bhattacharyya, Avik
    Khapra, Mitesh M.
    Kumar, Pratyush
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4948 - 4961
  • [6] NODULE DETECTION IN CHEST RADIOGRAPHS WITH UNSUPERVISED PRE-TRAINED DETECTION TRANSFORMERS
    Behrendt, Finn
    Bhattacharya, Debayan
    Krueger, Julia
    Opfer, Roland
    Schlaefer, Alexander
    2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,
  • [7] Emergent Modularity in Pre-trained Transformers
    Zhang, Zhengyan
    Zeng, Zhiyuan
    Lin, Yankai
    Xiao, Chaojun
    Wang, Xiaozhi
    Han, Xu
    Liu, Zhiyuan
    Xie, Ruobing
    Sun, Maosong
    Zhou, Jie
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 4066 - 4083
  • [8] Experiments in News Bias Detection with Pre-trained Neural Transformers
    Menzner, Tim
    Leidner, Jochen L.
    ADVANCES IN INFORMATION RETRIEVAL, ECIR 2024, PT IV, 2024, 14611 : 270 - 284
  • [9] Disentangling Semantics and Syntax in Sentence Embeddings with Pre-trained Language Models
    Huang, James Y.
    Huang, Kuan-Hao
    Chang, Kai-Wei
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1372 - 1379
  • [10] Pre-trained transformers: an empirical comparison
    Casola, Silvia
    Lauriola, Ivano
    Lavelli, Alberto
    MACHINE LEARNING WITH APPLICATIONS, 2022, 9