Harnessing Pre-Trained Sentence Transformers for Offensive Language Detection in Indian Languages

被引:0
|
作者
MKSSS Cummins College of Engineering for Women, Maharashtra, Pune, India [1 ]
不详 [2 ]
不详 [3 ]
机构
来源
CEUR Workshop Proc. | / 427-434期
关键词
Engineering Village;
D O I
暂无
中图分类号
学科分类号
摘要
BERT - Hate-speech detection - Indian regional language - Indicnlp - Language detection - Language processing - Low resource languages - Natural language processing - Natural languages - Offensive language detection - Offensive languages - Sentence-BERT - Speech detection - Text classification - Transformer
引用
收藏
相关论文
共 50 条
  • [21] On the Transferability of Pre-trained Language Models for Low-Resource Programming Languages
    Chen, Fuxiang
    Fard, Fatemeh H.
    Lo, David
    Bryksin, Timofey
    30TH IEEE/ACM INTERNATIONAL CONFERENCE ON PROGRAM COMPREHENSION (ICPC 2022), 2022, : 401 - 412
  • [22] Pashto poetry generation: deep learning with pre-trained transformers for low-resource languages
    Ullah, Imran
    Ullah, Khalil
    Khan, Hamad
    Aurangzeb, Khursheed
    Anwar, Muhammad Shahid
    Syed, Ikram
    PeerJ Computer Science, 2024, 10 : 1 - 23
  • [23] Software Vulnerabilities Detection Based on a Pre-trained Language Model
    Xu, Wenlin
    Li, Tong
    Wang, Jinsong
    Duan, Haibo
    Tang, Yahui
    2023 IEEE 22ND INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS, TRUSTCOM, BIGDATASE, CSE, EUC, ISCI 2023, 2024, : 904 - 911
  • [24] Adapting Pre-trained Language Models to Rumor Detection on Twitter
    Slimi, Hamda
    Bounhas, Ibrahim
    Slimani, Yahya
    JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2021, 27 (10) : 1128 - 1148
  • [25] Pashto poetry generation: deep learning with pre-trained transformers for low-resource languages
    Ullah, Imran
    Ullah, Khalil
    Khan, Hamad
    Aurangzeb, Khursheed
    Anwar, Muhammad Shahid
    Syed, Ikram
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [26] Solving ESL Sentence Completion Questions via Pre-trained Neural Language Models
    Liu, Qiongqiong
    Liu, Tianqiao
    Zhao, Jiafu
    Fang, Qiang
    Ding, Wenbiao
    Wu, Zhongqin
    Xia, Feng
    Tang, Jiliang
    Liu, Zitao
    ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2021), PT II, 2021, 12749 : 256 - 261
  • [27] Probing Power by Prompting: Harnessing Pre-trained Language Models for Power Connotation Framing
    Khanehzar, Shima
    Cohn, Trevor
    Mikolajczak, Gosia
    Frermann, Lea
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 873 - 885
  • [28] Improving the Performance of Pre-trained Systems in Sentence Retrieval
    Rughbeer, Yastil
    Pillay, Anban W.
    Jembere, Edgar
    2021 IEEE 24TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2021, : 908 - 915
  • [29] Pre-Trained Language Models and Their Applications
    Wang, Haifeng
    Li, Jiwei
    Wu, Hua
    Hovy, Eduard
    Sun, Yu
    ENGINEERING, 2023, 25 : 51 - 65
  • [30] Hyperbolic Pre-Trained Language Model
    Chen, Weize
    Han, Xu
    Lin, Yankai
    He, Kaichen
    Xie, Ruobing
    Zhou, Jie
    Liu, Zhiyuan
    Sun, Maosong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 3101 - 3112