Pashto poetry generation: deep learning with pre-trained transformers for low-resource languages

被引:0
|
作者
Ullah, Imran [1 ]
Ullah, Khalil [1 ]
Khan, Hamad [1 ]
Aurangzeb, Khursheed [2 ]
Anwar, Muhammad Shahid [3 ]
Syed, Ikram [3 ]
机构
[1] Software Engineering, University of Malakand, Chakdara, Pakistan
[2] Department of Computer Engineering, College of Computer and Information Sciences, King Saud University, Riyadh, Saudi Arabia
[3] Department of AI and Software, Gachon University, Seongnam-Si, Korea, Republic of
关键词
26;
D O I
10.7717/PEERJ-CS.2163
中图分类号
学科分类号
摘要
引用
收藏
页码:1 / 23
相关论文
共 50 条
  • [41] Learning to Select Pre-trained Deep Representations with Bayesian Evidence Framework
    Kim, Yong-Deok
    Jang, Taewoong
    Han, Bohyung
    Choi, Seungjin
    2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 5318 - 5326
  • [42] An Approach to Run Pre-Trained Deep Learning Models on Grayscale Images
    Ahmad, Ijaz
    Shin, Seokjoo
    3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE IN INFORMATION AND COMMUNICATION (IEEE ICAIIC 2021), 2021, : 177 - 180
  • [43] Hypernymy Detection for Low-Resource Languages via Meta Learning
    Yu, Changlong
    Hang, Jialong
    Zhang, Haisong
    Ng, Wilfred
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3651 - 3656
  • [44] Poetry to Prose Conversion in Sanskrit as a Linearisation Task: A case for Low-Resource Languages
    Krishna, Amrith
    Sharma, Vishnu Dutt
    Santra, Bishal
    Chakraborty, Aishik
    Satuluri, Pavankumar
    Goyal, Pawan
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1160 - 1166
  • [45] S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning
    Wang, Yabin
    Huang, Zhiwu
    Hong, Xiaopeng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [46] PDALN: Progressive Domain Adaptation over a Pre-trained Model for Low-Resource Cross-Domain Named Entity Recognition
    Zhang, Tao
    Xia, Congying
    Yu, Philip S.
    Liu, Zhiwei
    Zhao, Shu
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 5441 - 5451
  • [47] Explainable Pre-Trained Language Models for Sentiment Analysis in Low-Resourced Languages
    Mabokela, Koena Ronny
    Primus, Mpho
    Celik, Turgay
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (11)
  • [48] Extremely Low Resource Text simplification with Pre-trained Transformer Language Model
    Maruyama, Takumi
    Yamamoto, Kazuhide
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 53 - 58
  • [49] An Ensemble Voting Method of Pre-Trained Deep Learning Models for Orchid Recognition
    Ou, Chia-Ho
    Hu, Yi-Nuo
    Jiang, Dong-Jie
    Liao, Po-Yen
    2023 IEEE INTERNATIONAL SYSTEMS CONFERENCE, SYSCON, 2023,
  • [50] Analysis of Layer Efficiency and Layer Reduction on Pre-trained Deep Learning Models
    Nugraha, Brilian Tafjira
    Su, Shun-Feng
    2018 INTERNATIONAL CONFERENCE ON SYSTEM SCIENCE AND ENGINEERING (ICSSE), 2018,