Pashto poetry generation: deep learning with pre-trained transformers for low-resource languages

被引:0
|
作者
Ullah, Imran [1 ]
Ullah, Khalil [1 ]
Khan, Hamad [1 ]
Aurangzeb, Khursheed [2 ]
Anwar, Muhammad Shahid [3 ]
Syed, Ikram [3 ]
机构
[1] Software Engineering, University of Malakand, Chakdara, Pakistan
[2] Department of Computer Engineering, College of Computer and Information Sciences, King Saud University, Riyadh, Saudi Arabia
[3] Department of AI and Software, Gachon University, Seongnam-Si, Korea, Republic of
关键词
26;
D O I
10.7717/PEERJ-CS.2163
中图分类号
学科分类号
摘要
引用
收藏
页码:1 / 23
相关论文
共 50 条
  • [21] Deep Learning-based POS Tagger and Chunker for Odia Language Using Pre-trained Transformers
    Dalai, Tusarkanta
    Kumarmishra, Tapas
    Sa, Andpankaj K.
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (02)
  • [22] A Deep Learning Sentiment Analyser for Social Media Comments in Low-Resource Languages
    Kastrati, Zenun
    Ahmedi, Lule
    Kurti, Arianit
    Kadriu, Fatbardh
    Murtezaj, Doruntina
    Gashi, Fatbardh
    ELECTRONICS, 2021, 10 (10)
  • [23] Biologically Inspired Design Concept Generation Using Generative Pre-Trained Transformers
    Zhu, Qihao
    Zhang, Xinyu
    Luo, Jianxi
    JOURNAL OF MECHANICAL DESIGN, 2023, 145 (04)
  • [24] PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation
    Hua, Xinyu
    Wang, Lu
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 781 - 793
  • [25] Low Resource Summarization using Pre-trained Language Models
    Munaf, Mubashir
    Afzal, Hammad
    Mahmood, Khawir
    Iltaf, Naima
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (10)
  • [26] Efficient Entity Candidate Generation for Low-Resource Languages
    Garcia-Duran, Alberto
    Arora, Akhil
    West, Robert
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 6429 - 6438
  • [27] Learning to Switch off, Switch on, and Integrate Modalities in Large Pre-trained Transformers
    Duseja, Tejas
    Annervaz, K. M.
    Duggani, Jeevithiesh
    Zacharia, Shyam
    Free, Michael
    Dukkipati, Ambedkar
    2024 IEEE 7TH INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL, MIPR 2024, 2024, : 403 - 409
  • [28] On Pre-trained Image Features and Synthetic Images for Deep Learning
    Hinterstoisser, Stefan
    Lepetit, Vincent
    Wohlhart, Paul
    Konolige, Kurt
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT I, 2019, 11129 : 682 - 697
  • [29] Improving stance detection accuracy in low-resource languages: a deep learning framework with ParsBERT
    Rahimi, Mohammad
    Kiani, Vahid
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2024, : 517 - 535
  • [30] EFFICIENT UTILIZATION OF LARGE PRE-TRAINED MODELS FOR LOW RESOURCE ASR
    Vieting, Peter
    Luescher, Christoph
    Dierkes, Julian
    Schlueter, Ralf
    Ney, Hermann
    2023 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW, 2023,