Pashto poetry generation: deep learning with pre-trained transformers for low-resource languages

被引:0
|
作者
Ullah, Imran [1 ]
Ullah, Khalil [1 ]
Khan, Hamad [1 ]
Aurangzeb, Khursheed [2 ]
Anwar, Muhammad Shahid [3 ]
Syed, Ikram [3 ]
机构
[1] Software Engineering, University of Malakand, Chakdara, Pakistan
[2] Department of Computer Engineering, College of Computer and Information Sciences, King Saud University, Riyadh, Saudi Arabia
[3] Department of AI and Software, Gachon University, Seongnam-Si, Korea, Republic of
关键词
26;
D O I
10.7717/PEERJ-CS.2163
中图分类号
学科分类号
摘要
引用
收藏
页码:1 / 23
相关论文
共 50 条
  • [1] Pashto poetry generation: deep learning with pre-trained transformers for low-resource languages
    Ullah, Imran
    Ullah, Khalil
    Khan, Hamad
    Aurangzeb, Khursheed
    Anwar, Muhammad Shahid
    Syed, Ikram
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [2] On the Transferability of Pre-trained Language Models for Low-Resource Programming Languages
    Chen, Fuxiang
    Fard, Fatemeh H.
    Lo, David
    Bryksin, Timofey
    30TH IEEE/ACM INTERNATIONAL CONFERENCE ON PROGRAM COMPREHENSION (ICPC 2022), 2022, : 401 - 412
  • [3] Investigating Pre-trained Audio Encoders in the Low-Resource Condition
    Yang, Hao
    Zhao, Jinming
    Haffari, Gholamreza
    Shareghi, Ehsan
    INTERSPEECH 2023, 2023, : 1498 - 1502
  • [4] DIONYSUS: A Pre-trained Model for Low-Resource Dialogue Summarization
    Li, Yu
    Peng, Baolin
    He, Pengcheng
    Galley, Michel
    Yu, Zhou
    Gao, Jianfeng
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 1368 - 1386
  • [5] Text data augmentation and pre-trained Language Model for enhancing text classification of low-resource languages
    Ziyaden, Atabay
    Yelenov, Amir
    Hajiyev, Fuad
    Rustamov, Samir
    Pak, Alexandr
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [6] Enriching the Transfer Learning with Pre-Trained Lexicon Embedding for Low-Resource Neural Machine Translation
    Mieradilijiang Maimaiti
    Yang Liu
    Huanbo Luan
    Maosong Sun
    TsinghuaScienceandTechnology, 2022, 27 (01) : 150 - 163
  • [7] Enriching the Transfer Learning with Pre-Trained Lexicon Embedding for Low-Resource Neural Machine Translation
    Maimaiti, Mieradilijiang
    Liu, Yang
    Luan, Huanbo
    Sun, Maosong
    TSINGHUA SCIENCE AND TECHNOLOGY, 2022, 27 (01) : 150 - 163
  • [8] A Comparative Study of Pre-trained Encoders for Low-Resource Named Entity Recognition
    Chen, Yuxuan
    Mikkelsen, Jonas
    Binder, Arne
    Alt, Christoph
    Hennig, Leonhard
    PROCEEDINGS OF THE 7TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2022, : 46 - 59
  • [9] Efficient Fine-Tuning for Low-Resource Tibetan Pre-trained Language Models
    Zhou, Mingjun
    Daiqing, Zhuoma
    Qun, Nuo
    Nyima, Tashi
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 410 - 422
  • [10] ADAPTING PRE-TRAINED LANGUAGE MODELS TO LOW-RESOURCE TEXT SIMPLIFICATION: THE PATH MATTERS
    Garbacea, Cristina
    Mei, Qiaozhu
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199