Pashto poetry generation: deep learning with pre-trained transformers for low-resource languages

被引:0
|
作者
Ullah, Imran [1 ]
Ullah, Khalil [1 ]
Khan, Hamad [1 ]
Aurangzeb, Khursheed [2 ]
Anwar, Muhammad Shahid [3 ]
Syed, Ikram [3 ]
机构
[1] Software Engineering, University of Malakand, Chakdara, Pakistan
[2] Department of Computer Engineering, College of Computer and Information Sciences, King Saud University, Riyadh, Saudi Arabia
[3] Department of AI and Software, Gachon University, Seongnam-Si, Korea, Republic of
关键词
26;
D O I
10.7717/PEERJ-CS.2163
中图分类号
学科分类号
摘要
引用
收藏
页码:1 / 23
相关论文
共 50 条
  • [11] Pre-trained Word Embedding based Parallel Text Augmentation Technique for Low-Resource NMT in Favor of Morphologically Rich Languages
    Hailu, Tulu Tilahun
    Yu, Junqing
    Fantaye, Tessfu Geteye
    PROCEEDINGS OF THE THIRD INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND APPLICATION ENGINEERING (CSAE2019), 2019,
  • [12] Ensemble Learning with Pre-Trained Transformers for Crash Severity Classification: A Deep NLP Approach
    Jaradat, Shadi
    Nayak, Richi
    Paz, Alexander
    Elhenawy, Mohammed
    ALGORITHMS, 2024, 17 (07)
  • [13] INTENT CLASSIFICATION USING PRE-TRAINED LANGUAGE AGNOSTIC EMBEDDINGS FOR LOW RESOURCE LANGUAGES
    Yadav, Hemant
    Gupta, Akshat
    Rallabandi, Sai Krishna
    Black, Alan W.
    Shah, Rajiv Ratn
    INTERSPEECH 2022, 2022, : 3473 - 3477
  • [14] TWilBert: Pre-trained deep bidirectional transformers for Spanish Twitter
    Gonzalez, Jose Angel
    Hurtado, Lluis-F.
    Pla, Ferran
    NEUROCOMPUTING, 2021, 426 : 58 - 69
  • [15] Harnessing Pre-Trained Sentence Transformers for Offensive Language Detection in Indian Languages
    MKSSS Cummins College of Engineering for Women, Maharashtra, Pune, India
    不详
    不详
    CEUR Workshop Proc., (427-434):
  • [16] Embedding Articulatory Constraints for Low-resource Speech Recognition Based on Large Pre-trained Model
    Lee, Jaeyoung
    Mimura, Masato
    Kawahara, Tatsuya
    INTERSPEECH 2023, 2023, : 1394 - 1398
  • [17] Named-Entity Recognition for a Low-resource Language using Pre-Trained Language Model
    Yohannes, Hailemariam Mehari
    Amagasa, Toshiyuki
    37TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, 2022, : 837 - 844
  • [18] Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
    Lee, En-Shiun Annie
    Thillainathan, Sarubi
    Nayak, Shravan
    Ranathunga, Surangika
    Adelani, David Ifeoluwa
    Su, Ruisi
    McCarthy, Arya D.
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 58 - 67
  • [19] Word Representation Learning in Multimodal Pre-Trained Transformers: An Intrinsic Evaluation
    Pezzelle, Sandro
    Takmaz, Ece
    Fernandez, Raquel
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2021, 9 : 1563 - 1579
  • [20] Taming Pre-trained Language Models with N-gram Representations for Low-Resource Domain Adaptation
    Diao, Shizhe
    Xu, Ruijia
    Su, Hongjin
    Jiang, Yilei
    Song, Yan
    Zhang, Tong
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3336 - 3349