Reaching Quality and Efficiency with a Parameter-Efficient Controllable Sentence Simplification Approach

被引:0
|
作者
Menta, Antonio [1 ]
Garcia-Serrano, Ana [1 ]
机构
[1] UNED, ETSI Informat, C de Juan del Rosal 14, Madrid 28040, Spain
关键词
Text Simplification; Transfer Learning; Language Models;
D O I
10.2298/CSIS230912017M
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The task of Automatic Text Simplification (ATS) aims to transform texts to improve their readability and comprehensibility. Current solutions are based on Large Language Models (LLM). These models have high performance but require powerful computing resources and large amounts of data to be fine-tuned when working in specific and technical domains. This prevents most researchers from adapting the models to their area of study. The main contributions of this research are as follows: (1) proposing an accurate solution when powerful resources are not available, using the transfer learning capabilities across different domains with a set of linguistic features using a reduced size pre-trained language model (T5-small) and making it accessible to a broader range of researchers and individuals; (2) the evaluation of our model on two well-known datasets, Turkcorpus and ASSET, and the analysis of the influence of control tokens on the SimpleText corpus, focusing on the domains of Computer Science and Medicine. Finally, a detailed discussion comparing our approach with state-of-the-art models for sentence simplification is included.
引用
收藏
页数:24
相关论文
共 9 条
  • [1] MODE-LSTM: A Parameter-efficient Recurrent Network with Multi-Scale for Sentence Classification
    Ma, Qianli
    Lin, Zhenxi
    Yan, Jiangyue
    Chen, Zipeng
    Yu, Liuhong
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 6705 - 6715
  • [2] Parameter-Efficient and Memory-Efficient Tuning for Vision Transformer: A Disentangled Approach
    Zhang, Taolin
    Bai, Jiawang
    Lu, Zhihe
    Lian, Dongze
    Wang, Genping
    Wang, Xinchao
    Xia, Shu-Tao
    COMPUTER VISION - ECCV 2024, PT XLV, 2025, 15103 : 346 - 363
  • [3] Scattered or Connected? An Optimized Parameter-efficient Tuning Approach for Information Retrieval
    Ma, Xinyu
    Guo, Jiafeng
    Zhang, Ruqing
    Fan, Yixing
    Cheng, Xueqi
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 1471 - 1480
  • [4] Automatic Scoring of Arabic Essays: A Parameter-Efficient Approach for Grammatical Assessment
    Mahmoud, Somaia
    Nabil, Emad
    Torki, Marwan
    IEEE ACCESS, 2024, 12 : 142555 - 142568
  • [5] Improving Scientific Literature Classification: A Parameter-Efficient Transformer-Based Approach
    Ahanger, Mohammad Munzir
    Wani, M. Arif
    INTERNATIONAL JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING SYSTEMS, 2023, 14 (10) : 1115 - 1123
  • [6] Parameter-efficient fine-tuning large language model approach for hospital discharge paper summarization
    Goswami, Joyeeta
    Prajapati, Kaushal Kumar
    Saha, Ashim
    Saha, Apu Kumar
    APPLIED SOFT COMPUTING, 2024, 157
  • [7] ADT: An Additive Delta-Tuning approach for parameter-efficient tuning in pre-trained language models
    Li, Dong
    Tang, Jintao
    Li, Shasha
    Wang, Ting
    2024 6TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING, ICNLP 2024, 2024, : 382 - 386
  • [8] A parameter-efficient deep learning approach to predict conversion from mild cognitive impairment to Alzheimer's disease
    Spasov, Simeon
    Passamonti, Luca
    Duggento, Andrea
    Lio, Pietro
    Toschi, Nicola
    NEUROIMAGE, 2019, 189 : 276 - 287
  • [9] A Parameter-Efficient Learning Approach to Arabic Dialect Identification with Pre-Trained General-Purpose Speech Model
    Radhakrishnan, Srijith
    Yang, Chao-Han Huck
    Khan, Sumeer Ahmad
    Kiani, Narsis A.
    Gomez-Cabrero, David
    Tegner, Jesper N.
    INTERSPEECH 2023, 2023, : 1958 - 1962