Language Representation Models: An Overview

被引:10
|
作者
Schomacker, Thorben [1 ]
Tropmann-Frick, Marina [1 ]
机构
[1] Hamburg Univ Appl Sci, Dept Comp Sci, D-20099 Hamburg, Germany
关键词
natural language processing; neural networks; transformer; embeddings; multi-task learning; attention-based models; deep learning;
D O I
10.3390/e23111422
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In the last few decades, text mining has been used to extract knowledge from free texts. Applying neural networks and deep learning to natural language processing (NLP) tasks has led to many accomplishments for real-world language problems over the years. The developments of the last five years have resulted in techniques that have allowed for the practical application of transfer learning in NLP. The advances in the field have been substantial, and the milestone of outperforming human baseline performance based on the general language understanding evaluation has been achieved. This paper implements a targeted literature review to outline, describe, explain, and put into context the crucial techniques that helped achieve this milestone. The research presented here is a targeted review of neural language models that present vital steps towards a general language representation model.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Probing Task-Oriented Dialogue Representation from Language Models
    Wu, Chien-Sheng
    Xiong, Caiming
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 5036 - 5051
  • [32] Physical model language: Towards a unified representation for continuous and discrete models
    Chabanas, M
    Promayon, E
    MEDICAL SIMULATION, PROCEEDINGS, 2004, 3078 : 256 - 266
  • [33] Consistent Stakeholder Modifications of Formal Models via a Natural Language Representation
    Gabrysiak, Gregor
    Eichler, Daniel
    Hebig, Regina
    Giese, Holger
    2013 1ST INTERNATIONAL WORKSHOP ON NATURAL LANGUAGE ANALYSIS IN SOFTWARE ENGINEERING (NATURALISE), 2013, : 1 - 8
  • [34] KNOWLEDGE REPRESENTATION - AN OVERVIEW
    CERCONE, N
    INDIAN JOURNAL OF TECHNOLOGY, 1987, 25 (12): : 521 - 543
  • [35] OVERVIEW OF LANGUAGE
    CAMERON, MH
    SAUNDERS, MT
    LANGUAGE AND SPEECH, 1977, 20 (JUL-) : 217 - 231
  • [36] Language representation
    Rutten, Geert-Jan
    Ramsey, Nick
    JOURNAL OF NEUROSURGERY, 2007, 106 (04) : 726 - 727
  • [37] Language Representation Projection: Can We Transfer Factual Knowledge across Languages in Multilingual Language Models?
    Xu, Shaoyang
    Li, Junzhuo
    Xiong, Deyi
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 3692 - 3702
  • [38] Space-Efficient Representation of Entity-centric Query Language Models
    Van Gysel, Christophe
    Hannemann, Mirko
    Pusateri, Ernest
    Oualil, Youssef
    Oparin, Ilya
    INTERSPEECH 2022, 2022, : 679 - 683
  • [39] Pretrain-KGE: Learning Knowledge Representation from Pretrained Language Models
    Zhang, Zhiyuan
    Liu, Xiaoqian
    Zhang, Yi
    Su, Qi
    Sun, Xu
    He, Bin
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 259 - 266
  • [40] Exploring the representation of Chinese cultural symbols dissemination in the era of large language models
    Zhang, Yixiao
    He, Yuan
    Xia, Yining
    Wang, Yanbo
    Dong, Xianghui
    Yao, Junchen
    INTERNATIONAL COMMUNICATION OF CHINESE CULTURE, 2024, 11 (02) : 215 - 237