Transformers: State-of-the-Art Natural Language Processing

被引:0
|
作者
Wolf, Thomas [1 ]
Debut, Lysandre [1 ]
Sanh, Victor [1 ]
Chaumond, Julien [1 ]
Delangue, Clement [1 ]
Moi, Anthony [1 ]
Cistac, Picnic [1 ]
Rault, Tim [1 ]
Louf, Remi [1 ]
Funtowicz, Morgan [1 ]
Davison, Joe [1 ]
Shleifer, Sam [1 ]
von Platen, Patrick [1 ]
Ma, Clara [1 ]
Jernite, Yacine [1 ]
Plu, Julien [1 ]
Xu, Canwen [1 ]
Le Scao, Teven [1 ]
Gugger, Sylvain [1 ]
Drame, Mariama [1 ]
Lhoest, Quentin [1 ]
Rush, Alexander M. [1 ]
机构
[1] Hugging Face, Brooklyn, NY 11201 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments.
引用
收藏
页码:38 / 45
页数:8
相关论文
共 50 条
  • [1] DEEP LEARNING IN NATURAL LANGUAGE PROCESSING: A STATE-OF-THE-ART SURVEY
    Chai, Junyi
    Li, Anming
    [J]. PROCEEDINGS OF 2019 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), 2019, : 535 - 540
  • [2] The fusion of fuzzy theories and natural language processing: A state-of-the-art survey
    Liu, Ming
    Zhang, Hongjun
    Xu, Zeshui
    Ding, Kun
    [J]. APPLIED SOFT COMPUTING, 2024, 162
  • [3] Assessing receptive vocabulary using state-of-the-art natural language processing techniques
    Crossley, Scott
    Holmes, Langdon
    [J]. JOURNAL OF SECOND LANGUAGE STUDIES, 2023, 6 (01) : 1 - 28
  • [4] Asian language processing: current state-of-the-art
    Huang, Chu-Ren
    Tokunaga, Takenobu
    Lee, Sophia Yat Mei
    [J]. LANGUAGE RESOURCES AND EVALUATION, 2006, 40 (3-4) : 203 - 218
  • [5] Asian language processing: current state-of-the-art
    Chu-Ren Huang
    Takenobu Tokunaga
    Sophia Yat Mei Lee
    [J]. Language Resources and Evaluation, 2006, 40 : 203 - 218
  • [6] "Transforming" Personality Scale Development: Illustrating the Potential of State-of-the-Art Natural Language Processing
    Fyffe, Shea
    Lee, Philseok
    Kaplan, Seth
    [J]. ORGANIZATIONAL RESEARCH METHODS, 2024, 27 (02) : 265 - 300
  • [7] Connectionist natural language processing: The state of the art
    Christiansen, MH
    Chater, N
    [J]. COGNITIVE SCIENCE, 1999, 23 (04) : 417 - 437
  • [8] Natural language processing with transformers: a review
    Tucudean, Georgiana
    Bucos, Marian
    Dragulescu, Bogdan
    Caleanu, Catalin Daniel
    [J]. PEERJ COMPUTER SCIENCE, 2024, 10
  • [9] State-of-the-art trends in distribution transformers perfection
    Puylo, G. V.
    Kuzmenko, I. S.
    Tongalyuk, V. V.
    [J]. ELECTRICAL ENGINEERING & ELECTROMECHANICS, 2008, (02) : 48 - +
  • [10] Pattern Recognition and Natural Language Processing: State of the Art
    Kocaleva, Mirjana
    Stojanov, Done
    Stojanovik, Igor
    Zdravev, Zoran
    [J]. TEM JOURNAL-TECHNOLOGY EDUCATION MANAGEMENT INFORMATICS, 2016, 5 (02): : 236 - 240