共 15 条
- [4] Pre-trained models for natural language processing: A survey[J]. QIU XiPeng,SUN TianXiang,XU YiGe,SHAO YunFan,DAI Ning,HUANG XuanJing.Science China(Technological Sciences). 2020(10)
- [5] 自然语言处理[M]. - 电子工业出版社 , 车万翔, 2021
- [6] Pre-Trained Models: Past, Present and Future[J] . Han Xu,Zhang Zhengyan,Ding Ning,Gu Yuxian,Liu Xiao,Huo Yuqi,Qiu Jiezhong,Zhang Liang,Han Wentao,Huang Minlie,Jin Qin,Lan Yanyan,Liu Yang,Liu Zhiyuan,Lu Zhiwu,Qiu Xipeng,Song Ruihua,Tang Jie,Zhu Jun.AI Open . 2021 (prep)
- [7] Lawformer: A pre-trained language model for Chinese legal long documents[J] . Xiao Chaojun,Hu Xueyu,Liu Zhiyuan,Tu Cunchao,Sun Maosong.AI Open . 2021
- [8] BioBERT: a pre-trained biomedical language representation model for biomedical text mining[J] . Lee Jinhyuk,Yoon Wonjin,Kim Sungdong,Kim Donghyeon,Kim Sunkyu,So Chan Ho,Kang Jaewoo.Bioinformatics (Oxford, England) . 2020 (4)
- [9] Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context[J] . Zihang Dai,Zhilin Yang,Yiming Yang,Jaime G. Carbonell,Quoc V. Le,Ruslan Salakhutdinov.CoRR . 2019
- [10] RoBERTa: A Robustly Optimized BERT Pretraining Approach[J] . Yinhan Liu,Myle Ott,Naman Goyal,Jingfei Du,Mandar Joshi,Danqi Chen,Omer Levy,Mike Lewis,Luke Zettlemoyer,Veselin Stoyanov.CoRR . 2019