A novel fusion framework for sequential data using pre-trained model

被引:0
|
作者
Ruan, Tao [1 ]
Jin, Canghong [2 ]
Xu, Lei [1 ]
Ding, Jianchao [3 ]
Ying, Shengyu [4 ]
Wu, Minghui [2 ]
Li, Huanqiang [1 ]
机构
[1] Zhejiang Institute of Transportation Co. Ltd., Hangzhou,310028, China
[2] Zhejiang University City College, China
[3] Zhejiang Provincial Public Security, Department Expressway Traffic Police Corps, China
[4] Zhejiang University, China
关键词
D O I
暂无
中图分类号
学科分类号
摘要
引用
收藏
页码:593 / 598
相关论文
共 50 条
  • [41] MULTILINGUAL TEXT CLASSIFIER USING PRE-TRAINED UNIVERSAL SENTENCE ENCODER MODEL
    Orlovskiy, O., V
    Sohrab, Khalili
    Ostapov, S. E.
    Hazdyuk, K. P.
    Shumylyak, L. M.
    RADIO ELECTRONICS COMPUTER SCIENCE CONTROL, 2022, (03) : 102 - 108
  • [42] Classification of Indian Dance Forms using Pre-Trained Model-VGG
    Biswas, Snigdha
    Ghildiyal, Anirudh
    Sharma, Sachin
    2021 SIXTH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS, SIGNAL PROCESSING AND NETWORKING (WISPNET), 2021, : 278 - 282
  • [43] Pre-trained Language Model for Biomedical Question Answering
    Yoon, Wonjin
    Lee, Jinhyuk
    Kim, Donghyeon
    Jeong, Minbyul
    Kang, Jaewoo
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 1168 : 727 - 740
  • [44] BERTweet: A pre-trained language model for English Tweets
    Dat Quoc Nguyen
    Thanh Vu
    Anh Tuan Nguyen
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING: SYSTEM DEMONSTRATIONS, 2020, : 9 - 14
  • [45] CommitBERT: Commit Message Generation Using Pre-Trained Programming Language Model
    Jung, Tae-Hwan
    NLP4PROG 2021: THE 1ST WORKSHOP ON NATURAL LANGUAGE PROCESSING FOR PROGRAMMING (NLP4PROG 2021), 2021, : 26 - 33
  • [46] Tuning Pre-trained Model via Moment Probing
    Gao, Mingze
    Wang, Qilong
    Lin, Zhenyi
    Zhu, Pengfei
    Hu, Qinghua
    Zhou, Jingbo
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11769 - 11779
  • [47] Chinese Grammatical Correction Using BERT-based Pre-trained Model
    Wang, Hongfei
    Kurosawa, Michiki
    Katsumatat, Satoru
    Komachi, Mamoru
    1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 163 - 168
  • [48] Vietnamese Sentence Paraphrase Identification using Pre-trained Model and Linguistic Knowledge
    Dien Dinh
    Nguyen Le Thanh
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (08) : 796 - 806
  • [49] Session Search with Pre-trained Graph Classification Model
    Ma, Shengjie
    Chen, Chong
    Mao, Jiaxin
    Tian, Qi
    Jiang, Xuhui
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 953 - 962
  • [50] Online Active Model Selection for Pre-trained Classifiers
    Karimi, Mohammad Reza
    Guerel, Nezihe Merve
    Karlas, Bojan
    Rausch, Johannes
    Zhang, Ce
    Krause, Andreas
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130 : 307 - +