BERT-Caps: A Transformer-Based Capsule Network for Tweet Act Classification

被引:29
|
作者
Saha, Tulika [1 ]
Ramesh Jayashree, Srivatsa [1 ]
Saha, Sriparna [1 ]
Bhattacharyya, Pushpak [1 ]
机构
[1] IIT Patna, Dept Comp Sci & Engn, Patna 801106, Bihar, India
来源
关键词
Twitter; Task analysis; Bit error rate; Hidden Markov models; Pragmatics; Speech recognition; Bidirectional Encoder Representations from Transformers (BERT); capsule networks; speech acts;
D O I
10.1109/TCSS.2020.3014128
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Identification of speech acts provides essential cues in understanding the pragmatics of a user utterance. It typically helps in comprehending the communicative intention of a speaker. This holds true for conversations or discussions on any fora, including social media platforms, such as Twitter. This article presents a novel tweet act classifier (speech act for Twitter) for assessing the content and intent of tweets, thereby exploring the valuable communication among the tweeters. With the recent success of Bidirectional Encoder Representations from Transformers (BERT), a newly introduced language representation model that provides pretrained deep bidirectional representations of vast unlabeled data, we introduce BERT-Caps that is built on top of BERT. The proposed model tends to learn traits and attributes by leveraging from the joint optimization of features from the BERT and capsule layer to develop a robust classifier for the task. Some Twitter-specific symbols are also included in the model to observe its influence and importance. The proposed model attained a benchmark accuracy of 77.52% and outperformed several strong baselines and state-of-the-art approaches.
引用
收藏
页码:1168 / 1179
页数:12
相关论文
共 50 条
  • [11] Ptr4BERT: Automatic Semisupervised Chinese Government Message Text Classification Method Based on Transformer-Based Pointer Generator Network
    Li, Mingxin
    Yin, Kaiqian
    Wang, Minghao
    ADVANCES IN MULTIMEDIA, 2022, 2022
  • [12] Transformer-based Bug/Feature Classification
    Ozturk, Ceyhun E.
    Yilmaz, Eyup Halit
    Koksal, Omer
    2023 31ST SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU, 2023,
  • [13] EEG Classification with Transformer-Based Models
    Sun, Jiayao
    Xie, Jin
    Zhou, Huihui
    2021 IEEE 3RD GLOBAL CONFERENCE ON LIFE SCIENCES AND TECHNOLOGIES (IEEE LIFETECH 2021), 2021, : 92 - 93
  • [14] Transformer-Based Point Cloud Classification
    Wu, Xianfeng
    Liu, Xinyi
    Wang, Junfei
    Wu, Xianzu
    Lai, Zhongyuan
    Zhou, Jing
    Liu, Xia
    ARTIFICIAL INTELLIGENCE AND ROBOTICS, ISAIR 2022, PT I, 2022, 1700 : 218 - 225
  • [15] A transformer-based network for speech recognition
    Tang L.
    International Journal of Speech Technology, 2023, 26 (02) : 531 - 539
  • [16] TMD-BERT: A Transformer-Based Model for Transportation Mode Detection
    Drosouli, Ifigenia
    Voulodimos, Athanasios
    Mastorocostas, Paris
    Miaoulis, Georgios
    Ghazanfarpour, Djamchid
    ELECTRONICS, 2023, 12 (03)
  • [17] FlowFormers: Transformer-based Models for Real-time Network Flow Classification
    Babaria, Rushi
    Madanapalli, Sharat Chandra
    Kumar, Himal
    Sivaraman, Vijay
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 231 - 238
  • [18] A Transformer-Based Capsule Network for 3D Part-Whole Relationship Learning
    Chen, Yu
    Zhao, Jieyu
    Qiu, Qilu
    ENTROPY, 2022, 24 (05)
  • [19] Transformer-based Hierarchical Encoder for Document Classification
    Sakhrani, Harsh
    Parekh, Saloni
    Ratadiya, Pratik
    21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS ICDMW 2021, 2021, : 852 - 858
  • [20] Practical Transformer-based Multilingual Text Classification
    Wang, Cindy
    Banko, Michele
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, NAACL-HLT 2021, 2021, : 121 - 129