BERT-Caps: A Transformer-Based Capsule Network for Tweet Act Classification

被引:29
|
作者
Saha, Tulika [1 ]
Ramesh Jayashree, Srivatsa [1 ]
Saha, Sriparna [1 ]
Bhattacharyya, Pushpak [1 ]
机构
[1] IIT Patna, Dept Comp Sci & Engn, Patna 801106, Bihar, India
来源
关键词
Twitter; Task analysis; Bit error rate; Hidden Markov models; Pragmatics; Speech recognition; Bidirectional Encoder Representations from Transformers (BERT); capsule networks; speech acts;
D O I
10.1109/TCSS.2020.3014128
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Identification of speech acts provides essential cues in understanding the pragmatics of a user utterance. It typically helps in comprehending the communicative intention of a speaker. This holds true for conversations or discussions on any fora, including social media platforms, such as Twitter. This article presents a novel tweet act classifier (speech act for Twitter) for assessing the content and intent of tweets, thereby exploring the valuable communication among the tweeters. With the recent success of Bidirectional Encoder Representations from Transformers (BERT), a newly introduced language representation model that provides pretrained deep bidirectional representations of vast unlabeled data, we introduce BERT-Caps that is built on top of BERT. The proposed model tends to learn traits and attributes by leveraging from the joint optimization of features from the BERT and capsule layer to develop a robust classifier for the task. Some Twitter-specific symbols are also included in the model to observe its influence and importance. The proposed model attained a benchmark accuracy of 77.52% and outperformed several strong baselines and state-of-the-art approaches.
引用
收藏
页码:1168 / 1179
页数:12
相关论文
共 50 条
  • [31] MI-CAT: A transformer-based domain adaptation network for motor imagery classification
    Zhang, Dongxue
    Li, Huiying
    Xie, Jingmeng
    NEURAL NETWORKS, 2023, 165 : 451 - 462
  • [32] Emotion-Cause Pair Extraction via Transformer-Based Interaction Model with Text Capsule Network
    Yang, Cheng
    Ding, Jie
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 781 - 793
  • [33] Transformer-based Point Cloud Generation Network
    Xu, Rui
    Hui, Le
    Han, Yuehui
    Qian, Jianjun
    Xie, Jin
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 4169 - 4177
  • [34] A Transformer-Based Network for Hyperspectral Object Tracking
    Gao, Long
    Chen, Langkun
    Liu, Pan
    Jiang, Yan
    Xie, Weiying
    Li, Yunsong
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [35] Personality BERT: A Transformer-Based Model for Personality Detection from Textual Data
    Jain, Dipika
    Kumar, Akshi
    Beniwal, Rohit
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON COMPUTING AND COMMUNICATION NETWORKS (ICCCN 2021), 2022, 394 : 515 - 522
  • [36] Privacy Protection in Transformer-based Neural Network
    Lang, Jiaqi
    Li, Linjing
    Chen, Weiyun
    Zeng, Daniel
    2019 IEEE INTERNATIONAL CONFERENCE ON INTELLIGENCE AND SECURITY INFORMATICS (ISI), 2019, : 182 - 184
  • [37] Transformer-Based Multiscale 3-D Convolutional Network for Motor Imagery Classification
    Su, Jingyu
    An, Shan
    Wang, Guoxin
    Sun, Xinlin
    Hao, Yushi
    Li, Haoyu
    Gao, Zhongke
    IEEE SENSORS JOURNAL, 2025, 25 (05) : 8621 - 8630
  • [38] Transformer-based Dynamic Fusion Clustering Network
    Zhang, Chunchun
    Zhao, Yaliang
    Wang, Jinke
    KNOWLEDGE-BASED SYSTEMS, 2022, 258
  • [39] A Transformer-Based Framework for Payload Malware Detection and Classification
    Stein, Kyle
    Mahyari, Arash
    Francia, Guillermo, III
    El-Sheikh, Eman
    2024 IEEE 5TH ANNUAL WORLD AI IOT CONGRESS, AIIOT 2024, 2024, : 0105 - 0111
  • [40] TRANSFORMER-BASED DOMAIN ADAPTATION FOR EVENT DATA CLASSIFICATION
    Zhao, Junwei
    Zhang, Shiliang
    Huang, Tiejun
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4673 - 4677