HTNet: A Hybrid Model Boosted by Triple Self-attention for Crowd Counting

被引:0
|
作者
Li, Yang [1 ]
Yin, Baoqun [1 ]
机构
[1] Univ Sci & Technol China, Hefei, Peoples R China
基金
中国国家自然科学基金;
关键词
Crowd Counting; Deep Learning; Self-Attention; Hybrid Model;
D O I
10.1007/978-981-99-8555-5_23
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The swift development of convolutional neural network (CNN) has enabled significant headway in crowd counting research. However, the fixed-size convolutional kernels of traditional methods make it difficult to handle problems such as drastic scale change and complex background interference. In this regard, we propose a hybrid crowd counting model to tackle existing challenges. Firstly, we leverage a global self-attention module (GAM) after CNN backbone to capture wider contextual information. Secondly, due to the gradual recovery of the feature map size in the decoding stage, the local self-attention module (LAM) is employed to reduce computational complexity. With this design, the model can fuse features from global and local perspectives to better cope with scale change. Additionally, to establish the interdependence between spatial and channel dimensions, we further design a novel channel self-attention module (CAM) and combine it with LAM. Finally, we construct a simple yet useful double head module that outputs a foreground segmentation map in addition to the intermediate density map, which are then multiplied together in a pixel-wise style to suppress background interference. The experimental results on several benchmark datasets demonstrate that our method achieves remarkable improvement.
引用
收藏
页码:290 / 301
页数:12
相关论文
共 50 条
  • [21] Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs
    Khamesian, Saman
    Malek, Hamed
    EVOLVING SYSTEMS, 2024, 15 (02) : 489 - 503
  • [22] Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs
    Saman Khamesian
    Hamed Malek
    Evolving Systems, 2024, 15 : 489 - 503
  • [23] GSAP: A Hybrid GRU and Self-Attention Based Model for Dual Medical NLP Tasks
    Liu, Huey-Ing
    Chen, Meng-Wei
    Kao, Wei-Chun
    Yeh, Yao-Wen
    Yang, Cheng-Xuan
    2022-14TH INTERNATIONAL CONFERENCE ON KNOWLEDGE AND SMART TECHNOLOGY (KST 2022), 2022, : 80 - 85
  • [24] Protein–protein interaction site prediction by model ensembling with hybrid feature and self-attention
    Hanhan Cong
    Hong Liu
    Yi Cao
    Cheng Liang
    Yuehui Chen
    BMC Bioinformatics, 24
  • [25] A self-attention hybrid emoji prediction model for code-mixed language: (Hinglish)
    Gadde Satya Sai Naga Himabindu
    Rajat Rao
    Divyashikha Sethia
    Social Network Analysis and Mining, 2022, 12
  • [26] A self-attention hybrid emoji prediction model for code-mixed language: (Hinglish)
    Himabindu, Gadde Satya Sai Naga
    Rao, Rajat
    Sethia, Divyashikha
    SOCIAL NETWORK ANALYSIS AND MINING, 2022, 12 (01)
  • [27] DEEPCHORUS: A HYBRID MODEL OF MULTI-SCALE CONVOLUTION AND SELF-ATTENTION FOR CHORUS DETECTION
    He, Qiqi
    Sun, Xiaoheng
    Yu, Yi
    Li, Wei
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 411 - 415
  • [28] Hybrid LSTM Self-Attention Mechanism Model for Forecasting the Reform of Scientific Research in Morocco
    Fahim, Asmaa
    Tan, Qingmei
    Mazzi, Mouna
    Sahabuddin, Md
    Naz, Bushra
    Ullah Bazai, Sibghat
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [29] Reinforced Self-Attention Network: a Hybrid of Hard and Soft Attention for Sequence Modeling
    Shen, Tao
    Zhou, Tianyi
    Long, Guodong
    Jiang, Jing
    Wang, Sen
    Zhang, Chengqi
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 4345 - 4352
  • [30] Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification
    Huang, Weichun
    Tao, Ziqiang
    Huang, Xiaohui
    Xiong, Liyan
    Yu, Jia
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021