Chinese traditional painting style automatic classification based on dual-channel feature fusion with multi-attention mechanism

被引:0
|
作者
Liu, Yunzhu [1 ]
Wu, Lei [1 ]
机构
[1] Shandong Normal Univ, Fac Fine Arts, Jinan 250014, Shandong, Peoples R China
关键词
Image classification; Chinese traditional painting; attention mechanism; Swin-Transformer;
D O I
10.1142/S1793962324500387
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Existing classification models for traditional Chinese paintings mostly ignore shallow detail features, which leads to the imprecise classification of styles. To address the above problems, this paper proposes a Chinese traditional painting style automatic classification model based on dual-channel feature fusion with multi-attention mechanism. First, the spatial attention mechanism is introduced to enhance the Swin-Transformer framework to obtain the salient features of Chinese ancient painting images. Second, a dual-channel attention mechanism is constructed to extract global semantic features and local features of Chinese ancient painting images. Finally, the extracted features are fused and categorized based on the softmax classifier. To verify the feasibility and validity of the proposed model, this paper performs simulations on the Chinese painting dataset and compares it with existing algorithms.The average classification accuracy of the proposed model is 90.6%, with an improvement of 3.14%, which is better than the existing model in both visual effects and objective data comparisons.
引用
收藏
页数:17
相关论文
共 50 条
  • [31] LPI Radar Signal Recognition Based on Dual-Channel CNN and Feature Fusion
    Quan, Daying
    Tang, Zeyu
    Wang, Xiaofeng
    Zhai, Wenchao
    Qu, Chongxiao
    SYMMETRY-BASEL, 2022, 14 (03):
  • [32] Tool wear prediction based on parallel dual-channel adaptive feature fusion
    Yang, Jinfei
    Wu, Jinxin
    Li, Xianwang
    Qin, Xuejing
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2023, 128 (1-2): : 145 - 165
  • [33] Bearing Fault Diagnosis Method Based on Attention Mechanism and Multi-Channel Feature Fusion
    Gao, Hongfeng
    Ma, Jie
    Zhang, Zhonghang
    Cai, Chaozhi
    IEEE ACCESS, 2024, 12 : 45011 - 45025
  • [34] Study on classification of attention deficit hyperactivity disorder based on dual-channel autoencoder
    Zhu, Saisai
    Sun, Daoqing
    Liu, Zifan
    Zhao, Shanhui
    Yang, Yang
    Chen, Fulong
    JOURNAL OF ELECTRONIC IMAGING, 2024, 33 (02)
  • [35] A Baseline Similarity Attention-Based Dual-Channel Feature Fusion Network for Machine Remaining Useful Life Prediction
    Hu, Yawei
    Li, Xuanlin
    Wang, Hang
    Liu, Yongbin
    Liu, Xianzeng
    Cao, Zheng
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 12
  • [36] Multi-attention network for pedestrian intention prediction based on spatio-temporal feature fusion
    Zhang, Xiaofei
    Wang, Xiaolan
    Zhang, Weiwei
    Wang, Yansong
    Liu, Xintian
    Wei, Dan
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART D-JOURNAL OF AUTOMOBILE ENGINEERING, 2023, 238 (13) : 4202 - 4215
  • [37] Glaucoma multi-classification using the novel syndrome mechanism-based dual-channel network
    Fang, Lingling
    Qiao, Huan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 86
  • [38] Semantic segmentation of remote sensing images based on dual-channel attention mechanism
    Jiang, Jionghui
    Feng, Xi'an
    Huang, Hui
    IET IMAGE PROCESSING, 2024, 18 (09) : 2346 - 2356
  • [39] MM-UNet: Multi-attention mechanism and multi-scale feature fusion UNet for tumor image segmentation
    Xing, Yaozheng
    Yuan, Jie
    Liu, Qixun
    Peng, Shihao
    Yan, Yan
    Yao, Junyi
    2023 2ND ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING, CACML 2023, 2023, : 253 - 257
  • [40] Hyperspectral Image Classification Based on Multi-attention Mechanism and Compiled Graph Neural Networks
    Jie S.
    Jing Y.
    Shujie D.
    Shaobo L.
    Jianjun H.
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2024, 55 (03): : 183 - 192and212