CephaNN: A Multi-Head Attention Network for Cephalometric Landmark Detection

被引:27
|
作者
Qian, Jiahong [1 ]
Luo, Weizhi [1 ]
Cheng, Ming [1 ]
Tao, Yubo [1 ,2 ]
Lin, Jun [3 ]
Lin, Hai [1 ,2 ]
机构
[1] Zhejiang Univ, State Key Lab CAD&CG, Hangzhou 310058, Peoples R China
[2] Zhejiang Univ, Innovat Ctr Minimally Invas Tech & Device, Hangzhou 310058, Peoples R China
[3] Zhejiang Univ, Coll Med, Affiliated Hosp 1, Dept Stomatol, Hangzhou 310058, Peoples R China
基金
中国国家自然科学基金;
关键词
Heating systems; Neural networks; Kernel; Feature extraction; Annotations; Two dimensional displays; Deep learning; Cephalometric landmark detection; multi-head attention; neural network; intermediate supervision; region enhance;
D O I
10.1109/ACCESS.2020.3002939
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Cephalometric landmark detection is a crucial step in orthodontic and orthognathic treatments. To detect cephalometric landmarks accurately, we propose a novel multi-head attention neural network (CephaNN). CephaNN is an end-to-end network based on the heatmaps of annotated landmarks, and it consists of two parts, the multi-head part and the attention part. In the multi-head part, we adopt multi-head subnets to gain comprehensive knowledge of various subspaces of a cephalogram. The intermediate supervision is applied to accelerate the convergence. Based on the feature maps learned from the multi-head Part, the attention part applies the multi-attention mechanism to obtain a refined detection. For solving the class imbalance problem, we propose a region enhancing (RE) loss, to enhance the efficient regions on the regressed heatmaps. Experiments in the benchmark dataset demonstrate that CephaNN is state-of-the-art with the detection accuracy of 87.61% in the clinically accepted 2.0-mm range. Furthermore, CephaNN is efficient in classifying the anatomical types and robust in a real application on a 75-landmark dataset.
引用
收藏
页码:112633 / 112641
页数:9
相关论文
共 50 条
  • [41] A Hierarchical Structured Multi-Head Attention Network for Multi-Turn Response Generation
    Lin, Fei
    Zhang, Cong
    Liu, Shengqiang
    Ma, Hong
    IEEE ACCESS, 2020, 8 : 46802 - 46810
  • [42] Self Multi-Head Attention for Speaker Recognition
    India, Miquel
    Safari, Pooyan
    Hernando, Javier
    INTERSPEECH 2019, 2019, : 4305 - 4309
  • [43] Attention induced multi-head convolutional neural network for human activity recognition
    Khan, Zanobya N.
    Ahmad, Jamil
    APPLIED SOFT COMPUTING, 2021, 110
  • [44] Multi-Head Attention Affinity Diversity Sharing Network for Facial Expression Recognition
    Zheng, Caixia
    Liu, Jiayu
    Zhao, Wei
    Ge, Yingying
    Chen, Wenhe
    ELECTRONICS, 2024, 13 (22)
  • [45] DOUBLE MULTI-HEAD ATTENTION FOR SPEAKER VERIFICATION
    India, Miquel
    Safari, Pooyan
    Hernando, Javier
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6144 - 6148
  • [46] Learning Sentences Similarity By Multi-Head Attention
    Wang, Ming Yang
    Li, Chen Jiang
    Sun, Jian Dong
    Xu, Wei Ran
    Gao, Sheng
    Zhang, Ya Hao
    Wang, Pu
    Li, Jun Liang
    PROCEEDINGS OF 2018 INTERNATIONAL CONFERENCE ON NETWORK INFRASTRUCTURE AND DIGITAL CONTENT (IEEE IC-NIDC), 2018, : 16 - 19
  • [47] VIDEO SUMMARIZATION WITH ANCHORS AND MULTI-HEAD ATTENTION
    Sung, Yi-Lin
    Hong, Cheng-Yao
    Hsu, Yen-Chi
    Liu, Tyng-Luh
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 2396 - 2400
  • [48] Software and Hardware Fusion Multi-Head Attention
    Hu, Wei
    Xu, Dian
    Liu, Fang
    Fan, Zimeng
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2022, PT III, 2022, 13370 : 644 - 655
  • [49] Classification of Heads in Multi-head Attention Mechanisms
    Huang, Feihu
    Jiang, Min
    Liu, Fang
    Xu, Dian
    Fan, Zimeng
    Wang, Yonghao
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2022, PT III, 2022, 13370 : 681 - 692
  • [50] Diversifying Multi-Head Attention in the Transformer Model
    Ampazis, Nicholas
    Sakketou, Flora
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2024, 6 (04): : 2618 - 2638