Embedding the Self-Organisation of Deep Feature Maps in the Hamburger Framework can Yield Better and Interpretable Results

被引:0
|
作者
Humphreys, Jack [1 ]
Hagenbuchner, Markus [1 ]
Wang, Zhiyong [2 ]
Tsoi, Ah Chung [1 ]
机构
[1] Univ Wollongong, Sch Comp & Informat Technol, Wollongong, NSW, Australia
[2] Univ Sydney, Sch Comp Sci, Sydney, NSW, Australia
关键词
Neural networks; Self-Organizing Feature Maps; Deep Learning; Supervised Learning; Unsupervised Learning;
D O I
10.1109/IJCNN54540.2023.10191477
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The popularity of neural networks that explicitly utilise the global correlation structure of their features have become vastly more popular ever since the Transformer architecture was developed. We propose to embed unsupervised Self-Organising Maps within neural networks as a means to model the global correlation structure. By enforcing topological preservation therein, such a neural network is able to represent more complex correlation structures and to produce interpretable visualisations as a byproduct. We validate this approach by comparing with the existing state-of-the-art attention substitute within its own 'Hamburger' framework and by illustrating maps learnt by the module. Overall, this paper serves as a proof of concept for integrating Self-Organising Maps within a supervised network.
引用
收藏
页数:8
相关论文
empty
未找到相关数据