Survey on hypergraph neural networks

被引:0
|
作者
Lin J. [1 ,2 ,3 ]
Ye Z. [1 ,2 ]
Zhao H. [1 ,2 ]
Li Z. [1 ,2 ]
机构
[1] College of Computer, Qinghai Normal University, Xining
[2] State Key Laboratory of Tibetan Intelligent Information Processing and Application (Qinghai Normal University), Xining
[3] Department of Information Engineering, Xining Urban Vocational & Technical College, Xining
关键词
Classification; Graph; Graph neural networks; Hypergraph; Hypergraph neural network;
D O I
10.7544/issn1000-1239.202220483
中图分类号
学科分类号
摘要
In recent years, graph neural networks have achieved remarkable results in application fields such as recommendation systems and natural language processing with the help of large amounts of data and supercomputing power, and they mainly deal with graph data with pairwise relationships. However, in many real-world networks, the relationships between objects are more complex and beyond pairwise, such as scientific collaboration networks, protein networks, and others. If we directly use a graph to represent the complex relationships as pairwise relations, which will lead to a loss of information. Hypergraph is a flexible modeling tool, which shows higher-order relationships that cannot be fully described by a graph, making up for the shortage of graph. In light of this, scholars begin to care about how to develop neural networks on hypergraph, and successively put forward many hypergraph neural network models. Therefore, we overview the existing hypergraph neural network models. Firstly, we comprehensively review the development of the hypergraph neural network in the past three years. Secondly, we propose a new classification method according to the design method of hypergraph neural networks, and elaborate on representative models. Then, we introduce the application areas of hypergraph neural networks. Finally, the future research direction of hypergraph neural networks are summarized and discussed. © 2024 Science Press. All rights reserved.
引用
收藏
页码:362 / 384
页数:22
相关论文
共 151 条
  • [1] Krizhevsky A., Sutskever I., Hinton G.E., ImageNet classification with deep convolutional neural networks[J], Communications of The ACM, 60, 6, pp. 84-90, (2012)
  • [2] Elman J.L., Distributed representations, simple recurrent networks, and grammatical structure[J], Machine Learning, 7, 2, pp. 195-225, (1991)
  • [3] Bengio Y., Hinton G., Deep learning[J], Nature, 521, 7553, pp. 436-444, (2015)
  • [4] Hochreiter S., Schmidhuber J., Long short-term memory[J], Neural Computation, 9, 8, pp. 1735-1780, (1997)
  • [5] Chung J., Gulcehre C., Cho K., Et al., Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling[J], (2014)
  • [6] Bingbing X., Keting C., Junjie H., Et al., A survey on graph convolutional neural network[J], Chinese Journal of Computers, 43, 5, pp. 755-780, (2020)
  • [7] Bruna J., Zaremba W., Szlam A., Et al., Spectral Networks and Locally Connected Networks on Graphs[J], (2014)
  • [8] Shuai M., Jianwei L., Xin Z., Survey on graph neural network[J], Journal of Computer Research and Development, 59, 1, pp. 47-80, (2022)
  • [9] Han L., Mingyu Y., Zhengyang L., Et al., Survey on graph neural network acceleration architectures[J], Journal of Computer Research and Development, 58, 6, pp. 1204-1229, (2021)
  • [10] Velickovic P., Cucurull G., Casanova A., Et al., Graph Attention Networks[J], (2017)