Graph manifold learning with non-gradient decision layer

被引:0
|
作者
Jiao, Ziheng [1 ,2 ,3 ]
Zhang, Hongyuan [1 ,2 ,3 ]
Zhang, Rui [2 ,3 ]
Li, Xuelong [2 ,3 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian, Peoples R China
[2] Northwestern Polytech Univ, Sch Artificial Intelligence Opt & Elect iOPEN, Xian, Peoples R China
[3] Northwestern Polytech Univ, Key Lab Intelligent Interact & Applicat, Minist Ind & Informat Technol, Xian, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Graph convolution network; Manifold learning; Analytical solution; DIMENSIONALITY REDUCTION; NEURAL-NETWORK; CLASSIFICATION; REGRESSION; FRAMEWORK;
D O I
10.1016/j.neucom.2024.127390
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generally, Graph convolution network (GCN) utilizes the graph convolution operators and the softmax to extract the deep representation and make the prediction, respectively. Although GCN successfully represents the connectivity relationship among the nodes by aggregating the information on the graph, the softmax-based decision layer may result in suboptimal performance in semi -supervised learning with less label support due to ignoring the inner distribution of the graph nodes. Besides, the gradient descent will take thousands of interaction for optimization. To address the referred issues, we propose a novel graph deep model with a non -gradient decision layer for graph mining. Firstly, manifold learning is unified with label local -structure preservation to capture the topological information and make accurate predictions with limited label support. Moreover, it is theoretically proven to have analytical solutions and acts as a non -gradient decision layer in graph convolution networks. Particularly, a joint optimization method is designed for this graph model, which extremely accelerates the convergence of the model. Finally, extensive experiments show that the proposed model has achieved excellent performance compared to the current models.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Manifold Neural Network With Non-Gradient Optimization
    Zhang, Rui
    Jiao, Ziheng
    Zhang, Hongyuan
    Li, Xuelong
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (03) : 3986 - 3993
  • [2] Non-gradient edge detector
    Rubchinsky, A
    [J]. PROCEEDINGS OF THE FIFTH JOINT CONFERENCE ON INFORMATION SCIENCES, VOLS 1 AND 2, 2000, : A127 - A130
  • [3] NON-GRADIENT ALGORITHM OF FUNCTIONALS MINIMIZATION
    SATYR, AV
    [J]. DOPOVIDI AKADEMII NAUK UKRAINSKOI RSR SERIYA A-FIZIKO-MATEMATICHNI TA TECHNICHNI NAUKI, 1978, (06): : 498 - 501
  • [4] On the usefulness of non-gradient approaches in topology optimization
    Ole Sigmund
    [J]. Structural and Multidisciplinary Optimization, 2011, 43 : 589 - 596
  • [5] On the usefulness of non-gradient approaches in topology optimization
    Sigmund, Ole
    [J]. STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2011, 43 (05) : 589 - 596
  • [6] Dispersion of the prehistory distribution for non-gradient systems
    Zhu, Jinjie
    Wang, Jiong
    Gao, Shang
    Liu, Xianbin
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2020, 2020 (02):
  • [7] NON-LINEAR INSTABILITY BEHAVIOR OF NON-GRADIENT SYSTEMS
    MANDADI, V
    HUSEYIN, K
    [J]. HADRONIC JOURNAL, 1979, 2 (03): : 657 - 681
  • [8] A survey of non-gradient optimization methods in structural engineering
    Hare, Warren
    Nutini, Julie
    Tesfamariam, Solomon
    [J]. ADVANCES IN ENGINEERING SOFTWARE, 2013, 59 : 19 - 28
  • [9] HAPTOGLOBIN PHENOTYPING OF BLOODSTAINS BY NON-GRADIENT POLYACRYLAMIDE ELECTROPHORESIS
    FELIX, RT
    BOENISCH, T
    GIESE, RW
    [J]. JOURNAL OF FORENSIC SCIENCES, 1977, 22 (03) : 580 - 585
  • [10] High-resolution non-gradient topology optimization
    Guirguis, David
    Melek, William W.
    Aly, Mohamed F.
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2018, 372 : 107 - 125