Graph Evolving and Embedding in Transformer

被引:0
|
作者
Chien, Jen-Tzung [1 ]
Tsao, Chia-Wei [1 ]
机构
[1] Natl Yang Ming Chiao Tung Univ, Inst Elect & Comp Engn, Hsinchu, Taiwan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel graph representation which tightly integrates the information sources of node embedding matrix and weight matrix in a graph learning representation. A new parameter updating method is proposed to dynamically represent the graph network by using a specialized transformer. This graph evolved and embedded transformer is built by using the weights and node embeddings from graph structural data. The attention-based graph learning machine is implemented. Using the proposed method, each transformer layer is composed of two attention layers. The first layer is designed to calculate the weight matrix in graph convolutional network, and also the self attention within the matrix itself. The second layer is used to estimate the node embedding and weight matrix, and also the cross attention between them. Graph learning representation is enhanced by using these two attention layers. Experiments on three financial prediction tasks demonstrate that this transformer captures the temporal information and improves the F1 score and the mean reciprocal rank.
引用
收藏
页码:538 / 545
页数:8
相关论文
共 50 条
  • [1] TGformer: A Graph Transformer Framework for Knowledge Graph Embedding
    Shi, Fobo
    Li, Duantengchuan
    Wang, Xiaoguang
    Li, Bing
    Wu, Xindong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2025, 37 (01) : 526 - 541
  • [2] AirObject: A Temporally Evolving Graph Embedding for Object Identification
    Keetha, Nikhil Varma
    Wang, Chen
    Qiu, Yuheng
    Xu, Kuan
    Scherer, Sebastian
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 8397 - 8406
  • [3] Evolving graph convolutional network with transformer for CT segmentation
    Cui, Hui
    Jin, Qiangguo
    Wu, Xixi
    Wang, Linlin
    Zhang, Tiangang
    Nakaguchi, Toshiya
    Xuan, Ping
    Feng, David Dagan
    APPLIED SOFT COMPUTING, 2024, 165
  • [4] Graph-Level Embedding for Time-Evolving Graphs
    Wang, Lili
    Huang, Chenghan
    Ma, Weicheng
    Cao, Xinyuan
    Vosoughi, Soroush
    COMPANION OF THE WORLD WIDE WEB CONFERENCE, WWW 2023, 2023, : 5 - 8
  • [5] Position-Aware Relational Transformer for Knowledge Graph Embedding
    Li, Guangyao
    Sun, Zequn
    Hu, Wei
    Cheng, Gong
    Qu, Yuzhong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (08) : 11580 - 11594
  • [6] LAPTRAN: TRANSFORMER EMBEDDING GRAPH LAPLACIAN FOR POINT CLOUD PART SEGMENTATION
    Li, Abiao
    Lv, Chenlei
    Fang, Yuming
    Zuo, Yifan
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 3070 - 3074
  • [7] Deep learning learning for patent landscaping using transformer and graph embedding
    Choi, Seokkyu
    Lee, Hyeonju
    Park, Eunjeong
    Choi, Sungchul
    TECHNOLOGICAL FORECASTING AND SOCIAL CHANGE, 2022, 175
  • [8] Dynamic-GTN: Learning an Node Efficient Embedding in Dynamic Graph with Transformer
    Hoang, Thi-Linh
    Ta, Viet-Cuong
    PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2022, 13630 : 430 - 443
  • [9] Molecular representation contrastive learning via transformer embedding to graph neural networks
    Liu, Yunwu
    Zhang, Ruisheng
    Li, Tongfeng
    Jiang, Jing
    Ma, Jun
    Yuan, Yongna
    Wang, Ping
    APPLIED SOFT COMPUTING, 2024, 164
  • [10] MKGViLT:visual-and-language transformer based on medical knowledge graph embedding
    崔文成
    SHI Wentao
    SHAO Hong
    High Technology Letters, 2025, 31 (01) : 73 - 85