DeviceGPT: A Generative Pre-Training Transformer on the Heterogenous Graph for Internet of Things

被引:0
|
作者
Ren, Yimo [1 ,2 ]
Wang, Jinfa [1 ,2 ]
Li, Hong [1 ,2 ]
Zhu, Hongsong [1 ,2 ]
Sun, Limin [1 ,2 ]
机构
[1] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
[2] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Internet of Things; Pre-training; Self-Supervised; Graph Representation Learning;
D O I
10.1145/3539618.3591972
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, Graph neural networks (GNNs) have been adopted to model a wide range of structured data from academic and industry fields. With the rapid development of Internet technology, there are more and more meaningful applications for Internet devices, including device identification, geolocation and others, whose performance needs improvement. To replicate the several claimed successes of GNNs, this paper proposes DeviceGPT based on a generative pre-training transformer on a heterogeneous graph via self-supervised learning to learn interactions-rich information of devices from its large-scale databases well. The experiments on the dataset constructed from the real world show DeviceGPT could achieve competitive results in multiple Internet applications.
引用
收藏
页码:1929 / 1933
页数:5
相关论文
共 50 条
  • [41] Context-Aware Transformer Pre-Training for Answer Sentence Selection
    Di Liello, Luca
    Garg, Siddhant
    Moschitti, Alessandro
    [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 458 - 468
  • [42] DiT: Self-supervised Pre-training for Document Image Transformer
    Li, Junlong
    Xu, Yiheng
    Lv, Tengchao
    Cui, Lei
    Zhang, Cha
    Wei, Furu
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 3530 - 3539
  • [43] Pre-training and Evaluating Transformer-based Language Models for Icelandic
    Daoason, Jon Friorik
    Loftsson, Hrafn
    [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 7386 - 7391
  • [44] Graph Contrastive Multi-view Learning: A Pre-training Framework for Graph Classification
    Adjeisah, Michael
    Zhu, Xinzhong
    Xu, Huiying
    Ayall, Tewodros Alemu
    [J]. Knowledge-Based Systems, 2024, 299
  • [45] MEGA: Meta-Graph Augmented Pre-Training Model for Knowledge Graph Completion
    Wang, Yashen
    Ouyang, Xiaoye
    Guo, Dayu
    Zhu, Xiaoling
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (01)
  • [46] PointGPT: Auto-regressively Generative Pre-training from Point Clouds
    Chen, Guangyan
    Wang, Meiling
    Yang, Yi
    Yu, Kai
    Yuan, Li
    Yue, Yufeng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [47] Multimodal Pre-Training Based on Graph Attention Network for Document Understanding
    Zhang, Zhenrong
    Ma, Jiefeng
    Du, Jun
    Wang, Licheng
    Zhang, Jianshu
    [J]. IEEE Transactions on Multimedia, 2023, 25 : 6743 - 6755
  • [48] RecipeGPT: Generative Pre-training Based Cooking Recipe Generation and Evaluation System
    Lee, Helena H.
    Shu, Ke
    Achananuparp, Palakorn
    Prasetyo, Philips Kokoh
    Liu, Yue
    Lim, Ee-Peng
    Varshney, Lav R.
    [J]. WWW'20: COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2020, 2020, : 181 - 184
  • [49] Self-attention Based Text Matching Model with Generative Pre-training
    Zhang, Xiaolin
    Lei, Fengpei
    Yu, Shengji
    [J]. 2021 IEEE INTL CONF ON DEPENDABLE, AUTONOMIC AND SECURE COMPUTING, INTL CONF ON PERVASIVE INTELLIGENCE AND COMPUTING, INTL CONF ON CLOUD AND BIG DATA COMPUTING, INTL CONF ON CYBER SCIENCE AND TECHNOLOGY CONGRESS DASC/PICOM/CBDCOM/CYBERSCITECH 2021, 2021, : 84 - 91
  • [50] A Novel Distilled Generative Essay Polish System via Hierarchical Pre-Training
    Yang, Qichuan
    Zhang, Liuxin
    Zhang, Yang
    Gao, Jinghua
    Wang, Siyun
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,