BRep-BERT: Pre-training Boundary Representation BERT with Sub-graph Node Contrastive Learning

被引:1
|
作者
Lou, Yunzhong [1 ]
Li, Xueyang [1 ]
Chen, Haotian [1 ]
Zhou, Xiangdong [1 ]
机构
[1] Fudan Univ, Sch Comp Sci, Shanghai, Peoples R China
关键词
Boundary Representation; GNN; Transformer; Contrastive Learning; Few-Shot;
D O I
10.1145/3583780.3614795
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Obtaining effective entity feature representations is crucial in the field of Boundary Representation (B-Rep), a key parametric representation method in Computer-Aided Design (CAD). However, the lack of labeled large-scale database and the scarcity of task-specific label sets pose significant challenges. To address these problems, we propose an innovative unsupervised neural network approach called BRep-BERT, which extends the concept of BERT to the B-Rep domain. Specifically, we utilize Graph Neural Network (GNN) Tokenizer to generate discrete entity labels with geometric and structural semantic information. We construct new entity representation sequences based on the structural relationships and pre-train the model through the Masked Entity Modeling (MEM) task. To address the attention sparsity issue in large-scale geometric models, we incorporate graph structure information and learnable relative position encoding into the attention module to optimize feature updates. Additionally, we employ geometric sub-graphs and multi-level contrastive learning techniques to enhance the model's ability to learn regional features. Comparisons with previous methods demonstrate that BRep-BERT achieves the state-of-the-art performance on both full-data training and few-shot learning tasks across multiple B-Rep datasets. Particularly, BRep-BERT outperforms previous methods significantly in the few-shot learning scenarios. Comprehensive experiments demonstrate the substantial advantages and potential of BRep-BERT in handling B-Rep data representation. Code will be released at https://github.com/louyz1026/Brep_Bert.
引用
收藏
页码:1657 / 1666
页数:10
相关论文
共 16 条
  • [1] Contrastive Representations Pre-Training for Enhanced Discharge Summary BERT
    Won, DaeYeon
    Lee, YoungJun
    Choi, Ho-Jin
    Jung, YuChae
    [J]. 2021 IEEE 9TH INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI 2021), 2021, : 507 - 508
  • [2] FEDBFPT: An Efficient Federated Learning Framework for BERT Further Pre-training
    Wang, Xin'ao
    Li, Huan
    Chen, Ke
    Shou, Lidan
    [J]. PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 4344 - 4352
  • [3] Multilingual Molecular Representation Learning via Contrastive Pre-training
    Guo, Zhihui
    Sharma, Pramod
    Martinez, Andy
    Du, Liang
    Abraham, Robin
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 3441 - 3453
  • [4] ET-BERT: A Contextualized Datagram Representation with Pre-training Transformers for Encrypted Traffic Classification
    Lin, Xinjie
    Gang Xiong
    Gou, Gaopeng
    Zhen Li
    Shi, Junzheng
    Jing Yu
    [J]. PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 633 - 642
  • [5] VarCLR: Variable Semantic Representation Pre-training via Contrastive Learning
    Chen, Qibin
    Lacomis, Jeremy
    Schwartz, Edward J.
    Neubig, Graham
    Vasilescu, Bogdan
    Le Goues, Claire
    [J]. 2022 ACM/IEEE 44TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING (ICSE 2022), 2022, : 2327 - 2339
  • [6] Graph Contrastive Multi-view Learning: A Pre-training Framework for Graph Classification
    Adjeisah, Michael
    Zhu, Xinzhong
    Xu, Huiying
    Ayall, Tewodros Alemu
    [J]. Knowledge-Based Systems, 2024, 299
  • [7] W2V-BERT: COMBINING CONTRASTIVE LEARNING AND MASKED LANGUAGE MODELING FOR SELF-SUPERVISED SPEECH PRE-TRAINING
    Chung, Yu-An
    Zhang, Yu
    Han, Wei
    Chiu, Chung-Cheng
    Qin, James
    Pang, Ruoming
    Wu, Yonghui
    [J]. 2021 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU), 2021, : 244 - 250
  • [8] CoCo-BERT: Improving Video-Language Pre-training with Contrastive Cross-modal Matching and Denoising*
    Luo, Jianjie
    Li, Yehao
    Pan, Yingwei
    Yao, Ting
    Chao, Hongyang
    Mei, Tao
    [J]. PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 5600 - 5608
  • [9] Contrastive Pre-training with Adversarial Perturbations for Check-in Sequence Representation Learning
    Gong, Letian
    Lin, Youfang
    Guo, Shengnan
    Lin, Yan
    Wang, Tianyi
    Zheng, Erwen
    Zhou, Zeyu
    Wan, Huaiyu
    [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 4, 2023, : 4276 - 4283
  • [10] Improving Knowledge Graph Representation Learning by Structure Contextual Pre-training
    Ye, Ganqiang
    Zhang, Wen
    Bi, Zhen
    Wong, Chi Man
    Chen, Hui
    Chen, Huajun
    [J]. PROCEEDINGS OF THE 10TH INTERNATIONAL JOINT CONFERENCE ON KNOWLEDGE GRAPHS (IJCKG 2021), 2021, : 151 - 155