TTVAE: Transformer-based generative modeling for tabular data generation

被引:1
|
作者
Wang, Alex X. [1 ]
Nguyen, Binh P. [1 ,2 ]
机构
[1] Victoria Univ Wellington, Sch Math & Stat, Wellington 6012, New Zealand
[2] Ho Chi Minh City Open Univ, Fac Informat Technol, 97 Vo Van Tan,Dist 3, Ho Chi Minh City 70000, Vietnam
关键词
Generative AI; Tabular data; Transformer; Latent space interpolation; SMOTE;
D O I
10.1016/j.artint.2025.104292
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tabular data synthesis presents unique challenges, with Transformer models remaining underexplored despite the applications of Variational Autoencoders and Generative Adversarial Networks. To address this gap, we propose the Transformer-based Tabular Variational AutoEncoder (TTVAE), leveraging the attention mechanism for capturing complex data distributions. The inclusion of the attention mechanism enables our model to understand complex relationships among heterogeneous features, a task often difficult for traditional methods. TTVAE facilitates the integration of interpolation within the latent space during the data generation process. Specifically, TTVAE is trained once, establishing a low-dimensional representation of real data, and then various latent interpolation methods can efficiently generate synthetic latent points. Through extensive experiments on diverse datasets, TTVAE consistently achieves state-of-the-art performance, highlighting its adaptability across different feature types and data sizes. This innovative approach, empowered by the attention mechanism and the integration of interpolation, addresses the complex challenges of tabular data synthesis, establishing TTVAE as a powerful solution.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] Transformer-based Natural Language Understanding and Generation
    Zhang, Feng
    An, Gaoyun
    Ruan, Qiuqi
    2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 281 - 284
  • [22] A transformer-based Urdu image caption generation
    Hadi M.
    Safder I.
    Waheed H.
    Zaman F.
    Aljohani N.R.
    Nawaz R.
    Hassan S.U.
    Sarwar R.
    Journal of Ambient Intelligence and Humanized Computing, 2024, 15 (9) : 3441 - 3457
  • [23] Transformer-based generative adversarial network enabled direct aberration determination
    Chen, Sitong
    Zhu, Zihao
    Wang, Hao
    Wang, Yangyundou
    OPTICAL ENGINEERING, 2024, 63 (06)
  • [24] Transformer-Based Generative Model Accelerating the Development of Novel BRAF Inhibitors
    Yang, Lijuan
    Yang, Guanghui
    Bing, Zhitong
    Tian, Yuan
    Niu, Yuzhen
    Huang, Liang
    Yang, Lei
    ACS OMEGA, 2021, 6 (49): : 33864 - 33873
  • [25] Generative models for tabular data: A review
    Kim, Dong-Keon
    Ryu, Dongheum
    Lee, Yongbin
    Choi, Dong-Hoon
    JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, 2024, 38 (09) : 4989 - 5005
  • [26] Transformer-based Question Text Generation in the Learning System
    Li, Jiajun
    Song, Huazhu
    Li, Jun
    6TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE, ICIAI2022, 2022, : 50 - 56
  • [27] Applying Transformer-Based Text Summarization for Keyphrase Generation
    Glazkova A.V.
    Morozov D.A.
    Lobachevskii Journal of Mathematics, 2023, 44 (1) : 123 - 136
  • [28] Transformer-Based Graph Neural Networks for Outfit Generation
    Becattini, Federico
    Teotini, Federico Maria
    Bimbo, Alberto Del
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2024, 12 (01) : 213 - 223
  • [29] DLGNet: A Transformer-based Model for Dialogue Response Generation
    Olabiyi, Oluwatobi
    Mueller, Erik T.
    NLP FOR CONVERSATIONAL AI, 2020, : 54 - 62
  • [30] Evaluation of Transformer-Based Encoder on Conditional Graph Generation
    Abeywickrama, Thamila E. H.
    Tsugawa, Sho
    Manada, Akiko
    Watabe, Kohei
    2024 IEEE 48TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC 2024, 2024, : 1526 - 1527