A transformer-based deep learning framework to predict employee attrition

被引:0
|
作者
Li, Wenhui [1 ]
机构
[1] Shandong Normal Univ, Sch Informat Sci & Engn, Shandong, Peoples R China
关键词
Data science; Machine learning; Artificial intelligence; Attrition prediction; Deep learning;
D O I
10.7717/peerj-cs.1570
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In all areas of business, employee attrition has a detrimental impact on the accuracy of profit management. With modern advanced computing technology, it is possible to construct a model for predicting employee attrition to minimize business owners' costs. Despite the reality that these types of models have never been evaluated under real-world conditions, several implementations were developed and applied to the IBM HR Employee Attrition dataset to evaluate how these models may be incorporated into a decision support system and their effect on strategic decisions. In this study, a Transformer-based neural network was implemented and was characterized by contextual embeddings adapting to tubular data as a computational technique for determining employee turnover. Experimental outcomes showed that this model had significantly improved prediction efficiency compared to other state-of-the-art models. In addition, this study pointed out that deep learning, in general, and Transformerbased networks, in particular, are promising for dealing with tabular and unbalanced data.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Deep Learning Based Employee Attrition Prediction
    Gurler, Kerem
    Pak, Burcu Kuleli
    Gungor, Vehbi Cagri
    [J]. ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, AIAI 2023, PT I, 2023, 675 : 57 - 68
  • [2] Super-resolution reconstruction of turbulent flows with a transformer-based deep learning framework
    Xu, Qin
    Zhuang, Zijian
    Pan, Yongcai
    Wen, Binghai
    [J]. PHYSICS OF FLUIDS, 2023, 35 (05)
  • [3] A Transformer-based Framework for Multivariate Time Series Representation Learning
    Zerveas, George
    Jayaraman, Srideepika
    Patel, Dhaval
    Bhamidipaty, Anuradha
    Eickhoff, Carsten
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2114 - 2124
  • [4] Transformer-based contrastive learning framework for image anomaly detection
    Fan, Wentao
    Shangguan, Weimin
    Chen, Yewang
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (10) : 3413 - 3426
  • [5] Transformer-based contrastive learning framework for image anomaly detection
    Wentao Fan
    Weimin Shangguan
    Yewang Chen
    [J]. International Journal of Machine Learning and Cybernetics, 2023, 14 : 3413 - 3426
  • [6] A multi-label transformer-based deep learning approach to predict focal visual field progression
    Chen, Ling
    Tseng, Vincent S.
    Tsung, Ta-Hsin
    Lu, Da-Wen
    [J]. GRAEFES ARCHIVE FOR CLINICAL AND EXPERIMENTAL OPHTHALMOLOGY, 2024, 262 (07) : 2227 - 2235
  • [7] GIT: A Transformer-Based Deep Learning Model for Geoacoustic Inversion
    Feng, Sheng
    Zhu, Xiaoqian
    Ma, Shuqing
    Lan, Qiang
    [J]. JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2023, 11 (06)
  • [8] Transformer-based deep learning model for forced oscillation localization
    Matar, Mustafa
    Estevez, Pablo Gill
    Marchi, Pablo
    Messina, Francisco
    Elmoudi, Ramadan
    Wshah, Safwan
    [J]. INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2023, 146
  • [9] Characterization of groundwater contamination: A transformer-based deep learning model
    Bai, Tao
    Tahmasebi, Pejman
    [J]. ADVANCES IN WATER RESOURCES, 2022, 164
  • [10] stEnTrans: Transformer-Based Deep Learning for Spatial Transcriptomics Enhancement
    Xue, Shuailin
    Zhu, Fangfang
    Wang, Changmiao
    Min, Wenwen
    [J]. BIOINFORMATICS RESEARCH AND APPLICATIONS, PT I, ISBRA 2024, 2024, 14954 : 63 - 75