Unsupervised Transformer-Based Anomaly Detection in ECG Signals

被引:16
|
作者
Alamr, Abrar [1 ]
Artoli, Abdelmonim [1 ]
机构
[1] King Saud Univ, Coll Comp & Informat Sci, Comp Sci Dept, Riyadh 11543, Saudi Arabia
关键词
unsupervised transformers; deep learning; anomaly detection; ECG signal; TIME-SERIES;
D O I
10.3390/a16030152
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Anomaly detection is one of the basic issues in data processing that addresses different problems in healthcare sensory data. Technology has made it easier to collect large and highly variant time series data; however, complex predictive analysis models are required to ensure consistency and reliability. With the rise in the size and dimensionality of collected data, deep learning techniques, such as autoencoder (AE), recurrent neural networks (RNN), and long short-term memory (LSTM), have gained more attention and are recognized as state-of-the-art anomaly detection techniques. Recently, developments in transformer-based architecture have been proposed as an improved attention-based knowledge representation scheme. We present an unsupervised transformer-based method to evaluate and detect anomalies in electrocardiogram (ECG) signals. The model architecture comprises two parts: an embedding layer and a standard transformer encoder. We introduce, implement, test, and validate our model in two well-known datasets: ECG5000 and MIT-BIH Arrhythmia. Anomalies are detected based on loss function results between real and predicted ECG time series sequences. We found that the use of a transformer encoder as an alternative model for anomaly detection enables better performance in ECG time series data. The suggested model has a remarkable ability to detect anomalies in ECG signal and outperforms deep learning approaches found in the literature on both datasets. In the ECG5000 dataset, the model can detect anomalies with 99% accuracy, 99% F1-score, 99% AUC score, 98.1% recall, and 100% precision. In the MIT-BIH Arrhythmia dataset, the model achieved an accuracy of 89.5%, F1 score of 92.3%, AUC score of 93%, recall of 98.2%, and precision of 87.1%.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Memory-Token Transformer for Unsupervised Video Anomaly Detection
    Li, Youyu
    Song, Xiaoning
    Xu, Tianyang
    Feng, Zhenhua
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 3325 - 3332
  • [42] An Improved Transformer U-Net-based Reconstruction Method for Unsupervised Anomaly Detection
    Tang, Jialong
    Lv, Zheng
    Liu, Ying
    Zhao, Jun
    2024 14TH ASIAN CONTROL CONFERENCE, ASCC 2024, 2024, : 153 - 158
  • [43] Transformer-based unsupervised contrastive learning for histopathological image classification
    Wang, Xiyue
    Yang, Sen
    Zhang, Jun
    Wang, Minghui
    Zhang, Jing
    Yang, Wei
    Huang, Junzhou
    Han, Xiao
    MEDICAL IMAGE ANALYSIS, 2022, 81
  • [44] Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step Masking
    Shin, Ah-Hyung
    Kim, Seong Tae
    Park, Gyeong-Moon
    IEEE ACCESS, 2023, 11 : 74035 - 74047
  • [45] Robust transformer-based anomaly detection for nuclear power data using maximum correntropy criterion
    Yi, Shuang
    Zheng, Sheng
    Yang, Senquan
    Zhou, Guangrong
    He, Junjie
    NUCLEAR ENGINEERING AND TECHNOLOGY, 2024, 56 (04) : 1284 - 1295
  • [46] Multivariate time series anomaly detection via separation, decomposition, and dual transformer-based autoencoder
    Fu, Shiyuan
    Gao, Xin
    Li, Baofeng
    Zhai, Feng
    Lu, Jiansheng
    Xue, Bing
    Yu, Jiahao
    Xiao, Chun
    APPLIED SOFT COMPUTING, 2024, 159
  • [47] Decouple and Resolve: Transformer-Based Models for Online Anomaly Detection From Weakly Labeled Videos
    Liu, Tianshan
    Zhang, Cong
    Lam, Kin-Man
    Kong, Jun
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2023, 18 : 15 - 28
  • [48] DCT-GAN: Dilated Convolutional Transformer-Based GAN for Time Series Anomaly Detection
    Li, Yifan
    Peng, Xiaoyan
    Zhang, Jia
    Li, Zhiyong
    Wen, Ming
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (04) : 3632 - 3644
  • [49] A Dual Transformer-Based Deep Learning Model for Passenger Anomaly Behavior Detection in Elevator Cabs
    Ji, Yijin
    Sun, Haoxiang
    Xu, Benlian
    Lu, Mingli
    Zhou, Xu
    Shi, Jian
    INTERNATIONAL JOURNAL OF SWARM INTELLIGENCE RESEARCH, 2024, 15 (01)
  • [50] A Novel Transformer-Based Anomaly Detection Model for the Reactor Coolant Pump in Nuclear Power Plants
    Zhou, Guangrong
    Zheng, Sheng
    Yang, Senquan
    Yi, Shuang
    SCIENCE AND TECHNOLOGY OF NUCLEAR INSTALLATIONS, 2024, 2024