Detection of anomalies present in time series data is a crucial task in diverse realistic applications. Existing approaches exhibited prominent outcomes in this domain, though they face challenges in imbalanced datasets. The novelty of this work lies in the introduction of an Improved Attention Mechanism-based Transformer model, which uniquely integrates advanced attention mechanisms tailored for time series data to enhance anomaly detection capabilities. This approach effectively captures both short- and long-term dependencies, addressing limitations in handling imbalanced datasets. Here, the implemented approach is the Improved Attention Mechanism-based Transformer model, which involves preprocessing, information extraction and anomaly detection. Initially, input time series data undergoes preprocessing, where improved z-score normalization approach is utilized. Subsequently, the preprocessed data is employed to retrieve information for anomaly detection, which is carried out in Improved Attention Mechanism-based Transformer model that retrieves the correlation of every time point as well as relationship through long-distance time information among distinct positions in the sequence. Also, an improved loss function is employed during training of Improved Attention Mechanism-based Transformer model. Finally, in contrast to conventional methods, the proposed model achieves a detection accuracy exceeding 0.968 at a training percentage of 90 while existing models obtain lower ratings.