Stock Price Prediction Using CNN-BiLSTM-Attention Model

被引:31
|
作者
Zhang, Jilin [1 ]
Ye, Lishi [1 ]
Lai, Yongzeng [2 ]
机构
[1] Fujian Univ Technol, Sch Comp Sci & Math, Fuzhou 350108, Peoples R China
[2] Wilfrid Laurier Univ, Dept Math, Waterloo, ON N2L 3C5, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
stock price prediction; deep learning; CNN; BiLSTM; attention mechanism; VOLATILITY;
D O I
10.3390/math11091985
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Accurate stock price prediction has an important role in stock investment. Because stock price data are characterized by high frequency, nonlinearity, and long memory, predicting stock prices precisely is challenging. Various forecasting methods have been proposed, from classical time series methods to machine-learning-based methods, such as random forest (RF), recurrent neural network (RNN), convolutional neural network (CNN), Long Short-Term Memory (LSTM) neural networks and their variants, etc. Each method can reach a certain level of accuracy but also has its limitations. In this paper, a CNN-BiLSTM-Attention-based model is proposed to boost the accuracy of predicting stock prices and indices. First, the temporal features of sequence data are extracted using a convolutional neural network (CNN) and bi-directional long and short-term memory (BiLSTM) network. Then, an attention mechanism is introduced to fit weight assignments to the information features automatically; and finally, the final prediction results are output through the dense layer. The proposed method was first used to predict the price of the Chinese stock index-the CSI300 index and was found to be more accurate than any of the other three methods-LSTM, CNN-LSTM, CNN-LSTM-Attention. In order to investigate whether the proposed model is robustly effective in predicting stock indices, three other stock indices in China and eight international stock indices were selected to test, and the robust effectiveness of the CNN-BiLSTM-Attention model in predicting stock prices was confirmed. Comparing this method with the LSTM, CNN-LSTM, and CNN-LSTM-Attention models, it is found that the accuracy of stock price prediction is highest using the CNN-BiLSTM-Attention model in almost all cases.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] 基于CNN-BiLSTM-Attention模型的光纤非线性损伤补偿算法
    陈志轩
    张洪波
    张敏
    蔡炬
    刘娇
    杜杰
    张倩武
    电力信息与通信技术, 2023, 21 (11) : 7 - 12
  • [42] STOCK PRICE PREDICTION USING LSTM,RNN AND CNN-SLIDING WINDOW MODEL
    Selvin, Sreelekshmy
    Vinayakumar, R.
    Gopalakrishnan, E. A.
    Menon, Vijay Krishna
    Soman, K. P.
    2017 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2017, : 1643 - 1647
  • [43] 基于TimeGAN和CNN-BiLSTM-Attention的大坝变形预测混合模型
    原佳帆
    李丹杨
    李佳霖
    秦学
    毛鹏
    人民黄河, 2024, 46 (12) : 127 - 130+143
  • [44] 基于优化算法的CNN-BiLSTM-attention的月径流量预测
    朱豪
    胡圆昭
    尹明财
    贾慧
    张济世
    人民长江, 2023, 54 (12) : 96 - 104
  • [45] Stock Price Trend Prediction using MRCM-CNN
    Duan, Jufang
    Xu, Xiangyang
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 3455 - 3460
  • [46] A Novel Industrial Intrusion Detection Method based on Threshold-optimized CNN-BiLSTM-Attention using ROC Curve
    Lan, Mindi
    Luo, Jun
    Chai, Senchun
    Chai, Ruiqi
    Zhang, Chen
    Zhang, Baihai
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7384 - 7389
  • [47] CNN-BiLSTM-Attention: A multi-label neural classifier for short texts with a small set of labels
    Lu, Guangyao
    Liu, Yuling
    Wang, Jie
    Wu, Hongping
    INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (03)
  • [48] 基于CNN-BiLSTM-Attention融合神经网络的大气温度预测
    王怡
    普运伟
    中国水运(下半月), 2023, 23 (01) : 25 - 27
  • [49] Two-channel Attention Mechanism Fusion Model of Stock Price Prediction Based on CNN-LSTM
    Sun, Lin
    Xu, Wenzheng
    Liu, Jimin
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (05)
  • [50] A Stock Closing Price Prediction Model Based on CNN-BiSLSTM
    Wang, Haiyao
    Wang, Jianxuan
    Cao, Lihui
    Li, Yifan
    Sun, Qiuhong
    Wang, Jingyang
    COMPLEXITY, 2021, 2021