Improving state-of-the-art models has always attracted the attention of researchers and scientists for the sake of analysing Chaotic Time Series datasets. Starting from the proposal of traditional Neural Networks, the emergence of different Neural Network models has gotten faster. Among these models, Recurrent Neural Network, particularly, Long Short-Term Memory. Long Short-Term Memory has been widely used to forecast via regular Time Series data. When it is used to forecast through Chaotic Time Series, it suffers a lot due to the complexity and nonlinearity of this type of dataset, as well as missing data that results from the long time series. In this project, Dilated Long Short-Term Attention was implemented; attention mechanism was integrated inside the cell by staking multiple layers with different sizes of dilation. The proposed model was evaluated on data generated by Mackey Glass equation as well as real- world datasets. The experimental results show that Dilated Long Short-Term Attention model has better improvement in the Mean Absolute Error, Root Mean Percentage Error and R Squared of the predicted values than the common Recurrent Neural Network and Long Short-Term Memory. The experimental results show that the proposed model has the advantages of high prediction accuracy and small error, which is useful for the application of deep learning algorithms in Chaotic Time Series datasets.