• KSII Transactions on Internet and Information Systems
    Monthly Online Journal (eISSN: 1976-7277)

Time-Series Forecasting Based on Multi-Layer Attention Architecture

Vol. 18, No. 1, January 31, 2024
10.3837/tiis.2024.01.001, Download Paper (Free):

Abstract

Time-series forecasting is extensively used in the actual world. Recent research has shown that Transformers with a self-attention mechanism at their core exhibit better performance when dealing with such problems. However, most of the existing Transformer models used for time series prediction use the traditional encoder-decoder architecture, which is complex and leads to low model processing efficiency, thus limiting the ability to mine deep time dependencies by increasing model depth. Secondly, the secondary computational complexity of the self-attention mechanism also increases computational overhead and reduces processing efficiency. To address these issues, the paper designs an efficient multi-layer attention-based time-series forecasting model. This model has the following characteristics: (i) It abandons the traditional encoder-decoder based Transformer architecture and constructs a time series prediction model based on multi-layer attention mechanism, improving the model's ability to mine deep time dependencies. (ii) A cross attention module based on cross attention mechanism was designed to enhance information exchange between historical and predictive sequences. (iii) Applying a recently proposed sparse attention mechanism to our model reduces computational overhead and improves processing efficiency. Experiments on multiple datasets have shown that our model can significantly increase the performance of current advanced Transformer methods in time series forecasting, including LogTrans, Reformer, and Informer.


Statistics

Show / Hide Statistics

Statistics (Cumulative Counts from December 1st, 2015)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article

[IEEE Style]
N. Wang and X. Zhao, "Time-Series Forecasting Based on Multi-Layer Attention Architecture," KSII Transactions on Internet and Information Systems, vol. 18, no. 1, pp. 1-14, 2024. DOI: 10.3837/tiis.2024.01.001.

[ACM Style]
Na Wang and Xianglian Zhao. 2024. Time-Series Forecasting Based on Multi-Layer Attention Architecture. KSII Transactions on Internet and Information Systems, 18, 1, (2024), 1-14. DOI: 10.3837/tiis.2024.01.001.

[BibTeX Style]
@article{tiis:90384, title="Time-Series Forecasting Based on Multi-Layer Attention Architecture", author="Na Wang and Xianglian Zhao and ", journal="KSII Transactions on Internet and Information Systems", DOI={10.3837/tiis.2024.01.001}, volume={18}, number={1}, year="2024", month={January}, pages={1-14}}