24-04-2024 | Original Article
A multi-level attention long short-term memory neural network based on rival rise algorithm for traffic volume prediction
Authors:
Kaili Liao, Wuneng Zhou
Published in:
International Journal of Machine Learning and Cybernetics
Log in
Abstract
In the domain of traffic volume forecasting, several critical factors, such as weather conditions, have often been neglected. Consequently, we present a novel multi-level attention mechanism (MA) that integrates considerations of both weather and spatial-temporal aspects. Furthermore, we employ the multi-level attention mechanism within the Long Short-Term Memory (MALSTM) model to effectively capture pertinent features from high-dimensional data. However, it is important to note that the performance of deep learning methods is significantly influenced by hyperparameters. Within this paper, we introduce the Rival Rise Algorithm (RRA), a novel nature-inspired intelligence algorithm that mimics competitive behaviors found in nature. The strategies employed by RRA encompass two stages: the ’self-improve stage’ and the ’imitate-improve stage’. To evaluate the efficiency of the algorithm, we employ 13 commonly used optimization mathematical problems, covering both unimodal and multimodal scenarios. The results conclusively illustrate that the proposed RRA surpasses SFOA, DSFGWO, GWO, SSA, and CSSA in terms of accuracy, stability, and robustness. Furthermore, as a case study, we utilize the RRA-MALSTM model to predict traffic flows at an intersection located in Shenzhen, China, and metro passenger flows at a staion located in Minneapolis, Ameroca. When compared to traditional prediction models in SHenzhen dataset, MALSTM yields a performance of 2.3401 compared to STLSTM’s 2.3719, demonstrating a growth of 1.3%, and the RRA with improvement rates of 2.1% and 2.2%, respectively. Additionally, MALSTM outperforms LSTM with MAE at 230.89 and RMSE at 339.56 by 18.1% and 23.3%, showcasing the benefits of multi-attention, while RRA outperforms GWO, the least effective among the eight algorithms, showing improvements in MAE by 1.5% and RMSE by 2.9%.