Accurate traffic forecasting is vital to an intelligent transportation system. Although many deep learning models have achieved state-of-art performance for short-term traffic forecasting of up to 1 hour, long-term traffic forecasting that spans multiple hours remains a major challenge. To that end, we develop Graph Pyramid Autoformer (GPA), an attention-based spatial-temporal graph neural network that uses a novel pyramid autocorrelation attention mechanism. It enables learning from long temporal sequences on graphs and improves long-term traffic forecasting accuracy. We demonstrate the efficacy of the GPA using two benchmark traffic datasets: Los Angeles' METR-LA and the Bay Area's PEMS-BAY. Notably, our model has outperformed a range of existing state-of-the-art methods, delivering up to a 25 % improvement in the accuracy of long-term traffic forecasts. Our code is available at: https://github.com/WeiheneZlExplainable-Graph-Autoformer.
Abstract:
Publication date:
December 1, 2023
Publication type:
Conference Paper
Citation:
Zhong, W., Mallick, T., Macfarlane, J., Meidani, H., & Balaprakash, P. (2023). Graph Pyramid Autoformer for Long- Term Traffic Forecasting. 2023 International Conference on Machine Learning and Applications (ICMLA), 384–391. https://doi.org/10.1109/ICMLA58977.2023.00060