Random Projection Layers for Multidimensional Time Sires Forecasting

CCM Yeh, Y Fan, X Dai, V Lai, PO Aboagye…�- arXiv preprint arXiv�…, 2024 - arxiv.org
arXiv preprint arXiv:2402.10487, 2024arxiv.org
All-Multi-Layer Perceptron (all-MLP) mixer models have been shown to be effective for time
series forecasting problems. However, when such a model is applied to high-dimensional
time series (eg, the time series in a spatial-temporal dataset), its performance is likely to
degrade due to overfitting issues. In this paper, we propose an all-MLP time series
forecasting architecture, referred to as RPMixer. Our method leverages the ensemble-like
behavior of deep neural networks, where each individual block within the network acts like a�…
All-Multi-Layer Perceptron (all-MLP) mixer models have been shown to be effective for time series forecasting problems. However, when such a model is applied to high-dimensional time series (e.g., the time series in a spatial-temporal dataset), its performance is likely to degrade due to overfitting issues. In this paper, we propose an all-MLP time series forecasting architecture, referred to as RPMixer. Our method leverages the ensemble-like behavior of deep neural networks, where each individual block within the network acts like a base learner in an ensemble model, especially when identity mapping residual connections are incorporated. By integrating random projection layers into our model, we increase the diversity among the blocks' outputs, thereby enhancing the overall performance of RPMixer. Extensive experiments conducted on large-scale spatial-temporal forecasting benchmark datasets demonstrate that our proposed method outperforms alternative methods, including both spatial-temporal graph models and general forecasting models.
arxiv.org