Scaling Law for Time Series Forecasting

Jingzhe Shi, Qinwei Ma, Huan Ma, Lei Li

Research output: Contribution to conferencePaperResearch

2 Downloads (Pure)

Abstract

Scaling law that rewards large datasets, complex models and enhanced data granularity has been observed in various fields of deep learning. Yet, studies on time series forecasting have cast doubt on scaling behaviors of deep learning methods for time series forecasting: while more training data improves performance, more capable models do not always outperform less capable models, and longer input horizons may hurt performance for some models. We propose a theory for scaling law for time series forecasting that can explain these seemingly abnormal behaviors. We take into account the impact of dataset size and model complexity, as well as time series data granularity, particularly focusing on the look-back horizon, an aspect that has been unexplored in previous theories. Furthermore, we empirically evaluate various models using a diverse set of time series forecasting datasets, which (1) verifies the validity of scaling law on dataset size and model complexity within the realm of time series forecasting, and (2) validates our theoretical framework, particularly regarding the influence of look back horizon. We hope our findings may inspire new models targeting time series forecasting datasets of limited size, as well as large foundational datasets and models for time series forecasting in future works.\footnote{Codes for our experiments will be made public at: \url{https://github.com/JingzheShi/ScalingLawForTimeSeriesForecasting}.
Original languageUndefined/Unknown
Publication date2024
Number of pages31
Publication statusPublished - 2024
Event38th Conference on Neural Information Processing Systems (NeurIPS 2024). - Vancouver,; BC, Canada
Duration: 8 Dec 202415 Dec 2024

Conference

Conference38th Conference on Neural Information Processing Systems (NeurIPS 2024).
Country/TerritoryCanada
CityVancouver,; BC
Period08/12/202415/12/2024

Bibliographical note

20 pages

Cite this