In domains where there is inherent randomness in the process, simple (ensemble) models tend to outperform complex ones. Nonlinear models can capture nonlinear patterns but they also tend to fit noise.
Neural network models have been shown to work really well on text, images, sound etc, but these types of data have no inherent randomness in them. A piece of text is a piece of text.
Whereas most time series are usually trying to forecast quantities that have complex unmeasured causality, like natural gas prices. Past behavior is no guarantee of future behavior. Capturing the nonlinear behavior in the past better can actually degrade future performance. While simple models tend to be more robust because they tend to not overly bias towards any one trend.
9 months ago (things may have changed) someone showed simple time series models outperforming Chronos.
https://github.com/Nixtla/nixtla/tree/main/experiments/amazo...
In domains where there is inherent randomness in the process, simple (ensemble) models tend to outperform complex ones. Nonlinear models can capture nonlinear patterns but they also tend to fit noise.
Neural network models have been shown to work really well on text, images, sound etc, but these types of data have no inherent randomness in them. A piece of text is a piece of text.
Whereas most time series are usually trying to forecast quantities that have complex unmeasured causality, like natural gas prices. Past behavior is no guarantee of future behavior. Capturing the nonlinear behavior in the past better can actually degrade future performance. While simple models tend to be more robust because they tend to not overly bias towards any one trend.