MOMENT: A Family of Open Time-series Foundation Models
On this page
We introduce MOMENT, a family of open-source foundation models forgeneral-purpose time series analysis. Pre-training large models on time seriesdata is challenging due to (1) the absence of a large and cohesive public timeseries repository, and (2) diverse time series characteristics which makemulti-dataset training onerous. Additionally, (3) experimental benchmarks toevaluate these models, especially in scenarios with limited resources, time,and supervision, are still in their nascent stages. To address thesechallenges, we compile a large and diverse collection of public time series,called the Time series Pile, and systematically tackle time series-specificchallenges to unlock large-scale multi-dataset pre-training. Finally, we buildon recent work to design a benchmark to evaluate time series foundation modelson diverse tasks and datasets in limited supervision settings. Experiments onthis benchmark demonstrate the effectiveness of our pre-trained models withminimal data and task-specific fine-tuning. Finally, we present severalinteresting empirical observations about large pre-trained time series models.Pre-trained models (AutonLab/MOMENT-1-large) and Time Series Pile(AutonLab/Timeseries-PILE) are available on Huggingface.
Further reading
- Access Paper in arXiv.org