You can now listen to the accompanying podcast here: https://soundcloud.com/eghbal-rahimikia/revisiting-time-series-foundation-models-in-finance
All models can now be loaded directly from GitHub. The repository includes utilities and setup instructions. 🔗 https://github.com/DeepIntoStreams/TSFM_Finance
We are pleased to introduce FinText-TSFM, a comprehensive suite of time series foundation models (TSFMs) with 613 models pre-trained for quantitative finance. This release accompanies the paper : Re(Visiting) Time Series Foundation Models in Finance by Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025).
Finance-Native Pre-training:
Models are pre-trained from scratch on large-scale financial time series datasets — including daily excess returns across 89 markets and over 2 billion observations — to ensure full temporal and domain alignment.
Bias-Free Design:
Pre-training strictly follows a chronological expanding-window setup, avoiding any look-ahead bias or information leakage.
Each variation includes 23 separately pre-trained models, corresponding to each year from 2000 to 2023, with data starting in 1990.
Model Families:
This release includes variants of Chronos and TimesFM architectures adapted for financial time series:
Model Collections:
Performance Insights:
Our findings show that off-the-shelf TSFMs underperform in zero-shot forecasting, while finance-pretrained models achieve large gains in both predictive accuracy and portfolio performance.
Evaluation Scope:
Models are benchmarked across U.S. and seven international markets, using rolling windows of 5, 21, 252, and 512 days, with over 18 million out-of-sample forecasts spanning 22 years (2001–2023) of daily excess returns, evaluated at both the statistical and economic performance levels.
Please cite the accompanying paper if you use these models:
Re(Visiting) Time Series Foundation Models in Finance.
Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.
SSRN: [https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5770562)
This project was made possible through computational and institutional support from:
Developed by:
Alliance Manchester Business School, University of Manchester
Department of Mathematics, University College London (UCL)
Powered by:
Isambard-AI, Bristol Centre for Supercomputing (BriCS)
The Bede Supercomputer