Update README.md
Browse filesinitial model card
README.md
CHANGED
|
@@ -1,3 +1,31 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
library_name: timesfm
|
| 4 |
+
pipeline_tag: time-series-forecasting
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
+
# TimesFM
|
| 8 |
+
|
| 9 |
+
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
|
| 10 |
+
|
| 11 |
+
**Resources and Technical Documentation**:
|
| 12 |
+
* Paper: [A decoder-only foundation model for time-series forecasting](https://arxiv.org/abs/2310.10688), ICML 2024.
|
| 13 |
+
* [Google Research blog](https://research.google/blog/a-decoder-only-foundation-model-for-time-series-forecasting/)
|
| 14 |
+
* [GitHub repo](https://github.com/google-research/timesfm)
|
| 15 |
+
|
| 16 |
+
**Authors**: Google Research
|
| 17 |
+
|
| 18 |
+
This checkpoint is not an officially supported Google product. See [TimesFM in BigQuery](https://cloud.google.com/bigquery/docs/timesfm-model) for Google official support.
|
| 19 |
+
|
| 20 |
+
## Checkpoint `timesfm-2.5-200m`
|
| 21 |
+
|
| 22 |
+
`timesfm-2.5-200m` is the third open model checkpoint.
|
| 23 |
+
|
| 24 |
+
### Data
|
| 25 |
+
|
| 26 |
+
`timesfm-2.5-200m` is pretrained using
|
| 27 |
+
|
| 28 |
+
- [GiftEvalPretrain](https://huggingface.co/datasets/Salesforce/GiftEvalPretrain)
|
| 29 |
+
- [Wikimedia Pageviews](https://meta.wikimedia.org/wiki/Pageviews_Analysis), cutoff EoY 2022.
|
| 30 |
+
- [Google Trends](https://trends.google.com/trends/) top queries, cutoff EoY 2022 (see [paper](https://arxiv.org/abs/2310.10688) for details).
|
| 31 |
+
- Synthetic data.
|