|
--- |
|
pipeline_tag: time-series-forecasting |
|
tags: |
|
- model_hub_mixin |
|
- pytorch_model_hub_mixin |
|
- time series foundation models |
|
- pretrained models |
|
- time series |
|
--- |
|
|
|
Moirai is a large pre-trained Time Series Model based on the Masked Encoder architecture. It is a universal time series forecasting model capable of addressing diverse forecasting tasks across multiple domains, frequencies, and variables in a zero-shot manner. |
|
|
|
This is a version of [Moirai small](https://huggingface.co/Salesforce/moirai-1.1-R-small) trained by Faculty AI. It was pre-trained on the [LOTSA data](https://huggingface.co/datasets/Salesforce/lotsa_data) using the [codebase](https://github.com/SalesforceAIResearch/uni2ts/tree/main/cli/conf/pretrain) provided by Woo et al. (2024). Both the dataset and codebase are licensed under the [Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0). For more details on the model architecture, training, and results, please refer to the [paper](https://arxiv.org/abs/2402.02592). |
|
|
|
### Usage |
|
|
|
Please follow the [Installation](https://github.com/SalesforceAIResearch/uni2ts?tab=readme-ov-file#%EF%B8%8F-installation) instructions and [Getting Started](https://github.com/SalesforceAIResearch/uni2ts?tab=readme-ov-file#-getting-started) section provided in the uni2ts repo. To use the model trained by Faculty AI simply use `FacultyAI/moirai-small` when fetching the model weights. |
|
|
|
``` |
|
model = MoiraiForecast( |
|
module=MoiraiModule.from_pretrained("FacultyAI/moirai-small"), |
|
... |
|
) |
|
``` |
|
|
|
References |
|
|
|
```markdown |
|
Woo, G., Liu, C., Kumar, A., Xiong, C., Savarese, S., & Sahoo, D. (2024). Unified Training of Universal Time Series Forecasting Transformers. arXiv preprint arXiv:2402.02592. |
|
|
|
``` |
|
|