Time Series Forecasting
Transformers
Safetensors
t5
time series
forecasting
foundation models
pretrained models
text-generation-inference

Chronos-2

Chronos-2 is a 120M-parameter, encoder-only time series foundation model for zero-shot forecasting. It supports univariate, multivariate, and covariate-informed tasks within a single architecture. Inspired by the T5 encoder, Chronos-2 produces multi-step-ahead quantile forecasts and uses a group attention mechanism for efficient in-context learning across related series and covariates. Trained on a combination of real-world and large-scale synthetic datasets, it achieves state-of-the-art zero-shot accuracy among public models on fev-bench, GIFT-Eval, and Chronos Benchmark II. Chronos-2 is also highly efficient, delivering over 300 time series forecasts per second on a single A10G GPU and supporting both GPU and CPU inference.

Links

Overview

Capability Chronos-2 Chronos-Bolt Chronos
Univariate Forecasting
Cross-learning across items
Multivariate Forecasting
Past-only (real/categorical) covariates
Known future (real/categorical) covariates 🧩 🧩
Max. Context Length 8192 2048 512
Max. Prediction Length 1024 64 64

🧩 Chronos & Chronos-Bolt do not natively support future covariates, but they can be combined with external covariate regressors (see AutoGluon tutorial). This only models per-timestep effects, not effects across time. In contrast, Chronos-2 supports all covariate types natively.

Usage

Local usage

For experimentation and local inference, you can use the inference package.

Install the package

pip install "chronos-forecasting>=2.0"

Make zero-shot predictions using the pandas API

import pandas as pd  # requires: pip install 'pandas[pyarrow]'
from chronos import Chronos2Pipeline

pipeline = Chronos2Pipeline.from_pretrained("amazon/chronos-2", device_map="cuda")

# Load historical target values and past values of covariates
context_df = pd.read_parquet("https://autogluon.s3.amazonaws.com/datasets/timeseries/electricity_price/train.parquet")

# (Optional) Load future values of covariates
test_df = pd.read_parquet("https://autogluon.s3.amazonaws.com/datasets/timeseries/electricity_price/test.parquet")
future_df = test_df.drop(columns="target")

# Generate predictions with covariates
pred_df = pipeline.predict_df(
    context_df,
    future_df=future_df,
    prediction_length=24,  # Number of steps to forecast
    quantile_levels=[0.1, 0.5, 0.9],  # Quantiles for probabilistic forecast
    id_column="id",  # Column identifying different time series
    timestamp_column="timestamp",  # Column with datetime information
    target="target",  # Column(s) with time series values to predict
)

Deploying a Chronos-2 endpoint to SageMaker

For production use, we recommend deploying Chronos-2 endpoints to Amazon SageMaker.

First, update the SageMaker SDK to make sure that all the latest models are available.

pip install -U sagemaker

Deploy an inference endpoint to SageMaker.

from sagemaker.jumpstart.model import JumpStartModel

model = JumpStartModel(
    model_id="pytorch-forecasting-chronos-2",
    instance_type="ml.g5.2xlarge",
)
predictor = model.deploy()

Now you can send time series data to the endpoint in JSON format.

import pandas as pd
df = pd.read_csv("https://raw.githubusercontent.com/AileenNielsen/TimeSeriesAnalysisWithPython/master/data/AirPassengers.csv")

payload = {
    "inputs": [
        {"target": df["#Passengers"].tolist()}
    ],
    "parameters": {
        "prediction_length": 12,
    }
}
forecast = predictor.predict(payload)["predictions"]

For more details about the endpoint API, check out the example notebook

Training data

More details about the training data are available in the technical report.

Citation

If you find Chronos-2 useful for your research, please consider citing the associated paper:

@article{ansari2025chronos2,
  title        = {Chronos-2: From Univariate to Universal Forecasting},
  author       = {Abdul Fatir Ansari and Oleksandr Shchur and Jaris Küken and Andreas Auer and Boran Han and Pedro Mercado and Syama Sundar Rangapuram and Huibin Shen and Lorenzo Stella and Xiyuan Zhang and Mononito Goswami and Shubham Kapoor and Danielle C. Maddix and Pablo Guerron and Tony Hu and Junming Yin and Nick Erickson and Prateek Mutalik Desai and Hao Wang and Huzefa Rangwala and George Karypis and Yuyang Wang and Michael Bohlke-Schneider},
  year         = {2025},
  url          = {https://arxiv.org/abs/2510.15821}
}
Downloads last month
5,389
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Datasets used to train amazon/chronos-2

Spaces using amazon/chronos-2 2

Collection including amazon/chronos-2