Improve Forecast Accuracy with TimeGPT

This guide demonstrates how to improve forecast accuracy using TimeGPT. We use hourly electricity price data from Germany as an illustrative example. Before you begin, make sure you have initialized the NixtlaClient object with your API key.

Forecasting Results Overview

Below is a summary of our experiments and the corresponding accuracy improvements. We progressively refine forecasts by adding fine-tuning steps, adjusting loss functions, increasing the number of fine-tuned parameters, incorporating exogenous variables, and switching to a long-horizon model.

StepsDescriptionMAEMAE Improvement (%)RMSERMSE Improvement (%)
0Zero-Shot TimeGPT18.5N/A20.0N/A
1Add Fine-Tuning Steps11.538%12.637%
2Adjust Fine-Tuning Loss9.648%11.045%
3Fine-tune More Parameters9.051%11.344%
4Add Exogenous Variables4.675%6.468%
5Switch to Long-Horizon Model6.465%7.762%

Step-by-Step Guide

Step 1: Install and Import Packages

Make sure all necessary libraries are installed and imported. Then set up the Nixtla client (replace with your actual API key).

import numpy as np
import pandas as pd
from utilsforecast.evaluation import evaluate
from utilsforecast.plotting import plot_series
from utilsforecast.losses import mae, rmse
from nixtla import NixtlaClient

nixtla_client = NixtlaClient(
    # api_key='my_api_key_provided_by_nixtla'
)

Step 2: Load the Dataset

We use hourly electricity price data from Germany (unique_id == "DE"). The final two days (48 data points) form the test set.

df = pd.read_csv('https://raw.githubusercontent.com/Nixtla/transfer-learning-time-series/main/datasets/electricity-short-with-ex-vars.csv')
df['ds'] = pd.to_datetime(df['ds'])

df_sub = df.query('unique_id == "DE"')

df_train = df_sub.query('ds < "2017-12-29"')
df_test = df_sub.query('ds >= "2017-12-29"')

df_train.shape, df_test.shape

Hourly electricity price for Germany (training period highlighted).

Step 3: Benchmark Forecast with TimeGPT

Info: We first generate a zero-shot forecast using TimeGPT, which captures overall trends but may struggle with short-term fluctuations.

fcst_timegpt = nixtla_client.forecast(
    df=df_train[['unique_id', 'ds', 'y']],
    h=2*24,
    target_col='y',
    level=[90, 95]
)

Evaluation Metrics

unique_idmetricTimeGPT
DEmae18.519
DErmse20.038

Zero-shot TimeGPT Forecast

Step 4: Methods to Enhance Forecasting Accuracy

Use these following strategies to refine and improve your forecast:

4.1 Add Fine-tuning Steps

Further fine-tuning typically reduces forecasting errors by adjusting the internal weights of the TimeGPT model, allowing it to better adapt to your specific data.

fcst_df = nixtla_client.forecast(df=df_train[['unique_id', 'ds', 'y']],
                                 h=24*2,
                                 finetune_steps = 30,
                                 level=[90, 95])

Add 30 Fine-tuning Steps

Evaluation result:

unique_idmetricTimeGPT
DEmae11.458
DErmse12.643

4.2 Fine-tune Using Different Loss Functions

Trying different loss functions (e.g., MAE, MSE) can yield better results for specific use cases.

fcst_df = nixtla_client.forecast(df=df_train[['unique_id', 'ds', 'y']],
                                 h=24*2,
                                 finetune_steps = 30,
                                 finetune_loss = 'mae',
                                 level=[90, 95])

Fine-tune with MAE loss function

Evaluation result:

unique_idmetricTimeGPT
DEmae9.641
DErmse10.956

4.3 Adjust Number of Fine-tuned Parameters

The finetune_depth parameter controls how many model layers are fine-tuned. It ranges from 1 (few parameters) to 5 (more parameters).

fcst_df = nixtla_client.forecast(df=df_train[['unique_id', 'ds', 'y']],
                                 h=24*2,
                                 finetune_steps = 30,
                                 finetune_depth=2,
                                 finetune_loss = 'mae',
                                 level=[90, 95])

Fine-tune with depth of 2

Evaluation result:

unique_idmetricTimeGPT
DEmae9.002
DErmse11.348

4.4 Forecast with Exogenous Variables

Incorporate external data (e.g., weather conditions) to boost predictive performance.

#import exogenous variables
future_ex_vars_df = df_test.drop(columns = ['y'])
future_ex_vars_df.head()

#make forecast with historical and future exogenous variables
fcst_df = nixtla_client.forecast(df=df_train,
                                 X_df=future_ex_vars_df,
                                 h=24*2,
                                 level=[90, 95])

Add exogenous variables

Evaluation result:

unique_idmetricTimeGPT
DEmae4.603
DErmse6.359

4.5 Use a Long-Horizon Model

For longer forecasting periods, models optimized for multi-step predictions tend to perform better. You can enable this by setting the model parameter to timegpt-1-long-horizon.

fcst_df = nixtla_client.forecast(df=df_train[['unique_id', 'ds', 'y']],
                                 h=24*2,
                                 model = 'timegpt-1-long-horizon',
                                 level=[90, 95])

Use a Long-Horizon Model

Evaluation result:

unique_idmetricTimeGPT
DEmae6.366
DErmse7.738

Step 5: Conclusion and Next Steps

Key takeaways:

The following strategies offer consistent improvements in forecast accuracy. We recommend systematically experimenting with each approach to find the best combination for your data.

  • Increase the number of fine-tuning steps.

  • Experiment with different loss functions.

  • Incorporate exogenous data.

  • Switching to the long-horizon model for extended forecasting periods.

Success: Small refinements—like adding exogenous data or adjusting fine-tuning parameters—can significantly improve your forecasting results.


Result Summary

StepsDescriptionMAEMAE Improvement (%)RMSERMSE Improvement (%)
0Zero-Shot TimeGPT18.5N/A20.0N/A
1Add Fine-Tuning Steps11.538%12.637%
2Adjust Fine-Tuning Loss9.648%11.045%
3Fine-tune More Parameters9.051%11.344%
4Add Exogenous Variables4.675%6.468%
5Switch to Long-Horizon Model6.465%7.762%