Improve Forecast Accuracy with TimeGPT
Advanced techniques to enhance TimeGPT forecast accuracy for energy and electricity.
Improve Forecast Accuracy with TimeGPT
This guide demonstrates how to improve forecast accuracy using TimeGPT. We use hourly electricity price data from Germany as an illustrative example. Before you begin, make sure you have initialized the NixtlaClient
object with your API key.
Forecasting Results Overview
Below is a summary of our experiments and the corresponding accuracy improvements. We progressively refine forecasts by adding fine-tuning steps, adjusting loss functions, increasing the number of fine-tuned parameters, incorporating exogenous variables, and switching to a long-horizon model.
Steps | Description | MAE | MAE Improvement (%) | RMSE | RMSE Improvement (%) |
---|---|---|---|---|---|
0 | Zero-Shot TimeGPT | 18.5 | N/A | 20.0 | N/A |
1 | Add Fine-Tuning Steps | 11.5 | 38% | 12.6 | 37% |
2 | Adjust Fine-Tuning Loss | 9.6 | 48% | 11.0 | 45% |
3 | Fine-tune More Parameters | 9.0 | 51% | 11.3 | 44% |
4 | Add Exogenous Variables | 4.6 | 75% | 6.4 | 68% |
5 | Switch to Long-Horizon Model | 6.4 | 65% | 7.7 | 62% |
Step-by-Step Guide
1. Install and Import Packages
Make sure all necessary libraries are installed and imported. Then set up the Nixtla client (replace with your actual API key).
2. Load the Dataset
We use hourly electricity price data from Germany (unique_id == "DE"
). The final two days (48
data points) form the test set.
Dataset Load Output
Dataset Load Output
Hourly electricity price for Germany (training period highlighted).
3. Benchmark Forecast with TimeGPT
Info: We first generate a zero-shot forecast using TimeGPT, which captures overall trends but may struggle with short-term fluctuations.
Forecasting Log Output
Forecasting Log Output
Evaluation Metrics
unique_id | metric | TimeGPT |
---|---|---|
DE | mae | 18.519 |
DE | rmse | 20.038 |
Zero-shot TimeGPT Forecast
4. Methods to Enhance Forecasting Accuracy
Use these strategies to refine and improve your forecast:
Add Fine-tuning Steps
Add Fine-tuning Steps
Further fine-tuning iterations generally reduce errors and produce more accurate forecasts.
Fine-tune Using Different Loss Functions
Fine-tune Using Different Loss Functions
Trying different loss functions (e.g., MAE
, MSE
) can yield better results for specific use cases.
Adjust Number of Fine-tuned Parameters
Adjust Number of Fine-tuned Parameters
Including more parameters in fine-tuning helps TimeGPT adapt closely to the dataset, improving accuracy.
Forecast with Exogenous Variables
Forecast with Exogenous Variables
Incorporate external data (e.g., weather conditions) to boost predictive performance.
Use a Long-Horizon Model
Use a Long-Horizon Model
For extended forecasting periods, a model specifically trained for multi-step predictions often performs better.
5. Conclusion and Next Steps
Key takeaways:
-
Increase the number of fine-tuning steps.
-
Experiment with different loss functions.
-
Incorporate exogenous data.
-
Consider a specialized long-horizon forecasting model.
These strategies offer consistent improvements in forecast accuracy. We recommend systematically experimenting with each approach to find the best combination for your data.
Success: Small refinements—like adding exogenous data or adjusting fine-tuning parameters—can significantly improve your forecasting results.
Result Summary
Steps | Description | MAE | MAE Improvement (%) | RMSE | RMSE Improvement (%) |
---|---|---|---|---|---|
0 | Zero-Shot TimeGPT | 18.5 | N/A | 20.0 | N/A |
1 | Add Fine-Tuning Steps | 11.5 | 38% | 12.6 | 37% |
2 | Adjust Fine-Tuning Loss | 9.6 | 48% | 11.0 | 45% |
3 | Fine-tune More Parameters | 9.0 | 51% | 11.3 | 44% |
4 | Add Exogenous Variables | 4.6 | 75% | 6.4 | 68% |
5 | Switch to Long-Horizon Model | 6.4 | 65% | 7.7 | 62% |