Controlling the Anomaly Detection Process
Learn how to refine TimeGPT’s anomaly detection process by tuning parameters for improved accuracy and alignment with specific use cases.
This notebook shows you how to refine TimeGPT’s anomaly detection process. By tuning parameters, you can align anomaly detection with specific use cases and improve accuracy.
Why Anomaly Detection?
TimeGPT leverages forecast errors to identify anomalies in your time-series data. By optimizing parameters, you can detect subtle deviations and customize results for specific use cases.
Key Parameters
• detection_size determines data window size for threshold calculation.
• level sets confidence intervals for anomaly thresholds.
• freq aligns detection with data frequency (e.g., “D” for daily).
1. Install and Import Dependencies
In your environment, install and import the necessary libraries:
2. Initialize the Nixtla Client
Create an instance of NixtlaClient with your API key:
If you are using an Azure AI endpoint, set the base_url parameter:
3. Conduct a baseline detection
Load a portion of the Peyton Manning dataset to illustrate the default anomaly detection process:
x | unique_id | ds | y |
---|---|---|---|
2764 | 0 | 2015-07-05 | 6.499787 |
2765 | 0 | 2015-07-06 | 6.859615 |
2766 | 0 | 2015-07-07 | 6.881411 |
2767 | 0 | 2015-07-08 | 6.997596 |
2768 | 0 | 2015-07-09 | 7.152269 |
Load a portion of the Peyton Manning dataset to illustrate the default anomaly detection process:
x | unique_id | ds | y |
---|---|---|---|
2764 | 0 | 2015-07-05 | 6.499787 |
2765 | 0 | 2015-07-06 | 6.859615 |
2766 | 0 | 2015-07-07 | 6.881411 |
2767 | 0 | 2015-07-08 | 6.997596 |
2768 | 0 | 2015-07-09 | 7.152269 |
Baseline Detection Log Output
Baseline Detection Log Output
Baseline Anomaly Detection Visualization
4. Fine-tuned detection
TimeGPT detects anomalies based on forecast errors. By improving your model’s forecasts, you can strengthen anomaly detection performance. The following parameters can be fine-tuned:
• finetune_steps: Number of additional training iterations
• finetune_depth: Depth level for refining the model
• finetune_loss: Loss function used during fine-tuning
Fine-tuned Detection Log Output
Fine-tuned Detection Log Output
Fine-tuned TimeGPT Anomaly Detection
5. Adjusting Forecast Horizon and Step Size
You can refine the resolution and sensitivity of anomaly detection by modifying forecast horizon (h) and the interval between detection windows (step_size).
Adjusted Horizon and Step Size Visualization
Choosing h
and step_size
depends on the nature of your data:
• Frequent or short anomalies: Use smaller h
and step_size
• Smooth or longer trends: Choose larger h
and step_size
You have successfully refined anomaly detection using TimeGPT. Experiment with different fine-tuning strategies, horizons, and step sizes to tailor alerts for your unique data patterns.