Skip to main content
POST
/
v2
/
finetune
Foundational Time Series Model Multi Series Finetuning
curl --request POST \
  --url https://api.nixtla.io/v2/finetune \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '{
  "series": {
    "y": [
      0,
      1,
      2,
      3,
      4,
      5,
      6,
      7,
      8,
      9,
      10,
      11,
      12,
      13,
      14,
      15,
      16,
      17,
      18,
      19,
      20,
      21,
      22,
      23,
      24,
      25,
      26,
      27,
      28,
      29,
      30,
      31,
      32,
      33,
      34,
      35
    ],
    "sizes": [
      36
    ]
  },
  "finetune_steps": 10,
  "freq": "MS",
  "model": "timegpt-1"
}'
{
  "input_tokens": 1,
  "output_tokens": 1,
  "finetune_tokens": 1,
  "finetuned_model_id": "<string>"
}

Authorizations

Authorization
string
header
required

HTTPBearer

Body

application/json
series
object
required
freq
string
required

The frequency of the data represented as a string. 'D' for daily, 'M' for monthly, 'H' for hourly, and 'W' for weekly frequencies are available.

model
any

Model to use as a string. Common options are (but not restricted to) timegpt-1 and timegpt-1-long-horizon. Full options vary by different users. Contact ops@nixtla.io for more information. We recommend using timegpt-1-long-horizon for forecasting if you want to predict more than one seasonal period given the frequency of your data.

finetune_steps
integer
default:10

The number of tuning steps used to train the large time model on the data. Set this value to 0 for zero-shot inference, i.e., to make predictions without any further model tuning.

Required range: x > 0
finetune_loss
enum<string>
default:default

The loss used to train the large time model on the data. Select from ['default', 'mae', 'mse', 'rmse', 'mape', 'smape']. It will only be used if finetune_steps larger than 0. Default is a robust loss function that is less sensitive to outliers.

Available options:
default,
mae,
mse,
rmse,
mape,
smape,
poisson
finetune_depth
enum<integer>
default:1

The depth of the finetuning. Uses a scale from 1 to 5, where 1 means little finetuning, and 5 means that the entire model is finetuned. By default, the value is set to 1.

Available options:
1,
2,
3,
4,
5
output_model_id
string | null

ID to assign to the finetuned model

finetuned_model_id
string | null

ID of previously finetuned model

Response

Successful Response

input_tokens
integer
required
Required range: x >= 0
output_tokens
integer
required
Required range: x >= 0
finetune_tokens
integer
required
Required range: x >= 0
finetuned_model_id
string
required
I