Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: tutorial on reusing fine-tuned models #579

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

jmoralez
Copy link
Member

@jmoralez jmoralez commented Dec 19, 2024

Adds a tutorial on how to save, use and delete fine-tuned models.

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Copy link
Contributor

github-actions bot commented Dec 19, 2024

Experiment Results

Experiment 1: air-passengers

Description:

variable experiment
h 12
season_length 12
freq MS
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 12.6793 11.0623 47.8333 76
mape 0.027 0.0232 0.0999 0.1425
mse 213.936 199.132 2571.33 10604.2
total_time 1.6901 1.1235 0.0042 0.0032

Plot:

Experiment 2: air-passengers

Description:

variable experiment
h 24
season_length 12
freq MS
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 58.1031 58.4587 71.25 115.25
mape 0.1257 0.1267 0.1552 0.2358
mse 4040.21 4110.79 5928.17 18859.2
total_time 0.7642 0.7424 0.0036 0.0032

Plot:

Experiment 3: electricity-multiple-series

Description:

variable experiment
h 24
season_length 24
freq H
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 178.293 268.129 269.23 1331.02
mape 0.0234 0.0311 0.0304 0.1692
mse 121589 219467 213677 4.68961e+06
total_time 0.7579 1.2174 0.0046 0.0042

Plot:

Experiment 4: electricity-multiple-series

Description:

variable experiment
h 168
season_length 24
freq H
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 465.497 346.972 398.956 1119.26
mape 0.062 0.0436 0.0512 0.1583
mse 835021 403760 656723 3.17316e+06
total_time 0.9292 0.9546 0.0048 0.0042

Plot:

Experiment 5: electricity-multiple-series

Description:

variable experiment
h 336
season_length 24
freq H
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 558.673 459.764 602.926 1340.95
mape 0.0697 0.0565 0.0787 0.17
mse 1.22723e+06 739132 1.61572e+06 6.04619e+06
total_time 0.9568 1.3328 0.0049 0.0044

Plot:

@jmoralez jmoralez marked this pull request as ready for review December 19, 2024 23:51
@@ -0,0 +1,831 @@
{
Copy link
Member

@ngupta23 ngupta23 Dec 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we just rename the method to dump? Appending model_ seems redundant.


Reply via ReviewNB

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a method of pydantic's BaseModel, so I have no control over that. What we could do is something like NixltaClient.finetuned_models(as_df=True) and have that return the dataframe instead. WDYT?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that is a good idea.

@ngupta23
Copy link
Member

Nice enhancement!

@@ -0,0 +1,831 @@
{
Copy link
Member

@ngupta23 ngupta23 Dec 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to elaborate on what the limit is for storing finetuned models?


Reply via ReviewNB

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We haven't settled on the limit, Max wants it to be related to the plan, but right now it's 50 for everyone. I'll set that in the doc for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants