Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: auto-LGBM #55

Open
azev77 opened this issue Jun 5, 2020 · 9 comments
Open

Feature Request: auto-LGBM #55

azev77 opened this issue Jun 5, 2020 · 9 comments

Comments

@azev77
Copy link

azev77 commented Jun 5, 2020

You probably know there are various auto-tuning self tuning packages such as AutoXGBoost etc.
It would be awesome if, at some point, there were options to automate HP tuning for LGBM.

@yalwan-iqvia
Copy link
Collaborator

Hi @azev77

Actually I wasn't aware of AutoXGBoost. Is there something it does which is fundamentally different than more generic hyperparameter optimisers?

The reason why I asked is that we have another project, in a sort of "open preview" state (https://github.com/IQVIA-ML/TreeParzen.jl), which is a hyperparameter optimiser and has an interface to MLJ. I am in fact, currently writing up a small-ish tutorial on its usage.

@azev77
Copy link
Author

azev77 commented Jun 8, 2020

The way I see it, there are generic hyperparameter optimizers (Hyperopt.jl etc) which can be applied to almost any model and there are model specific packages (LiquidSVM). Sometimes the model specific ones are optimized for that specific purpose... Sometimes not.

I'm excited about TreeParzen.jl, I look forward to testdriving it!

@yalwan-iqvia
Copy link
Collaborator

Thanks for the update. Without any particular knowledge about more method-specific hyperparameter optimisers, I can't say there is currently a plan for such a thing, but will keep this in mind 😄

@azev77
Copy link
Author

azev77 commented Jun 8, 2020

Btw, here is a post on autotuning LGBM w/ optuna (https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258).

Optuna has example for LGBM, XGB, Keras, MXNet, Pytorch etc.

I'd love to write a simple script to help auto-tune LGBM in Julia

@yalwan-iqvia
Copy link
Collaborator

https://github.com/optuna/optuna/blob/7315f5f6032ce5a9b0e8ea30e82beb110f5e428b/examples/pruning/lightgbm_integration.py#L34

What do you think? Do you think suggestion here is that these are most useful parameters for tuning?

It feels a bit example-specific, I wouldn't in all applications want to turn on random subspace method or bagging, but they're unconditionally on here in this example.

I guess what I am seeing is that LGBM specific optimisation would typically be embedding some prior knowledge about LGBM, and then it makes me wonder if we can do something like embed this into the MLJ interface for use with any hyperparameter optimiser? I did something previously with SVM hyperparameters (which is much more geometric, so one could use some properties of the training data to try and infer ranges/starting points for the hyperparams).

In general, I am preferring to go with MLJ integration routes for this type of functionality, but, I definitely think there's an opportunity for collaboration here, if we can nail down something suitably simple we can possibly get it mainlined into this module.

Feel free to continue link materials or code snippets in here and we can see what comes out of it. I'll also send a link to the TreeParzen MLJ tutorial when its up (it is with XGB, not LightGBM, but it's more of a usage example).

@yalwan-iqvia
Copy link
Collaborator

This is embarrassingly rough, but take a look if you fancy:

https://github.com/IQVIA-ML/TreeParzen.jl/pull/36/files

@azev77
Copy link
Author

azev77 commented Jun 8, 2020

I just tested your XGB code, looks like it's coming along nicely!

Btw, it's routine to benchmark any self-tuning model as follows (example from AutoXGB paper):
image
image

I was actually thinking of doing something like this with MLJ at some point before I even saw your package... Perhaps we can talk on Skype?

@yalwan-iqvia
Copy link
Collaborator

We should indeed arrange a chat @iqml would also be interested in this

@azev77
Copy link
Author

azev77 commented Jun 9, 2020

I sent an email via Discourse

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants