-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: auto-LGBM #55
Comments
Hi @azev77 Actually I wasn't aware of AutoXGBoost. Is there something it does which is fundamentally different than more generic hyperparameter optimisers? The reason why I asked is that we have another project, in a sort of "open preview" state (https://github.com/IQVIA-ML/TreeParzen.jl), which is a hyperparameter optimiser and has an interface to MLJ. I am in fact, currently writing up a small-ish tutorial on its usage. |
The way I see it, there are generic hyperparameter optimizers (Hyperopt.jl etc) which can be applied to almost any model and there are model specific packages (LiquidSVM). Sometimes the model specific ones are optimized for that specific purpose... Sometimes not. I'm excited about TreeParzen.jl, I look forward to testdriving it! |
Thanks for the update. Without any particular knowledge about more method-specific hyperparameter optimisers, I can't say there is currently a plan for such a thing, but will keep this in mind 😄 |
Btw, here is a post on autotuning LGBM w/ optuna (https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258). Optuna has example for LGBM, XGB, Keras, MXNet, Pytorch etc. I'd love to write a simple script to help auto-tune LGBM in Julia |
What do you think? Do you think suggestion here is that these are most useful parameters for tuning? It feels a bit example-specific, I wouldn't in all applications want to turn on random subspace method or bagging, but they're unconditionally on here in this example. I guess what I am seeing is that LGBM specific optimisation would typically be embedding some prior knowledge about LGBM, and then it makes me wonder if we can do something like embed this into the MLJ interface for use with any hyperparameter optimiser? I did something previously with SVM hyperparameters (which is much more geometric, so one could use some properties of the training data to try and infer ranges/starting points for the hyperparams). In general, I am preferring to go with MLJ integration routes for this type of functionality, but, I definitely think there's an opportunity for collaboration here, if we can nail down something suitably simple we can possibly get it mainlined into this module. Feel free to continue link materials or code snippets in here and we can see what comes out of it. I'll also send a link to the TreeParzen MLJ tutorial when its up (it is with XGB, not LightGBM, but it's more of a usage example). |
This is embarrassingly rough, but take a look if you fancy: |
We should indeed arrange a chat @iqml would also be interested in this |
I sent an email via Discourse |
You probably know there are various auto-tuning self tuning packages such as AutoXGBoost etc.
It would be awesome if, at some point, there were options to automate HP tuning for LGBM.
The text was updated successfully, but these errors were encountered: