-
Notifications
You must be signed in to change notification settings - Fork 156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please add CatBoost or any alternate package (pure Julia) which can beat it #992
Comments
There is a discussion about this here: JuliaAI/CatBoost.jl#9 . Seems a pity you are not able to get what you want from EvoTrees.jl, which should be similar to CatBoost but pure julia. In my view, it's a better use of limited resources to improve pure Julia implementations than wrapping python/C implementations. And in the case of gradient tree boosters, we already have julia and MLJ interfaces for XGBoost and LightGBM. Do we really need a 4th tree booster? Of course if someone is interested in an MLJ interface for CatBoost.jl, then I am happy to provide guidance. |
I also thought, EvoTrees is a good option for Julia. I request, please add EvoTrees with Treezen optimization with MLJ. Also, show one example that, it increased the quality of results. Sorry for the discussion. But ,it is Julia learner problems. Once, if I get proper results, I will continue with it ,otherwise, I'm forced by python again. You can use Boston, or AMES MLJ datasets for implementation. |
I think these various tools are individually well-documented. If you have a specific tutorial you'd like to see, please make a request at https://github.com/JuliaAI/DataScienceTutorials.jl/issues |
Yes, they are well documented individually. It is really helpful for beginners and it helped me a lot. But there is a problem getting the best results. We have chosen Julia for fast and best results. MLJ is best with all tools ( specially with pure Julia packages) , MLJ+evotrees+Treezen (or Latin etc) It should beat python based XGBoost ,LightGboost optuna optimization. You have given an example of how to use it. But didn't show ,how to get best. As a Julia learner, it is a suggestion. Thanks for your efforts and cooperation @ablaom |
Yes, I will do the request as you mentioned. |
Thank you very much .@ablaom |
You're welcome. 'm just providing guidance. The main work is being carried out by @tylerjthomas9. |
Closed as completed: https://github.com/JuliaAI/CatBoost.jl#mlj-example |
I'm committed to learn julia . I have tried MLJFLUX, BetaML, MLJ models, Evotrees etc...
but, nothing is giving similar to Catboost Performance and quality of results (MLJFLUX is near but computaionally expesnive).
Im testing for regression and I need to get high quality results for research pulication.
I appriciate entire julia team and MLJ. I like it.
I hope, you will consider my request.
Thank you
The text was updated successfully, but these errors were encountered: