Skip to content
forked from dmlc/xgboost

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

License

Notifications You must be signed in to change notification settings

migolan/xgboost

This branch is 1 commit ahead of, 115 commits behind dmlc/xgboost:master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

a6e354b · Jan 1, 2025
Dec 15, 2024
Dec 28, 2024
Oct 19, 2022
Dec 13, 2024
Dec 9, 2024
Dec 6, 2024
Jun 15, 2024
Jan 1, 2025
Aug 21, 2024
Dec 16, 2024
Dec 7, 2024
Dec 19, 2024
Dec 2, 2024
Dec 21, 2024
Dec 19, 2024
Dec 28, 2024
Feb 27, 2024
Mar 3, 2023
Jul 23, 2018
Mar 15, 2023
Dec 4, 2024
Mar 10, 2023
Jun 4, 2024
Jul 13, 2023
Nov 27, 2024
Nov 29, 2023
Jul 15, 2019
Jun 21, 2024
Jul 15, 2024
Oct 19, 2022

Repository files navigation

eXtreme Gradient Boosting

Build Status XGBoost-CI Documentation Status GitHub license CRAN Status Badge PyPI version Conda version Optuna Twitter OpenSSF Scorecard Open In Colab

Community | Documentation | Resources | Contributors | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, Dask, Spark, PySpark) and can solve problems beyond billions of examples.

License

© Contributors, 2021. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page.

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.

Sponsors

Become a sponsor and get a logo here. See details at Sponsoring the XGBoost Project. The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).

Open Source Collective sponsors

Backers on Open Collective Sponsors on Open Collective

Sponsors

[Become a sponsor]

NVIDIA

Backers

[Become a backer]

About

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

Resources

License

Security policy

Citation

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 44.3%
  • Python 20.9%
  • Cuda 16.4%
  • R 7.8%
  • Scala 4.8%
  • Java 3.5%
  • Other 2.3%