XGBoost

XGBoost
Developer(s)The XGBoost Contributors
Initial releaseMarch 27, 2014; 10 years ago (2014-03-27)
Stable release
2.1.4[1] Edit this on Wikidata / 7 February 2025; 31 days ago (7 February 2025)
Repository
Written inC++
Operating systemLinux, macOS, Microsoft Windows
TypeMachine learning
LicenseApache License 2.0
Websitexgboost.ai

XGBoost[2] (eXtreme Gradient Boosting) is an open-source software library which provides a regularizing gradient boosting framework for C++, Java, Python,[3] R,[4] Julia,[5] Perl,[6] and Scala. It works on Linux, Microsoft Windows,[7] and macOS.[8] From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks Apache Hadoop, Apache Spark, Apache Flink, and Dask.[9][10]

XGBoost gained much popularity and attention in the mid-2010s as the algorithm of choice for many winning teams of machine learning competitions.[11]

  1. ^ "Release 2.1.4 Patch Release". 7 February 2025. Retrieved 11 February 2025.
  2. ^ "GitHub project webpage". GitHub. June 2022. Archived from the original on 2021-04-01. Retrieved 2016-04-05.
  3. ^ "Python Package Index PYPI: xgboost". Archived from the original on 2017-08-23. Retrieved 2016-08-01.
  4. ^ "CRAN package xgboost". Archived from the original on 2018-10-26. Retrieved 2016-08-01.
  5. ^ "Julia package listing xgboost". Archived from the original on 2016-08-18. Retrieved 2016-08-01.
  6. ^ "CPAN module AI::XGBoost". Archived from the original on 2020-03-28. Retrieved 2020-02-09.
  7. ^ "Installing XGBoost for Anaconda in Windows". IBM. Archived from the original on 2018-05-08. Retrieved 2016-08-01.
  8. ^ "Installing XGBoost on Mac OSX". IBM. Archived from the original on 2018-05-08. Retrieved 2016-08-01.
  9. ^ "Dask Homepage". Archived from the original on 2022-09-14. Retrieved 2021-07-15.
  10. ^ "Distributed XGBoost with Dask — xgboost 1.5.0-dev documentation". xgboost.readthedocs.io. Archived from the original on 2022-06-04. Retrieved 2021-07-15.
  11. ^ "XGBoost - ML winning solutions (incomplete list)". GitHub. Archived from the original on 2017-08-24. Retrieved 2016-08-01.

From Wikipedia, the free encyclopedia · View on Wikipedia

Developed by Nelliwinne