![]() | |
Developer(s) | The XGBoost Contributors |
---|---|
Initial release | March 27, 2014 |
Stable release | 2.1.4[1] ![]() |
Repository | |
Written in | C++ |
Operating system | Linux, macOS, Microsoft Windows |
Type | Machine learning |
License | Apache License 2.0 |
Website | xgboost |
XGBoost[2] (eXtreme Gradient Boosting) is an open-source software library which provides a regularizing gradient boosting framework for C++, Java, Python,[3] R,[4] Julia,[5] Perl,[6] and Scala. It works on Linux, Microsoft Windows,[7] and macOS.[8] From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks Apache Hadoop, Apache Spark, Apache Flink, and Dask.[9][10]
XGBoost gained much popularity and attention in the mid-2010s as the algorithm of choice for many winning teams of machine learning competitions.[11]