项目作者: nickkunz

项目描述 :
Nested Cross-Validation for Bayesian Optimized Gradient Boosting
高级语言: Python
项目地址: git://github.com/nickkunz/nestedhyperboost.git
创建时间: 2020-03-05T22:30:53Z
项目社区:https://github.com/nickkunz/nestedhyperboost

开源协议:GNU General Public License v3.0

下载




Nested Cross-Validation for Bayesian Optimized Gradient Boosting

PyPI version
License: GPL v3
Build Status
Codacy Badge
GitHub last commit

Description

A Python implementation that unifies Nested K-Fold Cross-Validation, Bayesian Hyperparameter Optimization, and Gradient Boosting. Designed for rapid prototyping on small to mid-sized data sets (can be manipulated within memory). Quickly obtains high quality prediction results by abstracting away tedious hyperparameter tuning and implementation details in favor of usability and implementation speed. Bayesian Hyperparamter Optimization utilizes Tree Parzen Estimation (TPE) from the Hyperopt package. Gradient Boosting can be conducted one of three ways. Select between XGBoost, LightGBM, or CatBoost. XGBoost is applied using traditional Gradient Tree Boosting (GTB). LightGBM is applied using its novel Gradient Based One Sided Sampling (GOSS). CatBoost is applied usings its novel Ordered Boosting. NestedHyperBoost can be applied to regression, multi-class classification, and binary classification problems.

Features

  1. Consistent syntax across all Gradient Boosting methods.
  2. Supported Gradient Boosting methods: XGBoost, LightGBM, CatBoost.
  3. Returns custom object that includes common performance metrics and plots.
  4. Developed for readability, maintainability, and future improvement.

Requirements

  1. Python 3
  2. NumPy
  3. Pandas
  4. MatPlotLib
  5. Scikit-Learn
  6. Hyperopt
  7. XGBoost
  8. LightGBM
  9. CatBoost

Installation

  1. ## install pypi release
  2. pip install nestedhyperboost
  3. ## install developer version
  4. pip install git+https://github.com/nickkunz/nestedhyperboost.git

Usage

  1. ## load libraries
  2. from nestedhyperboost import xgboost
  3. from sklearn import datasets
  4. import pandas
  5. ## load data
  6. data_sklearn = datasets.load_iris()
  7. data = pandas.DataFrame(data_sklearn.data, columns = data_sklearn.feature_names)
  8. data['target'] = pandas.Series(data_sklearn.target)
  9. ## conduct nestedhyperboost
  10. results = xgboost.xgb_ncv_classifier(
  11. data = data,
  12. y = 'target',
  13. k_inner = 5,
  14. k_outer = 5,
  15. n_evals = 10
  16. )
  17. ## preview results
  18. results.accu_mean()
  19. results.conf_mtrx()
  20. results.prfs_mean()
  21. ## preview plots
  22. results.feat_plot()
  23. ## model and params
  24. model = results.model
  25. params = results.params

License

© Nick Kunz, 2019. Licensed under the General Public License v3.0 (GPLv3).

Contributions

NestedHyperBoost is open for improvements and maintenance. Your help is valued to make the package better for everyone.

References

Bergstra, J., Bardenet, R., Bengio, Y., Kegl, B. (2011). Algorithms for Hyper-Parameter Optimization. https://papers.nips.cc/paper/4443-algorithms-for-hyper-parameter-optimization.pdf.

Bergstra, J., Yamins, D., Cox, D. D. (2013). Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures.
Proceedings of the 30th International Conference on International Conference on Machine Learning. 28:I115–I123.
http://proceedings.mlr.press/v28/bergstra13.pdf.

Chen, T., Guestrin, C. (2016). XGBoost: A Scalable Tree Boosting System. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 785–794.
https://www.kdd.org/kdd2016/papers/files/rfp0697-chenAemb.pdf.

Ke, G., Meng, Q., Finley, T., et al. (2017). LightGBM: A Highly Efficient Gradient Boosting Decision Tree. Proceedings of the 31st International Conference on Neural Information Processing Systems. 3146-3154. https://papers.nips.cc/paper/6907-lightgbm-a-highly-efficient-gradient-boosting-decision-tree.pdf.

Prokhorenkova, L., Gusev, G., Vorobev, A., et al. (2018). CatBoost: Unbiased Boosting with Categorical Features. Proceedings of the 32nd International Conference on Neural Information Processing Systems. 6639–6649.
http://learningsys.org/nips17/assets/papers/paper_11.pdf.