Article
Version 1
Preserved in Portico This version is not peer-reviewed
DEBoost: A Python Library for Weighted Distance Ensembling in Machine Learning
Version 1
: Received: 22 May 2020 / Approved: 23 May 2020 / Online: 23 May 2020 (04:54:39 CEST)
How to cite: Khoong, W. H. DEBoost: A Python Library for Weighted Distance Ensembling in Machine Learning. Preprints 2020, 2020050354. https://doi.org/10.20944/preprints202005.0354.v1 Khoong, W. H. DEBoost: A Python Library for Weighted Distance Ensembling in Machine Learning. Preprints 2020, 2020050354. https://doi.org/10.20944/preprints202005.0354.v1
Abstract
In this paper, we introduce deboost, a Python library devoted to weighted distance ensembling of predictions for regression and classification tasks. Its backbone resides on the scikit-learn library for default models and data preprocessing functions. It offers flexible choices of models for the ensemble as long as they contain the predict method, like the models available from scikit-learn. deboost is released under the MIT open-source license and can be downloaded from the Python Package Index (PyPI) at https://pypi.org/project/deboost. The source scripts are also available on a GitHub repository at https://github.com/weihao94/DEBoost.
Supplementary and Associated Material
https://pypi.org/project/deboost/: PyPI repository for Python library/package
https://github.com/weihao94/DEBoost: GitHub repository for the scripts
Keywords
ensemble learning; machine learning; Python; spatial distance; statistical distance; weighted ensemble
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment