Abstract
Gradient Boosting Decision Tree (GBDT) are popular machine learning algorithms with implementations such as LightGBM and in popular machine learning toolkits like Scikit-Learn. Many implementations can only produce trees in an offline manner and in a greedy manner. We explore ways to convert existing GBDT implementations to known neural network architectures with minimal performance loss in order to allow decision splits to be updated in an online manner and provide extensions to allow splits points to be altered as a neural architecture search problem. We provide learning bounds for our neural network.
Abstract (translated)
梯度增强决策树(gbdt)是一种流行的机器学习算法,其实现方式如lightgbm和scikit-learn等流行的机器学习工具包。许多实现只能以离线和贪婪的方式生成树。我们探索将现有的gbdt实现转换为具有最小性能损失的已知神经网络架构的方法,以允许在线更新决策分割,并提供扩展以允许分割点作为神经架构搜索问题进行更改。我们为我们的神经网络提供学习边界。
URL
https://arxiv.org/abs/1904.11132