2024-07-12 –, For Loop (3.2)
NeuroTreeModels.jl introduces NeuroTree
, a differentiable tree operator for tabular data. It's implemented as a general neural network building block, similar to Dense
or Conv
, making it composable with the components of the Flux ecosystem.
The library includes a ready to use regressor that follows the MLJ interface. Comprehensive benchmarks on common datasets show its competitiveness with state-of-the-art boosted tree models.
NeuroTree addresses the greediness of traditional trees by having all node and leaves learnt simultaneously and incorporates the benefits of boosting and bagging through a built-in ensemble of trees.
Computation of leaf weights consists in accumulating the weights through each tree branch, resulting to in-place element wise operations that are not friendly to auto-differentiation engines. This limitation is overcome using custom reverse rules through ChainRules, both for CPU and GPU.
Benchmarks were run against some of the most common regression datasets for regression, classification and ranking tasks and state of the art algos (XGBoost, LightGBM, CatBoost, EvoTrees). We show that NeuroTree is the single best for 2 of the 6 datasets (Higgs and YEAR), close to best on 2 others, but perform worst on the 2 ranking tasks.
We also highlight the relevance of Julia's machine learning capabilities in the commercial context of a portfolio manager.
Jeremie is the Head of Science at Evovest. He joined the firm following his work as an applied research scientist at Element AI, where he expanded his deep learning acumen. He spent over 8 years at Intact in various R&D roles as an actuary (FCAS), where he pushed the development of analytical solutions in various areas, including assessment price optimization and telematics. He also has consulting experience at Willis Towers Watson, providing pragmatic solutions to clients.