Questions tagged [natural-gradient-boosting]

10 questions
10
votes
4 answers

Can Boosted Trees predict below the minimum value of the training label?

I am using gradient Gradient Boosted Trees (with Catboost) for a Regression task. Can GBtrees predict a label that is below the minimum (or above the max) that was seen in the training ? For instance if the minimum value the label had is 10, would…
Yairh
  • 119
  • 1
  • 5
4
votes
1 answer

Why would GradientBoostClassifier do better than XGBoostClassifier?

I am working on the Kaggle home loan model and interestingly enough, the GradientBoostClassifier has a considerably better score than XGBClassifier. At the same time it seems to not overfit as much. (note, I am running both algos with default…
callmeGuy
  • 175
  • 5
3
votes
1 answer

Output value of a gradient boosting decision tree node that has just a single example in it

The general gradient boosting algorithm for tree-based classifiers is as follows: Input: training set $\{(x_{i},y_{i})\}_{i=1}^{n}$, a differentiable loss function $L(y,F(x))$, and a number of iterations $M$. Algorithm: Initialize model with a…
2
votes
2 answers

House price inflation modelling

I have a data set of house prices and their corresponding features (rooms, meter squared, etc). An additional feature is the sold date of the house. The aim is to create a model that can estimate the price of a house as if it was sold today. For…
2
votes
1 answer

DecisionTreeRegressor under the hood of GradientBoostingClassifier

I'm inspecting the weak estimators of my GradientBoostingClassifier model. This model was fit on a binary class dataset. I noticed that all the weak estimators under this ensemble classifier are decision tree regressor objects. This seems strange to…
2
votes
1 answer

How to enable GPU on GradientBoostingClassifier?

Is there a way to enable GPU on GradientBoostingClassifier?
1
vote
1 answer

Does Gradient Boosting perform n-ary splits where n > 2?

I wonder whether algorithms such as GBM, XGBoost, CatBoost, and LightGBM perform more than two splits at a node in the decision trees? Can a node be split into 3 or more branches instead of merely binary splits? Can more than one feature be used in…
1
vote
1 answer

Handling Categorical Features on NGBoost

Recently I have been doing some research on NGBoost, but I could not see any parameter for categorical features. Is there any parameter that I missed? __init__(self, Dist=, Score=,…
0
votes
1 answer

How to reconstruct a scikit-learn predictor for Gradient Boosting Regressor?

I would like to train my datasets in scikit-learn but export the final Gradient Boosting Regressor elsewhere so that I can make predictions directly on another platform. I am aware that we can obtain the individual decision trees used by the…
0
votes
1 answer

Which other algorithms fit residuals like XGBoost?

XGBoost and standard gradient boosting train learners to fit the residuals rather than the observations themselves. I understand that this aspect of the algorithm matches the boosting mechanism which allows it to iteratively fit errors made by…
Bobby
  • 114
  • 6