Questions tagged [natural-gradient-boosting]
10 questions
10
votes
4 answers
Can Boosted Trees predict below the minimum value of the training label?
I am using gradient Gradient Boosted Trees (with Catboost) for a Regression task. Can GBtrees predict a label that is below the minimum (or above the max) that was seen in the training ?
For instance if the minimum value the label had is 10, would…
Yairh
- 119
- 1
- 5
4
votes
1 answer
Why would GradientBoostClassifier do better than XGBoostClassifier?
I am working on the Kaggle home loan model and interestingly enough, the GradientBoostClassifier has a considerably better score than XGBClassifier. At the same time it seems to not overfit as much. (note, I am running both algos with default…
callmeGuy
- 175
- 5
3
votes
1 answer
Output value of a gradient boosting decision tree node that has just a single example in it
The general gradient boosting algorithm for tree-based classifiers is as follows:
Input: training set $\{(x_{i},y_{i})\}_{i=1}^{n}$, a differentiable loss function $L(y,F(x))$, and a number of iterations $M$.
Algorithm:
Initialize model with a…
figs_and_nuts
- 775
- 1
- 4
- 13
2
votes
2 answers
House price inflation modelling
I have a data set of house prices and their corresponding features (rooms, meter squared, etc). An additional feature is the sold date of the house. The aim is to create a model that can estimate the price of a house as if it was sold today. For…
Melly Donald
- 21
- 2
2
votes
1 answer
DecisionTreeRegressor under the hood of GradientBoostingClassifier
I'm inspecting the weak estimators of my GradientBoostingClassifier model. This model was fit on a binary class dataset.
I noticed that all the weak estimators under this ensemble classifier are decision tree regressor objects. This seems strange to…
Oliver Foster
- 862
- 5
- 12
2
votes
1 answer
How to enable GPU on GradientBoostingClassifier?
Is there a way to enable GPU on GradientBoostingClassifier?
callmeGuy
- 175
- 5
1
vote
1 answer
Does Gradient Boosting perform n-ary splits where n > 2?
I wonder whether algorithms such as GBM, XGBoost, CatBoost, and LightGBM perform more than two splits at a node in the decision trees? Can a node be split into 3 or more branches instead of merely binary splits? Can more than one feature be used in…
Chong Lip Phang
- 221
- 2
- 8
1
vote
1 answer
Handling Categorical Features on NGBoost
Recently I have been doing some research on NGBoost, but I could not see any parameter for categorical features. Is there any parameter that I missed?
__init__(self, Dist=, Score=,…
kaanbay
- 23
- 2
0
votes
1 answer
How to reconstruct a scikit-learn predictor for Gradient Boosting Regressor?
I would like to train my datasets in scikit-learn but export the final Gradient Boosting Regressor elsewhere so that I can make predictions directly on another platform.
I am aware that we can obtain the individual decision trees used by the…
Chong Lip Phang
- 221
- 2
- 8
0
votes
1 answer
Which other algorithms fit residuals like XGBoost?
XGBoost and standard gradient boosting train learners to fit the residuals rather than the observations themselves. I understand that this aspect of the algorithm matches the boosting mechanism which allows it to iteratively fit errors made by…
Bobby
- 114
- 6