Questions tagged [pruning]
14 questions
6
votes
1 answer
What is Pruning & Truncation in Decision Trees?
Pruning & Truncation
As per my understanding
Truncation: Stop the tree while it is still growing so that it may not end up with leaves containing very low data points. One way to do this is to set a minimum number of training inputs to use on each…
Pluviophile
- 3,520
- 11
- 29
- 49
4
votes
1 answer
Optuna Median Pruner n_warmup_steps
For Gradient Boosting Models such as XGBOOST and LGBM does n_warmup_steps in optuna.pruners.MedianPruner refer to the minimum number of folds evaluated before pruning is triggered?
I.e. if number of CV folds equals 5 then n_warmup_steps=1 means…
Kjetil Haukås
- 141
- 2
1
vote
1 answer
Efficient Decision Tree Pruning
Is there an efficient way to handle pruning in Decision Tree with Python ?
Currently I'm doing that:
def do_best_tree(Xtrain, ytrain, Xtest, ytest):
clf = DecisionTreeClassifier()
clf.fit(Xtrain, ytrain)
path =…
EzrielS
- 323
- 1
- 7
1
vote
1 answer
cost-complexity-pruning-path with pipeline
I'm using Kaggle's titanic set. I'm using pieplines and I'm trying to prune my decision tree and for that I want the cost_complexity_pruning_path. The last line of code produces the error:
ValueError: could not convert string to float: 'male' …
user5744148
- 113
- 2
1
vote
0 answers
Search for redundant filters(channels) in CNN
When training a CNN one specifies in each layer the number of channels. In the input we have 1 channel for grayscale image and 3 for RGB image, and then usually the image resolution is decreased, whereas the number of channels increases (64, 128,…
spiridon_the_sun_rotator
- 294
- 1
- 7
1
vote
0 answers
How to apply pruning on a BERT model?
I have trained a BERT model using ktrain (tensorflow wrapper) to recognize emotion on text, it works but it suffers from really slow inference. That makes my model not suitable for a production environment. I have done some research and it seems…
Stamatis Tiniakos
- 85
- 8
1
vote
0 answers
Different Decision Tree pruning method
I am trying to learn different pruning methods for decision trees. I have put together a list of methods below.
Reduced Error Pruning
Cost Complexity pruning
Minimum error pruning
Pessimistic Error Pruning
Critical Value Pruning
Error Based…
Surbhi
- 11
- 2
1
vote
1 answer
Structured and unstructured pruning for deep learning models
I was trying to understand structured and unstructured pruning techniques used for deep learning models: link 1 and link 2. To recap what I have understood that unstructured pruning is based on weight pruning however structured pruning is basically…
root
- 125
- 1
- 7
1
vote
0 answers
Difference between rpart models, one with information split the other with rpart.control
What is the difference between these two models?
bankmodel <- rpart(y ~ ., data = train, method = "class", control = rpart.control(cp = 0))
info.model <- rpart(y~., data = train, parms=list(split="information"))
I see one is split using the…
cocoakrispies98
- 163
- 4
0
votes
1 answer
Pruning in Decision trees
Following is what I learned about the process followed during building and pruning a decision tree, mathematically (from Introduction to Machine Learning by Gareth James et al.):
Use recursive binary splitting to grow a large tree on the…
Kuljeet Keshav
- 115
- 2
0
votes
2 answers
Weight pruning of CNN
I was confused when i was reading about weight pruning on CNN. Is it applied for all the layers including convolutional layers or only it is done for dense layers?
root
- 125
- 1
- 7
0
votes
0 answers
Pruning using BERTology
I am trying out some BERT based models for a question and answering task. I need models trained on squad v2.0. To cut down on the inference time , I'm trying out pruning. I came across the BERTology example script for pruning. However, I'm confused…
satan 29
- 103
- 6
0
votes
0 answers
Difference between CNN filter and channel pruning
I've been reading up on DNN pruning techniques and came across filter pruning and channel pruning in CNNs. However, as was my understanding, confirmed by this paper:
Pruning a filter
in layer $i$ is equivalent to pruning the corresponding…
Alex P
- 101
- 1
0
votes
0 answers
What is the difference between using Minimal Cost-Complexity Pruning and testing all possible tree depths in a decision tree?
I´m studying the sklearn decision tree classifier and I´m having some trouble understanding the concept of pruning. From what I understand it consists in making the tree less deep in order to avoid overfitting. That can also be achieved by setting a…
Ajoa
- 1