MLP stands for multi-layer perceptron, the most basic kind of neural network. Also called DNN (deep neural network), as opposed to CNN or RNN (convolutional and recurrent neural networks).
Questions tagged [mlp]
107 questions
21
votes
2 answers
How to adjust the hyperparameters of MLP classifier to get more perfect performance
I am just getting touch with Multi-layer Perceptron. And, I got this accuracy when classifying the DEAP data with MLP. However, I have no idea how to adjust the hyperparameters for improving the result.
Here is the detail of my code and…
Irving.ren
- 327
- 1
- 2
- 7
11
votes
2 answers
How do I get the feature importace for a MLPClassifier?
I use the MLPClassifier from scikit learn. I have about 20 features. Is there a scikit method to get the feature importance? I found
clf.feature_importances_
but it seems that it only exists for decision trees.
jochen6677
- 561
- 2
- 4
- 9
8
votes
4 answers
How to handle features which are not always available?
I have a feature in my feature vector that is not always available respectively sometimes (for some samples) it makes no sense to use it. I feed a sklearn MLPClassifier with this feature vector. Does the neural network learn by itself when the…
jochen6677
- 561
- 2
- 4
- 9
8
votes
2 answers
What is the difference between multi-layer perceptron and generalized feed forward neural network?
I'm reading this paper:An artificial neural network model for rainfall forecasting in Bangkok, Thailand. The author created 6 models, 2 of which have the following architecture:
model B: Simple multilayer perceptron with Sigmoid activation function…
hyTuev
- 267
- 2
- 9
6
votes
1 answer
Is a multi-layer perceptron exactly the same as a simple fully connected neural network?
I've been learning a little about StyleGans lately and somebody told me that a Multi-Layer Perceptron, MLP, is used in parts of the architecture for transforming noise. When I saw this person's code, it just looked like a normal 8-layer fully…
zipline86
- 349
- 4
- 12
5
votes
1 answer
Can a neural network recognize a letter B as an A if your trained it so?
You have a neural network. And you have, say, pictures of $100,000$ hand-written letters (A-Z). Now you make a typical Training and the neural network will recognize an A as an A, a B as a B, ...
Now you repeat Training but with a change: you tell…
jochen6677
- 561
- 2
- 4
- 9
4
votes
1 answer
Why would one crossvalidate the random state number?
Still learning about machine learning, I've stumbled across a kaggle (link), which I cannot understand.
Here are lines 72 and 73:
parameters = {'solver': ['lbfgs'],
'max_iter': [1000,1100,1200,1300,1400,1500,1600,1700,1800,1900,2000…
Dan Chaltiel
- 331
- 2
- 10
4
votes
2 answers
Adding more layers decreases accuracy
I have my ANN trained on MNIST dataset. Hidden layer has 128 neurons and input layer has 784 neurons. This gave me an accuracy of 94%. However when I added one more layer with 64 neurons in each then the accuracy significantly reduced to 35%. What…
Pink
- 41
- 1
- 3
3
votes
1 answer
Why does the MAE still remain, at all?
This may seem to be a silly question. But I just wonder why the MAE doesn't reduce to values close to 0.
It's the result of an MLP with 2 hidden layers and 6 neurons per hidden layer, trying to estimate one outputvalue depending on three input…
Turnvater
- 48
- 6
3
votes
1 answer
Neural Network regression negative performance
I have a problem with the performance of a multi layer perceptron regressor (neural network) and I cannot figure out why.
Task: I am trying to improve a time series prediction. I have predictions of a physical parameter of the last 4 years along…
Mark
- 31
- 2
3
votes
4 answers
Are weights of a neural network reset between epochs?
If an epoch is defined as the neural network training process after seeing the whole training data once. How is it that when starting the next epoch, the loss is almost always smaller than the first one? Does this mean that after an epoch the…
user134132523
- 149
- 1
- 3
- 13
3
votes
1 answer
Understanding computations of Perceptron and Multi-Layer Perceptrons on Geometric level
I am currently watching amazing Deep Learning lecture series from Carnegie Melllon University, but I am having little bit of trouble understanding how Perceptrons and MLP are making their decisions on a geometrical level.
I would really like to…
Stefan Radonjic
- 716
- 1
- 7
- 20
3
votes
2 answers
Geometric interpretation of MLP output
I am really interested in the geometric interpretation of perceptron outputs, mainly as a way to better understand what the network is really doing, but I can't seem to find much information on this topic.
I know a perceptron with no hidden layers…
Juan González
- 33
- 3
3
votes
1 answer
How to utilize user feedback due to miss-classification when correct class label is unknown?
Suppose we are developing an app which is supposed to predict a dog's breed by it's picture. We trained a classifier (in my case an MLP) using some dataset and shipped the app to users. Now suppose some user comes and takes a picture of a friend's…
Mehraban
- 133
- 5
3
votes
1 answer
Coding MLP: good practices?
I recently finished coding my own MLP neural network in Python. To make my code easier to read, I separated the MLP, into classes; the network class, the layers class and the neuron class, where the neuron class would do all the calculations such as…
Liam F-A
- 53
- 4