5

I have been doing machine learning for a while, but bits and pieces come together even after some time of practicing.

In neural networks, you adjust the weights by doing one pass (forward pass), and then computing the partial derivatives for the weights (backward pass) after each training example - and subtracting those partial derivatives from the initial weights.

in turn, the calculation of the new weights is mathematically complex (you need to compute the partial derivative of the weights, for which you compute the error at every layer of the neural net - but the input layer).

Is that not by definition an online algorithm, where cost and new weights are calculated after each training example?

Thanks!

2 Answers2

4

There are three training modes for neural networks

  • stochastic gradient descent: Adjust the weights after every single training example
  • batch training: Adjust the weights after going through all data (an epoch)
  • mini-batch training: Adjust the weights after going through a mini-batch. This is usually 128 training examples.

Most of the time, mini-batch training seems to be used.

So the answer is:

No, the neural network learning algorithm is not online algorithm.

Martin Thoma
  • 18,630
  • 31
  • 92
  • 167
1

You can train after each example, or after each epoch. This is the difference between stochastic gradient descent, and batch gradient descent. See pg 84 of Sebastian Raschka's Python Machine Learning book for more.

Russell Richie
  • 231
  • 1
  • 6