Can we just use SGDClassifier with log loss instead of Logistic regression, would they have similar results ?
Asked
Active
Viewed 731 times
3
-
Yes. With SGD, will need an optimum LR – 10xAI Jan 04 '21 at 16:23
1 Answers
1
From a practical point view, yes you could use both options seamlesly with in general similar results (check this scikit learn functionality), but:
- SGD (Stochastic Gradient Descent) is an optimization algorithm among others
- log loss/hinge loss... are the loss functions used in the selected optimization strategy (SGD in your case) to find the optimal weights to fit such linear models, but another optimization solvers could be used with those algorithms. See this post for logistic regression with sklearn, where you have the solver parameter, which can be, among others, 'sag' (stochastic average gradient descent), but others like 'lfbs' are also accepted.
German C M
- 2,674
- 4
- 18