Wednesday, 28 September 2022

Optimizers in Machine Learning

Optimizer is an algorithm or a function that is used to modify parameters such as weights and learning rate to improve the performance of model by reducing loss and improving the accuracy.

Different optimizers

  1. Gradient Descent
  2. Stochastic Gradient Descent
  3. Stochastic Gradient Descent with Momentum
  4. Mini Batch Gradient Descent
  5. Adagrad
  6. RMSProp
  7. AdaDelta
  8. Adam


No comments:

Post a Comment