Optimizer is an algorithm or a function that is used to modify parameters such as weights and learning rate to improve the performance of model by reducing loss and improving the accuracy.
Different optimizers
- Gradient Descent
- Stochastic Gradient Descent
- Stochastic Gradient Descent with Momentum
- Mini Batch Gradient Descent
- Adagrad
- RMSProp
- AdaDelta
- Adam
No comments:
Post a Comment