-
Notifications
You must be signed in to change notification settings - Fork 1
Gradient Descent
A Bhat edited this page Jun 4, 2020
·
12 revisions
- An optimization algorithm used to minimize loss function by iteratively moving in the direction of steepest descent
- Descent defined by the negative of the gradient
- To update the parameters of the model
Type | Examples per pass |
---|---|
Batch gradient descent | During each pass, process all examples |
Stochastic gradient descent | During each pass, process 1 random example |
Mini-batch gradient descent | During each pass, process N random examples, with N << all the training examples |
- What is the algorithm?
- How is it used in machine learning?
- Why are the different ways of doing gradient descent?