Skip to content

Gradient Descent

A Bhat edited this page Jun 4, 2020 · 12 revisions

Description

  • An optimization algorithm used to minimize loss function by iteratively moving in the direction of steepest descent
  • Descent defined by the negative of the gradient

Usage

Types

Type Examples per pass
Batch gradient descent During each pass, process all examples
Stochastic gradient descent During each pass, process 1 random example
Mini-batch gradient descent During each pass, process N random examples, with N << all the training examples

FAQ

  • What is the algorithm?
  • How is it used in machine learning?
  • Why are the different ways of doing gradient descent?

See also

Clone this wiki locally