Loading...

Gradient descent

An optimization method that adjusts model weights in small steps toward lower prediction error. Think of it as rolling a ball downhill: the algorithm repeatedly moves weights in whatever direction reduces the loss function. Gradient descent determines both training speed and whether the model converges on useful patterns.

See: Backpropagation; Loss function; Training