The algorithm used to train neural networks by computing how much each weight contributed to prediction errors, then adjusting weights to reduce those errors. Backpropagation is how models "learn" from training data; errors propagate backward through the network, and weights are updated accordingly.
See: Gradient descent; Training; Weights