Concepts(3)
Plain-language explanations of technical ideas.
Backpropagation
The algorithm that calculates how much each weight in a network contributed to the error, making gradient descent possible at scale.
Deep LearningTrainingNeural Networks
Gradient Descent
The algorithm that teaches a neural network to get better over time by nudging its weights in the right direction.
Deep LearningOptimizationTraining
Vanishing Gradient Problem
Why deep networks and recurrent models struggle to learn when the error signal fades out before reaching the early layers.
Deep LearningTrainingRNN