SARATH THARAYILHS.T.[W] WRITEUPSWWRITEUPS[K] CONCEPTSKCONCEPTS[P] PROJECTSPPROJECTS[A] ABOUTAABOUT
മ
/ SYSTEM

Building thoughtful software, writing notes, and shipping experiments across data, AI, and the web.

No cookies, no tracking. Preferences are stored locally in your browser. Anonymous view counts are kept server-side.

© 2026 Sarath Tharayil/IST --:--:--

Concepts(3)

Plain-language explanations of technical ideas.

Backpropagation

The algorithm that calculates how much each weight in a network contributed to the error, making gradient descent possible at scale.

Deep LearningTrainingNeural Networks

Gradient Descent

The algorithm that teaches a neural network to get better over time by nudging its weights in the right direction.

Deep LearningOptimizationTraining

Vanishing Gradient Problem

Why deep networks and recurrent models struggle to learn when the error signal fades out before reaching the early layers.

Deep LearningTrainingRNN