EASE AI
Home / Bridges / Math FoundationsMachine Learning

Bridge: Math FoundationsMachine Learning

Bridge 1 of 7 · See how Gradient Descent evolves into Linear Regression

From: Math Foundations

Gradient Descent

StartMinAbstract Loss: f(x,y) = x² + y²
θ = θ - α∇L(θ)
What changes?
1Apply gradient descent to optimize w and b
2Loss function measures prediction error on real data
3Same optimization algorithm, applied to data fitting
4Parameters have interpretable meaning (slope, intercept)
To: Machine Learning

Linear Regression

ŷ = wx + bMSE Loss on real data
ŷ = wx + b, minimize L = Σ(y-ŷ)²

Key Insight

Gradient Descent from Math module is the exact same algorithm used to train Linear Regression. The only difference is the loss function: in Math you minimized an abstract function, in ML you minimize prediction error on real data.