Gradient Descent in Regression

Gradient Descent is an iterative algorithm used to tune the parameters in regression models for minimum loss.

Gradient descent step

The size of the step that gradient descent takes is called the learning rate. Finding an adequate value for the learning rate is key to achieve convergence. If this value is too large the algorithm will never reach the optimus, but if is too small it will take too much time to achieve the desired value.

Related Courses

Course

Learn the Basics of Machine Learning

Intermediate

13 Lessons