Tuesday, April 25, 2017

Linear Regression Algorthims in Scikit-Learn

Hi,

While i am working on different regression algorithms in scikit-learn library. I would like to share some important tips to differentiate between major linear regression algorithms in Machine Learning space.

Below is a comparison table to compare among four linear regression algorithms:


The general idea of Gradient Descent (GD) is to tweak parameters iteratively in order to minimize a cost function.

Batch and Stochastic Gradient Descent: at each step, both algorithms compute the gradients based on the full training dataset (as in Batch GD) or based on just one instance (as in Stochastic GD).

While in Mini-Batch Gradient Descent algorithm: computes the gradients based on small random sets of instances called mini batches.


There are more linear regression algorithms in sklearn that is not covered in this blog post, you can find it here:  http://scikit-learn.org/stable/modules/sgd.html#regression


Hope this helps!

No comments: