Regression models are used to predict a continuous value. Predicting prices of a house given the features of house like price, size etc is one of the common examples of Regression. Here we predict a target variable y based on the input variable x. Based on degree of x, regression is categorized.
‘Regression’ explains how an independent variable is numerically associated with the dependent variable. Nothing but it helps in estimating a variable’s value based on another given value.
This is one of the common and interesting type of Regression technique. Here we predict a target variable Y based on the input variable X. A linear relationship should exist between target variable and predictor and so comes the name Linear Regression. Consider predicting the expences of a family based on the number of people. We can easily identify that there seems to be a correlation between family expences and members count. The hypothesis of linear regression is
Y represents expences, X is members count and a and b are the coefficients of equation. So in order to predict Y (expences) given X (members count), we need to know the values of a and b (the model’s coefficients).
degree of the regresion is 2 for polynomial regression. If data is not fitting linear regression, we can do polynomial regression.
Hence If we try to use a simple linear regression in the above graph then the linear regression line won’t fit very well. It is very difficult to fit a linear regression line in the above graph with a low value of error. Hence we can try to use the polynomial regression to fit a polynomial line so that we can achieve a minimum error or minimum cost function. The equation of the polynomial regression for the above graph data would be: