linear regression vs logistic regression is a two important backbone of data science and machine learning regression models most of the model and algorithm are using these regression fundamentals in the background.

## What is the regression?

Regression analysis is a form of predictive modeling technique which investigates the relationship between a dependent variable Y and independent variable Xn. In another word, regression shows the changes in the dependent variable y-axis with respect to the independent variable x-axis.

Regression analysis a graphing a line for the set of data points which most fit an overall set of data.

## Application of Regression:-

- Weather forecast
- Trend forecast
- Sales and marketing, Relationship between age and income.
- Trend and sales estimation.
- Analyzing the impact of price changes.
- Email spam or not.
- Assessment the risk in financial services and insurance domain.
- A return of investment: – suppose a company invests some fund to promote some brand so they can use regression to calculate the return based on brand marketing investment.
- A company can use regression to find out which department (HR, marketing, Sales, Admin, R&D) are taking more salary on their overall revenue.
- Regression analysis is useful for customer survey and feedback form to check the quality of service
- A company can use a regression model to predict the future employ salary with respect to their experience.

# Linear Regression Vs Logistic Regression

Basic | Linear Regression | Logistic Regression |

Core concept | A data point is in the form of a straight line | Data point are in not straight line, they vary between to point. |

Used with | Continuous variables | Categorical variable |

Output Prediction | Values of variable | The probability of occurrence of an event |

Accuracy and goodness of fit | Measured by loss, R Squared, Adjusted R squared etc. | Accuracy, Precision, Recall, F1 Score, ROC curve, Confusion matrix, etc. |

Pattern | A linear regression map to continues X to continues Y | Logistic Regression map to continuous X to binary Y.
You can use this to a category like truth/false, yes/no, statement. |

Formula |
||

Graph type | Linear graph | Sigmoid graph |

Regression line also called best-fit line which shows the relationship between the independent variable and dependent variable.

Positive Regression line is the line where dependent values at Y-axis is a proposal to an independent value X-axis line of slop is +veĀ Y = mx+c

Negative Regression line where your independent values increasing on X-axis and dependent values on y-axis decreasing so Y values inversely professional to X values where the slope of the regression line is -ve Y = -mx+c

Logistic regression work as a switch it will be either 0 or 1 logistic regression data not linearly separated instead of that it will be divided into two group.

We can transform the logistic regression formula by putting linear regression y value in a sigmoid function.

Wrapping up: So Linear regression vs Logistic regression by looking at the data pattern we can easily understand which regression will work well with what kind of datasets. Linea regression work grates with the continuous data point and provide good accuracy which predicting unseen data point. In other hands, logistic regression works very accurately with a group of the data point.

I like the efforts you have put in this, regards for all the great blog posts.