# Statistics Learning - Multi-variant logistic regression

A logistic regression with multiple variables and two class outcome.

## General linear model

$$\begin{array}{rrrl} Pr(Y = 1|X) & = & p(X) & = & \frac{\displaystyle e^{\displaystyle B_0 + B_1 . X_1 + \dots + B_i . X_i}}{\displaystyle 1+ e^{\displaystyle B_0 + B_1 . X_1 + \dots + B_i . X_i }} \\ \end{array}$$

Invert of the logit transformation: $$\begin{array}{rrl} log \left (\frac {\displaystyle p(X)}{\displaystyle 1 - p(X)} \right ) & = & B_0 + B_1 X_1 + \dots + B_i X_i \\ \end{array}$$

## R

myLogisticRegressionModel <- glm ( targetVariable~�., data=myDataFrame , family = binomial )
summary ( myLogisticRegressionModel )

• tilde means to be modeled as.
• And dot means all the other variables in the data frame
• A binomial family tells to fit the logistic regression model.

## Interpretation

We're not too interested in the intercept.

It is difficult to interpret regression coefficients in a multiple regression model, because the correlations between the variables can affect the signs.

When we have correlated variables, these variables act as surrogates for each other and it can affect:

• the sign of the coefficient
• the p-value (significant or not)

Discover More
Machine Learning - (Univariate|Simple) Logistic regression

A Simple Logistic regression is a Logistic regression with only one parameters. For the generalization (ie with more than one parameter), see Logistic regression comes from the fact that linear regression...
Machine Learning - Logistic regression (Classification Algorithm)

The prediction from a logistic regression model can be interpreted as the probability that the label is 1. linear regression can also be used to perform classification problem. Just by transforming the...