# Statistics - log-likelihood function (cross-entropy)

The “log-likelihood function” is a probabilistic function.

$$\sum_{i=1}^{N}(1-X_i^i)log(1-Pr[1|B_1^1,B_2^2,\dots,B_k^k])+X_i^i.log(Pr[1|B_1^1,B_2^2,\dots,B_k^k])$$

The “log-likelihood function” is also referred to as the cross-entropy

Discover More
Statistics - Akaike information criterion (AIC)

AIC stands for Akaike Information Criterion. Akaike is the name of the guy who came up with this idea. AIC is a quantity that we can calculate for many different model types, not just linear models,...
Statistics - Best Subset Selection Regression

The most direct approach in order to generate a set of model for the feature selection approach is called all subsets or best subsets regression. We compute the least squares t for all possible subsets...
Statistics - Deviance

The deviance is negative two times the maximized log-likelihood. And in the case of least squares regression, the deviance and the residual sum of squares are equivalent but for other model types the...
Statistics - Maximum likelihood

Maximum likelihood was introduced by Ronald Fisher back in the 1920s. Since each observation is meant to be independent of each other one, the probability of observed data is the probability of the observed...