Logistic Regression is one of the supervised machine Learning algorithms supplied for group i.e. To predict discrete valued outcome. That is a statistical techngaianation.netue that is provided to suspect the outcome of a dependent variable based on observations provided in the training set.

You are watching: Advantages and disadvantages of logistic regression


Logistic Regression is one that the simplest device learning algorithms and also is straightforward to implement however provides great training effectiveness in some cases. Likewise due to these reasons, cultivate a design with this algorithm doesn"t call for high computation power.

The predicted parameters (trained weights) give inference around the prominence of every feature. The direction of combination i.e. Positive or negative is additionally given. For this reason we have the right to use logistic regression to uncover out the relationship between the features.

This algorithm permits models to it is in updated easily to reflect brand-new data, unequal decision tree or support vector machines. The update can be done making use of stochastic gradient descent.

Logistic Regression outputs well-calibrated probabilities in addition to classification results. This is an benefit over models the only offer the final classification as results. If a training instance has a 95% probability because that a class, and another has actually a 55% probability for the exact same class, we gain an inference about which training instances are an ext accurate for the formulated problem.

In a low dimensional dataset having a sufficient variety of training examples, logistic regression is less prone to over-fitting.

Rather than straight away beginning with a facility model, logistic regression is occasionally used together a benchmark version to measure up performance, together it is relatively quick and also easy come implement.

Logistic Regression proves to be very efficient when the dataset has actually features that room linearly separable.


It has a really close partnership with neural networks. A neural network representation can be regarded as stacking together a lot of of tiny logistic regression classifiers.

Due to its simple probabilistic interpretation, the training time that logistic regression algorithm comes out to be far less 보다 most complicated algorithms, such together an synthetic Neural Network.

This algorithm can quickly be expanded to multi-class classification utilizing a softmax classifier, this is known as Multinomial Logistic Regression.

Resultant weights uncovered after training of the logistic regression model, are found to it is in highly interpretable. The load w_i have the right to be understood as the amount log in odds will certainly increase, if x_i rises by 1 and all various other x"s stay constant. I below refers to any type of training example from i = 0 come n .


Logistic Regression is a statistical evaluation model the attempts to predict an exact probabilistic outcomes based on independent features. On high dimensional datasets, this may bring about the design being over-fit top top the maintain set, which means overstating the accuracy of guess on the training collection and therefore the design may not have the ability to predict accurate results ~ above the test set. This usually happens in the case when the model is trained on small training data with numerous features. So on high dimensional datasets, Regularization methods should be considered to prevent over-fitting (but this renders the version complex). Very high regularization factors may even lead to the version being under-fit ~ above the maintain data.

Non linear troubles can"t it is in solved through logistic regression since it has actually a linear decision surface. Linearly separable data is rarely found in real civilization scenarios. Therefore the transformation of non linear functions is required which can be excellent by enhancing the variety of features such that the data becomes linearly separable in higher dimensions.


It is difficult come capture complicated relationships making use of logistic regression. An ext powerful and facility algorithms such as Neural Networks can conveniently outperform this algorithm.

The training functions are well-known as independent variables. Logistic Regression requires center or no multicollinearity in between independent variables. This method if 2 independent variables have a high correlation, only one of them should be used. Repetition the information might lead come wrong training of parameters (weights) during minimizing the price function. Multicollinearity can be eliminated using dimensionality palliation techngaianation.netues.

In straight Regression independent and also dependent variables have to be connected linearly. However Logistic Regression requires that independent variables are linearly regarded the log odds (log(p/(1-p)).

Only important and also relevant features should it is in used to build a design otherwise the probability predictions made by the model might be incorrect and also the model"s predictive value may degrade.

The existence of data values that deviate indigenous the expected range in the dataset may lead to incorrect outcomes as this algorithm is perceptible to outliers.

Logistic Regression requires a big dataset and likewise sufficient training examples for all the category it demands to identify.

It is compelled that each training instance be independent of every the other instances in the dataset. If they are connected in part way, then the model will try to give more importance to those details training examples. So, the maintain data should not come from matched data or repeated measurements. Because that example, some scientific research techngaianation.netues rely ~ above multiple monitorings on the same individuals. This an approach can"t be supplied in such cases.

See more: What Is The Difference Between Pentecostal And Baptist, Difference Between Baptist And Pentecostal


With this write-up at gaianation.net, friend must have the finish idea of benefits and disadvantages of Logistic Regression. Enjoy.