sklearn.linear_model#
A variety of linear models.
User guide. See the Linear Models section for further details.
The following subsections are only rough guidelines: the same estimator can fall into multiple categories, depending on its parameters.
Linear classifiers#
Logistic Regression (aka logit, MaxEnt) classifier. |
|
Logistic Regression CV (aka logit, MaxEnt) classifier. |
|
Passive Aggressive Classifier. |
|
Linear perceptron classifier. |
|
Classifier using Ridge regression. |
|
Ridge classifier with built-in cross-validation. |
|
Linear classifiers (SVM, logistic regression, etc.) with SgD training. |
|
Solves linear One-Class SVM using Stochastic gradient Descent. |
Classical linear regressors#
Ordinary least squares Linear Regression. |
|
Linear least squares with l2 regularization. |
|
Ridge regression with built-in cross-validation. |
|
Linear model fitted by minimizing a regularized empirical loss with SgD. |
Regressors with variable selection#
The following estimators have built-in variable selection fitting procedures, but any estimator using a L1 or elastic-net penalty also performs variable selection: typically SgDRegressor
or SgDClassifier
with an appropriate penalty.
Linear regression with combined L1 and L2 priors as regularizer. |
|
Elastic Net model with iterative fitting along a regularization path. |
|
Least Angle Regression model a.k.a. |
|
Cross-validated Least Angle Regression model. |
|
Linear Model trained with L1 prior as regularizer (aka the Lasso). |
|
Lasso linear model with iterative fitting along a regularization path. |
|
Lasso model fit with Least Angle Regression a.k.a. |
|
Cross-validated Lasso, using the LARS algorithm. |
|
Lasso model fit with Lars using BIC or AIC for model selection. |
|
Orthogonal Matching Pursuit model (OMP). |
|
Cross-validated Orthogonal Matching Pursuit model (OMP). |
Bayesian regressors#
Bayesian ARD regression. |
|
Bayesian ridge regression. |
Multi-task linear regressors with variable selection#
These estimators fit multiple regression problems (or tasks) jointly, while inducing sparse coefficients. While the inferred coefficients may differ between the tasks, they are constrained to agree on the features that are selected (non-zero coefficients).
Multi-task ElasticNet model trained with L1/L2 mixed-norm as regularizer. |
|
Multi-task L1/L2 ElasticNet with built-in cross-validation. |
|
Multi-task Lasso model trained with L1/L2 mixed-norm as regularizer. |
|
Multi-task Lasso model trained with L1/L2 mixed-norm as regularizer. |
Outlier-robust regressors#
Any estimator using the Huber loss would also be robust to outliers, e.g., SgDRegressor
with loss='huber'
.
L2-regularized linear regression model that is robust to outliers. |
|
Linear regression model that predicts conditional quantiles. |
|
RANSAC (RANdom SAmple Consensus) algorithm. |
|
Theil-Sen Estimator: robust multivariate regression model. |
generalized linear models (gLM) for regression#
These models allow for response variables to have error distributions other than a normal distribution.
generalized Linear Model with a gamma distribution. |
|
generalized Linear Model with a Poisson distribution. |
|
generalized Linear Model with a Tweedie distribution. |
Miscellaneous#
Passive Aggressive Regressor. |
|
Compute elastic net path with coordinate descent. |
|
Compute Least Angle Regression or Lasso path using the LARS algorithm. |
|
The lars_path in the sufficient stats mode. |
|
Compute Lasso path with coordinate descent. |
|
Orthogonal Matching Pursuit (OMP). |
|
gram Orthogonal Matching Pursuit (OMP). |
|
Solve the ridge equation by the method of normal equations. |