Search Options

Results per page
Sort
Preferred Languages
Labels
Advance

Results 1061 - 1070 of 1,699 for document (0.27 sec)

  1. Effect of varying threshold for self-training —...

    This example illustrates the effect of a varying threshold on self-training. The breast_cancer dataset is loaded, and labels are deleted such that only 50 out of 569 samples have labels. A SelfTrai...
    scikit-learn.org/stable/auto_examples/semi_supervised/plot_self_training_varying_threshold.html
    Mon Jul 07 14:36:35 UTC 2025
      102.7K bytes
      Cache
     
  2. L1-based models for Sparse Signals — scikit-lea...

    The present example compares three l1-based regression models on a synthetic signal obtained from sparse and correlated features that are further corrupted with additive gaussian noise: a Lasso;, a...
    scikit-learn.org/stable/auto_examples/linear_model/plot_lasso_and_elasticnet.html
    Mon Jul 07 14:36:35 UTC 2025
      125.4K bytes
      Cache
     
  3. Imputing missing values before building an esti...

    Missing values can be replaced by the mean, the median or the most frequent value using the basic SimpleImputer. In this example we will investigate different imputation techniques: imputation by t...
    scikit-learn.org/stable/auto_examples/impute/plot_missing_values.html
    Mon Jul 07 14:36:35 UTC 2025
      125K bytes
      Cache
     
  4. 1.13. Feature selection — scikit-learn 1.7.0 do...

    The classes in the sklearn.feature_selection module can be used for feature selection/dimensionality reduction on sample sets, either to improve estimators’ accuracy scores or to boost their perfor...
    scikit-learn.org/stable/modules/feature_selection.html
    Mon Jul 07 14:36:34 UTC 2025
      73.8K bytes
      Cache
     
  5. 7.3. Preprocessing data — scikit-learn 1.7.0 do...

    The sklearn.preprocessing package provides several common utility functions and transformer classes to change raw feature vectors into a representation that is more suitable for the downstream esti...
    scikit-learn.org/stable/modules/preprocessing.html
    Mon Jul 07 14:36:32 UTC 2025
      198.2K bytes
      Cache
     
  6. 1.10. Decision Trees — scikit-learn 1.7.0 docum...

    Decision Trees (DTs) are a non-parametric supervised learning method used for classification and regression. The goal is to create a model that predicts the value of a target variable by learning s...
    scikit-learn.org/stable/modules/tree.html
    Mon Jul 07 14:36:35 UTC 2025
      94.5K bytes
      Cache
     
  7. 1.16. Probability calibration — scikit-learn 1....

    When performing classification you often want not only to predict the class label, but also obtain a probability of the respective label. This probability gives you some kind of confidence on the p...
    scikit-learn.org/stable/modules/calibration.html
    Mon Jul 07 14:36:32 UTC 2025
      63.3K bytes
      Cache
     
  8. 8.1. Toy datasets — scikit-learn 1.7.0 document...

    scikit-learn comes with a few small standard datasets that do not require to download any file from some external website. They can be loaded using the following functions: These datasets are usefu...
    scikit-learn.org/stable/datasets/toy_dataset.html
    Mon Jul 07 14:36:35 UTC 2025
      63.3K bytes
      Cache
     
  9. 9. Computing with scikit-learn — scikit-learn 1...

    Strategies to scale computationally: bigger data- Scaling with instances using out-of-core learning., Computational Performance- Prediction Latency, Prediction Throughput, Tips and Tricks., Paralle...
    scikit-learn.org/stable/computing.html
    Mon Jul 07 14:36:35 UTC 2025
      31.4K bytes
      Cache
     
  10. Regularization path of L1- Logistic Regression ...

    Train l1-penalized logistic regression models on a binary classification problem derived from the Iris dataset. The models are ordered from strongest regularized to least regularized. The 4 coeffic...
    scikit-learn.org/stable/auto_examples/linear_model/plot_logistic_path.html
    Mon Jul 07 14:36:35 UTC 2025
      97K bytes
      Cache
     
Back to top