Documentation

AdaBoost

Adaptive boosting that focuses on examples with large errors

Adaptive boosting that focuses on examples with large errors.

When to use:

  • Have weak base learners
  • Want classic ensemble method
  • Simpler than gradient boosting

Strengths: Simple, less prone to overfitting, interpretable Weaknesses: Sensitive to noise and outliers, slower than modern boosting

Model Parameters

N Estimators (default: 50) Number of weak learners.

Learning Rate (default: 1.0) Shrinks the contribution of each learner.

Loss

  • linear: Linear loss (default)
  • square: Square loss
  • exponential: Exponential loss

Random State (default: 42) Seed for reproducibility.

On this page


Command Palette

Search for a command to run...

Keyboard Shortcuts
CTRL + KSearch
CTRL + DTheme switch
CTRL + LLanguage switch

Software details
Compiled about 9 hours ago
Release: v4.0.0-production
Buildnumber: master@d5b7269
History: 52 Items