Documentation

AdaBoost

Train AdaBoost to predict categorical outcomes

Adaptive boosting that focuses on misclassified examples.

When to use:

  • Have weak base learners
  • Want classic ensemble method
  • Simpler than gradient boosting

Strengths: Simple, less prone to overfitting, interpretable Weaknesses: Sensitive to noise, slower than modern boosting

Model Parameters

N Estimators (default: 50) Number of weak learners.

Learning Rate (default: 1.0) Shrinks the contribution of each classifier.

Algorithm

  • SAMME.R: Real boosting (default, better)
  • SAMME: Discrete boosting

Random State (default: 42) Seed for reproducibility.

On this page


Command Palette

Search for a command to run...

Keyboard Shortcuts
CTRL + KSearch
CTRL + DTheme switch
CTRL + LLanguage switch

Software details
Compiled 4 days ago
Release: v4.0.0-production
Buildnumber: master@994bcfd
History: 46 Items