Documentation

AdaBoost

Adaptive boosting ensemble for regression

AdaBoost Regressor iteratively trains base estimators, re-weighting training examples where previous models made large errors. It builds a strong regressor from simple base learners.

When to use:

  • Regression with outlier-free data (AdaBoost is sensitive to noise)
  • When a straightforward boosting approach is preferred
  • Combining weak regressors into a competitive ensemble

Input: Tabular data with the feature columns defined during training Output: Continuous predicted value

Model Settings (set during training, used at inference)

N Estimators (default: 50) Number of boosting rounds.

Learning Rate (default: 1.0) Shrinks each estimator's contribution. Lower values improve generalization with more estimators.

Loss (default: linear) Loss function for updating weights. linear, square, or exponential.

Base Estimator Max Depth (default: 3) Depth of the decision tree base learner.

Inference Settings

No dedicated inference-time settings.


Command Palette

Search for a command to run...

Keyboard Shortcuts
CTRL + KSearch
CTRL + DTheme switch
CTRL + LLanguage switch

Software details
Compiled about 7 hours ago
Release: v4.0.0-production
Buildnumber: master@d5b7269
History: 52 Items