Dokumentation (english)

AdaBoost

Adaptive boosting ensemble for regression

AdaBoost Regressor iteratively trains base estimators, re-weighting training examples where previous models made large errors. It builds a strong regressor from simple base learners.

When to use:

  • Regression with outlier-free data (AdaBoost is sensitive to noise)
  • When a straightforward boosting approach is preferred
  • Combining weak regressors into a competitive ensemble

Input: Tabular data with the feature columns defined during training Output: Continuous predicted value

Model Settings (set during training, used at inference)

N Estimators (default: 50) Number of boosting rounds.

Learning Rate (default: 1.0) Shrinks each estimator's contribution. Lower values improve generalization with more estimators.

Loss (default: linear) Loss function for updating weights. linear, square, or exponential.

Base Estimator Max Depth (default: 3) Depth of the decision tree base learner.

Inference Settings

No dedicated inference-time settings.


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor etwa 4 Stunden
Release: v4.0.0-production
Buildnummer: master@afa25ab
Historie: 72 Items