Dokumentation (english)

Gradient Boosting

Train Gradient Boosting to predict categorical outcomes

Classic gradient boosting algorithm from scikit-learn.

When to use:

  • Similar to XGBoost but simpler
  • Don't need cutting-edge performance
  • Want more straightforward hyperparameters

Strengths: Good accuracy, interpretable feature importance, built into scikit-learn Weaknesses: Slower than XGBoost/LightGBM, less features

Model Parameters

Similar to XGBoost but with fewer options. Key parameters:

N Estimators, Max Depth, Learning Rate, Subsample, Min Samples Split, Min Samples Leaf

On this page


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor 1 Tag
Release: v4.0.0-production
Buildnummer: master@64a3463
Historie: 68 Items