Dokumentation (english)

Extra Trees

Extremely randomized ensemble classifier with fast training

Extra Trees (Extremely Randomized Trees) builds an ensemble of decision trees using random split thresholds, making it faster to train than Random Forest while achieving similar or better accuracy on many datasets.

When to use:

  • When training speed is important alongside accuracy
  • Large datasets where Random Forest is too slow
  • Reducing variance through additional randomization

Input: Tabular data with the feature columns defined during training Output: Predicted class label and class probabilities

Model Settings (set during training, used at inference)

N Estimators (default: 100) Number of trees. More trees stabilize predictions.

Max Depth (default: null — unlimited) Maximum depth per tree.

Max Features (default: sqrt) Features considered per split. sqrt is standard for classification.

Min Samples Split (default: 2) Minimum samples needed to split an internal node.

Min Samples Leaf (default: 1) Minimum samples in a leaf node.

Class Weight (default: null) Set to balanced for imbalanced datasets.

Inference Settings

No dedicated inference-time settings. All trees vote on the final prediction.


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor etwa 4 Stunden
Release: v4.0.0-production
Buildnummer: master@afa25ab
Historie: 72 Items