Documentation

Extra Trees

Train Extra Trees to predict categorical outcomes

Similar to Random Forest but uses random splits instead of optimal splits.

When to use:

  • Want faster training than Random Forest
  • Have large dataset
  • Bias-variance tradeoff favors more randomness

Strengths: Faster training, even less overfitting than RF Weaknesses: Slightly less accurate than RF, still large models

Model Parameters

Same as Random Forest: n_estimators, max_depth, min_samples_split, min_samples_leaf, etc.

On this page


Command Palette

Search for a command to run...

Keyboard Shortcuts
CTRL + KSearch
CTRL + DTheme switch
CTRL + LLanguage switch

Software details
Compiled 3 days ago
Release: v4.0.0-production
Buildnumber: master@994bcfd
History: 46 Items