Documentation

Decision Tree

Train Decision Tree to predict categorical outcomes

Single tree that makes decisions using if-then-else rules.

When to use:

  • Need interpretable model
  • Want to visualize decisions
  • Teaching/explaining ML
  • Quick baseline

Strengths: Highly interpretable, visualizable, handles non-linear, no feature scaling needed Weaknesses: Overfits easily, unstable (small data changes = different tree), less accurate than ensembles

Model Parameters

Max Depth Maximum tree depth. Lower = simpler, prevents overfitting.

Min Samples Split (default: 2) Minimum samples to create a split.

Min Samples Leaf (default: 1) Minimum samples in leaf nodes.

Criterion

  • gini: Gini impurity (default)
  • entropy: Information gain
  • log_loss: Log loss

Splitter

  • best: Choose best split (default)
  • random: Choose best random split (faster, more randomness)

Random State (default: 42) Seed for reproducibility.

On this page


Command Palette

Search for a command to run...

Keyboard Shortcuts
CTRL + KSearch
CTRL + DTheme switch
CTRL + LLanguage switch

Software details
Compiled about 7 hours ago
Release: v4.0.0-production
Buildnumber: master@d5b7269
History: 52 Items