Dokumentation (english)

Decision Tree

Train Decision Tree to predict categorical outcomes

Single tree that makes decisions using if-then-else rules.

When to use:

  • Need interpretable model
  • Want to visualize decisions
  • Teaching/explaining ML
  • Quick baseline

Strengths: Highly interpretable, visualizable, handles non-linear, no feature scaling needed Weaknesses: Overfits easily, unstable (small data changes = different tree), less accurate than ensembles

Model Parameters

Max Depth Maximum tree depth. Lower = simpler, prevents overfitting.

Min Samples Split (default: 2) Minimum samples to create a split.

Min Samples Leaf (default: 1) Minimum samples in leaf nodes.

Criterion

  • gini: Gini impurity (default)
  • entropy: Information gain
  • log_loss: Log loss

Splitter

  • best: Choose best split (default)
  • random: Choose best random split (faster, more randomness)

Random State (default: 42) Seed for reproducibility.

On this page


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor 1 Tag
Release: v4.0.0-production
Buildnummer: master@64a3463
Historie: 68 Items