Documentation

LightGBM

Microsoft's gradient boosting framework optimized for speed and memory efficiency

Microsoft's gradient boosting framework optimized for speed and memory efficiency.

When to use:

  • Large datasets (>10k rows)
  • Many features
  • Need fast training
  • Limited memory

Strengths: Very fast, low memory, handles large datasets, accurate Weaknesses: Can overfit small datasets, many hyperparameters

Model Parameters

Num Leaves (default: 31) Maximum number of leaves in one tree. More = complex.

Learning Rate (default: 0.1) Step size for weight updates.

N Estimators (default: 100) Number of boosting iterations.

Max Depth (default: -1) Maximum tree depth (-1 = unlimited).

Feature Fraction (default: 1.0) Fraction of features to use per iteration.

Bagging Fraction (default: 1.0) Fraction of data to use per iteration.

Min Data in Leaf (default: 20) Minimum samples in one leaf.

Reg Alpha, Reg Lambda L1 and L2 regularization.

Random State (default: 42) Seed for reproducibility.

On this page


Command Palette

Search for a command to run...

Keyboard Shortcuts
CTRL + KSearch
CTRL + DTheme switch
CTRL + LLanguage switch

Software details
Compiled about 9 hours ago
Release: v4.0.0-production
Buildnumber: master@d5b7269
History: 52 Items