Ridge Regression
Linear regression with L2 regularization to reduce overfitting
Ridge Regression adds an L2 penalty to linear regression, shrinking large coefficients toward zero. This improves generalization when features are correlated or the dataset is small relative to feature count.
When to use:
- Linear relationships with many correlated features (multicollinearity)
- Preventing overfitting on small datasets with many features
- When all features should be kept (unlike Lasso, Ridge never zeroes out coefficients)
Input: Tabular data with the feature columns defined during training Output: Continuous predicted value
Model Settings (set during training, used at inference)
Alpha (default: 1.0) Regularization strength. Higher values apply stronger shrinkage. Set to 0 to recover standard linear regression.
Fit Intercept (default: true) Whether to include a bias term.
Solver (default: auto)
Algorithm for finding the solution. auto selects the best solver based on data type and size.
Inference Settings
No dedicated inference-time settings.