GitHub
deepbox/ml

Linear Models

Regression and classification models based on linear combinations of features. All follow the fit/predict API.

LinearRegression

Ordinary Least Squares regression. Fits ŷ = Xw + b by minimizing ‖y − Xw‖². No regularization. Closed-form solution via normal equations or SVD.

Ridge

Linear regression with L2 regularization. Minimizes ‖y − Xw‖² + α‖w‖². Shrinks coefficients toward zero but never to exactly zero. α controls regularization strength.

Lasso

Linear regression with L1 regularization. Minimizes ‖y − Xw‖² + α‖w‖₁. Produces sparse solutions (some coefficients exactly zero). Useful for feature selection.

LogisticRegression

Linear classifier using logistic (sigmoid) function. Outputs class probabilities. Supports binary and multi-class (one-vs-rest) classification. Regularized by default.

OLS

min ‖y − Xw‖²

Where:

  • w = Weight vector

Ridge

min ‖y − Xw‖² + α‖w‖²

Where:

  • α = Regularization strength

Lasso

min ‖y − Xw‖² + α‖w‖₁

Where:

  • ‖w‖₁ = Sum of absolute weights

Logistic

P(y=1|x) = σ(wᵀx + b)

Where:

  • σ = Sigmoid function
linear-models.ts
import { LinearRegression, Ridge, Lasso, LogisticRegression } from "deepbox/ml";import { tensor } from "deepbox/ndarray";const X = tensor([[1], [2], [3], [4], [5]]);const y = tensor([2, 4, 5, 4, 5]);// Linear Regressionconst lr = new LinearRegression();lr.fit(X, y);const pred = lr.predict(tensor([[6]])); // Predict for x=6console.log(lr.coef);     // Slopeconsole.log(lr.intercept); // Intercept// Ridge (L2 regularization)const ridge = new Ridge({ alpha: 1.0 });ridge.fit(X, y);// Lasso (L1 regularization, sparse solutions)const lasso = new Lasso({ alpha: 0.1 });lasso.fit(X, y);// Logistic Regression (classification)const Xc = tensor([[1, 2], [2, 3], [3, 1], [4, 3]]);const yc = tensor([0, 0, 1, 1]);const logReg = new LogisticRegression({ maxIter: 100 });logReg.fit(Xc, yc);const classes = logReg.predict(tensor([[2.5, 2]]));