GitHub
deepbox/nn

Module & Sequential

Base classes for building neural networks. Module is the building block; Sequential chains modules in order.

Module

Base class for all neural network modules. Subclass Module to create custom layers and models. Manages parameters, buffers, sub-modules, state serialization, hooks, and train/eval mode.

Sequential

extends Module

A sequential container that chains modules in the order they are passed. The output of each module is fed as input to the next. The simplest way to build feedforward networks.

Module API

  • .forward(input) — Run the forward pass. Override in subclasses.
  • .parameters() — Returns all learnable parameters (GradTensors)
  • .train() — Set the module to training mode (enables dropout, etc.)
  • .eval() — Set to evaluation mode (disables dropout, uses running stats for batchnorm)
  • .stateDict() — Serialize all parameters and buffers to a plain object
  • .loadStateDict(dict) — Restore parameters from a state dict
  • .freezeParameters() — Disable gradient computation for all parameters
  • .unfreezeParameters() — Re-enable gradient computation
  • .registerForwardHook(fn) — Register a hook called after every forward()
  • .to(device) — Move module to a device
  • .children() — Iterate over immediate sub-modules
  • .namedParameters() — Iterate over (name, parameter) pairs
module-sequential.ts
import { Module, Sequential, Linear, ReLU, Dropout } from "deepbox/nn";import { tensor } from "deepbox/ndarray";// Sequential: chain layersconst model = new Sequential(  new Linear(10, 64),  new ReLU(),  new Dropout(0.2),  new Linear(64, 32),  new ReLU(),  new Linear(32, 1));const input = tensor([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]]);const output = model.forward(input); // shape: [1, 1]// Access parameters for optimizerconst params = model.parameters(); // All GradTensors// Save and load stateconst state = model.stateDict();model.loadStateDict(state);// Train/eval modemodel.train(); // Enable dropoutmodel.eval();  // Disable dropout
custom-module.ts
import { Module } from "deepbox/nn";import { Linear, ReLU } from "deepbox/nn";import type { Tensor } from "deepbox/ndarray";// Custom module by extending Moduleclass MLP extends Module {  private fc1: Linear;  private fc2: Linear;  private relu: ReLU;  constructor(inputSize: number, hiddenSize: number, outputSize: number) {    super();    this.fc1 = new Linear(inputSize, hiddenSize);    this.fc2 = new Linear(hiddenSize, outputSize);    this.relu = new ReLU();    this.registerModule("fc1", this.fc1);    this.registerModule("fc2", this.fc2);  }  forward(x: Tensor): Tensor {    x = this.fc1.forward(x);    x = this.relu.forward(x);    x = this.fc2.forward(x);    return x;  }}