Skip to content

LyCIL: A Pytorch-Lightning 2.x Toolbox for Continual/Incremental Learning. Inspired by https://github.com/LAMDA-CL/PyCIL.

Notifications You must be signed in to change notification settings

Moenupa/LyCIL

 
 

Repository files navigation

LyCIL

A clean, PyTorch Lightning (2.x) continual/incremental learning (CIL) library inspired by PyCIL, with up-to-date methods, NPU support, modern interface, and ready-to-run CLI examples.

Features

  • Modular LightningModule implementations for iCaRL and LUCIR that inherit from a common BaseIncremental.
  • ExemplarBuffer with herding selection, per-class quotas, and persistence.
  • CosineClassifier with dynamic expansion (used by LUCIR), and a standard Linear head (used by iCaRL by default; switchable).
  • CIFAR-100 and ImageNet LightningDataModules with task-by-task class splits.
  • Custom LightningCLI subclass that orchestrates multi-task training loops.
  • Concise, English docstrings and comments.

Quick Start

We highly recommend using uv to manage the environment.

# uv venv                           # create virtual env
uv sync --extra lightning           # for cuda lightning
uv sync --extra lightning-npu       # for npu lightning
Alternatively, install via pip or conda:
# conda or pip, with optional dependencies for lightning
pip install -e ".[lightning]"       # 'lightning', or 'lightning-npu'

Use LightningCLI commands to train, validate, test, or predict with continual learning models. For example, to train learning without forgetting (LwF) on CIFAR-100:

# valid commands: {fit,validate,test,predict}
python examples/lwf-cli.py fit -c configs/smoketest_lwf.yml
# or pass individual config files
python examples/lwf-cli.py fit --trainer configs/trainer/smoketest.yml --model configs/model/lwf.yml --data configs/data/cifar100.yml

Releases

No releases published

Languages

  • Python 100.0%