UCL COMP0087 Statistical Natural Language Processing - NanoGlot Team
-
Updated
Apr 17, 2026 - Jupyter Notebook
UCL COMP0087 Statistical Natural Language Processing - NanoGlot Team
A curated showcase of small, on-premise models with great utility, enabling privacy-centered deployment and easy adaptation to bio-inspired use cases.
A 36M-parameter goldfish language model with a 10-second memory + pixel-art PWA desk pet. Runs in your browser, fully offline. Adopt it at den-sec.github.io/glublm/desk-pet/
Local-first inference control layer for small LLMs. Inline plan-then-answer scaffolding (lite/two_pass/direct) for more reliable structured outputs at lower latency on Ollama and similar.
Experiments with In Context Learning using small functions
Create a Small LLM using EleutherAI/gpt-neo-2.7B - Fine Tune It for a Specalized Purpouse and Leverage as a Co-Pilot
Small-model-friendly OpenClaw skills — simple, local-only tools optimized for reliable single-tool-call execution by 4B-parameter LLMs (e.g. Qwen3-4B-AWQ)
Add a description, image, and links to the small-llm topic page so that developers can more easily learn about it.
To associate your repository with the small-llm topic, visit your repo's landing page and select "manage topics."