The pretty much "official" DSPy framework for Typescript
-
Updated
Mar 15, 2026 - TypeScript
The pretty much "official" DSPy framework for Typescript
NativeMind: Your fully private, open-source, on-device AI assistant
TypeScript SDK for using in-browser AI models with the Vercel AI SDK, with support for seamless fallback to server-side models
A lightweight recreation of OS1/Samantha from the movie Her, running locally in the browser
A Webapp that uses Retrieval Augmented Generation (RAG) and Large Language Models to interact with a PDF directly in the browser.
Not-Only LLM Chat. An AI application that enhances creativity and user experience beyond just LLM chat. Noted: Seems it beta version of there is issue with DB please clear site Data in debug
Blueshell AI - Chat privately with AI and your documents fully offline, directly in your browser.
TalkFlow-Gun-Vue-WebLLM-DApp - E2EE Chat - E2EE Video/Voice Call - Decentralized AI- robots - drones - communication facilities
A proof of concept of what can be done with the BlockNote editor based on ProseMiror and an LLM running in the browser
🚀 A blazingly fast, modern chat interface built with Rust/WASM and Leptos, featuring local AI model execution via WebLLM. Privacy-first design with no backend dependencies.
ChatGPT style interface for open-source LLMs, allowing completely free, offline, and private AI use. Connect with models downloaded from Ollama, or load models directly in the browser with WebLLM.
Local-first AI extension that turns what you read into a searchable knowledge graph with hybrid search, temporal context, and proactive recall—fully private.
KREASYS: A professional, autonomous, browser-native AI ecosystem and IDE. It features local LLM inference via WebGPU, a persistent Virtual File System (VFS), and multi-modal interaction (text, image, audio, video). Operates entirely client-side for maximum privacy and performance with integrated Telegram delegation.
Add a description, image, and links to the webllm topic page so that developers can more easily learn about it.
To associate your repository with the webllm topic, visit your repo's landing page and select "manage topics."