Elevator pitch: ResonantBridge is an autopilot for AI. It measures coherence (σ), detects drift, and self-stabilizes generation (autopoiesis + adaptive threshold + active realignment). Everything runs locally with Ollama.
- Live metrics: σ(t), drift_rate, entropy, confidence + semantic states
(STABLE / WANDERING / DRIFTING / CONFUSED / REALIGNING) - Self-regulation: autopoiesis (adaptive ω), adaptive threshold, active realignment
- Closed-loop: hooks to adjust LLM decoding (temperature/top_p/prompt anchor) based on σ/STATE
- GUI: Matplotlib live dashboard (σ, drift, entropy, threshold, confidence + color-coded state)
- On-prem / model-agnostic: works with Ollama (Llama/Gemma/…)
- One-click scripts: Windows
.bat, Linux/macOS.sh - CI: GitHub Actions (build on Windows + Linux)
Ollama (LLM) ──► σ(t) extractor ──► ResonantBridge (C++) ──► Metrics/State
▲ │
└────────── policy hook (temperature/top_p)◄┘
#
# markdown
# Copy code
# \- \*\*σ(t)\*\*: simple, extensible coherence proxy over the stream.
# \- \*\*Bridge (C++)\*\*: autopoiesis + adaptive threshold + realignment + state classification.
# \- \*\*Policy hook\*\*: optionally tunes sampling when the system drifts.
#
# ---
#
## 🚀 Quick Start (Windows)
1. Install:
- Ollama (local) — and pull a model, e.g. `ollama pull llama3.1:8b`
- Python ≥ 3.11 (`pip install -r requirements.txt` if provided)
- CMake + MSVC Build Tools (for the C++ bridge)
2. Build the bridge (once):
```powershell
mkdir build; cd build
cmake .. -G "Visual Studio 17 2022" -A x64
cmake --build . --config Release
Run (opens 4–5 live windows):
powershell
Copy code
cd .\launchers\
.\run_resonantbridge_all.bat
(PowerShell users: Set-ExecutionPolicy Bypass -Scope Process -Force and run .\run_resonantbridge_all.ps1)
Stop: close the windows or Ctrl+C.
#
# \### Windows (one click)
# ```powershell
# pip install -r requirements.txt
# scripts\\run\_live\_demo.bat
# Linux/macOS (one command)
# bash
# Copy code
# pip install -r requirements.txt
# chmod +x scripts/run\_live\_demo.sh
# ./scripts/run\_live\_demo.sh
# On first run, the scripts auto-build the C++ binaries with CMake.
🖥️ What you’ll see
OLLAMA FEED: streaming text → sigma_feed.txt (σ updates continuously).
BRIDGE: reads σ, computes drift/entropy/confidence, performs realignment, logs to out.csv.
LIVE VISUAL: real-time graph + STATE (color background). On close, saves docs/live_snapshot.png.
(Windows) VOICE (optional): TTS announces state changes.
📂 Layout
.
├─ src/ # C++ core and CLIs
│ ├─ rlang_core_bridge.h
│ ├─ rlang_core_bridge.cpp
│ ├─ rlang_cli.cpp
│ └─ rlang_cli_sigma_example.cpp
├─ tools/ # Python tools
│ ├─ ollama_sigma_feed.py # extracts σ(t) from the Ollama stream
│ └─ live_visual.py # GUI
├─ scripts/
│ ├─ run_live_demo.bat # Windows: loop feed + loop bridge + GUI (+ TTS)
│ ├─ run_live_demo.sh # Linux/macOS
│ └─ voice_watch.ps1 # Windows TTS watcher (optional)
├─ CMakeLists.txt
├─ requirements.txt
├─ LICENSE-AGPL.txt
├─ LICENSE-COMMERCIAL.md
├─ .github/workflows/ci.yml
└─ docs/
├─ screenshot_live_visual.png
└─ QUICKSTART.md
🔧 Manual run (alternative)
# Build
mkdir -p build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
cmake --build . --config Release
# Run (3 terminals)
python tools/ollama_sigma_feed.py --model llama3.1:8b --prompt "Explain resonance as breathing of a system." --sigma-file sigma_feed.txt --window 64
./build/rlang_cli_sigma out.csv sigma_feed.txt 300
python tools/live_visual.py out.csv sigma_feed.txt
🛡️ License (Dual-License)
Community / non-commercial: AGPL-3.0 (see LICENSE-AGPL.txt).
Commercial / enterprise: Commercial License (see LICENSE-COMMERCIAL.md) — contact below.
TL;DR: free for personal and community use (AGPL requires sharing improvements).
For closed-source/enterprise use: obtain a commercial license.
Commercial contact: YOUR_NAME — YOUR_EMAIL
🧪 CI
GitHub Actions (.github/workflows/ci.yml) builds on Windows + Linux.
Includes a small “smoke test” for the CLI.
🧩 Extensions (ideas)
Policy pack: σ/STATE → temperature/top_p/prompt adjustments (close the loop over the LLM).
Memory kit: journal.md + facts.json + digest injection for session continuity.
Evaluations: batch experiments with/without policy; metric deltas.
🛟 Troubleshooting
Ollama 404 → use the exact tag from ollama list (e.g., llama3.1:8b).
cmake not found → install CMake and reopen the terminal.
GUI “stops” → use the provided scripts (they loop), or increase CYCLES.
PowerShell quoting → scripts already use robust quoting.
🤝 Contributing
PRs welcome! See CONTRIBUTING.md and CODE_OF_CONDUCT.md.
