This Docker Compose setup provides an environment with the following tools:
- n8n: Workflow automation platform (available at http://localhost:5679)
- Ollama: AI model serving platform (available at http://localhost:11435)
- OpenWebUI: User interface for LLMs (available at http://localhost:3001)
- Langfuse: LLM observability platform (available at http://localhost:3002)
- Make sure you have Docker and Docker Compose installed on your system.
- Start the services:
docker compose up -d
- n8n: http://localhost:5679
- Ollama API: http://localhost:11435
- OpenWebUI: http://localhost:3001
- Langfuse: http://localhost:3002
All data is persisted using Docker volumes:
n8n_data: Stores n8n workflows and dataollama_data: Stores Ollama models and configurationslangfuse_db_data: Stores Langfuse observability data
- Start the services and access Langfuse at http://localhost:3002
- Create your first account (this will be the admin account)
- Create a new project in Langfuse
- Get your API keys from the project settings
- Update the
.envfile with your Langfuse API keys:LANGFUSE_PUBLIC_KEY=pk-lf-your-public-key LANGFUSE_SECRET_KEY=sk-lf-your-secret-key
- Restart the services to apply the new API keys
To stop all services:
docker compose downTo stop and remove all data (volumes):
docker compose down -v