An n8n community node that integrates OpenAI-compatible chat models with MLflow Tracing, giving you full observability into every LLM call directly from your n8n workflows.
npm package: https://www.npmjs.com/package/n8n-nodes-mlflow
- Support for OpenAI-compatible chat models (e.g.,
gpt-4.1-mini,gpt-4o, and any LiteLLM/LocalAI endpoint) - Automatic MLflow tracing for every LLM request and response
- Chat session grouping — traces are linked by
sessionIdfor multi-turn conversation tracking - Custom metadata injection:
sessionId,userId, and arbitrary structured JSON - Compatible with self-hosted MLflow and Databricks-hosted MLflow
n8n is a fair-code licensed workflow automation platform.
Installation
Credentials
Operations
Compatibility
Resources
Version history
Follow the installation guide in the official n8n documentation for community nodes.
For n8n v0.187+, install directly from the UI:
- Go to Settings → Community Nodes
- Click Install
- Enter
n8n-nodes-mlflowin the npm package name field - Agree to the risks of using community nodes
- Select Install
A preconfigured Docker setup is available in the docker/ directory:
- Build the Docker image
cd docker docker build --progress=plain -f docker/Dockerfile -t n8n-nodes-mlflow .
- Run the container
docker run -it --rm \ --name n8n-mlflow \ -p 5678:5678 \ -e GENERIC_TIMEZONE="America/New_York" \ -e TZ="America/New_York" \ -e N8N_ENFORCE_SETTINGS_FILE_PERMISSIONS=true \ -e N8N_RUNNERS_ENABLED=true \ -v n8n_data:/home/node/.n8n \ n8n-nodes-mlflow:latest
You can now access n8n at http://localhost:5678
# Go to your n8n installation directory
cd ~/.n8n
# Install the node
npm install n8n-nodes-mlflow
# Restart n8n to apply the node
n8n startThis credential authenticates your OpenAI-compatible LLM endpoint and configures where MLflow traces are sent.
| Field | Required | Description | Example |
|---|---|---|---|
| OpenAI API Key | Yes | API key for your OpenAI-compatible endpoint | sk-abc123... |
| OpenAI Base URL | No | Override the default endpoint base URL | https://api.openai.com/v1 |
| Field | Required | Description | Example |
|---|---|---|---|
| MLflow Tracking URI | Yes | Your MLflow tracking server URI | http://localhost:5000 or databricks |
| MLflow Experiment ID | Yes | Experiment ID where traces will be logged | 1234567890 |
Databricks users: Set the Tracking URI to
databricksand provide your Databricks personal access token as the MLflow Tracking Token.
The node exposes MLflow tracing fields that attach structured context to every trace span logged to your MLflow experiment.
| Field | Type | Description |
|---|---|---|
sessionId |
string |
Groups related traces into a single chat session (mlflow.trace.session) |
userId |
string |
Identifies the end user making the request (mlflow.trace.user) |
customMetadata |
object |
Arbitrary JSON attached to the trace as span metadata |
| Field | Example Value |
|---|---|
| Session ID | {{$json.sessionId}} |
| User ID | test-user |
| Custom Metadata (JSON) | Any JSON value |
- Node Configuration UI: Sample n8n workflow using the OpenAI MLflow node.
- Workflow Setup: A typical workflow using this node.
- n8n Chat: An example chat interaction in n8n.
- MLflow Trace: The resulting trace logged in MLflow.
- MLflow Session Grouping: Traces grouped by chat session in MLflow.
- Requires n8n version 1.0.0 or later
- Requires Node.js >= 20.15
- Compatible with:
- OpenAI official API (
https://api.openai.com) - Any OpenAI-compatible LLM (e.g. via LiteLLM, LocalAI, Azure OpenAI)
- Databricks-hosted MLflow, and self-hosted MLflow instances
- OpenAI official API (
- v0.1.0 — Initial release with OpenAI + MLflow tracing integration, session grouping, and custom metadata support
Apache 2.0 © 2025





