Skip to content

Commit 8f150ec

Browse files
committed
refactor: improve MLflow provider code based on PR feedback
- refactor: simplify MLflow provider documentation per upstream feedback - refactor: improve import handling with module-level imports - refactor: use proper typing instead of type: ignore for mlflow client - refactor: use .split() instead of magic number [5:] for clarity - chore: remove manual test script per upstream request Changes: - Move mlflow imports to module level with graceful fallback - Replace with proper typing - Remove 7 duplicate imports throughout methods - Use for better readability - Simplify documentation to essential information only Signed-off-by: William Caban <[email protected]>
1 parent f2c6dd8 commit 8f150ec

File tree

4 files changed

+16
-359
lines changed

4 files changed

+16
-359
lines changed

docs/docs/providers/prompts/remote_mlflow.mdx

Lines changed: 1 addition & 98 deletions
Original file line numberDiff line numberDiff line change
@@ -3,104 +3,7 @@ description: |
33
[MLflow](https://mlflow.org/) is a remote provider for centralized prompt management and versioning
44
using MLflow's Prompt Registry (available in MLflow 3.4+). It allows you to store, version, and manage
55
prompts in a centralized MLflow server, enabling team collaboration and prompt lifecycle management.
6-
7-
## Features
8-
MLflow Prompts Provider supports:
9-
- Create and store prompts with automatic versioning
10-
- Retrieve prompts by ID and version
11-
- Update prompts (creates new immutable versions)
12-
- List all prompts or all versions of a specific prompt
13-
- Set default version for a prompt
14-
- Automatic variable extraction from templates
15-
- Metadata storage and retrieval
16-
- Centralized prompt management across teams
17-
18-
## Key Capabilities
19-
- **Version Control**: Immutable versioning ensures prompt history is preserved
20-
- **Default Version Management**: Easily switch between prompt versions
21-
- **Variable Auto-Extraction**: Automatically detects `{{ variable }}` placeholders
22-
- **Metadata Tags**: Stores Llama Stack metadata for seamless integration
23-
- **Team Collaboration**: Centralized MLflow server enables multi-user access
24-
25-
## Usage
26-
27-
To use MLflow Prompts Provider in your Llama Stack project:
28-
29-
1. Install MLflow 3.4 or later
30-
2. Start an MLflow server (local or remote)
31-
3. Configure your Llama Stack project to use the MLflow provider
32-
4. Start creating and managing prompts
33-
34-
## Installation
35-
36-
Install MLflow using pip or uv:
37-
38-
```bash
39-
pip install 'mlflow>=3.4.0'
40-
# or
41-
uv pip install 'mlflow>=3.4.0'
42-
```
43-
44-
## Quick Start
45-
46-
### 1. Start MLflow Server
47-
48-
**Local server** (for development):
49-
```bash
50-
mlflow server --host 127.0.0.1 --port 5555
51-
```
52-
53-
**Remote server** (for production):
54-
```bash
55-
mlflow server --host 0.0.0.0 --port 5000 --backend-store-uri postgresql://user:pass@host/db
56-
```
57-
58-
### 2. Configure Llama Stack
59-
60-
Add to your Llama Stack configuration:
61-
62-
```yaml
63-
prompts:
64-
- provider_id: mlflow-prompts
65-
provider_type: remote::mlflow
66-
config:
67-
mlflow_tracking_uri: http://localhost:5555
68-
experiment_name: llama-stack-prompts
69-
```
70-
71-
### 3. Use the Prompts API
72-
73-
```python
74-
from llama_stack_client import LlamaStackClient
75-
76-
client = LlamaStackClient(base_url="http://localhost:5000")
77-
78-
# Create a prompt
79-
prompt = client.prompts.create(
80-
prompt="Summarize the following text in {{ num_sentences }} sentences:\n\n{{ text }}",
81-
variables=["num_sentences", "text"]
82-
)
83-
print(f"Created prompt: {prompt.prompt_id} (v{prompt.version})")
84-
85-
# Retrieve prompt
86-
retrieved = client.prompts.get(prompt_id=prompt.prompt_id)
87-
print(f"Retrieved: {retrieved.prompt}")
88-
89-
# Update prompt (creates version 2)
90-
updated = client.prompts.update(
91-
prompt_id=prompt.prompt_id,
92-
prompt="Summarize in exactly {{ num_sentences }} sentences:\n\n{{ text }}",
93-
version=1,
94-
set_as_default=True
95-
)
96-
print(f"Updated to version: {updated.version}")
97-
98-
# List all prompts
99-
prompts = client.prompts.list()
100-
print(f"Found {len(prompts.data)} prompts")
101-
```
102-
103-
## Documentation
6+
1047
See [MLflow's documentation](https://mlflow.org/docs/latest/prompts.html) for more details about MLflow Prompt Registry.
1058
1069
sidebar_label: Remote - MLflow

scripts/test_mlflow_prompts_manual.py

Lines changed: 0 additions & 239 deletions
This file was deleted.

src/llama_stack/providers/remote/prompts/mlflow/mapping.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ def to_mlflow_name(self, prompt_id: str) -> str:
6060
raise ValueError(f"Invalid prompt_id format: {prompt_id}. Expected format: pmpt_<48-hex-chars>")
6161

6262
# Extract hex part (after "pmpt_" prefix)
63-
hex_part = prompt_id[5:]
63+
hex_part = prompt_id.split("pmpt_")[1]
6464

6565
# Create MLflow name
6666
return f"{self.MLFLOW_NAME_PREFIX}{hex_part}"

0 commit comments

Comments
 (0)