You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Follow this guide to learn how to quickly set up Plano and integrate it into your generative AI applications. You can:
7
7
8
+
- :ref:`Use Plano as a model proxy (Gateway) <llm_routing_quickstart>` to standardize access to multiple LLM providers.
8
9
- :ref:`Build agents <quickstart_agents>` for multi-step workflows (e.g., travel assistants with flights and hotels).
9
10
- :ref:`Call deterministic APIs via prompt targets <quickstart_prompt_targets>` to turn instructions directly into function calls.
10
-
- :ref:`Use Plano as a model proxy (Gateway) <llm_routing_quickstart>` to standardize access to multiple LLM providers.
11
11
12
12
.. note::
13
13
This quickstart assumes basic familiarity with agents and prompt targets from the Concepts section. For background, see :ref:`Agents <agents>` and :ref:`Prompt Target <prompt_target>`.
@@ -48,6 +48,109 @@ Plano's CLI allows you to manage and interact with the Plano efficiently. To ins
48
48
$ pip install planoai==0.4.1
49
49
50
50
51
+
.. _llm_routing_quickstart:
52
+
53
+
Use Plano as a Model Proxy (Gateway)
54
+
------------------------------------
55
+
56
+
Step 1. Create plano config file
57
+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
58
+
59
+
Plano operates based on a configuration file where you can define LLM providers, prompt targets, guardrails, etc. Below is an example configuration that defines OpenAI and Anthropic LLM providers.
60
+
61
+
Create ``plano_config.yaml`` file with the following content:
62
+
63
+
.. code-block:: yaml
64
+
65
+
version: v0.3.0
66
+
67
+
listeners:
68
+
- type: model
69
+
name: model_1
70
+
address: 0.0.0.0
71
+
port: 12000
72
+
73
+
model_providers:
74
+
75
+
- access_key: $OPENAI_API_KEY
76
+
model: openai/gpt-4o
77
+
default: true
78
+
79
+
- access_key: $ANTHROPIC_API_KEY
80
+
model: anthropic/claude-sonnet-4-5
81
+
82
+
Step 2. Start plano
83
+
~~~~~~~~~~~~~~~~~~~
84
+
85
+
Once the config file is created, ensure that you have environment variables set up for ``ANTHROPIC_API_KEY`` and ``OPENAI_API_KEY`` (or these are defined in a ``.env`` file).
86
+
87
+
Start Plano:
88
+
89
+
.. code-block:: console
90
+
91
+
$ planoai up plano_config.yaml
92
+
# Or if installed with uv tool: uvx planoai up plano_config.yaml
93
+
2024-12-05 11:24:51,288 - planoai.main - INFO - Starting plano cli version: 0.4.1
94
+
2024-12-05 11:24:51,825 - planoai.utils - INFO - Schema validation successful!
95
+
2024-12-05 11:24:51,825 - planoai.main - INFO - Starting plano
96
+
...
97
+
2024-12-05 11:25:16,131 - planoai.core - INFO - Container is healthy!
--data '{"messages": [{"role": "user","content": "What is the capital of France?"}], "model": "none"}' \
109
+
http://localhost:12000/v1/chat/completions
110
+
111
+
{
112
+
...
113
+
"model": "gpt-4o-2024-08-06",
114
+
"choices": [
115
+
{
116
+
...
117
+
"messages": {
118
+
"role": "assistant",
119
+
"content": "The capital of France is Paris.",
120
+
},
121
+
}
122
+
],
123
+
}
124
+
125
+
.. note::
126
+
When the requested model is not found in the configuration, Plano will randomly select an available model from the configured providers. In this example, we use ``"model": "none"`` and Plano selects the default model ``openai/gpt-4o``.
127
+
128
+
Step 3.2: Using OpenAI Python client
129
+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
130
+
131
+
Make outbound calls via the Plano gateway:
132
+
133
+
.. code-block:: python
134
+
135
+
from openai import OpenAI
136
+
137
+
# Use the OpenAI client as usual
138
+
client = OpenAI(
139
+
# No need to set a specific openai.api_key since it's configured in Plano's gateway
140
+
api_key='--',
141
+
# Set the OpenAI API base URL to the Plano gateway endpoint
142
+
base_url="http://127.0.0.1:12000/v1"
143
+
)
144
+
145
+
response = client.chat.completions.create(
146
+
# we select model from plano_config file
147
+
model="--",
148
+
messages=[{"role": "user", "content": "What is the capital of France?"}],
@@ -228,105 +331,6 @@ And to get the list of supported currencies:
228
331
"Here is a list of the currencies that are supported for conversion from USD, along with their symbols:\n\n1. AUD - Australian Dollar\n2. BGN - Bulgarian Lev\n3. BRL - Brazilian Real\n4. CAD - Canadian Dollar\n5. CHF - Swiss Franc\n6. CNY - Chinese Renminbi Yuan\n7. CZK - Czech Koruna\n8. DKK - Danish Krone\n9. EUR - Euro\n10. GBP - British Pound\n11. HKD - Hong Kong Dollar\n12. HUF - Hungarian Forint\n13. IDR - Indonesian Rupiah\n14. ILS - Israeli New Sheqel\n15. INR - Indian Rupee\n16. ISK - Icelandic Króna\n17. JPY - Japanese Yen\n18. KRW - South Korean Won\n19. MXN - Mexican Peso\n20. MYR - Malaysian Ringgit\n21. NOK - Norwegian Krone\n22. NZD - New Zealand Dollar\n23. PHP - Philippine Peso\n24. PLN - Polish Złoty\n25. RON - Romanian Leu\n26. SEK - Swedish Krona\n27. SGD - Singapore Dollar\n28. THB - Thai Baht\n29. TRY - Turkish Lira\n30. USD - United States Dollar\n31. ZAR - South African Rand\n\nIf you want to convert USD to any of these currencies, you can select the one you are interested in."
229
332
230
333
231
-
.. _llm_routing_quickstart:
232
-
233
-
Use Plano as a Model Proxy (Gateway)
234
-
------------------------------------
235
-
236
-
Step 1. Create plano config file
237
-
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
238
-
239
-
Plano operates based on a configuration file where you can define LLM providers, prompt targets, guardrails, etc. Below is an example configuration that defines OpenAI and Anthropic LLM providers.
240
-
241
-
Create ``plano_config.yaml`` file with the following content:
242
-
243
-
.. code-block:: yaml
244
-
245
-
version: v0.3.0
246
-
247
-
listeners:
248
-
- type: model
249
-
name: model_1
250
-
address: 0.0.0.0
251
-
port: 12000
252
-
253
-
model_providers:
254
-
255
-
- access_key: $OPENAI_API_KEY
256
-
model: openai/gpt-4o
257
-
default: true
258
-
259
-
- access_key: $ANTHROPIC_API_KEY
260
-
model: anthropic/claude-sonnet-4-5
261
-
262
-
Step 2. Start plano
263
-
~~~~~~~~~~~~~~~~~~~
264
-
265
-
Once the config file is created, ensure that you have environment variables set up for ``ANTHROPIC_API_KEY`` and ``OPENAI_API_KEY`` (or these are defined in a ``.env`` file).
266
-
267
-
Start Plano:
268
-
269
-
.. code-block:: console
270
-
271
-
$ planoai up plano_config.yaml
272
-
# Or if installed with uv tool: uvx planoai up plano_config.yaml
273
-
2024-12-05 11:24:51,288 - planoai.main - INFO - Starting plano cli version: 0.4.1
274
-
2024-12-05 11:24:51,825 - planoai.utils - INFO - Schema validation successful!
275
-
2024-12-05 11:24:51,825 - planoai.main - INFO - Starting plano
276
-
...
277
-
2024-12-05 11:25:16,131 - planoai.core - INFO - Container is healthy!
278
-
279
-
Step 3: Interact with LLM
280
-
~~~~~~~~~~~~~~~~~~~~~~~~~
281
-
282
-
Step 3.1: Using OpenAI Python client
283
-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
284
-
285
-
Make outbound calls via the Plano gateway:
286
-
287
-
.. code-block:: python
288
-
289
-
from openai import OpenAI
290
-
291
-
# Use the OpenAI client as usual
292
-
client = OpenAI(
293
-
# No need to set a specific openai.api_key since it's configured in Plano's gateway
294
-
api_key='--',
295
-
# Set the OpenAI API base URL to the Plano gateway endpoint
296
-
base_url="http://127.0.0.1:12000/v1"
297
-
)
298
-
299
-
response = client.chat.completions.create(
300
-
# we select model from plano_config file
301
-
model="--",
302
-
messages=[{"role": "user", "content": "What is the capital of France?"}],
0 commit comments