Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
118 changes: 61 additions & 57 deletions src/oss/langgraph/local-server.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -129,14 +129,18 @@ The `langgraph dev` command starts Agent Server in an in-memory mode. This mode
> - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024
```

For an Agent Server running on a custom host/port, update the baseURL parameter.
For an Agent Server running on a custom host/port, update the `baseUrl` query parameter in the URL. For example, if your server is running on `http://myhost:3000`:

```
https://smith.langchain.com/studio/?baseUrl=http://myhost:3000
```

<Accordion title="Safari compatibility">
Use the `--tunnel` flag with your command to create a secure tunnel, as Safari has limitations when connecting to localhost servers:
Use the `--tunnel` flag with your command to create a secure tunnel, as Safari has limitations when connecting to localhost servers:

```shell
langgraph dev --tunnel
```
```shell
langgraph dev --tunnel
```
</Accordion>

## 7. Test the API
Expand All @@ -145,60 +149,60 @@ For an Agent Server running on a custom host/port, update the baseURL parameter.
<Tabs>
<Tab title="Python SDK (async)">
1. Install the LangGraph Python SDK:
```shell
pip install langgraph-sdk
```
```shell
pip install langgraph-sdk
```
2. Send a message to the assistant (threadless run):
```python
from langgraph_sdk import get_client
import asyncio

client = get_client(url="http://localhost:2024")

async def main():
async for chunk in client.runs.stream(
None, # Threadless run
"agent", # Name of assistant. Defined in langgraph.json.
input={
"messages": [{
"role": "human",
"content": "What is LangGraph?",
}],
},
):
print(f"Receiving new event of type: {chunk.event}...")
print(chunk.data)
print("\n\n")

asyncio.run(main())
```
```python
from langgraph_sdk import get_client
import asyncio

client = get_client(url="http://localhost:2024")

async def main():
async for chunk in client.runs.stream(
None, # Threadless run
"agent", # Name of assistant. Defined in langgraph.json.
input={
"messages": [{
"role": "human",
"content": "What is LangGraph?",
}],
},
):
print(f"Receiving new event of type: {chunk.event}...")
print(chunk.data)
print("\n\n")

asyncio.run(main())
```
</Tab>
<Tab title="Python SDK (sync)">
1. Install the LangGraph Python SDK:
```shell
pip install langgraph-sdk
```
```shell
pip install langgraph-sdk
```
2. Send a message to the assistant (threadless run):
```python
from langgraph_sdk import get_sync_client

client = get_sync_client(url="http://localhost:2024")

for chunk in client.runs.stream(
None, # Threadless run
"agent", # Name of assistant. Defined in langgraph.json.
input={
"messages": [{
"role": "human",
"content": "What is LangGraph?",
}],
},
stream_mode="messages-tuple",
):
print(f"Receiving new event of type: {chunk.event}...")
print(chunk.data)
print("\n\n")
```
```python
from langgraph_sdk import get_sync_client

client = get_sync_client(url="http://localhost:2024")

for chunk in client.runs.stream(
None, # Threadless run
"agent", # Name of assistant. Defined in langgraph.json.
input={
"messages": [{
"role": "human",
"content": "What is LangGraph?",
}],
},
stream_mode="messages-tuple",
):
print(f"Receiving new event of type: {chunk.event}...")
print(chunk.data)
print("\n\n")
```
</Tab>
<Tab title="Rest API">
```bash
Expand Down Expand Up @@ -226,9 +230,9 @@ For an Agent Server running on a custom host/port, update the baseURL parameter.
<Tabs>
<Tab title="Javascript SDK">
1. Install the LangGraph JS SDK:
```shell
npm install @langchain/langgraph-sdk
```
```shell
npm install @langchain/langgraph-sdk
```
2. Send a message to the assistant (threadless run):
```js
const { Client } = await import("@langchain/langgraph-sdk");
Expand Down