diff --git a/src/oss/langgraph/local-server.mdx b/src/oss/langgraph/local-server.mdx index f9836d4570..d2a0432be9 100644 --- a/src/oss/langgraph/local-server.mdx +++ b/src/oss/langgraph/local-server.mdx @@ -129,14 +129,18 @@ The `langgraph dev` command starts Agent Server in an in-memory mode. This mode > - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 ``` -For an Agent Server running on a custom host/port, update the baseURL parameter. +For an Agent Server running on a custom host/port, update the `baseUrl` query parameter in the URL. For example, if your server is running on `http://myhost:3000`: + +``` +https://smith.langchain.com/studio/?baseUrl=http://myhost:3000 +``` - Use the `--tunnel` flag with your command to create a secure tunnel, as Safari has limitations when connecting to localhost servers: + Use the `--tunnel` flag with your command to create a secure tunnel, as Safari has limitations when connecting to localhost servers: - ```shell - langgraph dev --tunnel - ``` + ```shell + langgraph dev --tunnel + ``` ## 7. Test the API @@ -145,60 +149,60 @@ For an Agent Server running on a custom host/port, update the baseURL parameter. 1. Install the LangGraph Python SDK: - ```shell - pip install langgraph-sdk - ``` + ```shell + pip install langgraph-sdk + ``` 2. Send a message to the assistant (threadless run): - ```python - from langgraph_sdk import get_client - import asyncio - - client = get_client(url="http://localhost:2024") - - async def main(): - async for chunk in client.runs.stream( - None, # Threadless run - "agent", # Name of assistant. Defined in langgraph.json. - input={ - "messages": [{ - "role": "human", - "content": "What is LangGraph?", - }], - }, - ): - print(f"Receiving new event of type: {chunk.event}...") - print(chunk.data) - print("\n\n") - - asyncio.run(main()) - ``` + ```python + from langgraph_sdk import get_client + import asyncio + + client = get_client(url="http://localhost:2024") + + async def main(): + async for chunk in client.runs.stream( + None, # Threadless run + "agent", # Name of assistant. Defined in langgraph.json. + input={ + "messages": [{ + "role": "human", + "content": "What is LangGraph?", + }], + }, + ): + print(f"Receiving new event of type: {chunk.event}...") + print(chunk.data) + print("\n\n") + + asyncio.run(main()) + ``` 1. Install the LangGraph Python SDK: - ```shell - pip install langgraph-sdk - ``` + ```shell + pip install langgraph-sdk + ``` 2. Send a message to the assistant (threadless run): - ```python - from langgraph_sdk import get_sync_client - - client = get_sync_client(url="http://localhost:2024") - - for chunk in client.runs.stream( - None, # Threadless run - "agent", # Name of assistant. Defined in langgraph.json. - input={ - "messages": [{ - "role": "human", - "content": "What is LangGraph?", - }], - }, - stream_mode="messages-tuple", - ): - print(f"Receiving new event of type: {chunk.event}...") - print(chunk.data) - print("\n\n") - ``` + ```python + from langgraph_sdk import get_sync_client + + client = get_sync_client(url="http://localhost:2024") + + for chunk in client.runs.stream( + None, # Threadless run + "agent", # Name of assistant. Defined in langgraph.json. + input={ + "messages": [{ + "role": "human", + "content": "What is LangGraph?", + }], + }, + stream_mode="messages-tuple", + ): + print(f"Receiving new event of type: {chunk.event}...") + print(chunk.data) + print("\n\n") + ``` ```bash @@ -226,9 +230,9 @@ For an Agent Server running on a custom host/port, update the baseURL parameter. 1. Install the LangGraph JS SDK: - ```shell - npm install @langchain/langgraph-sdk - ``` + ```shell + npm install @langchain/langgraph-sdk + ``` 2. Send a message to the assistant (threadless run): ```js const { Client } = await import("@langchain/langgraph-sdk");