Skip to content

Commit 9be7ef6

Browse files
committed
Sync open source content 🐝 (from b90fd66c2b897ae8dc9bd1739f0529a607bcc5ba)
1 parent 08a487c commit 9be7ef6

27 files changed

+500
-292
lines changed

_meta.global.tsx

Lines changed: 13 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -44,13 +44,21 @@ const meta = {
4444
display: "children",
4545
items: {
4646
introduction: {
47-
title: "Intro",
47+
title: "Introduction",
4848
},
49-
"gram-quickstart": {
50-
title: "Quickstart",
49+
"mcp-installation-guides": {
50+
title: "MCP Installation Guides",
5151
},
52-
guides: {
53-
title: "Guides",
52+
"getting-started": {
53+
title: "Getting Started",
54+
items: {
55+
typescript: {
56+
title: "Use TypeScript",
57+
},
58+
openapi: {
59+
title: "Use an OpenAPI Spec",
60+
},
61+
},
5462
},
5563
concepts: {
5664
title: "Concepts",

docs/gram/api-clients/using-anthropic-api-with-gram-mcp-servers.mdx

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ sidebar:
55
order: 2
66
---
77

8-
Anthropic's [Messages API](https://docs.anthropic.com/en/api/messages) supports remote MCP servers through their MCP connector feature. This allows you to give Claude models direct access to your tools and infrastructure by connecting to [Gram-hosted MCP servers](/docs/gram/gram-quickstart).
8+
Anthropic's [Messages API](https://docs.anthropic.com/en/api/messages) supports remote MCP servers through their MCP connector feature. This allows you to give Claude models direct access to your tools and infrastructure by connecting to [Gram-hosted MCP servers](/docs/gram/quickstart).
99

1010
This guide will show you how to connect Anthropic's API to a Gram-hosted MCP server using an example [Push Advisor API](https://github.com/ritza-co/gram-examples/blob/main/push-advisor-api/openapi.yaml). You'll learn how to create an MCP server from an OpenAPI document, set up the connection, configure authentication, and use natural language to query the example API.
1111

@@ -60,7 +60,6 @@ Click **Toolsets** in the sidebar to view the Push Advisor toolset.
6060

6161
![Screenshot of the Gram dashboard showing the Push Advisor toolset](/assets/docs/gram/img/guides/mcp-installing-ide/toolset-created.png)
6262

63-
6463
### Publishing an MCP server
6564

6665
Let's make the toolset available as an MCP server.
@@ -310,13 +309,13 @@ try:
310309
],
311310
betas=["mcp-client-2025-04-04"]
312311
)
313-
312+
314313
# Check for MCP tool errors in the response
315314
for content in response.content:
316315
if hasattr(content, 'type') and content.type == 'mcp_tool_result':
317316
if hasattr(content, 'is_error') and content.is_error:
318317
print(f"Tool error occurred: {content}")
319-
318+
320319
print(response.content[0].text)
321320
except Exception as error:
322321
print(f"API call failed: {error}")
@@ -384,7 +383,6 @@ Click **Connect** to establish a connection to your MCP server.
384383

385384
Use the Inspector to verify that your MCP server responds correctly before integrating it with your Anthropic API calls.
386385

387-
388386
## What's next
389387

390388
You now have Anthropic's Claude models connected to your Gram-hosted MCP server, giving them access to your custom APIs and tools.

docs/gram/api-clients/using-langchain-with-gram-mcp-servers.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ sidebar:
55
order: 4
66
---
77

8-
[LangChain](https://www.langchain.com/) and [LangGraph](https://www.langchain.com/langgraph) support MCP servers through the `langchain-mcp-adapters` library, which allows you to give your LangChain agents and LangGraph workflows direct access to your tools and infrastructure by connecting to [Gram-hosted MCP servers](/docs/gram/gram-quickstart).
8+
[LangChain](https://www.langchain.com/) and [LangGraph](https://www.langchain.com/langgraph) support MCP servers through the `langchain-mcp-adapters` library, which allows you to give your LangChain agents and LangGraph workflows direct access to your tools and infrastructure by connecting to [Gram-hosted MCP servers](/docs/gram/quickstart).
99

1010
This guide demonstrates how to connect LangChain to a Gram-hosted MCP server using an example [Push Advisor API](https://github.com/ritza-co/gram-examples/blob/main/push-advisor-api/openapi.yaml). You'll learn how to create an MCP server from an OpenAPI document, set up the connection, configure authentication, and use natural language to query the example API.
1111

docs/gram/api-clients/using-openai-agents-sdk-with-gram-mcp-servers.mdx

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ import { Callout } from "@/mdx/components";
77

88
The OpenAI [Agents SDK](https://openai.github.io/openai-agents-python/) is a production-ready framework for building agentic AI applications. The Agents SDK provides advanced features like persistent sessions, agent handoffs, guardrails, and comprehensive tracing for complex workflows.
99

10-
When combined with [Gram-hosted MCP servers](/docs/gram/gram-quickstart), the Agents SDK enables you to build sophisticated agents that can interact with your APIs, databases, and other services through natural language conversations with persistent context.
10+
When combined with [Gram-hosted MCP servers](/docs/gram/quickstart), the Agents SDK enables you to build sophisticated agents that can interact with your APIs, databases, and other services through natural language conversations with persistent context.
1111

1212
This guide shows you how to connect the OpenAI Agents SDK to a Gram-hosted MCP server using Taskmaster, a full-stack CRUD application for task and project management. Taskmaster includes a web UI for managing projects and tasks, a built-in HTTP API, OAuth 2.0 authentication, and a Neon PostgreSQL database for storing data. Try the [demo app](https://taskmaster-speakeasyapi.vercel.app/) to see it in action.
1313

@@ -34,7 +34,7 @@ To follow this tutorial, you need:
3434

3535
## Creating a Taskmaster MCP server
3636

37-
Before connecting the OpenAI Agents SDK to a Taskmaster MCP server, you first need to create one.
37+
Before connecting the OpenAI Agents SDK to a Taskmaster MCP server, you first need to create one.
3838

3939
Follow the guide to [creating a Taskmaster MCP server](/docs/gram/examples/creating-taskmaster-mcp-server), which walks you through:
4040

@@ -66,7 +66,10 @@ export GRAM_KEY=your-gram-api-key # Optional: only needed for private MCP serve
6666
```
6767

6868
<Callout title="Code Examples" type="info">
69-
Throughout this guide, replace `your-taskmaster-slug` with your actual MCP server URL and update the header names to match your server configuration from the guide to [creating a Taskmaster MCP server](/docs/gram/examples/creating-taskmaster-mcp-server).
69+
Throughout this guide, replace `your-taskmaster-slug` with your actual MCP
70+
server URL and update the header names to match your server configuration from
71+
the guide to [creating a Taskmaster MCP
72+
server](/docs/gram/examples/creating-taskmaster-mcp-server).
7073
</Callout>
7174

7275
### Basic connection (public server)
@@ -397,8 +400,8 @@ When the browser opens:
397400
- In the **Transport Type** field, select **Streamable HTTP** (not the default stdio).
398401
- Enter your server URL: `https://app.getgram.ai/mcp/your-taskmaster-slug`.
399402
- For authentication, add **API Token Authentication**:
400-
- **Header name:** `MCP-TASKMASTER-API-KEY`
401-
- **Bearer token:** Your Taskmaster API key
403+
- **Header name:** `MCP-TASKMASTER-API-KEY`
404+
- **Bearer token:** Your Taskmaster API key
402405
- Click **Connect** to test the connection.
403406

404407
**Note:** Taskmaster servers use custom authentication headers that may not be fully supported by the standard MCP Inspector interface. For guaranteed testing, use the Gram Playground or the code examples in this guide.

docs/gram/api-clients/using-openai-api-with-gram-mcp-servers.mdx

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ sidebar:
55
order: 1
66
---
77

8-
The OpenAI [Responses API](https://platform.openai.com/docs/api-reference/responses) supports remote MCP servers through its MCP tool feature. This allows you to give GPT models direct access to your tools and infrastructure by connecting to [Gram-hosted MCP servers](/docs/gram/gram-quickstart).
8+
The OpenAI [Responses API](https://platform.openai.com/docs/api-reference/responses) supports remote MCP servers through its MCP tool feature. This allows you to give GPT models direct access to your tools and infrastructure by connecting to [Gram-hosted MCP servers](/docs/gram/getting-started).
99

1010
This guide shows you how to connect the OpenAI Responses API to a Gram-hosted MCP server using an example [Push Advisor API](https://github.com/ritza-co/gram-examples/blob/main/push-advisor-api/openapi.yaml). You'll learn how to create an MCP server from an OpenAPI document, set up the connection, configure authentication, and use natural language to query the example API.
1111

@@ -190,17 +190,17 @@ If your Gram MCP server has multiple tools but you only want to expose certain o
190190

191191
```javascript
192192
const response = await client.responses.create({
193-
model: "gpt-4.1",
194-
tools: [
195-
{
196-
type: "mcp",
197-
server_label: "gram-pushadvisor",
198-
server_url: "https://app.getgram.ai/mcp/canipushtoprod",
199-
allowed_tools: ["can_i_push_to_prod"],
200-
require_approval: "never",
201-
},
202-
],
203-
input: "Is it safe to deploy today?",
193+
model: "gpt-4.1",
194+
tools: [
195+
{
196+
type: "mcp",
197+
server_label: "gram-pushadvisor",
198+
server_url: "https://app.getgram.ai/mcp/canipushtoprod",
199+
allowed_tools: ["can_i_push_to_prod"],
200+
require_approval: "never",
201+
},
202+
],
203+
input: "Is it safe to deploy today?",
204204
});
205205
```
206206

@@ -389,4 +389,4 @@ Use the Inspector to verify that your MCP server responds correctly before integ
389389

390390
You now have OpenAI's GPT models connected to your Gram-hosted MCP server, giving them access to your custom APIs and tools.
391391

392-
Ready to build your own MCP server? [Try Gram today](/product/gram) and see how easy it is to turn any API into agent-ready tools that work with both OpenAI and Anthropic models.
392+
Ready to build your own MCP server? [Try Gram today](/product/gram) and see how easy it is to turn any API into agent-ready tools that work with both OpenAI and Anthropic models.

docs/gram/api-clients/using-pydantic-ai-with-gram-mcp-servers.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ sidebar:
55
order: 5
66
---
77

8-
[Pydantic AI](https://ai.pydantic.dev/) supports MCP servers through the `pydantic-ai-mcp-client` library. This allows you to give your Pydantic AI agents direct access to your tools and infrastructure by connecting to [Gram-hosted MCP servers](/docs/gram/gram-quickstart).
8+
[Pydantic AI](https://ai.pydantic.dev/) supports MCP servers through the `pydantic-ai-mcp-client` library. This allows you to give your Pydantic AI agents direct access to your tools and infrastructure by connecting to [Gram-hosted MCP servers](/docs/gram/quickstart).
99

1010
This guide shows you how to connect Pydantic AI to a Gram-hosted MCP server using an example [Push Advisor API](https://github.com/ritza-co/gram-examples/blob/main/push-advisor-api/openapi.yaml). You'll learn how to create an MCP server from an OpenAPI document, set up the connection, configure authentication, and use natural language to query the example API.
1111

0 commit comments

Comments
 (0)