Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 28 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,31 +2,30 @@

The [OpenRouter](https://openrouter.ai/) provider for the [Vercel AI SDK](https://sdk.vercel.ai/docs) gives access to over 300 large language models on the OpenRouter chat and completion APIs.

## Setup for AI SDK v5
## Overview

```bash
# For pnpm
pnpm add @openrouter/ai-sdk-provider
This provider allows you to use the Vercel AI SDK with the OpenRouter API. It provides a seamless integration, allowing you to leverage the power of OpenRouter's extensive model catalog with the convenience of the AI SDK.

# For npm
npm install @openrouter/ai-sdk-provider
## Features

# For yarn
yarn add @openrouter/ai-sdk-provider
```
- **Access to over 300 models**: Use any of the models available on OpenRouter, including the latest open-source and proprietary models.
- **Chat and completion APIs**: Use both the chat and completion APIs, with support for streaming and non-streaming responses.
- **Tool support**: Use tools with supported models to build powerful applications.
- **Usage accounting**: Track your token usage and costs with OpenRouter's usage accounting feature.
- **Anthropic prompt caching**: Leverage Anthropic's prompt caching for faster and cheaper responses.
- **Provider routing**: Control how your requests are routed to different providers.

## (LEGACY) Setup for AI SDK v4
## Setup

```bash
# For pnpm
pnpm add @openrouter/ai-sdk-provider@ai-sdk-v4
pnpm add @openrouter/ai-sdk-provider

# For npm
npm install @openrouter/ai-sdk-provider@ai-sdk-v4
npm install @openrouter/ai-sdk-provider

# For yarn
yarn add @openrouter/ai-sdk-provider@ai-sdk-v4

yarn add @openrouter/ai-sdk-provider
```

## Provider Instance
Expand All @@ -37,6 +36,17 @@ You can import the default provider instance `openrouter` from `@openrouter/ai-s
import { openrouter } from '@openrouter/ai-sdk-provider';
```

You can also create your own provider instance with custom settings:

```ts
import { createOpenRouter } from '@openrouter/ai-sdk-provider';

const openrouter = createOpenRouter({
apiKey: 'YOUR_API_KEY',
baseURL: 'https://my-proxy.com/api/v1',
});
```

## Example

```ts
Expand Down Expand Up @@ -197,3 +207,7 @@ if (result.providerMetadata?.openrouter?.usage) {
);
}
```

## API Reference

The full API reference is available in the [generated documentation](<LINK_TO_GENERATED_DOCS>).
9 changes: 9 additions & 0 deletions e2e/tools.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,9 @@ const openrouter = createOpenRouter({
baseUrl: `${process.env.OPENROUTER_API_BASE}/api/v1`,
});

/**
* A tool for sending an SMS message.
*/
export const sendSMSTool = tool({
description: 'Send an SMS to any phone number',
inputSchema: z.object({
Expand All @@ -24,6 +27,9 @@ export const sendSMSTool = tool({
},
});

/**
* A tool for reading an SMS message.
*/
export const readSMSTool = tool({
description: 'Read the nth SMS from a phone number',
inputSchema: z.object({
Expand All @@ -39,6 +45,9 @@ export const readSMSTool = tool({
},
});

/**
* A tool for executing a command in the terminal.
*/
export const executeCommandInTerminalTool = tool({
description: 'Execute a command in the terminal',
inputSchema: z.object({
Expand Down
5 changes: 5 additions & 0 deletions examples/next-chat/.env.local.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Required: obtain an API key from https://openrouter.ai/keys
OPENROUTER_API_KEY=sk-or-...

# Optional: override the base URL if you are pointing at a proxy.
# OPENROUTER_BASE_URL=https://openrouter.ai/api/v1
49 changes: 49 additions & 0 deletions examples/next-chat/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# OpenRouter Next.js Chat Example

This example demonstrates how to build a streaming chat experience in Next.js using the
[`@openrouter/ai-sdk-provider`](https://www.npmjs.com/package/@openrouter/ai-sdk-provider)
and the Vercel AI SDK. The UI lets you:

- pick an OpenRouter model
- toggle tool usage on or off
- watch streaming assistant replies
- inspect tool invocations and their inputs/outputs in real time

## Getting Started

1. Install dependencies:

```bash
pnpm install
pnpm --filter @openrouter/examples-next-chat dev
```

> **Note:** the example is part of the monorepo. You can also `cd examples/next-chat`
> and run `pnpm install` followed by `pnpm dev`.

2. Copy the example environment file and add your OpenRouter key:

```bash
cp examples/next-chat/.env.local.example examples/next-chat/.env.local
```

At minimum you need `OPENROUTER_API_KEY`. Set `OPENROUTER_BASE_URL` if you proxy requests.

3. Start the development server:

```bash
pnpm --filter @openrouter/examples-next-chat dev
```

Visit `http://localhost:3000` to try the chat experience.

## How It Works

- `app/api/chat/route.ts` configures the OpenRouter provider, streams responses with tools, and
returns AI SDK UI message streams.
- `app/page.tsx` implements a small client-side state machine that consumes the stream, renders
messages, and keeps track of tool invocations.
- `lib/tools.ts` defines two sample tools (`getCurrentWeather` and `getCurrentTime`). You can add
your own tools or wire in real data sources.

This example is intentionally lightweight so you can adapt it for your own projects.
65 changes: 65 additions & 0 deletions examples/next-chat/app/api/chat/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import type { ModelMessage } from 'ai';
import { streamText } from 'ai';

import { BASIC_TOOLS } from '../../../lib/tools';
import { DEFAULT_SYSTEM_PROMPT } from '../../../lib/models';

interface ChatRequestBody {
modelId: string;
toolMode?: 'auto' | 'disabled';
messages: ModelMessage[];
}

const openrouter = createOpenRouter({
compatibility: 'strict',
baseURL: process.env.OPENROUTER_BASE_URL ?? process.env.OPENROUTER_API_BASE,
});

function normalizeToolMode(toolMode: ChatRequestBody['toolMode']) {
return toolMode === 'disabled' ? 'disabled' : 'auto';
}

export async function POST(request: Request) {
const apiKey = process.env.OPENROUTER_API_KEY;
if (!apiKey) {
return Response.json(
{ error: 'Missing OPENROUTER_API_KEY environment variable.' },
{ status: 500 },
);
}

let body: ChatRequestBody;
try {
body = (await request.json()) as ChatRequestBody;
} catch (_error) {
return Response.json({ error: 'Invalid JSON payload.' }, { status: 400 });
}

if (!body || typeof body.modelId !== 'string') {
return Response.json({ error: 'Request must include a modelId string.' }, { status: 400 });
}

if (!Array.isArray(body.messages) || body.messages.some((message) => typeof message !== 'object')) {
return Response.json({ error: 'Messages must be an array of chat messages.' }, { status: 400 });
}

const toolMode = normalizeToolMode(body.toolMode);
const shouldExposeTools = toolMode !== 'disabled';

try {
const result = streamText({
model: openrouter(body.modelId),
system: DEFAULT_SYSTEM_PROMPT,
messages: body.messages,
tools: shouldExposeTools ? BASIC_TOOLS : undefined,
toolChoice: shouldExposeTools ? 'auto' : 'none',
});

return result.toUIMessageStreamResponse();
} catch (error) {
const errorMessage =
error instanceof Error ? error.message : 'Unknown error while contacting OpenRouter.';
return Response.json({ error: errorMessage }, { status: 500 });
}
}
Loading