Skip to content
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
154 changes: 106 additions & 48 deletions content/docs/02-getting-started/04-svelte.mdx
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
---
title: Svelte
description: Welcome to the AI SDK quickstart guide for Svelte!
description: Learn how to build your first agent with the AI SDK and Svelte.
---

# Svelte Quickstart

The AI SDK is a powerful Typescript library designed to help developers build AI-powered applications.

In this quickstart tutorial, you'll build a simple AI-chatbot with a streaming user interface. Along the way, you'll learn key concepts and techniques that are fundamental to using the SDK in your own projects.
In this quickstart tutorial, you'll build a simple agent with a streaming chat user interface. Along the way, you'll learn key concepts and techniques that are fundamental to using the SDK in your own projects.

If you are unfamiliar with the concepts of [Prompt Engineering](/docs/advanced/prompt-engineering) and [HTTP Streaming](/docs/advanced/why-streaming), you can optionally read these documents first.

Expand All @@ -16,9 +16,9 @@ If you are unfamiliar with the concepts of [Prompt Engineering](/docs/advanced/p
To follow this quickstart, you'll need:

- Node.js 18+ and pnpm installed on your local development machine.
- An OpenAI API key.
- A [ Vercel AI Gateway ](https://vercel.com/ai-gateway) API key.

If you haven't obtained your OpenAI API key, you can do so by [signing up](https://platform.openai.com/signup/) on the OpenAI website.
If you haven't obtained your Vercel AI Gateway API key, you can do so by [signing up](https://vercel.com/d?to=%2F%5Bteam%5D%2F%7E%2Fai&title=Go+to+AI+Gateway) on the Vercel website.

## Set Up Your Application

Expand All @@ -32,74 +32,75 @@ Navigate to the newly created directory:

### Install Dependencies

Install `ai` and `@ai-sdk/openai`, the AI SDK's OpenAI provider.
Install `ai` and `@ai-sdk/svelte`, the AI package and AI SDK's Svelte bindings. The AI SDK's [ Vercel AI Gateway provider ](/providers/ai-sdk-providers/ai-gateway) ships with the `ai` package. You'll also install `zod`, a schema validation library used for defining tool inputs.

<Note>
The AI SDK is designed to be a unified interface to interact with any large
language model. This means that you can change model and providers with just
one line of code! Learn more about [available providers](/providers) and
[building custom providers](/providers/community-providers/custom-providers)
in the [providers](/providers) section.
This guide uses the Vercel AI Gateway provider so you can access hundreds of
models from different providers with one API key, but you can switch to any
provider or model by installing its package. Check out available [AI SDK
providers](/providers/ai-sdk-providers) for more information.
</Note>
<div className="my-4">
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}>
<Tab>
<Snippet text="pnpm add -D ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
<Snippet text="pnpm add -D ai@beta @ai-sdk/svelte@beta zod" dark />
</Tab>
<Tab>
<Snippet
text="npm install -D ai @ai-sdk/openai @ai-sdk/svelte zod"
dark
/>
<Snippet text="npm install -D ai@beta @ai-sdk/svelte@beta zod" dark />
</Tab>
<Tab>
<Snippet text="yarn add -D ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
<Snippet text="yarn add -D ai@beta @ai-sdk/svelte@beta zod" dark />
</Tab>
<Tab>
<Snippet text="bun add -d ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
<Snippet text="bun add -d ai@beta @ai-sdk/svelte@beta zod" dark />
</Tab>
</Tabs>
</div>

### Configure OpenAI API Key
### Configure your AI Gateway API key

Create a `.env.local` file in your project root and add your OpenAI API Key. This key is used to authenticate your application with the OpenAI service.
Create a `.env.local` file in your project root and add your AI Gateway API key. This key authenticates your application with the Vercel AI Gateway.

<Snippet text="touch .env.local" />

Edit the `.env.local` file:

```env filename=".env.local"
OPENAI_API_KEY=xxxxxxxxx
AI_GATEWAY_API_KEY=xxxxxxxxx
```

Replace `xxxxxxxxx` with your actual OpenAI API key.
Replace `xxxxxxxxx` with your actual Vercel AI Gateway API key.

<Note className="mb-4">
Vite does not automatically load environment variables onto `process.env`, so
you'll need to import `OPENAI_API_KEY` from `$env/static/private` in your code
(see below).
The AI SDK's Vercel AI Gateway Provider will default to using the
`AI_GATEWAY_API_KEY` environment variable. Vite does not automatically load
environment variables onto `process.env`, so you'll need to import
`AI_GATEWAY_API_KEY` from `$env/static/private` in your code (see below).
</Note>

## Create an API route

Create a SvelteKit Endpoint, `src/routes/api/chat/+server.ts` and add the following code:

```tsx filename="src/routes/api/chat/+server.ts"
import { createOpenAI } from '@ai-sdk/openai';
import { streamText, type UIMessage, convertToModelMessages } from 'ai';
import {
streamText,
type UIMessage,
convertToModelMessages,
createGateway,
} from 'ai';

import { OPENAI_API_KEY } from '$env/static/private';
import { AI_GATEWAY_API_KEY } from '$env/static/private';

const openai = createOpenAI({
apiKey: OPENAI_API_KEY,
const gateway = createGateway({
apiKey: AI_GATEWAY_API_KEY,
});

export async function POST({ request }) {
const { messages }: { messages: UIMessage[] } = await request.json();

const result = streamText({
model: openai('gpt-4o'),
model: gateway('openai/gpt-5.1'),
messages: convertToModelMessages(messages),
});

Expand All @@ -108,18 +109,75 @@ export async function POST({ request }) {
```

<Note>
If you see type errors with `OPENAI_API_KEY` or your `POST` function, run the
dev server.
If you see type errors with `AI_GATEWAY_API_KEY` or your `POST` function, run
the dev server.
</Note>

Let's take a look at what is happening in this code:

1. Create an OpenAI provider instance with the `createOpenAI` function from the `@ai-sdk/openai` package.
1. Create a gateway provider instance with the `createGateway` function from the `ai` package.
2. Define a `POST` request handler and extract `messages` from the body of the request. The `messages` variable contains a history of the conversation between you and the chatbot and provides the chatbot with the necessary context to make the next generation. The `messages` are of UIMessage type, which are designed for use in application UI - they contain the entire message history and associated metadata like timestamps.
3. Call [`streamText`](/docs/reference/ai-sdk-core/stream-text), which is imported from the `ai` package. This function accepts a configuration object that contains a `model` provider (defined in step 1) and `messages` (defined in step 2). You can pass additional [settings](/docs/ai-sdk-core/settings) to further customise the model's behaviour. The `messages` key expects a `ModelMessage[]` array. This type is different from `UIMessage` in that it does not include metadata, such as timestamps or sender information. To convert between these types, we use the `convertToModelMessages` function, which strips the UI-specific metadata and transforms the `UIMessage[]` array into the `ModelMessage[]` format that the model expects.
4. The `streamText` function returns a [`StreamTextResult`](/docs/reference/ai-sdk-core/stream-text#result-object). This result object contains the [ `toUIMessageStreamResponse` ](/docs/reference/ai-sdk-core/stream-text#to-data-stream-response) function which converts the result to a streamed response object.
5. Return the result to the client to stream the response.

## Choosing a Provider

The AI SDK supports dozens of model providers through [first-party](/providers/ai-sdk-providers), [OpenAI-compatible](/providers/openai-compatible-providers), and [ community ](/providers/community-providers) packages.

This quickstart uses the [Vercel AI Gateway](https://vercel.com/ai-gateway) provider, which is the default [global provider](/docs/ai-sdk-core/provider-management#global-provider-configuration). This means you can access models using a simple string in the model configuration:

```ts
model: 'openai/gpt-5.1';
```

You can also explicitly import and use the gateway provider in two other equivalent ways:

```ts
// Option 1: Import from 'ai' package (included by default)
import { gateway } from 'ai';
model: gateway('openai/gpt-5.1');

// Option 2: Install and import from '@ai-sdk/gateway' package
import { gateway } from '@ai-sdk/gateway';
model: gateway('openai/gpt-5.1');
```

### Using other providers

To use a different provider, install its package and create a provider instance. For example, to use OpenAI directly:

<div className="my-4">
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}>
<Tab>
<Snippet text="pnpm add @ai-sdk/openai@beta" dark />
</Tab>
<Tab>
<Snippet text="npm install @ai-sdk/openai@beta" dark />
</Tab>
<Tab>
<Snippet text="yarn add @ai-sdk/openai@beta" dark />
</Tab>

<Tab>
<Snippet text="bun add @ai-sdk/openai@beta" dark />
</Tab>

</Tabs>
</div>

```ts
import { openai } from '@ai-sdk/openai';

model: openai('gpt-5.1');
```

#### Updating the global provider

You can change the default global provider so string model references use your preferred provider everywhere in your application. Learn more about [provider management](/docs/ai-sdk-core/provider-management#global-provider-configuration).

Pick the approach that best matches how you want to manage providers across your application.

## Wire up the UI

Now that you have an API route that can query an LLM, it's time to set up your frontend. The AI SDK's [UI](/docs/ai-sdk-ui) package abstracts the complexity of a chat interface into one class, `Chat`.
Expand Down Expand Up @@ -195,8 +253,8 @@ Let's enhance your chatbot by adding a simple weather tool.
Modify your `src/routes/api/chat/+server.ts` file to include the new weather tool:

```tsx filename="src/routes/api/chat/+server.ts" highlight="2,3,17-31"
import { createOpenAI } from '@ai-sdk/openai';
import {
createGateway,
streamText,
type UIMessage,
convertToModelMessages,
Expand All @@ -205,17 +263,17 @@ import {
} from 'ai';
import { z } from 'zod';

import { OPENAI_API_KEY } from '$env/static/private';
import { AI_GATEWAY_API_KEY } from '$env/static/private';

const openai = createOpenAI({
apiKey: OPENAI_API_KEY,
const gateway = createGateway({
apiKey: AI_GATEWAY_API_KEY,
});

export async function POST({ request }) {
const { messages }: { messages: UIMessage[] } = await request.json();

const result = streamText({
model: openai('gpt-4o'),
model: gateway('openai/gpt-5.1'),
messages: convertToModelMessages(messages),
tools: {
weather: tool({
Expand Down Expand Up @@ -316,8 +374,8 @@ To solve this, you can enable multi-step tool calls using `stopWhen`. By default
Modify your `src/routes/api/chat/+server.ts` file to include the `stopWhen` condition:

```ts filename="src/routes/api/chat/+server.ts" highlight="15"
import { createOpenAI } from '@ai-sdk/openai';
import {
createGateway,
streamText,
type UIMessage,
convertToModelMessages,
Expand All @@ -326,17 +384,17 @@ import {
} from 'ai';
import { z } from 'zod';

import { OPENAI_API_KEY } from '$env/static/private';
import { AI_GATEWAY_API_KEY } from '$env/static/private';

const openai = createOpenAI({
apiKey: OPENAI_API_KEY,
const gateway = createGateway({
apiKey: AI_GATEWAY_API_KEY,
});

export async function POST({ request }) {
const { messages }: { messages: UIMessage[] } = await request.json();

const result = streamText({
model: openai('gpt-4o'),
model: gateway('openai/gpt-5.1'),
messages: convertToModelMessages(messages),
stopWhen: stepCountIs(5),
tools: {
Expand Down Expand Up @@ -369,8 +427,8 @@ By setting `stopWhen: stepCountIs(5)`, you're allowing the model to use up to 5
Update your `src/routes/api/chat/+server.ts` file to add a new tool to convert the temperature from Fahrenheit to Celsius:

```tsx filename="src/routes/api/chat/+server.ts" highlight="32-45"
import { createOpenAI } from '@ai-sdk/openai';
import {
createGateway,
streamText,
type UIMessage,
convertToModelMessages,
Expand All @@ -379,17 +437,17 @@ import {
} from 'ai';
import { z } from 'zod';

import { OPENAI_API_KEY } from '$env/static/private';
import { AI_GATEWAY_API_KEY } from '$env/static/private';

const openai = createOpenAI({
apiKey: OPENAI_API_KEY,
const gateway = createGateway({
apiKey: AI_GATEWAY_API_KEY,
});

export async function POST({ request }) {
const { messages }: { messages: UIMessage[] } = await request.json();

const result = streamText({
model: openai('gpt-4o'),
model: gateway('openai/gpt-5.1'),
messages: convertToModelMessages(messages),
stopWhen: stepCountIs(5),
tools: {
Expand Down
Loading
Loading