Request batching allows executing multiple queries in a single HTTP request. This reduces network overhead and improves performance when a page needs to load multiple resources.
Without batching, each query makes a separate HTTP request:
// Page needs 5 user-related queries
const [user, posts, comments, notifications, settings] = await Promise.all([
api.users.get({ id: 1 }),
api.posts.list({ userId: 1 }),
api.comments.list({ userId: 1 }),
api.notifications.list({ userId: 1 }),
api.settings.get({ userId: 1 })
])
// This makes 5 HTTP POST requests
// POST /api/users.get
// POST /api/posts.list
// POST /api/comments.list
// POST /api/notifications.list
// POST /api/settings.getWith batching, all queries are sent in one request:
// Same query, but batched
const results = await api.batch([
[api.users.get, { id: 1 }],
[api.posts.list, { userId: 1 }],
[api.comments.list, { userId: 1 }],
[api.notifications.list, { userId: 1 }],
[api.settings.get, { userId: 1 }]
])
// Single HTTP POST
// POST /api/batch
// Body: [
// { path: "users.get", args: { id: 1 } },
// { path: "posts.list", args: { userId: 1 } },
// ...
// ]
const [user, posts, comments, notifications, settings] = resultsimport { batch } from "@deessejs/server"
// Batch multiple queries
const results = await batch([
[api.users.get, { id: 1 }],
[api.posts.list, { userId: 1 }]
])
// Results are in same order
const [user, posts] = results// Can mix queries and mutations (mutations run first)
const results = await batch([
[api.notifications.markRead, { id: 1 }], // Mutation (runs first)
[api.users.get, { id: 1 }], // Query
[api.notifications.list, { userId: 1 }] // Query
])// Default limit: 10 requests per batch
const results = await batch([
[api.users.get, { id: 1 }],
[api.users.get, { id: 2 }],
// ... up to 10
])
// Custom limit
const results = await batch(requests, { maxBatchSize: 5 })// Enable automatic batching
const api = createAPI({
router: t.router({ ... }),
batch: {
enabled: true,
windowMs: 10 // Wait 10ms to collect requests
}
})
// All queries are automatically batched!
const user1 = await api.users.get({ id: 1 })
const user2 = await api.users.get({ id: 2 })
// These two requests will be batched together// Force batch execution
const results = await batch(requests)The batch endpoint handles multiple requests:
// POST /api/batch
// Request body:
[
{ path: "users.get", args: { id: 1 } },
{ path: "posts.list", args: { userId: 1 } }
]
// Response body:
[
{ ok: true, value: { id: 1, name: "John" } },
{ ok: true, value: [{ id: 1, title: "Hello" }] }
]// If one request fails, others still succeed
const results = await batch([
[api.users.get, { id: 1 }], // Success
[api.users.get, { id: 999 }] // Error: NOT_FOUND
])
// results[0].ok === true
// results[1].ok === falseconst { t, createAPI } = defineContext({
context: { db: myDatabase },
batch: {
enabled: true,
maxSize: 10,
windowMs: 10
}
})| Option | Type | Default | Description |
|---|---|---|---|
enabled |
boolean |
false |
Enable automatic batching |
maxSize |
number |
10 |
Maximum requests per batch |
windowMs |
number |
10 |
Time to wait for requests |
| Method | 100 requests | Network calls |
|---|---|---|
| No batching | 500ms | 100 |
| Batching (10) | 120ms | 10 |
| Batching (single) | 80ms | 1 |
Good for:
- Loading multiple related resources on page load
- Dashboard pages with multiple widgets
- Initial data fetching
Not needed for:
- Single requests
- Sequential dependencies
- Real-time updates
// Client-side batching
class RequestBatcher {
private queue: Array<{
resolve: Function
reject: Function
path: string
args: unknown
}> = []
private timer: NodeJS.Timeout | null = null
async execute(path: string, args: unknown) {
return new Promise((resolve, reject) => {
this.queue.push({ resolve, reject, path, args })
if (this.queue.length >= 10) {
this.flush()
} else if (!this.timer) {
this.timer = setTimeout(() => this.flush(), 10)
}
})
}
private async flush() {
if (this.timer) {
clearTimeout(this.timer)
this.timer = null
}
const requests = this.queue.splice(0)
const response = await fetch('/api/batch', {
method: 'POST',
body: JSON.stringify(requests.map(r => ({ path: r.path, args: r.args })))
})
const results = await response.json()
requests.forEach((req, i) => {
req.resolve(results[i])
})
}
}// Don't batch unrelated requests
const results = await batch([
[api.users.get, { id: 1 }],
[api.weather.get, {}], // Unrelated
[api.stock.get, {}] // Unrelated
])// Good: Batch initial page data
async function loadDashboard() {
const [user, stats, notifications] = await batch([
[api.users.get, { id: currentUserId }],
[api.stats.get, {}],
[api.notifications.list, { userId: currentUserId }]
])
return { user, stats, notifications }
}// If requests depend on each other, don't batch
const user = await api.users.get({ id: 1 })
const posts = await api.posts.list({ userId: user.id })
// These can't be batched// Large batches may timeout
const results = await batch(requests, { maxSize: 10 })// Results are in request order
const results = await batch([
[api.users.get, { id: 1 }],
[api.users.get, { id: 2 }]
])
// results[0] corresponds to requests[0]
// results[1] corresponds to requests[1]- Streaming batch responses
- Parallel batch execution
- Dependency-aware batching
- Automatic batch optimization