Skip to content

Batch Request/Response Plugin

The Batch Request/Response Plugin allows you to combine multiple requests and responses into a single batch, reducing the overhead of sending each one separately.

INFO

The Batch Plugin streams responses asynchronously so that no individual request blocks another, ensuring all responses are handled independently for faster, more efficient batching.

Setup

This plugin requires configuration on both the server and client sides.

Server

ts
import { 
BatchHandlerPlugin
} from '@orpc/server/plugins'
const
handler
= new
RPCHandler
(
router
, {
plugins
: [new
BatchHandlerPlugin
()],
})

INFO

The handler can be any supported oRPC handler, such as RPCHandler, OpenAPIHandler or custom implementations. Note that this plugin uses its own protocol for batching requests and responses, which is different from the handler’s native protocol.

Client

To use the BatchLinkPlugin, define at least one group. Requests within the same group will be considered for batching together, and each group requires a context as described in client context.

ts
import { 
BatchLinkPlugin
} from '@orpc/client/plugins'
const
link
= new
RPCLink
({
url
: 'https://api.example.com/rpc',
plugins
: [
new
BatchLinkPlugin
({
groups
: [
{
condition
:
options
=> true,
context
: {} // This context will represent the batch request and persist throughout the request lifecycle
} ] }), ], })

INFO

The link can be any supported oRPC link, such as RPCLink, OpenAPILink, or custom implementations.

Limitations

The plugin does not support AsyncIteratorObject or File/Blob in responses (requests will auto fall back to the default behavior). To exclude unsupported procedures, use the exclude option:

ts
const 
link
= new
RPCLink
({
url
: 'https://api.example.com/rpc',
plugins
: [
new
BatchLinkPlugin
({
groups
: [
{
condition
:
options
=> true,
context
: {}
} ],
exclude
: ({
path
}) => {
return ['planets/getImage', 'planets/subscribe'].
includes
(
path
.
join
('/'))
} }), ], })

Request Headers

By default, oRPC uses the headers appear in all requests in the batch. To customize headers, use the headers option:

ts
const 
link
= new
RPCLink
({
url
: 'https://api.example.com/rpc',
plugins
: [
new
BatchLinkPlugin
({
groups
: [
{
condition
:
options
=> true,
context
: {}
} ],
headers
: () => ({
authorization
: 'Bearer 1234567890',
}) }), ], })

Response Headers

By default, the response headers are empty. To customize headers, use the headers option:

ts
import { 
BatchHandlerPlugin
} from '@orpc/server/plugins'
const
handler
= new
RPCHandler
(
router
, {
plugins
: [new
BatchHandlerPlugin
({
headers
:
responses
=> ({
'some-header': 'some-value', }) })], })

Groups

Requests within the same group will be considered for batching together, and each group requires a context as described in client context.

In the example below, I used a group and context to batch requests based on the cache control:

ts
import { 
RPCLink
} from '@orpc/client/fetch'
import {
BatchLinkPlugin
} from '@orpc/client/plugins'
interface ClientContext {
cache
?:
RequestCache
} const
link
= new
RPCLink
<ClientContext>({
url
: 'http://localhost:3000/rpc',
method
: ({
context
}) => {
if (
context
?.
cache
) {
return 'GET' } return 'POST' },
plugins
: [
new
BatchLinkPlugin
({
groups
: [
{
condition
: ({
context
}) =>
context
?.
cache
=== 'force-cache',
context
: { // This context will be passed to the fetch method
cache
: 'force-cache',
}, }, { // Fallback for all other requests - need put it at the end of list
condition
: () => true,
context
: {},
}, ], }), ],
fetch
: (
request
,
init
, {
context
}) =>
globalThis
.
fetch
(
request
, {
...
init
,
cache
:
context
?.
cache
,
}), })

Now, calls with cache=force-cache will be sent with cache=force-cache, whether they're batched or executed individually.

Released under the MIT License.