🚀 BlockNote AI is here! Access the early preview.
BlockNote Docs/Features/AI/Backend Integration

Backend Integration with BlockNote AI

The most common (and recommended) setup to integrate BlockNote AI with an LLM is to have BlockNote AI call your backend, which then calls an LLM of your choice using the Vercel AI SDK. This page explains the default setup, but also provides several alternative approaches.

Default setup (Vercel AI SDK)

The example below closely follows the basic example from the Vercel AI SDK for Next.js. The only difference is that we're retrieving the BlockNote tools from the request body and using the toolDefinitionsToToolSet function to convert them to AI SDK tools. The LLM will now be able to invoke these tools to make modifications to the BlockNote document as requested by the user. The tool calls are forwarded to the client application where they're handled automatically by the AI Extension.

import { openai } from "@ai-sdk/openai";
import { convertToModelMessages, streamText } from "ai";
import { toolDefinitionsToToolSet } from "@blocknote/xl-ai";

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages, toolDefinitions } = await req.json();

  const result = streamText({
    model: openai("gpt-4.1"), // see https://ai-sdk.dev/docs/foundations/providers-and-models
    messages: convertToModelMessages(messages),
    tools: toolDefinitionsToToolSet(toolDefinitions),
    toolChoice: "required",
  });

  return result.toUIMessageStreamResponse();
}

Different javascript frameworks will have a very similar setup. For example, see our Hono example.

If your backend is in another language, you're unable to use Vercel AI SDK, or you can't setup a backend at all - there are several alternatives to integrate BlockNote AI:

Data Stream Protocol

BlockNote AI expects your backend to respond with Server-Sent Events (SSE) data streams according to the Data Stream Protocol. This protocol specified by the Vercel AI SDK. You can use this information to develop custom backends for your use case. For example, to provide compatible API endpoints that are implemented in a different language such as Python.

Custom transport

Instead of modifying your backend to support the Data Stream Protocol, you can also implement a custom transport layer in the client application. The transport layer determines how AI SDK requests are sent to your backend to retrieve an LLM response.

ClientSideTransport

BlockNote AI also provides a ClientSideTransport class that can be used to connect directly to LLMs without routing through a backend. To use this transport, create a Vercel AI SDK Provider and LanguageModel directly on the client. Then, use this to instantiate the transport and pass it to the AI Extension.

The example below uses the OpenAI Compatible provider, but you can use any provider / model you want.

import { createOpenAICompatible } from "@ai-sdk/openai-compatible";
import { ClientSideTransport } from "@blocknote/xl-ai";

const model = createOpenAICompatible({
  apiKey: 'your-api-key',
  baseURL: 'https://your-provider',
})('model-id');

// ...
 createAIExtension({
    transport: new ClientSideTransport({
        model,
    }),
}),
// ...

With a proxy server

It's likely you cannot call your LLM provider directly in your client application using ClientSideTransport, because you need to hide API keys or prevent CORS issues. For this reason, you can use a proxy server to route requests to your LLM provider. This proxy server can then inject your API keys and forward the request to your LLM provider.

BlockNote AI provides a fetchViaProxy function that can be used to create a fetch function that routes requests through a proxy server (the example below uses Groq as the LLM provider):

import { createGroq } from "@ai-sdk/groq";
import { fetchViaProxy } from "@blocknote/xl-ai";

const model = createGroq({
  fetch: fetchViaProxy(
    (url) => `${BASE_URL}/proxy?provider=groq&url=${encodeURIComponent(url)}`,
  ),
  apiKey: "fake-api-key", // the API key is not used as it's actually added in the proxy server
})("llama-3.3-70b-versatile");
  • See a full example of how to use ClientSideTransport with a proxy server.

Advanced patterns

You can connect BlockNote AI features with more advanced AI pipelines. You can integrate concepts like Agents, RAG (Retrieval-Augmented Generation), multi-step LLM calls. Make sure you always expose the BlockNote tools (as passed via the toolDefinitions in the request body) to the LLM and forward invocations to the client.

We love to hear about your integrations and collaborate on advanced AI patterns. For dedicated support on integrating your pipeline and application with BlockNote AI, get in touch.

  • By default, BlockNote AI composes the LLM request (messages) based on the user's prompt and passes these to your backend. See this example for an example where composing the LLM request (prompt building) is delegated to the server.