🚀 BlockNote AI is here! Access the early preview.
BlockNote Docs/Features/AI/BlockNote AI Reference

createAIExtension

Use createAIExtension to create a new AI Extension that can be registered to an editor when calling useCreateBlockNote.

// Usage:
const aiExtension = createAIExtension(opts: AIExtensionOptions);

// Definitions:
function createAIExtension(options: AIExtensionOptions): (editor: BlockNoteEditor) => AIExtension;

type AIExtensionOptions = LLMRequestHelpers & {
  /**
   * The name and color of the agent cursor when the AI is writing
   * @default { name: "AI", color: "#8bc6ff" }
   */
  agentCursor?: { name: string; color: string };
};

type LLMRequestHelpers = {
  /**
   * Transport used by the AI SDK to send requests to your backend/LLM.
   * Implement to customize backend URLs or use a different transport (e.g. websockets).
   */
  transport?: ChatTransport<UIMessage>;

  /**
   * Customize which stream tools are available to the LLM.
   */
  streamToolsProvider?: StreamToolsProvider<any, any>;
  // Provide `streamToolsProvider` in createAIExtension(options) or override per call via LLMRequestOptions.
  // If omitted, defaults to using `llmFormats.html.getStreamToolsProvider()`.

  /**
   * Extra options (headers/body/metadata) forwarded to the AI SDK request.
   */
  chatRequestOptions?: ChatRequestOptions;

  /**
   * Responsible for submitting a BlockNote AIRequest to the AI SDK.
   * Use to transform the messages sent to the LLM.
   *
   * @default defaultAIRequestSender(llmFormats.html.defaultPromptBuilder, llmFormats.html.defaultPromptInputDataBuilder)
   */
  aiRequestSender?: AIRequestSender;
};

getAIExtension

Use getAIExtension to retrieve the AI extension instance registered to the editor:

function getAIExtension(editor: BlockNoteEditor): AIExtension;

AIExtension

The AIExtension class is the main class for the AI extension. It exposes state and methods to interact with BlockNote's AI features.

class AIExtension {
  /**
   * Execute a call to an LLM and apply the result to the editor
   */
  callLLM(opts: LLMRequestOptions): Promise<void>;

  /**
   * Returns a read-only zustand store with the state of the AI Menu
   */
  get store(): ReadonlyStoreApi<{
    aiMenuState:
      | ({
          /**
           * The ID of the block that the AI menu is opened at.
           * This changes as the AI is making changes to the document
           */
          blockId: string;
        } & (
          | {
              status: "error";
              error: any;
            }
          | {
              status:
                | "user-input"
                | "thinking"
                | "ai-writing"
                | "user-reviewing";
            }
        ))
      | "closed";
  }>;

  /**
   * Returns a zustand store with the global configuration of the AI Extension.
   * These options are used by default across all LLM calls when calling {@link callLLM}
   */
  readonly options: StoreApi<LLMRequestHelpers>;

  /** Open the AI menu at a specific block */
  openAIMenuAtBlock(blockID: string): void;
  /** Close the AI menu */
  closeAIMenu(): void;
  /** Accept the changes made by the LLM */
  acceptChanges(): void;
  /** Reject the changes made by the LLM */
  rejectChanges(): void;
  /** Retry the previous LLM call (only valid when status is "error") */
  retry(): Promise<void>;
  /** Advanced: manually update the status shown by the AI menu */
  setAIResponseStatus(
    status:
      | "user-input"
      | "thinking"
      | "ai-writing"
      | "user-reviewing"
      | { status: "error"; error: any },
  ): void;
}

LLMRequestOptions

Requests to an LLM are made by calling callLLM on the AIExtension object. This takes an LLMRequestOptions object as an argument.

type LLMRequestOptions = {
  /** The user prompt */
  userPrompt: string;

  /** Whether to use the editor selection for the LLM call (default: true) */
  useSelection?: boolean;

  /**
   * If the user's cursor is in an empty paragraph, automatically delete it when the AI starts writing.
   * Used when typing `/ai` in an empty block. (default: true)
   */
  deleteEmptyCursorBlock?: boolean;
} & LLMRequestHelpers; // Optionally override helpers per request

getStreamToolsProvider

When an LLM is called, it needs to interpret the document and invoke operations to modify it. Use a format's getStreamToolsProvider to obtain the tools the LLM may call while editing. In most cases, use llmFormats.html.getStreamToolsProvider(...).

/** Return a provider for the stream tools available to the LLM */
type getStreamToolsProvider = (
  // Whether to add artificial delays between document edits
  // or apply them immediately as they're streamed in from the LLM without delays
  // (default: true)
  withDelays: boolean,
  // The stream tools to use, there are separate tools for adding, updating and deleting blocks
  // (default: { add: true, update: true, delete: true })
  defaultStreamTools?: {
    add?: boolean;
    update?: boolean;
    delete?: boolean;
  },
) => StreamToolsProvider;

AIRequest and AIRequestSender (advanced)

The AIRequest models a single AI operation against the editor (prompt, selection, tools). The AIRequestSender is responsible for submitting that request to the AI SDK layer.

type AIRequest = {
  editor: BlockNoteEditor;
  chat: Chat<UIMessage>;
  userPrompt: string;
  selectedBlocks?: Block[];
  emptyCursorBlockToDelete?: string;
  streamTools: StreamTool<any>[];
};

type AIRequestSender = {
  sendAIRequest: (
    aiRequest: AIRequest,
    options: ChatRequestOptions,
  ) => Promise<void>;
};

The default AIRequestSender used is defaultAIRequestSender(llmFormats.html.defaultPromptBuilder, llmFormats.html.defaultPromptInputDataBuilder). It takes an AIRequest and the default prompt builder (see below) to construct the updated messages array and submits this to the AI SDK.

PromptBuilder (advanced)

A PromptBuilder allows you to fine-tune the messages sent to the LLM. A PromptBuilder mutates the AI SDK UIMessage[] in place based on the user prompt and document-specific input data. Input data is produced by a paired PromptInputDataBuilder.

We recommend forking the default PromptBuilder as a starting point.

// Mutates the messages based on format-specific input data
export type PromptBuilder<E> = (
  messages: UIMessage[],
  inputData: E,
) => Promise<void>;

// Builds the input data passed to the PromptBuilder from a BlockNote AIRequest
export type PromptInputDataBuilder<E> = (aiRequest: AIRequest) => Promise<E>;

// Create an AIRequestSender from your custom builders.
// This lets you plug your PromptBuilder into the request pipeline used by callLLM/executeAIRequest.
function defaultAIRequestSender<E>(
  promptBuilder: PromptBuilder<E>,
  promptInputDataBuilder: PromptInputDataBuilder<E>,
): AIRequestSender;

Lower-level functions (advanced)

The callLLM function automatically passes the default options set in the AIExtension to the LLM request. It also handles the LLM response and updates the state of the AI menu accordingly.

For advanced use cases, you can also directly use the lower-level buildAIRequest and executeAIRequest functions to issue an LLM request directly.

buildAIRequest

Use buildAIRequest to assemble an AIRequest from editor state and configuration.

function buildAIRequest(opts: {
  editor: BlockNoteEditor;
  chat: Chat<UIMessage>;
  userPrompt: string;
  useSelection?: boolean;
  deleteEmptyCursorBlock?: boolean;
  streamToolsProvider?: StreamToolsProvider<any, any>;
  onBlockUpdated?: (blockId: string) => void;
}): AIRequest;

executeAIRequest

Use executeAIRequest to send it with an AIRequestSender and process streaming tool calls.

function executeAIRequest(opts: {
  aiRequest: AIRequest;
  sender: AIRequestSender;
  chatRequestOptions?: ChatRequestOptions;
  onStart?: () => void;
}): Promise<void>;