logoassistant-ui

Edge Runtime

Overview

The Edge Runtime is assistant-ui's native runtime with the most built-in features. It is the recommended runtime for new projects.

The edge runtime can connect to a variety of backends, including:

  • Edge Runtime backends
  • Vercel AI SDK backends (using Data Stream protocol)

Getting Started

Create a Next.JS project

npx create-next-app@latest my-app
cd my-app

Install @assistant-ui/react

npm install @assistant-ui/react

Define a MyRuntimeProvider component

Update the MyModelAdapter below to integrate with your own custom API.

@/app/MyRuntimeProvider.tsx
"use client";
 
import type {  } from "react";
import {
  ,
  ,
  type ,
} from "@assistant-ui/react";
 
const :  = {
  async ({ ,  }) {
    // TODO replace with your own API
    const  = await ("<YOUR_API_ENDPOINT>", {
      : "POST",
      : {
        "Content-Type": "application/json",
      },
      // forward the messages in the chat to the API
      : .({
        ,
      }),
      // if the user hits the "cancel" button or escape keyboard key, cancel the request
      : ,
    });
 
    const  = await .();
    return {
      : [
        {
          : "text",
          : .text,
        },
      ],
    };
  },
};
 
export function ({
  ,
}: <{
  : ;
}>) {
  const  = ();
 
  return (
    < ={}>
      {}
    </>
  );
}

Wrap your app in MyRuntimeProvider

@/app/layout.tsx
import type {  } from "react";
import {  } from "@/app/MyRuntimeProvider";
 
export default function ({
  ,
}: <{
  : ;
}>) {
  return (
    <>
      < ="en">
        <>{}</>
      </>
    </>
  );
}

Streaming

Declare the run function as an AsyncGenerator (async *run). This allows you to yield the results as they are generated.

@/app/MyRuntimeProvider.tsx
const :  = {
  async *({ , ,  }) {
    const  = await ({ , ,  });
 
    let  = "";
    for await (const  of ) {
       += .[0]?.?. || "";
 
      yield {
        : [{ : "text",  }],
      };
    }
  },
};

On this page

Edit on Github