Full-Stack Integration
Integrate Mastra directly into your Next.js application's API routes. This approach keeps your backend and frontend code within the same project.
Initialize Assistant UI
Start by setting up Assistant UI in your project. Run one of the following commands:
This command installs necessary dependencies and creates basic configuration files, including a default chat API route.
Need Help?
For detailed setup instructions, including adding API keys, basic configuration, and manual setup steps, please refer to the main Getting Started guide.
Review Initial API Route
The initialization command creates a basic API route at app/api/chat/route.ts
(or src/app/api/chat/route.ts
). It typically looks like this:
This default route uses the Vercel AI SDK directly with OpenAI. In the following steps, we will modify this route to integrate Mastra.
Install Mastra Packages
Add the Mastra core, memory, the AI SDK OpenAI provider packages to your project:
Configure Next.js
To ensure Next.js correctly bundles your application when using Mastra directly in API routes, you need to configure serverExternalPackages
.
Update your next.config.mjs
(or next.config.js
) file to include @mastra/*
:
This tells Next.js to treat Mastra packages as external dependencies on the server-side.
Create Mastra Files
Set up the basic folder structure for your Mastra configuration. Create a mastra
folder (e.g., in your src
or root directory) with the following structure:
You can create these files and folders manually or use the following commands in your terminal:
These files will be used in the next steps to define your Mastra agent and configuration.
Define the Agent
Now, let's define the behavior of our AI agent. Open the mastra/agents/chefAgent.ts
file and add the following code:
This code creates a new Mastra Agent
named chef-agent
.
instructions
: Defines the agent's persona and primary goal.model
: Specifies the language model the agent will use (in this case, OpenAI's GPT-4o Mini via the AI SDK).
Make sure you have set up your OpenAI API key as described in the Getting Started guide.
Register the Agent
Next, register the agent with your Mastra instance. Open the mastra/index.ts
file and add the following code:
This code initializes Mastra and makes the chefAgent
available for use in your application's API routes.
Modify the API Route
Now, update your API route (app/api/chat/route.ts
) to use the Mastra agent you just configured. Replace the existing content with the following:
Key changes:
- We import the
mastra
instance created inmastra/index.ts
. Make sure the import path (@/mastra
) is correct for your project setup (you might need~/mastra
,../../../mastra
, etc., depending on your path aliases and project structure). - We retrieve the
chefAgent
usingmastra.getAgent("chefAgent")
. - Instead of calling the AI SDK's
streamText
directly, we callagent.stream(messages)
to process the chat messages using the agent's configuration and model. - The result is still returned in a format compatible with Assistant UI using
toDataStreamResponse()
.
Your API route is now powered by Mastra!
Run the Application
You're all set! Start your Next.js development server:
Open your browser to http://localhost:3000
(or the port specified in your terminal). You should now be able to interact with your chefAgent
through the Assistant UI chat interface. Ask it for cooking advice based on ingredients you have!
Congratulations! You have successfully integrated Mastra into your Next.js application using the full-stack approach. Your Assistant UI frontend now communicates with a Mastra agent running in your Next.js backend API route.
To explore more advanced Mastra features like memory, tools, workflows, and more, please refer to the official Mastra documentation.