Getting Started
Requirements
You need a LangGraph Cloud API server. You can start a server locally via LangGraph Studio or use LangSmith for a hosted version.
The state of the graph you are using must have a messages
key with a list of LangChain-alike messages.
New project from template
Installation in existing React project
Setup a proxy backend endpoint (optional, for production)
This example forwards every request to the LangGraph server directly from the browser. For production use-cases, you should limit the API calls to the subset of endpoints that you need and perform authorization checks.
Setup helper functions
Define a MyAssistant
component
Use the MyAssistant
component
Setup UI components
Follow the UI Components guide to setup the UI components.
Advanced APIs
Message Accumulator
The LangGraphMessageAccumulator
lets you append messages incoming from the server to replicate the messages state client side.
Message Conversion
Use convertLangChainMessages
to transform LangChain messages to assistant-ui format:
Interrupt Persistence
LangGraph supports interrupting the execution flow to request user input or handle specific interactions. These interrupts can be persisted and restored when switching between threads. This means that if a user switches away from a thread during an interaction (like waiting for user approval), the interaction state will be preserved when they return to that thread.
To handle interrupts in your application:
- Make sure your thread state type includes the
interrupts
field - Return the interrupts from
onSwitchToThread
along with the messages - The runtime will automatically restore the interrupt state when switching threads
This feature is particularly useful for applications that require user approval flows, multi-step forms, or any other interactive elements that might span multiple thread switches.