Index
The Agent-User Interaction (AG-UI) protocol standardizes how frontend applications talk to AI agents – including streaming, tools, shared state, and custom events. AG2 provides a lightweight integration via autogen.ag_ui.AGUIStream, letting you connect a ConversableAgent to any AG-UI-compatible frontend.
For background on the protocol see AG-UI Protocol introduction
When to use AG-UI with AG2#
Use the AG-UI integration when:
- You already have (or plan to build) a web UI based on the AG-UI protocol
- You want streaming responses and tool events that the frontend can render in real-time
- You need to mix backend tools (Python functions) with frontend tools (UI actions / Generative UI tools) in one coherent protocol
- You want to reuse existing AG-UI debugging tooling (e.g. Dojo) while keeping your agent logic in AG2
Generally, we recommend using AG-UI whenever you need to build a rich interactive UI for your agent. It less than an hour you can create your own ChatGPT-like web application.
Installation#
To use the AG-UI integration, install AG2 with the ag-ui extra, pulling in the official ag-ui-protocol package:
Fast integration: Build an ASGI endpoint#
If you are using Starlette or a Starlette-based framework, like FastAPI, AGUIStream can build an ASGI endpoint class for you.
Because the class returned by build_asgi() is a standard Starlette HTTPEndpoint, you can plug or mount into any ASGI application that accepts ASGI routes (for example, FastAPI, Starlette itself, or other Starlette-based frameworks).
This gives you a ready-to-use endpoint compatible with AG-UI frontends:
Advanced example: Manual dispatch#
The most flexible way to integrate AG-UI is to:
- Accept an HTTP request from an AG-UI frontend
- Parse the body into a
RunAgentInput - Call
AGUIStream.dispatch - Stream the encoded SSE (Server-Sent Events) events back to the client
You can then run this using any ASGI server:
Your AG-UI frontend can now send RunAgentInput payloads to /chat and consume the streamed events to render messages, tools, and state updates.
It allows you to add any additional logic to the endpoint, for example, you can add a logging layer, a cache layer, a rate limiting layer, etc.
Authentication#
You can protect your endpoint with a simple authentication layer, such token-based shown below, that you already use in other endpoints.
Tools Context#
In some cases you may want to provide additional context to your agent, such as the user's ID, a session ID, a user profile, etc.
You can pass this information into your agent's tools using the standard (ContextVariables) way, passing it as a parameter to the tool function.
Pass this context to your agent, you can use the context parameter of the dispatch method.
Backend vs Frontend tools#
The tools mechanism is one of the most powerful features of AG-UI. It allows you to mix backend tools (Python functions) and frontend tools (UI actions / GenUI tools) in one coherent protocol.
Backend tools (Python functions)#
Backend tools are AG2 tools registered on your ConversableAgent:
AG-UI allows you to capture these tool's calls and render them in the UI to notify the user about backend agent activity.
Frontend tools (UI-driven actions)#
Frontend tools are defined on the AG-UI side and allow you to build flexible and appealing UI to interact with your agent.
The most common cases of frontend tools are:
- Generative UI - tools that are able to render UI based on LLM output: custom cards, lists, buttons and others
- HITL (Human in the Loop) - tools that allow you to ask the user for specific input from the UI: buttons, toggles, etc.
You can learn more about frontend tools in the CopilotKit documentation.