Use Cases
- Use cases
- Reference Agents
- Notebooks
- All Notebooks
- Group Chat with Customized Speaker Selection Method
- RAG OpenAI Assistants in AutoGen
- Using RetrieveChat with Qdrant for Retrieve Augmented Code Generation and Question Answering
- Auto Generated Agent Chat: Function Inception
- Task Solving with Provided Tools as Functions (Asynchronous Function Calls)
- Using Guidance with AutoGen
- Solving Complex Tasks with A Sequence of Nested Chats
- Group Chat
- Solving Multiple Tasks in a Sequence of Async Chats
- Auto Generated Agent Chat: Task Solving with Provided Tools as Functions
- Conversational Chess using non-OpenAI clients
- RealtimeAgent with local websocket connection
- Web Scraping using Apify Tools
- DeepSeek: Adding Browsing Capabilities to AG2
- Interactive LLM Agent Dealing with Data Stream
- Generate Dalle Images With Conversable Agents
- Supercharging Web Crawling with Crawl4AI
- RealtimeAgent in a Swarm Orchestration
- Perform Research with Multi-Agent Group Chat
- Agent Tracking with AgentOps
- Translating Video audio using Whisper and GPT-3.5-turbo
- Automatically Build Multi-agent System from Agent Library
- Auto Generated Agent Chat: Collaborative Task Solving with Multiple Agents and Human Users
- Structured output
- CaptainAgent
- Group Chat with Coder and Visualization Critic
- Cross-Framework LLM Tool Integration with AG2
- Using FalkorGraphRagCapability with agents for GraphRAG Question & Answering
- Demonstrating the `AgentEval` framework using the task of solving math problems as an example
- RealtimeAgent in a Swarm Orchestration using WebRTC
- A Uniform interface to call different LLMs
- From Dad Jokes To Sad Jokes: Function Calling with GPTAssistantAgent
- Solving Complex Tasks with Nested Chats
- Usage tracking with AutoGen
- Agent with memory using Mem0
- Using RetrieveChat Powered by PGVector for Retrieve Augmented Code Generation and Question Answering
- Tools with Dependency Injection
- Solving Multiple Tasks in a Sequence of Chats with Different Conversable Agent Pairs
- WebSurferAgent
- Using RetrieveChat Powered by MongoDB Atlas for Retrieve Augmented Code Generation and Question Answering
- Assistants with Azure Cognitive Search and Azure Identity
- ReasoningAgent - Advanced LLM Reasoning with Multiple Search Strategies
- Agentic RAG workflow on tabular data from a PDF file
- Making OpenAI Assistants Teachable
- Run a standalone AssistantAgent
- AutoBuild
- Solving Multiple Tasks in a Sequence of Chats
- Currency Calculator: Task Solving with Provided Tools as Functions
- Swarm Orchestration with AG2
- Use AutoGen in Databricks with DBRX
- Using a local Telemetry server to monitor a GraphRAG agent
- Auto Generated Agent Chat: Solving Tasks Requiring Web Info
- StateFlow: Build Workflows through State-Oriented Actions
- Groupchat with Llamaindex agents
- Using Neo4j's native GraphRAG SDK with AG2 agents for Question & Answering
- Agent Chat with Multimodal Models: LLaVA
- Group Chat with Retrieval Augmented Generation
- Runtime Logging with AutoGen
- SocietyOfMindAgent
- Agent Chat with Multimodal Models: DALLE and GPT-4V
- Agent Observability with OpenLIT
- Mitigating Prompt hacking with JSON Mode in Autogen
- Trip planning with a FalkorDB GraphRAG agent using a Swarm
- Language Agent Tree Search
- Auto Generated Agent Chat: Collaborative Task Solving with Coding and Planning Agent
- OptiGuide with Nested Chats in AutoGen
- Auto Generated Agent Chat: Task Solving with Langchain Provided Tools as Functions
- Writing a software application using function calls
- Auto Generated Agent Chat: GPTAssistant with Code Interpreter
- Adding Browsing Capabilities to AG2
- Agentchat MathChat
- Chatting with a teachable agent
- RealtimeAgent with gemini client
- Preprocessing Chat History with `TransformMessages`
- Chat with OpenAI Assistant using function call in AutoGen: OSS Insights for Advanced GitHub Data Analysis
- Websockets: Streaming input and output using websockets
- Task Solving with Code Generation, Execution and Debugging
- Agent Chat with Async Human Inputs
- Agent Chat with custom model loading
- Chat Context Dependency Injection
- Nested Chats for Tool Use in Conversational Chess
- Auto Generated Agent Chat: Group Chat with GPTAssistantAgent
- Cross-Framework LLM Tool for CaptainAgent
- Auto Generated Agent Chat: Teaching AI New Skills via Natural Language Interaction
- SQL Agent for Spider text-to-SQL benchmark
- Auto Generated Agent Chat: Task Solving with Code Generation, Execution, Debugging & Human Feedback
- OpenAI Assistants in AutoGen
- (Legacy) Implement Swarm-style orchestration with GroupChat
- Enhanced Swarm Orchestration with AG2
- Using RetrieveChat for Retrieve Augmented Code Generation and Question Answering
- Using Neo4j's graph database with AG2 agents for Question & Answering
- AgentOptimizer: An Agentic Way to Train Your LLM Agent
- Engaging with Multimodal Models: GPT-4V in AutoGen
- RealtimeAgent with WebRTC connection
- FSM - User can input speaker transition constraints
- Config loader utility functions
- Community Gallery
Enhanced Swarm Orchestration with AG2
Swarm Ochestration
AG2’s swarm orchestration provides a flexible and powerful method of managing a conversation with multiple agents, tools, and transitions.
In this notebook, we look at more advanced features of the swarm orchestration.
If you are new to swarm, check out this notebook, where we introduce the core features of swarms including global context variables, hand offs, and initiating a swarm chat.
In this notebook we’re going to demonstrate these features AG2’s swarm orchestration:
- Updating an agent’s state
- Conditional handoffs
- Nested chats
This notebook has been updated as swarms can now accommodate any ConversableAgent.
Install ag2
:
pip install ag2
Note: If you have been using
autogen
orpyautogen
, all you need to do is upgrade it using:pip install -U autogen
or
pip install -U pyautogen
as
pyautogen
,autogen
, andag2
are aliases for the same PyPI package.
For more information, please refer to the installation guide.
Set your API Endpoint
The
config_list_from_json
function loads a list of configurations from an environment variable or
a json file.
import autogen
config_list = autogen.config_list_from_json(
"OAI_CONFIG_LIST",
filter_dict={
"model": ["gpt-4o"],
},
)
llm_config = {
"cache_seed": 42, # change the cache_seed for different trials
"temperature": 1,
"config_list": config_list,
"timeout": 120,
}
Demonstration
We’re creating this customer service workflow for an e-commerce platform. Customers can ask about the status of their orders, but they must be authenticated to do so.
Key aspects of this swarm are:
- System messages are customised, incorporating the context of the workflow
- Handoffs are conditional, only being available when they are relevant
- A nested chat handles the order retrieval and summarisation
from typing import Any, Dict, List
from autogen import (
AfterWork,
AfterWorkOption,
ConversableAgent,
OnCondition,
SwarmResult,
UpdateSystemMessage,
UserProxyAgent,
initiate_swarm_chat,
register_hand_off,
)
Context
workflow_context = {
# customer details
"customer_name": None,
"logged_in_username": None,
# workflow status
"logged_in": False,
"requires_login": True,
# order enquiry details
"has_order_id": False,
"order_id": None,
}
Databases
# Databases
USER_DATABASE = {
"mark": {
"full_name": "Mark Sze",
},
"kevin": {
"full_name": "Yiran Wu",
},
}
ORDER_DATABASE = {
"TR13845": {
"user": "mark",
"order_number": "TR13845",
"status": "shipped", # order status: order_received, shipped, delivered, return_started, returned
"return_status": "N/A", # return status: N/A, return_started, return_shipped, return_delivered, refund_issued
"product": "matress",
"link": "https://www.example.com/TR13845",
"shipping_address": "123 Main St, State College, PA 12345",
},
"TR14234": {
"user": "kevin",
"order_number": "TR14234",
"status": "delivered",
"return_status": "N/A",
"product": "pillow",
"link": "https://www.example.com/TR14234",
"shipping_address": "123 Main St, State College, PA 12345",
},
"TR29384": {
"user": "mark",
"order_number": "TR29384",
"status": "delivered",
"return_status": "N/A",
"product": "bed frame",
"link": "https://www.example.com/TR29384",
"shipping_address": "123 Main St, State College, PA 12345",
},
}
Agent’s Functions
# ORDER FUNCTIONS
def check_order_id(order_id: str, context_variables: dict) -> SwarmResult:
"""Check if the order ID is valid"""
# Restricts order to checking to the logged in user
if (
context_variables["logged_in_username"]
and order_id in ORDER_DATABASE
and ORDER_DATABASE[order_id]["user"] == context_variables["logged_in_username"]
):
return SwarmResult(
context_variables=context_variables, values=f"Order ID {order_id} is valid.", agent=order_triage_agent
)
else:
return SwarmResult(
context_variables=context_variables,
values=f"Order ID {order_id} is invalid. Please ask for the correct order ID.",
agent=order_triage_agent,
)
def record_order_id(order_id: str, context_variables: dict) -> SwarmResult:
"""Record the order ID in the workflow context"""
if order_id not in ORDER_DATABASE:
return SwarmResult(
context_variables=context_variables,
values=f"Order ID {order_id} not found. Please ask for the correct order ID.",
agent=order_triage_agent,
)
context_variables["order_id"] = order_id
context_variables["has_order_id"] = True
return SwarmResult(
context_variables=context_variables, values=f"Order ID Recorded: {order_id}", agent=order_mgmt_agent
)
# AUTHENTICATION FUNCTIONS
def login_customer_by_username(username: str, context_variables: dict) -> SwarmResult:
"""Get and log the customer in by their username"""
if username in USER_DATABASE:
context_variables["customer_name"] = USER_DATABASE[username]["full_name"]
context_variables["logged_in_username"] = username
context_variables["logged_in"] = True
context_variables["requires_login"] = False
return SwarmResult(
context_variables=context_variables,
values=f"Welcome back our customer, {context_variables['customer_name']}! Please continue helping them.",
agent=order_triage_agent,
)
else:
return SwarmResult(
context_variables=context_variables,
values=f"User {username} not found. Please ask for the correct username.",
agent=authentication_agent,
)
Agents
# AGENTS
# Human customer
user = UserProxyAgent(
name="customer",
code_execution_config=False,
)
order_triage_prompt = """You are an order triage agent, working with a customer and a group of agents to provide support for your e-commerce platform.
An agent needs to be logged in to be able to access their order. The authentication_agent will work with the customer to verify their identity, transfer to them to start with.
The order_mgmt_agent will manage all order related tasks, such as tracking orders, managing orders, etc. Be sure to check the order as one step. Then if it's valid you can record it in the context.
Ask the customer for further information when necessary.
The current status of this workflow is:
Customer name: {customer_name}
Logged in: {logged_in}
Enquiring for Order ID: {order_id}
"""
order_triage_agent = ConversableAgent(
name="order_triage_agent",
update_agent_state_before_reply=[
UpdateSystemMessage(order_triage_prompt),
],
functions=[check_order_id, record_order_id],
llm_config=llm_config,
)
authentication_prompt = "You are an authentication agent that verifies the identity of the customer."
authentication_agent = ConversableAgent(
name="authentication_agent",
system_message=authentication_prompt,
functions=[login_customer_by_username],
llm_config=llm_config,
)
order_management_prompt = """You are an order management agent that manages inquiries related to e-commerce orders.
The order must be logged in to access their order.
Use your available tools to get the status of the details from the customer. Ask the customer questions as needed.
The current status of this workflow is:
Customer name: {customer_name}
Logged in: {logged_in}
Enquiring for Order ID: {order_id}
"""
order_mgmt_agent = ConversableAgent(
name="order_mgmt_agent",
update_agent_state_before_reply=[
UpdateSystemMessage(order_management_prompt),
],
functions=[check_order_id, record_order_id],
llm_config=llm_config,
)
Nested Chats
# NESTED CHAT - Delivery Status
order_retrieval_agent = ConversableAgent(
name="order_retrieval_agent",
system_message="You are an order retrieval agent that gets details about an order.",
llm_config=llm_config,
)
order_summariser_agent = ConversableAgent(
name="order_summariser_agent",
system_message="You are an order summariser agent that provides a summary of the order details.",
llm_config=llm_config,
)
def extract_order_summary(recipient: ConversableAgent, messages, sender: ConversableAgent, config):
"""Extracts the order summary based on the OrderID in the context variables"""
order_id = sender.get_context("order_id")
if order_id in ORDER_DATABASE:
order = ORDER_DATABASE[order_id]
return f"Order {order['order_number']} for {order['product']} is currently {order['status']}. The shipping address is {order['shipping_address']}."
else:
return f"Order {order_id} not found."
nested_chat_one = {
"carryover_config": {"summary_method": "last_msg"},
"recipient": order_retrieval_agent,
"message": extract_order_summary, # "Retrieve the status details of the order using the order id",
"max_turns": 1,
}
nested_chat_two = {
"recipient": order_summariser_agent,
"message": "Summarise the order details provided in a tabulated, text-based, order sheet format",
"max_turns": 1,
"summary_method": "last_msg",
}
chat_queue = [nested_chat_one, nested_chat_two]
Handoffs (OnCondition and AfterWork)
# HANDOFFS
register_hand_off(
agent=order_triage_agent,
hand_to=[
OnCondition(
target=authentication_agent,
condition="The customer is not logged in, authenticate the customer.",
available="requires_login",
),
OnCondition(
target=order_mgmt_agent,
condition="The customer is logged in, continue with the order triage.",
available="logged_in",
),
AfterWork(AfterWorkOption.REVERT_TO_USER),
],
)
register_hand_off(
agent=authentication_agent,
hand_to=[
OnCondition(
target=order_triage_agent,
condition="The customer is logged in, continue with the order triage.",
available="logged_in",
),
AfterWork(AfterWorkOption.REVERT_TO_USER),
],
)
def has_order_in_context(agent: ConversableAgent, messages: List[Dict[str, Any]]) -> bool:
return agent.get_context("has_order_id")
register_hand_off(
agent=order_mgmt_agent,
hand_to=[
OnCondition(
target={
"chat_queue": chat_queue,
},
condition="Retrieve the status of the order",
available=has_order_in_context,
),
OnCondition(
target=authentication_agent,
condition="The customer is not logged in, authenticate the customer.",
available="requires_login",
),
OnCondition(target=order_triage_agent, condition="The customer has no more enquiries about this order."),
AfterWork(AfterWorkOption.REVERT_TO_USER),
],
)
Let’s go!
chat_history = initiate_swarm_chat(
initial_agent=order_triage_agent,
agents=[order_triage_agent, authentication_agent, order_mgmt_agent],
context_variables=workflow_context,
messages="Can you help me with my order.",
user_agent=user,
max_rounds=40,
after_work=AfterWorkOption.TERMINATE,
)