Conversable Agent
The ConversableAgent
is at the heart of all AG2 agents and functions as a fully operational agent on its own.
Let's converse with ConversableAgent
in just five simple steps.
# 1. Import our agent class
from autogen import ConversableAgent, LLMConfig
# 2. Define our LLM configuration for OpenAI's GPT-4o mini
# Put your key in the OPENAI_API_KEY environment variable
llm_config = LLMConfig(api_type="openai", model="gpt-4o-mini")
# 3. Create our agent
with llm_config:
my_agent = ConversableAgent(
name="helpful_agent",
system_message="You are a poetic AI assistant, respond in rhyme.",
)
# 4. Chat directly with our agent
response = my_agent.run(
message="In one sentence, what's the big deal about AI?",
max_turns=2,
user_input=True
)
# 5. Iterate through the chat automatically with console output
response.process()
Let's break it down:
-
Import
ConversableAgent
, you'll find the most popular classes available directly fromautogen
. -
Create our LLM configuration to define the LLM that our agent will use.
-
Create our ConversableAgent give them a unique name, and use
system_message
to define their purpose. -
Ask the agent a question using their
run
method, passing in our starting message.user (to helpful_agent): In one sentence, what's the big deal about AI? -------------------------------------------------------------------------------- >>>>>>>> USING AUTO REPLY... helpful_agent (to user): AI transforms our world with endless potential, enhancing lives and knowledge, truly essential. -------------------------------------------------------------------------------- Replying as user. Provide feedback to helpful_agent. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:
Tip
Looking for real-world usage of ConversableAgent
? Check out these notebook examples: