autogen.SwarmAgent
SwarmAgent
Swarm agent for participating in a swarm.
SwarmAgent is a subclass of ConversableAgent.
Additional args: functions (List[Callable]): A list of functions to register with the agent. update_agent_state_before_reply (List[Callable]): A list of functions, including UPDATE_SYSTEM_MESSAGEs, called to update the agent before it replies.
Parameters:Name | Description |
---|---|
name | name of the agent. Type: str |
system_message | system message for the ChatCompletion inference. Type: str | None Default: ‘You are a helpful AI Assistant.‘ |
llm_config | llm inference configuration. Please refer to OpenAIWrapper.create for available options. When using OpenAI or Azure OpenAI endpoints, please specify a non-empty ‘model’ either in llm_config or in each config of ‘config_list’ in llm_config .To disable llm-based auto reply, set to False. When set to None, will use self.DEFAULT_CONFIG, which defaults to False. Type: dict | Literal[False] | None Default: None |
functions | Type: list[typing.Callable] | Callable Default: None |
is_termination_msg | a function that takes a message in the form of a dictionary and returns a boolean value indicating if this received message is a termination message. The dict can contain the following keys: “content”, “role”, “name”, “function_call”. Type: Callable[[dict], bool] | None Default: None |
max_consecutive_auto_reply | the maximum number of consecutive auto replies. default to None (no limit provided, class attribute MAX_CONSECUTIVE_AUTO_REPLY will be used as the limit in this case). When set to 0, no auto reply will be generated. Type: int | None Default: None |
human_input_mode | whether to ask for human inputs every time a message is received. Possible values are “ALWAYS”, “TERMINATE”, “NEVER”. (1) When “ALWAYS”, the agent prompts for human input every time a message is received. Under this mode, the conversation stops when the human input is “exit”, or when is_termination_msg is True and there is no human input. (2) When “TERMINATE”, the agent only prompts for human input only when a termination message is received or the number of auto reply reaches the max_consecutive_auto_reply. (3) When “NEVER”, the agent will never prompt for human input. Under this mode, the conversation stops when the number of auto reply reaches the max_consecutive_auto_reply or when is_termination_msg is True. Type: Literal['ALWAYS', 'NEVER', 'TERMINATE'] Default: ‘NEVER’ |
description | a short description of the agent. This description is used by other agents (e.g. the GroupChatManager) to decide when to call upon this agent. (Default: system_message) Type: str | None Default: None |
code_execution_config=False | |
update_agent_state_before_reply | Type: list[Callable | autogen.UPDATE_SYSTEM_MESSAGE] | Callable | autogen.UPDATE_SYSTEM_MESSAGE | None Default: None |
**kwargs |
Static Methods
process_nested_chat_carryover
Process carryover messages for a nested chat (typically for the first chat of a swarm)
The carryover_config key is a dictionary containing: “summary_method”: The method to use to summarise the messages, can be “all”, “last_msg”, “reflection_with_llm” or a Callable “summary_args”: Optional arguments for the summary method
Supported carryover ‘summary_methods’ are: “all” - all messages will be incorporated “last_msg” - the last message will be incorporated “reflection_with_llm” - an llm will summarise all the messages and the summary will be incorporated as a single message Callable - a callable with the signature: my_method(agent: ConversableAgent, messages: List[Dict[str, Any]]) -> str
Parameters:Name | Description |
---|---|
chat | The chat dictionary containing the carryover configuration Type: dict[str, typing.Any] |
recipient | The recipient agent Type: autogen.ConversableAgent |
messages | The messages from the parent chat Type: list[dict[str, typing.Any]] |
sender | The sender agent Type: autogen.ConversableAgent |
config | Type: Any |
trim_n_messages | The number of latest messages to trim from the messages list Type: int Default: 0 |
Instance Methods
add_functions
Name | Description |
---|---|
func_list | Type: list[typing.Callable] |
add_single_function
Add a single function to the agent, removing context variables for LLM use
Parameters:Name | Description |
---|---|
func | Type: Callable |
name=None | |
description='' |
generate_swarm_tool_reply
Pre-processes and generates tool call replies.
This function:
- Adds context_variables back to the tool call for the function, if necessary.
- Generates the tool calls reply.
- Updates context_variables and next_agent based on the tool call response.
Name | Description |
---|---|
messages | Type: list[dict] | None Default: None |
sender | Type: autogen.Agent | None Default: None |
config | Type: autogen.OpenAIWrapper | None Default: None |
register_hand_off
Register a function to hand off to another agent.
Parameters:Name | Description |
---|---|
hand_to | A list of ON_CONDITIONs and an, optional, AFTER_WORK condition Type: list[autogen.ON_CONDITION | autogen.AFTER_WORK] | autogen.ON_CONDITION | autogen.AFTER_WORK |
register_update_agent_state_before_reply
Register functions that will be called when the agent is selected and before it speaks. You can add your own validation or precondition functions here.
Parameters:Name | Description |
---|---|
functions | A list of functions to be registered. Each function is called when the agent is selected and before it speaks. Type: list[typing.Callable] | Callable | None |