AgentBuilder

AgentBuilder(
    config_file_or_env: str | None = 'OAI_CONFIG_LIST',
    config_file_location: str | None = '',
    builder_model: str | list | None = [],
    agent_model: str | list | None = [],
    builder_model_tags: list | None = [],
    agent_model_tags: list | None = [],
    max_agents: int | None = 5
)

AgentBuilder can help user build an automatic task solving process powered by multi-agent system.
Specifically, our building pipeline includes initialize and build.
(These APIs are experimental and may change in the future.)

Parameters:
NameDescription
config_file_or_envType: str | None

Default: ‘OAI_CONFIG_LIST’
config_file_locationType: str | None

Default:
builder_modelType: str | list | None

Default: []
agent_modelType: str | list | None

Default: []
builder_model_tagsType: list | None

Default: []
agent_model_tagsType: list | None

Default: []
max_agentsType: int | None

Default: 5

Class Attributes

AGENT_DESCRIPTION_PROMPT



AGENT_NAME_PROMPT



AGENT_SEARCHING_PROMPT



AGENT_SELECTION_PROMPT



AGENT_SYS_MSG_PROMPT



CODING_AND_TASK_SKILL_INSTRUCTION



CODING_PROMPT



DEFAULT_DESCRIPTION



DEFAULT_PROXY_AUTO_REPLY



GROUP_CHAT_DESCRIPTION



online_server_name



Instance Methods

build

build(
    self,
    building_task: str,
    default_llm_config: dict[str, Any],
    coding: bool | None = None,
    code_execution_config: dict[str, Any] | None = None,
    use_oai_assistant: bool | None = False,
    user_proxy: ConversableAgent | None = None,
    max_agents: int | None = None,
    **kwargs: Any
) -> tuple[list[ConversableAgent], dict[str, Any]]

Auto build agents based on the building task.

Parameters:
NameDescription
building_taskinstruction that helps build manager (gpt-4) to decide what agent should be built.

Type: str
default_llm_configspecific configs for LLM (e.g., config_list, seed, temperature, …).

Type: dict[str, typing.Any]
codinguse to identify if the user proxy (a code interpreter) should be added.

Type: bool | None

Default: None
code_execution_configspecific configs for user proxy (e.g., last_n_messages, work_dir, …).

Type: dict[str, typing.Any] | None

Default: None
use_oai_assistantuse OpenAI assistant api instead of self-constructed agent.

Type: bool | None

Default: False
user_proxyuser proxy’s class that can be used to replace the default user proxy.

Type: ConversableAgent | None

Default: None
max_agentsType: int | None

Default: None
**kwargsType: Any
Returns:
TypeDescription
tuple[list[ConversableAgent], dict[str, typing.Any]]agent_list: a list of agents. cached_configs: cached configs.

build_from_library

build_from_library(
    self,
    building_task: str,
    library_path_or_json: str,
    default_llm_config: dict[str, Any],
    top_k: int = 3,
    coding: bool | None = None,
    code_execution_config: dict[str, Any] | None = None,
    use_oai_assistant: bool | None = False,
    embedding_model: str | None = 'all-mpnet-base-v2',
    user_proxy: ConversableAgent | None = None,
    **kwargs: Any
) -> tuple[list[ConversableAgent], dict[str, Any]]

Build agents from a library.
The library is a list of agent configs, which contains the name and system_message for each agent.
We use a build manager to decide what agent in that library should be involved to the task.

Parameters:
NameDescription
building_taskinstruction that helps build manager (gpt-4) to decide what agent should be built.

Type: str
library_path_or_jsonpath or JSON string config of agent library.

Type: str
default_llm_configspecific configs for LLM (e.g., config_list, seed, temperature, …).

Type: dict[str, typing.Any]
top_kType: int

Default: 3
codinguse to identify if the user proxy (a code interpreter) should be added.

Type: bool | None

Default: None
code_execution_configspecific configs for user proxy (e.g., last_n_messages, work_dir, …).

Type: dict[str, typing.Any] | None

Default: None
use_oai_assistantuse OpenAI assistant api instead of self-constructed agent.

Type: bool | None

Default: False
embedding_modela Sentence-Transformers model use for embedding similarity to select agents from library.

As reference, chromadb use “all-mpnet-base-v2” as default.

Type: str | None

Default: ‘all-mpnet-base-v2’
user_proxyuser proxy’s class that can be used to replace the default user proxy.

Type: ConversableAgent | None

Default: None
**kwargsType: Any
Returns:
TypeDescription
tuple[list[ConversableAgent], dict[str, typing.Any]]agent_list: a list of agents. cached_configs: cached configs.

clear_agent

clear_agent(
    self,
    agent_name: str,
    recycle_endpoint: bool | None = True
) -> 

Clear a specific agent by name.

Parameters:
NameDescription
agent_namethe name of agent.

Type: str
recycle_endpointtrigger for recycle the endpoint server.

If true, the endpoint will be recycled when there is no agent depending on.

Type: bool | None

Default: True

clear_all_agents

clear_all_agents(self, recycle_endpoint: bool | None = True) -> 

Clear all cached agents.

Parameters:
NameDescription
recycle_endpointType: bool | None

Default: True

load

load(
    self,
    filepath: str | None = None,
    config_json: str | None = None,
    use_oai_assistant: bool | None = False,
    **kwargs: Any
) -> tuple[list[ConversableAgent], dict[str, Any]]

Load building configs and call the build function to complete building without calling online LLMs’ api.

Parameters:
NameDescription
filepathfilepath or JSON string for the save config.

Type: str | None

Default: None
config_jsonJSON string for the save config.

Type: str | None

Default: None
use_oai_assistantuse OpenAI assistant api instead of self-constructed agent.

Type: bool | None

Default: False
**kwargsType: Any
Returns:
TypeDescription
tuple[list[ConversableAgent], dict[str, typing.Any]]agent_list: a list of agents. cached_configs: cached configs.

save

save(self, filepath: str | None = None) -> str

Save building configs. If the filepath is not specific, this function will create a filename by encrypt the building_task string by md5 with “save_config_” prefix, and save config to the local path.

Parameters:
NameDescription
filepathsave path.

Type: str | None

Default: None

set_agent_model

set_agent_model(self, model: str) -> 
Parameters:
NameDescription
modelType: str

set_builder_model

set_builder_model(self, model: str) -> 
Parameters:
NameDescription
modelType: str