agentchat.contrib.agent_builder
AgentBuilder
AgentBuilder can help user build an automatic task solving process powered by multi-agent system. Specifically, our building pipeline includes initialize and build.
__init__
(These APIs are experimental and may change in the future.)
Arguments:
config_file_or_env
- path or environment of the OpenAI api configs.builder_model
- specify a model as the backbone of build manager.agent_model
- specify a model as the backbone of participant agents.endpoint_building_timeout
- timeout for building up an endpoint server.max_agents
- max agents for each task.
clear_agent
Clear a specific agent by name.
Arguments:
agent_name
- the name of agent.recycle_endpoint
- trigger for recycle the endpoint server. If true, the endpoint will be recycled when there is no agent depending on.
clear_all_agents
Clear all cached agents.
build
Auto build agents based on the building task.
Arguments:
building_task
- instruction that helps build manager (gpt-4) to decide what agent should be built.coding
- use to identify if the user proxy (a code interpreter) should be added.code_execution_config
- specific configs for user proxy (e.g., last_n_messages, work_dir, …).default_llm_config
- specific configs for LLM (e.g., config_list, seed, temperature, …).use_oai_assistant
- use OpenAI assistant api instead of self-constructed agent.user_proxy
- user proxy’s class that can be used to replace the default user proxy.
Returns:
agent_list
- a list of agents.cached_configs
- cached configs.
build_from_library
Build agents from a library. The library is a list of agent configs, which contains the name and system_message for each agent. We use a build manager to decide what agent in that library should be involved to the task.
Arguments:
building_task
- instruction that helps build manager (gpt-4) to decide what agent should be built.library_path_or_json
- path or JSON string config of agent library.default_llm_config
- specific configs for LLM (e.g., config_list, seed, temperature, …).coding
- use to identify if the user proxy (a code interpreter) should be added.code_execution_config
- specific configs for user proxy (e.g., last_n_messages, work_dir, …).use_oai_assistant
- use OpenAI assistant api instead of self-constructed agent.embedding_model
- a Sentence-Transformers model use for embedding similarity to select agents from library. As reference, chromadb use “all-mpnet-base-v2” as default.user_proxy
- user proxy’s class that can be used to replace the default user proxy.
Returns:
agent_list
- a list of agents.cached_configs
- cached configs.
save
Save building configs. If the filepath is not specific, this function will create a filename by encrypt the building_task string by md5 with “save_config_” prefix, and save config to the local path.
Arguments:
filepath
- save path.
Returns:
filepath
- path save.
load
Load building configs and call the build function to complete building without calling online LLMs’ api.
Arguments:
filepath
- filepath or JSON string for the save config.config_json
- JSON string for the save config.use_oai_assistant
- use OpenAI assistant api instead of self-constructed agent.
Returns:
agent_list
- a list of agents.cached_configs
- cached configs.