LLMConfig
autogen.LLMConfig #
LLMConfig(*configs, top_p=None, temperature=None, max_tokens=None, check_every_ms=None, allow_format_str_template=None, response_format=None, timeout=None, seed=None, cache_seed=None, parallel_tool_calls=None, tools=(), functions=(), routing_method=None)
Initializes the LLMConfig object.
| PARAMETER | DESCRIPTION |
|---|---|
*configs | A list of LLM configuration entries or dictionaries. TYPE: |
temperature | The sampling temperature for LLM generation. TYPE: |
check_every_ms | The interval (in milliseconds) to check for updates TYPE: |
allow_format_str_template | Whether to allow format string templates. TYPE: |
response_format | The format of the response (e.g., JSON, text). TYPE: |
timeout | The timeout for LLM requests in seconds. TYPE: |
seed | The random seed for reproducible results. TYPE: |
cache_seed | The seed for caching LLM responses. TYPE: |
parallel_tool_calls | Whether to enable parallel tool calls. TYPE: |
tools | A list of tools available for the LLM. |
functions | A list of functions available for the LLM. |
max_tokens | The maximum number of tokens to generate. TYPE: |
top_p | The nucleus sampling probability. TYPE: |
routing_method | The method used to route requests (e.g., fixed_order, round_robin). TYPE: |
Examples:
# Example 1: create config from one model dictionary
config = LLMConfig({
"model": "gpt-5-mini",
"api_key": os.environ["OPENAI_API_KEY"],
})
# Example 2: create config from list of dictionaries
config = LLMConfig(
{
"model": "gpt-5-mini",
"api_key": os.environ["OPENAI_API_KEY"],
},
{
"model": "gpt-4",
"api_key": os.environ["OPENAI_API_KEY"],
},
)
Source code in autogen/llm_config/config.py
ensure_config classmethod #
Transforms passed objects to LLMConfig object.
Method to use for Agent(llm_config={...}) cases.
LLMConfig.ensure_config(LLMConfig(...)) LLMConfig(...)
LLMConfig.ensure_config(LLMConfigEntry(...)) LLMConfig(LLMConfigEntry(...))
LLMConfig.ensure_config({"model": "gpt-o3"}) LLMConfig(OpenAILLMConfigEntry(model="o3"))
LLMConfig.ensure_config([{"model": "gpt-o3"}, ...]) LLMConfig(OpenAILLMConfigEntry(model="o3"), ...)
(deprecated) LLMConfig.ensure_config({"config_list": [{ "model": "gpt-o3" }, ...]}) LLMConfig(OpenAILLMConfigEntry(model="o3"), ...)