Your AG2 agents are likely to need an LLM and you can configure one, or more, for each agent.

AG2’s agents can use LLMs through OpenAI, Anthropic, Google, Amazon, Mistral AI, Cerebras, Together AI, and Groq. Locally hosted models can also be used through Ollama, LiteLLM, and LM Studio.

From version 0.8: The OpenAI package, openai, is not installed by default.

Install AG2 with your preferred model provider(s), for example:

  • pip install ag2[openai]
  • pip install ag2[gemini]
  • pip install ag2[anthropic,cohere,mistral]

First, we create the LLM configuration object with the API type, model, and, if necessary, the key. There are two ways to create an LLM configuration object:

  1. By passing the config_list parameter with a list of dictionaries containing the API type, model, and key.
import os
from autogen import LLMConfig

llm_config = LLMConfig(
  config_list = [
    {
      "api_type": "openai",
      "model": "gpt-4o-mini",
      "api_key": os.environ["OPENAI_API_KEY"]
    },
    {
      "api_type": "openai",
      "model": "gpt-4o",
      "api_key": os.environ["OPENAI_API_KEY"]
    }
  ],
)
  1. By passing the api_type, model, and api_key as parameters.
import os
from autogen import LLMConfig

llm_config = LLMConfig(
  api_type="openai",
  model="gpt-4o-mini",
  api_key=os.environ["OPENAI_API_KEY"],
)

It is important to never hard-code secrets into your code, therefore we read the OpenAI API key from an environment variable.

Then, when you create your agents you can pass the LLM configuration object to the agent in two ways:

  1. By passing the llm_config parameter to the agent as a keyword argument.
with llm_config:
  my_agent = ConversableAgent(
      name="helpful_agent",
      system_message="You are a poetic AI assistant",
  )
  1. By using the with statement to create a context manager for the LLM configuration.
with llm_config:
    my_agent = ConversableAgent(
        name="helpful_agent",
        system_message="You are a poetic AI assistant",
    )

This will set the LLM configuration for the agent.

The default LLM provider is OpenAI but if you would like to use a different provider, see the available providers.

AG2’s LLM configuration allows you to specify many LLMs for fallback support and the ability to filter them for an agent, see the LLM Configuration deep-dive.

Environment variables

The examples in these guides include an LLM configuration for OpenAI’s GPT-4o mini model and will need the OPENAI_API_KEY environment variable set with your OpenAI API key.

Set it in your terminal/command prompt:

export OPENAI_API_KEY="YOUR_API_KEY"