Skip to content

V1/V2 Client Compatibility Demonstration#

Open In Colab Open on GitHub

This notebook demonstrates that AG2’s V2 client architecture (ModelClientV2) is fully compatible with V1 clients (ModelClient). Multiple client versions can work together seamlessly in the same group chat by specifying different llm_config settings.

Key Concept: Client Version Determined by llm_config#

The client version is controlled by the api_type field in llm_config:

  • V2 Client: "api_type": "openai_v2" - Returns rich UnifiedResponse with typed content blocks
  • V1 Clients: "api_type": "google" or default OpenAI config - Uses legacy response format

This example demonstrates a group chat with mixed client versions working together.

import os

# Configure LLM to use V2 client
llm_config = {
    "config_list": [
        {
            "api_type": "openai_v2",  # <-- Key: use V2 client architecture
            "model": "gpt-4o-mini",  # Vision-capable model
            "api_key": os.getenv("OPENAI_API_KEY"),
        }
    ],
    "temperature": 0.3,
}

llm_config_v1_gemini = {
    "config_list": [
        {
            "api_type": "google",  # <-- Key: use V1 client architecture
            "model": "gemini-2.5-flash",
            "api_key": os.getenv("GEMINI_API_KEY"),
        }
    ],
    "temperature": 0.3,
}

llm_config_v1_oai = {
    "config_list": [
        {
            "model": "gpt-4o-mini",
            "api_key": os.getenv("OPENAI_API_KEY"),
        }
    ],
    "temperature": 0.3,
}

Configuration: Three LLM Configs with Different Client Versions#

Below we configure three different llm_config settings demonstrating V1 and V2 client versions:

  1. llm_config - V2 OpenAI client using "api_type": "openai_v2"
  2. llm_config_v1_gemini - V1 Gemini client using "api_type": "google"
  3. llm_config_v1_oai - V1 OpenAI client using default configuration (no explicit api_type)
# Group chat amongst agents to create a 4th grade lesson plan
# Flow determined by Group Chat Manager automatically, and
# should be Teacher > Planner > Reviewer > Teacher (repeats if necessary)

# 1. Import our agent and group chat classes
from autogen import ConversableAgent, GroupChat, GroupChatManager

# Define our LLM configuration for OpenAI's GPT-4o mini
# uses the OPENAI_API_KEY environment variable

# Planner agent setup
planner_message = "Create lesson plans for 4th grade. Use format: <title>, <learning_objectives>, <script>"
planner = ConversableAgent(
    name="planner_agent", llm_config=llm_config, system_message=planner_message, description="Creates lesson plans"
)

# Reviewer agent setup
reviewer_message = "Review lesson plans against 4th grade curriculum. Provide max 3 changes."
reviewer = ConversableAgent(
    name="reviewer_agent",
    llm_config=llm_config_v1_gemini,
    system_message=reviewer_message,
    description="Reviews lesson plans",
)

# Teacher agent setup
teacher_message = "Choose topics and work with planner and reviewer. Say DONE! when finished."
teacher = ConversableAgent(
    name="teacher_agent",
    llm_config=llm_config_v1_oai,
    system_message=teacher_message,
)

# Setup group chat
groupchat = GroupChat(agents=[teacher, planner, reviewer], speaker_selection_method="auto", messages=[])

# Create manager
# At each turn, the manager will check if the message contains DONE! and end the chat if so
# Otherwise, it will select the next appropriate agent using its LLM
manager = GroupChatManager(
    name="group_manager",
    groupchat=groupchat,
    llm_config=llm_config,
    is_termination_msg=lambda x: "DONE!" in (x.get("content", "") or "").upper(),
)

Group Chat with Mixed V1/V2 Clients#

Now we’ll create a group chat with agents using different client versions:

  • Planner Agent: Uses V2 OpenAI client (llm_config)
  • Reviewer Agent: Uses V1 Gemini client (llm_config_v1_gemini)
  • Teacher Agent: Uses V2 OpenAI client (llm_config)
  • Group Manager: Uses V2 OpenAI client (llm_config)

This demonstrates that V1 and V2 clients can work together seamlessly in the same conversation.

# Start the conversation
chat_result = teacher.initiate_chat(recipient=manager, message="Let's teach the kids about the solar system.")

# Print the chat
print(chat_result.chat_history)