Open In Colab Open on GitHub

In this tutorial, we demonstrate how to integrate LLM tools from various frameworks—including LangChain Tools, CrewAI Tools, and PydanticAI Tools into the AG2 framework. This process enables smooth interoperability between these systems, allowing developers to leverage the unique capabilities of each toolset within AG2’s flexible agent-based architecture. By the end of this guide, you will understand how to configure agents, adapt these tools for use in AG2, and validate the integration through practical examples.

LangChain Tools Integration

LangChain is a popular framework that offers a wide range of tools to work with LLMs. LangChain has already implemented a variety of tools that can be easily integrated into AG2. You can explore the available tools in the LangChain Community Tools folder. These tools, such as those for querying APIs, web scraping, and text generation, can be quickly incorporated into AG2, providing powerful functionality for your agents.

Installation

To integrate LangChain tools into the AG2 framework, install the required dependencies:

pip install ag2[interop-langchain]

Note: If you have been using autogen or pyautogen, all you need to do is upgrade it using:

pip install -U autogen[interop-langchain]

or

pip install -U pyautogen[interop-langchain]

as pyautogen, autogen, and ag2 are aliases for the same PyPI package.

Additionally, this notebook uses LangChain’s Wikipedia Tool, which requires the wikipedia package. Install it with:

pip install wikipedia

Imports

Import necessary modules and tools.

  • WikipediaQueryRun and WikipediaAPIWrapper: Tools for querying Wikipedia.
  • AssistantAgent and UserProxyAgent: Agents that facilitate communication in the AG2 framework.
  • Interoperability: This module acts as a bridge, making it easier to integrate LangChain tools with AG2’s architecture.
import os

from langchain_community.tools import WikipediaQueryRun
from langchain_community.utilities import WikipediaAPIWrapper

from autogen import AssistantAgent, UserProxyAgent
from autogen.interop import Interoperability

Agent Configuration

Configure the agents for the interaction.

  • config_list defines the LLM configurations, including the model and API key.
  • UserProxyAgent simulates user inputs without requiring actual human interaction (set to NEVER).
  • AssistantAgent represents the AI agent, configured with the LLM settings.
config_list = [{"model": "gpt-4o", "api_key": os.environ["OPENAI_API_KEY"]}]
user_proxy = UserProxyAgent(
    name="User",
    human_input_mode="NEVER",
)

chatbot = AssistantAgent(
    name="chatbot",
    llm_config={"config_list": config_list},
)

Tool Integration

  • Initialize and register the LangChain tool with AG2.
  • WikipediaAPIWrapper: Configured to fetch the top 1 result from Wikipedia with a maximum of 1000 characters per document.
  • WikipediaQueryRun: A LangChain tool that executes Wikipedia queries.
  • Interoperability: Converts the LangChain tool into a format compatible with the AG2 framework.
  • ag2_tool.register_for_execution(user_proxy): Registers the tool for use by the user_proxy agent.
  • ag2_tool.register_for_llm(chatbot): Registers the tool for integration with the chatbot agent.
api_wrapper = WikipediaAPIWrapper(top_k_results=1, doc_content_chars_max=1000)
langchain_tool = WikipediaQueryRun(api_wrapper=api_wrapper)

interop = Interoperability()
ag2_tool = interop.convert_tool(tool=langchain_tool, type="langchain")

ag2_tool.register_for_execution(user_proxy)
ag2_tool.register_for_llm(chatbot)
message = "Tell me about the history of the United States"
user_proxy.initiate_chat(recipient=chatbot, message=message, max_turns=2)

CrewAI Tools Integration

CrewAI provides a variety of powerful tools designed for tasks such as web scraping, search, code interpretation, and more. These tools are easy to integrate into the AG2 framework, allowing you to enhance your agents with advanced capabilities. You can explore the full list of available tools in the CrewAI Tools repository.

Installation

Install the required packages for integrating CrewAI tools into the AG2 framework. This ensures all dependencies for both frameworks are installed.

pip install ag2[interop-crewai]

Note: If you have been using autogen or pyautogen, all you need to do is upgrade it using:

pip install -U autogen[interop-crewai]

or

pip install -U pyautogen[interop-crewai]

as pyautogen, autogen, and ag2 are aliases for the same PyPI package.

Imports

Import necessary modules and tools.

  • ScrapeWebsiteTool are the CrewAI tools for web scraping
  • AssistantAgent and UserProxyAgent are core AG2 classes.
  • Interoperability: This module acts as a bridge, making it easier to integrate CrewAI tools with AG2’s architecture.
import os

from crewai_tools import ScrapeWebsiteTool

from autogen import AssistantAgent, UserProxyAgent
from autogen.interop import Interoperability

Agent Configuration

Configure the agents for the interaction.

  • config_list defines the LLM configurations, including the model and API key.
  • UserProxyAgent simulates user inputs without requiring actual human interaction (set to NEVER).
  • AssistantAgent represents the AI agent, configured with the LLM settings.
config_list = [{"model": "gpt-4o", "api_key": os.environ["OPENAI_API_KEY"]}]
user_proxy = UserProxyAgent(
    name="User",
    human_input_mode="NEVER",
)

chatbot = AssistantAgent(
    name="chatbot",
    llm_config={"config_list": config_list},
)

Tool Integration

Initialize and register the CrewAI tool with AG2.

  • crewai_tool is an instance of the ScrapeWebsiteTool from CrewAI.
  • Interoperability converts the CrewAI tool to make it usable in AG2.
  • register_for_execution and register_for_llm allow the tool to work with the UserProxyAgent and AssistantAgent.
interop = Interoperability()
crewai_tool = ScrapeWebsiteTool()
ag2_tool = interop.convert_tool(tool=crewai_tool, type="crewai")

ag2_tool.register_for_execution(user_proxy)
ag2_tool.register_for_llm(chatbot)

message = "Scrape the website https://ag2.ai/"

chat_result = user_proxy.initiate_chat(recipient=chatbot, message=message, max_turns=2)
print(chat_result.summary)

PydanticAI Tools Integration

PydanticAI is a newer framework that offers powerful capabilities for working with LLMs. Although it currently does not have a repository with pre-built tools, it provides features like dependency injection, allowing you to inject a “Context” into a tool for better execution without relying on LLMs. This context can be used for passing parameters or managing state during the execution of a tool. While the framework is still growing, you can integrate its tools into AG2 to enhance agent capabilities, especially for tasks that involve structured data and context-driven logic.

Installation

To integrate LangChain tools into the AG2 framework, install the required dependencies:

pip install ag2[interop-pydantic-ai]

Note: If you have been using autogen or pyautogen, all you need to do is upgrade it using:

pip install -U autogen[interop-pydantic-ai]

or

pip install -U pyautogen[interop-pydantic-ai]

as pyautogen, autogen, and ag2 are aliases for the same PyPI package.

Imports

Import necessary modules and tools.

  • BaseModel: Used to define data structures for tool inputs and outputs.
  • RunContext: Provides context during the execution of tools.
  • PydanticAITool: Represents a tool in the PydanticAI framework.
  • AssistantAgent and UserProxyAgent: Agents that facilitate communication in the AG2 framework.
  • Interoperability: This module acts as a bridge, making it easier to integrate PydanticAI tools with AG2’s architecture.
import os
from typing import Optional

from pydantic import BaseModel
from pydantic_ai import RunContext
from pydantic_ai.tools import Tool as PydanticAITool

from autogen import AssistantAgent, UserProxyAgent
from autogen.interop import Interoperability

Agent Configuration

Configure the agents for the interaction.

  • config_list defines the LLM configurations, including the model and API key.
  • UserProxyAgent simulates user inputs without requiring actual human interaction (set to NEVER).
  • AssistantAgent represents the AI agent, configured with the LLM settings.
config_list = [{"model": "gpt-4o", "api_key": os.environ["OPENAI_API_KEY"]}]
user_proxy = UserProxyAgent(
    name="User",
    human_input_mode="NEVER",
)

chatbot = AssistantAgent(
    name="chatbot",
    llm_config={"config_list": config_list},
)

Tool Integration

Integrate the PydanticAI tool with AG2.

  • Define a Player model using BaseModel to structure the input data.
  • Use RunContext to securely inject dependencies (like the Player instance) into the tool function without exposing them to the LLM.
  • Implement get_player to define the tool’s functionality, accessing ctx.deps for injected data.
  • Convert the tool to an AG2-compatible format with Interoperability and register it for execution and LLM communication.
  • Convert the PydanticAI tool into an AG2-compatible format using convert_tool.
  • Register the tool for both execution and communication with the LLM by associating it with the user_proxy and chatbot.
class Player(BaseModel):
    name: str
    age: int


def get_player(ctx: RunContext[Player], additional_info: Optional[str] = None) -> str:  # type: ignore[valid-type]
    """Get the player's name.

    Args:
        additional_info: Additional information which can be used.
    """
    return f"Name: {ctx.deps.name}, Age: {ctx.deps.age}, Additional info: {additional_info}"  # type: ignore[attr-defined]


interop = Interoperability()
pydantic_ai_tool = PydanticAITool(get_player, takes_ctx=True)

# player will be injected as a dependency
player = Player(name="Luka", age=25)
ag2_tool = interop.convert_tool(tool=pydantic_ai_tool, type="pydanticai", deps=player)

ag2_tool.register_for_execution(user_proxy)
ag2_tool.register_for_llm(chatbot)

Initiate a conversation between the UserProxyAgent and the AssistantAgent.

  • Use the initiate_chat method to send a message from the user_proxy to the chatbot.
  • In this example, the user requests the chatbot to retrieve player information, providing “goal keeper” as additional context.
  • The Player instance is securely injected into the tool using RunContext, ensuring the chatbot can retrieve and use this data during the interaction.
user_proxy.initiate_chat(
    recipient=chatbot, message="Get player, for additional information use 'goal keeper'", max_turns=3
)