Skip to content

Grok example with live search#

Open In Colab Open on GitHub

Summary and Best Practices#

This notebook demonstrated successful Grok integration with AG2. Here are the key takeaways:

✅ What Works#

  1. OpenAI Compatibility: Grok works seamlessly with AG2’s OpenAI client
  2. Real-time Search: The extra_body parameter enables Grok’s search capabilities
  3. Function Calling: Proper function registration
  4. Standard AG2 Patterns: All existing AG2 agent patterns work with Grok

📚 References#

Environment Setup#

Before running this notebook, ensure you have:

  1. API Key: Get your Grok API key from x.ai

  2. Environment Variable:

    export XAI_API_KEY="your-actual-grok-api-key"
    
  3. Dependencies:

    pip install ag2[openai]
    

Security Note: Never hardcode API keys in notebooks. Always use environment variables for authentication.

Example 2: Using Grok with function calling capabilities#

print(“=== Grok with Function Calling ===”)

from typing import Annotated

def get_weather(city: Annotated[str, “The city name”]) -> str: “““Get current weather for a city.”“” # This is a mock function - in reality, you’d call a weather API return f”The current weather in {city} is sunny with a temperature of 22°C.”

def calculate_math(expression: Annotated[str, “Mathematical expression to evaluate”]) -> str: “““Calculate a mathematical expression safely.”“” try: # Simple evaluation - in production, use a safer math parser result = eval(expression.replace(“^”, “**“)) return f”The result of {expression} is {result}” except Exception: return f”Could not evaluate the expression: {expression}”

Configure Grok - FIXED: Remove tools from LLMConfig (causes JSON serialization error)#

function_config = LLMConfig( config_list=[ { “model”: “grok-4”, “api_key”: os.getenv(“XAI_API_KEY”), “base_url”: “https://api.x.ai/v1”, “api_type”: “openai”, # Using OpenAI-compatible client for function calling } ], temperature=0.3, # Temperature goes at top level, not in config_list max_tokens=800, # max_tokens also at top level # NOTE: DO NOT use tools=[functions] here - causes “Object of type function is not JSON serializable” )

Create function-calling assistant - FIXED: Pass functions to ConversableAgent, not LLMConfig#

function_assistant = AssistantAgent( name=“grok_function_assistant”, system_message=“You are a helpful assistant that can call functions to get weather information and perform calculations. Use the available tools when appropriate.”, llm_config=function_config, functions=[get_weather, calculate_math], # CORRECT: Functions go here, not in LLMConfig.tools )

Test function calling#

user_proxy.initiate_chat( function_assistant, message=“What’s the weather like in Tokyo? Also, can you calculate 15 * 23 + 7?”, max_turns=2, clear_history=True, )

import os

from autogen import AssistantAgent, LLMConfig, UserProxyAgent

# Example 1: Using Grok with default OpenAI client (OpenAI-compatible)
grok4_config = LLMConfig(
    config_list=[
        {
            "model": "grok-4",
            "api_type": "openai",  # Use existing openai type only
            "base_url": "https://api.x.ai/v1",
            "api_key": os.getenv("XAI_API_KEY"),
            "max_tokens": 1000,
            # Test if extra_body works for search parameters
            "extra_body": {
                "search_enabled": True,
                "real_time_data": True,
                "search_parameters": {
                    "max_search_results": 5,
                    "include_citations": True,
                    "search_timeout": 10,
                    "return_citations": True,
                },
            },
        }
    ],
    temperature=0.5,
)

This example demonstrates basic conversation with Grok, leveraging its real-time search capabilities through the extra_body parameter.

Key features: - Real-time data access: Grok can access current information and web search - Search parameters: Configure max results, citations, and timeout settings - Standard OpenAI compatibility: Works with existing AG2 patterns

# Create agents
assistant = AssistantAgent(
    name="grok_assistant",
    system_message="You are a helpful AI assistant powered by Grok. You have access to real-time information and can help with various tasks.",
    llm_config=grok4_config,
)

user_proxy = UserProxyAgent(
    name="user_proxy",
    human_input_mode="NEVER",
    max_consecutive_auto_reply=3,
    is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
    code_execution_config={
        "work_dir": "coding",
        "use_docker": False,  # Set to True if you have Docker available
    },
)

user_proxy.initiate_chat(
    assistant,
    message="What's the weather like in Tokyo? Also, can you calculate 15 * 23 + 7?",
    max_turns=2,
    clear_history=True,
)

Result Analysis#

Notice that the conversation completed successfully! Grok was able to: - Provide today’s date correctly (July 18, 2025) - Access real-time information through its built-in search capabilities - The search parameters in extra_body enabled enhanced search functionality

The response shows that Grok’s real-time search integration works seamlessly with AG2.

Example 2: Function Calling with Grok#

This example demonstrates how to use Grok with function calling capabilities. This was challenging to get right due to AG2’s specific requirements for function registration.

# Example 2: Using Grok with function calling capabilities
print("\n=== Grok with Function Calling ===")

from typing import Annotated

from autogen.tools import tool

@tool(description="Get current weather for a city")
def get_weather(city: Annotated[str, "The city name"]) -> str:
    """Get current weather for a city."""
    # This is a mock function - in reality, you'd call a weather API
    return f"The current weather in {city} is sunny with a temperature of 22°C."

@tool(description="Calculate a mathematical expression")
def calculate_math(expression: Annotated[str, "Mathematical expression to evaluate"]) -> str:
    """Calculate a mathematical expression safely."""
    try:
        # Simple evaluation - in production, use a safer math parser
        result = eval(expression.replace("^", "**"))
        return f"The result of {expression} is {result}"
    except Exception:
        return f"Could not evaluate the expression: {expression}"

# Configure Grok - FIXED: Remove tools from LLMConfig (causes JSON serialization error)
function_config = LLMConfig(
    config_list=[
        {
            "model": "grok-4",
            "api_key": os.getenv("XAI_API_KEY"),
            "base_url": "https://api.x.ai/v1",
            "api_type": "openai",  # Using OpenAI-compatible client for function calling
        }
    ],
    temperature=0.3,  # Temperature goes at top level, not in config_list
    max_tokens=800,  # max_tokens also at top level
    # NOTE: DO NOT use tools=[functions] here - causes "Object of type function is not JSON serializable"
)

# Create function-calling assistant - FIXED: Pass functions to ConversableAgent, not LLMConfig
function_assistant = AssistantAgent(
    name="grok_function_assistant",
    system_message="You are a helpful assistant that can call functions to get weather information and perform calculations. Use the available tools when appropriate.",
    llm_config=function_config,
    functions=[get_weather, calculate_math],  # CORRECT: Functions go here, not in LLMConfig.tools
)

result = function_assistant.run(
    message="What's the weather like in Tokyo? Also, can you calculate 15 * 23 + 7?",
    max_turns=2,
    tools=[get_weather, calculate_math],
)

result.process()