Skip to content

ContextStrLLMCondition

autogen.agentchat.group.llm_condition.ContextStrLLMCondition #

ContextStrLLMCondition(context_str, **data)

Bases: LLMCondition

Context variable-based LLM condition.

This condition uses a ContextStr object with context variable placeholders that will be substituted before being evaluated by an LLM.

Initialize with a context string as a positional parameter.

PARAMETER DESCRIPTION
context_str

The ContextStr object with variable placeholders

TYPE: ContextStr

data

Additional data for the parent class

TYPE: Any DEFAULT: {}

Source code in autogen/agentchat/group/llm_condition.py
def __init__(self, context_str: ContextStr, **data: Any) -> None:
    """Initialize with a context string as a positional parameter.

    Args:
        context_str: The ContextStr object with variable placeholders
        data: Additional data for the parent class
    """
    super().__init__(context_str=context_str, **data)

context_str instance-attribute #

context_str

get_prompt #

get_prompt(agent, messages)

Return the prompt with context variables substituted.

PARAMETER DESCRIPTION
agent

The agent evaluating the condition (provides context variables)

TYPE: ConversableAgent

messages

The conversation history (not used)

TYPE: list[dict[str, Any]]

RETURNS DESCRIPTION
str

The prompt with context variables substituted

Source code in autogen/agentchat/group/llm_condition.py
def get_prompt(self, agent: "ConversableAgent", messages: list[dict[str, Any]]) -> str:
    """Return the prompt with context variables substituted.

    Args:
        agent: The agent evaluating the condition (provides context variables)
        messages: The conversation history (not used)

    Returns:
        The prompt with context variables substituted
    """
    result = self.context_str.format(agent.context_variables)
    return result if result is not None else ""