Skip to content

ContextStrLLMCondition

autogen.agentchat.group.llm_condition.ContextStrLLMCondition #

Bases: LLMCondition

Context variable-based LLM condition.

This condition uses a ContextStr object with context variable placeholders that will be substituted before being evaluated by an LLM.

context_str instance-attribute #

context_str

get_prompt #

get_prompt(agent, messages)

Return the prompt with context variables substituted.

PARAMETER DESCRIPTION
agent

The agent evaluating the condition (provides context variables)

TYPE: ConversableAgent

messages

The conversation history (not used)

TYPE: list[dict[str, Any]]

RETURNS DESCRIPTION
str

The prompt with context variables substituted

Source code in autogen/agentchat/group/llm_condition.py
def get_prompt(self, agent: "ConversableAgent", messages: list[dict[str, Any]]) -> str:
    """Return the prompt with context variables substituted.

    Args:
        agent: The agent evaluating the condition (provides context variables)
        messages: The conversation history (not used)

    Returns:
        The prompt with context variables substituted
    """
    result = self.context_str.format(agent.context_variables)
    return result if result is not None else ""