Skip to content

LLMCondition

autogen.agentchat.group.llm_condition.LLMCondition #

Bases: BaseModel

Protocol for conditions evaluated by an LLM.

get_prompt #

get_prompt(agent, messages)

Get the prompt text for LLM evaluation.

PARAMETER DESCRIPTION
agent

The agent evaluating the condition

TYPE: ConversableAgent

messages

The conversation history

TYPE: list[dict[str, Any]]

RETURNS DESCRIPTION
str

The prompt text to be evaluated by the LLM

Source code in autogen/agentchat/group/llm_condition.py
def get_prompt(self, agent: "ConversableAgent", messages: list[dict[str, Any]]) -> str:
    """Get the prompt text for LLM evaluation.

    Args:
        agent: The agent evaluating the condition
        messages: The conversation history

    Returns:
        The prompt text to be evaluated by the LLM
    """
    raise NotImplementedError("Requires subclasses to implement.")