Skip to content

StringLLMCondition

autogen.agentchat.group.llm_condition.StringLLMCondition #

Bases: LLMCondition

Simple string-based LLM condition.

This condition provides a static string prompt to be evaluated by an LLM.

prompt instance-attribute #

prompt

get_prompt #

get_prompt(agent, messages)

Return the static prompt string.

PARAMETER DESCRIPTION
agent

The agent evaluating the condition (not used)

TYPE: ConversableAgent

messages

The conversation history (not used)

TYPE: list[dict[str, Any]]

RETURNS DESCRIPTION
str

The static prompt string

Source code in autogen/agentchat/group/llm_condition.py
def get_prompt(self, agent: "ConversableAgent", messages: list[dict[str, Any]]) -> str:
    """Return the static prompt string.

    Args:
        agent: The agent evaluating the condition (not used)
        messages: The conversation history (not used)

    Returns:
        The static prompt string
    """
    return self.prompt