Skip to content

StringLLMCondition

autogen.agentchat.group.llm_condition.StringLLMCondition #

StringLLMCondition(prompt, **data)

Bases: LLMCondition

Simple string-based LLM condition.

This condition provides a static string prompt to be evaluated by an LLM.

Initialize with a prompt string as a positional parameter.

PARAMETER DESCRIPTION
prompt

The static prompt string to evaluate

TYPE: str

data

Additional data for the parent class

TYPE: Any DEFAULT: {}

Source code in autogen/agentchat/group/llm_condition.py
def __init__(self, prompt: str, **data: Any) -> None:
    """Initialize with a prompt string as a positional parameter.

    Args:
        prompt: The static prompt string to evaluate
        data: Additional data for the parent class
    """
    super().__init__(prompt=prompt, **data)

prompt instance-attribute #

prompt

get_prompt #

get_prompt(agent, messages)

Return the static prompt string.

PARAMETER DESCRIPTION
agent

The agent evaluating the condition (not used)

TYPE: ConversableAgent

messages

The conversation history (not used)

TYPE: list[dict[str, Any]]

RETURNS DESCRIPTION
str

The static prompt string

Source code in autogen/agentchat/group/llm_condition.py
def get_prompt(self, agent: "ConversableAgent", messages: list[dict[str, Any]]) -> str:
    """Return the static prompt string.

    Args:
        agent: The agent evaluating the condition (not used)
        messages: The conversation history (not used)

    Returns:
        The static prompt string
    """
    return self.prompt