LLMCondition
autogen.agentchat.group.llm_condition.LLMCondition #
Bases: BaseModel
Protocol for conditions evaluated by an LLM.
get_prompt #
Get the prompt text for LLM evaluation.
PARAMETER | DESCRIPTION |
---|---|
agent | The agent evaluating the condition TYPE: |
messages | The conversation history |
RETURNS | DESCRIPTION |
---|---|
str | The prompt text to be evaluated by the LLM |