Prompt Management#
System Prompts#
Agents can be initialized with a static system prompt. You can provide a single string or a list of strings:
Dynamic Prompts#
On conversation startup#
System prompts can be generated dynamically when a conversation starts. This is useful when the prompt depends on external state or initial context. You can achieve this by using the @my_agent.prompt decorator or passing a synchronous or asynchronous function.
Dynamic prompt functions support the same powerful execution context capabilities as Agent Tools. For more detailed information on specific context features, see Dependency Injection, Context Variables, and Human-in-the-loop.
Dynamic prompts are evaluated only once at the beginning of the conversation, and their results are appended to the static prompts and reused for subsequent turns.
Alternatively, you can pass a callable directly to the prompt parameter, or mix static strings and callables in a list:
On each conversation turn#
While dynamic prompt hooks are evaluated once per conversation, you might need to update the prompt dynamically on each turn. You can do this by mutating the prompt list within the Context directly between calls to reply.ask().
You can also completely override the agent's default prompt for a specific run or turn by passing the prompt parameter directly to ask():
Prompt updates#
For continuous and event-driven prompt updates, you can mutate context.prompt dynamically from an event subscriber. This allows you to respond to specific events in the stream and adjust the agent's behavior on the fly during an ongoing conversation. See the Stream documentation for more details on this advanced feature.