Tools with ChatContext Dependency Injection
Introduction
In this post, we’ll build upon the concepts introduced in our previous blog on Tools with Dependency Injection. We’ll take a deeper look at how ChatContext
can be used to manage the flow of conversations in a more structured and secure way.
By using ChatContext
, we can track and control the sequence of function calls during a conversation. This is particularly useful in situations where one task must be completed before another — for example, ensuring that a user logs in before they can check their account balance. This approach helps to prevent errors and enhances the security of the system.
Benefits of Using ChatContext
:
- Flow Control: Ensures tasks are performed in the correct order, reducing the chance of mistakes.
- Enhanced Security: Prevents unauthorized actions, such as accessing sensitive data before authentication.
- Simplified Debugging: Logs the conversation history, making it easier to trace and resolve issues.
Installation
To install AG2
, simply run the following command:
Imports
Account and Helper Functions
The following Account
class and helper functions are adapted from
the Tools with Dependency Injection post. They define the structure for securely handling
account data and operations like login and balance retrieval.
Define Account Class
Helper Functions
These functions validate account credentials and retrieve account balances.
Agent Configuration
Configure the agents for the interaction.
config_list
defines the LLM configurations, including the model and API key.UserProxyAgent
simulates user inputs without requiring actual human interaction (set toNEVER
).AssistantAgent
represents the AI agent, configured with the LLM settings.
Injecting a ChatContext Parameter
Now let’s upgrade the example from the previous post by introducing
the ChatContext
parameter. This enhancement allows us to enforce
proper execution order in the workflow, ensuring that users log in
before accessing sensitive data like account balances.
The following functions will be registered:
login
: Verifies the user’s credentials and ensures they are logged in.get_balance
: Retrieves the account balance but only if the user has successfully logged in first.
Finally, we initiate a chat to retrieve the balance.
When user_proxy.initiate_chat(agent, message="Get users balance", max_turns=4)
is called, the following happens:
- User Request: The user sends the message
"Get users balance"
, prompting the agent to suggest callingget_balance
. - First Function Call: The agent attempts to execute
get_balance
, but since the user isn’t logged in, it raises an error:"Please login first"
. The agent then suggests callinglogin
to authenticate the user. - Login Execution: The
login
function is executed, confirming the user’s login with"You are logged in"
. - Second Function Call: With the user logged in, the agent successfully calls
get_balance
and retrieves the balance. - Final Response: The agent responds with
"Your balance is 200 USD"
, confirming the correct execution of the workflow.
Conclusion
In this post, we introduced ChatContext
as an extension of dependency injection, showing how it simplifies and secures workflows in conversational AI by ensuring tasks like login happen before accessing sensitive data, improving both functionality and security.
Highlights:
- Flow Control:
ChatContext
can ensure that functions are executed in the correct sequence, reducing the risk of logical errors. - Enhanced Security: Sensitive actions like balance retrieval are protected by preconditions like login verification.
- Simplified Debugging: With
ChatContext
, developers can trace conversations, making it easier to identify and resolve issues.
By leveraging tools like ChatContext
and employing dependency injection principles, you can build robust, secure, and user-friendly conversational workflows.