LiteLLM with WatsonX
LiteLLM is an open-source, locally run proxy server providing an OpenAI-compatible API. It supports various LLM providers, including IBM’s WatsonX, enabling seamless integration with tools like AG2.
Running LiteLLM with WatsonX requires the following installations:
- AG2 – A framework for building and orchestrating AI agents.
- LiteLLM – An OpenAI-compatible proxy for bridging non-compliant APIs.
- IBM WatsonX – LLM service requiring specific session token authentication.
Prerequisites
Before setting up, ensure Docker is installed. Refer to the Docker installation guide. Optionally, consider using Postman to easily test API requests.
Installing WatsonX
To set up WatsonX, follow these steps: 1. Access WatsonX: - Sign up for WatsonX.ai. - Create an API_KEY and PROJECT_ID.
- Validate WatsonX API Access:
- Verify access using the following commands:
Tip: Verify access to watsonX APIs before installing LiteLLM.
Get Session Token:
Get list of LLMs:
Ask the LLM a question:
With access to watsonX API’s validated you can install the python library from https://ibm.github.io/watsonx-ai-python-sdk/install.html
Installing LiteLLM
To install LiteLLM, follow these steps: 1. Download LiteLLM Docker Image:
OR
Install LiteLLM Python Library:
-
Create a LiteLLM Configuration File:
-
Save as
litellm_config.yaml
in a local directory. -
Example content for WatsonX:
-
-
Start LiteLLM Container:
Installing AG2
AG2 simplifies orchestration and communication between agents. To install:
-
Open a terminal with administrator rights.
-
Run the following command:
Once installed, AG2 agents can leverage WatsonX APIs via LiteLLM.