Create a OpenAI-compatible client for Gemini features.

Example:

llm_config={

GeminiClient

class GeminiClient()

Client for Google’s Gemini API.

Please visit this page for the roadmap of Gemini integration of AutoGen.

__init__

def __init__(**kwargs)

Uses either either api_key for authentication from the LLM config (specifying the GOOGLE_GEMINI_API_KEY environment variable also works), or follows the Google authentication mechanism for VertexAI in Google Cloud if no api_key is specified, where project_id and location can also be passed as parameters. Previously created credentials object can be provided, or a Service account key file can also be used. If neither a service account key file, nor the api_key are passed, then the default credentials will be used, which could be a personal account if the user is already authenticated in, like in Google Cloud Shell.

Arguments:

  • api_key str - The API key for using Gemini.
  • credentials google.auth.credentials.Credentials - credentials to be used for authentication with vertexai.
  • google_application_credentials str - Path to the JSON service account key file of the service account. Alternatively, the GOOGLE_APPLICATION_CREDENTIALS environment variable can also be set instead of using this argument.
  • project_id str - Google Cloud project id, which is only valid in case no API key is specified.
  • location str - Compute region to be used, like ‘us-west1’. This parameter is only valid in case no API key is specified.

message_retrieval

def message_retrieval(response) -> list

Retrieve and return a list of strings or a list of Choice.Message from the response.

NOTE: if a list of Choice.Message is returned, it currently needs to contain the fields of OpenAI’s ChatCompletion Message object, since that is expected for function or tool calling in the rest of the codebase at the moment, unless a custom agent is being used.

get_usage

@staticmethod
def get_usage(response) -> dict

Return usage summary of the response using RESPONSE_USAGE_KEYS.