OllamaLLMConfigEntry
autogen.oai.ollama.OllamaLLMConfigEntry #
Bases: LLMConfigEntry
num_predict class-attribute
instance-attribute
#
num_predict = Field(default=-1, description='Maximum number of tokens to predict, note: -1 is infinite (default), -2 is fill context.')
model_config class-attribute
instance-attribute
#
create_client #
check_top_p classmethod
#
Source code in autogen/llm_config/entry.py
check_temperature classmethod
#
Source code in autogen/llm_config/entry.py
apply_application_config #
Apply application level configurations.