Skip to main content
Version: Next

remote::ollama

Description​

Ollama inference provider for running local models through the Ollama runtime.

Configuration​

FieldTypeRequiredDefaultDescription
allowed_modelslist[str | NoneNoList of models that should be registered with the model registry. If None, all models are allowed.
refresh_models<class 'bool'>NoFalseWhether to refresh models periodically from the provider
url<class 'str'>Nohttp://localhost:11434

Sample Configuration​

url: ${env.OLLAMA_URL:=http://localhost:11434}