Skip to main content
Version: v0.4.0

remote::ollama

Description​

Ollama inference provider for running local models through the Ollama runtime.

Configuration​

FieldTypeRequiredDefaultDescription
allowed_modelslist[str] | NoneNoList of models that should be registered with the model registry. If None, all models are allowed.
refresh_modelsboolNoFalseWhether to refresh models periodically from the provider
base_urlHttpUrl | NoneNohttp://localhost:11434/v1

Sample Configuration​

base_url: ${env.OLLAMA_URL:=http://localhost:11434/v1}