remote::ollama
Description​
Ollama inference provider for running local models through the Ollama runtime.
Configuration​
| Field | Type | Required | Default | Description |
|---|---|---|---|---|
allowed_models | list[str] | None | No | List of models that should be registered with the model registry. If None, all models are allowed. | |
refresh_models | bool | No | False | Whether to refresh models periodically from the provider |
base_url | HttpUrl | None | No | http://localhost:11434/v1 |
Sample Configuration​
base_url: ${env.OLLAMA_URL:=http://localhost:11434/v1}