Skip to main content
Version: v0.2.23

remote::ollama

Description​

Ollama inference provider for running local models through the Ollama runtime.

Configuration​

FieldTypeRequiredDefaultDescription
url<class 'str'>Nohttp://localhost:11434
refresh_models<class 'bool'>NoFalseWhether to refresh models periodically

Sample Configuration​

url: ${env.OLLAMA_URL:=http://localhost:11434}