remote::ollama
Description
Ollama inference provider for running local models through the Ollama runtime.
Configuration
Field |
Type |
Required |
Default |
Description |
---|---|---|---|---|
|
|
No |
http://localhost:11434 |
|
|
|
No |
False |
Whether to refresh models periodically |
Sample Configuration
url: ${env.OLLAMA_URL:=http://localhost:11434}