remote::ollama
Description
Ollama inference provider for running local models through the Ollama runtime.
Configuration
Field |
Type |
Required |
Default |
Description |
---|---|---|---|---|
|
|
No |
http://localhost:11434 |
|
|
|
No |
True |
Sample Configuration
url: ${env.OLLAMA_URL:=http://localhost:11434}
raise_on_connect_error: true