Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 21 additions & 0 deletions docs/docs/ai-presets.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -90,6 +90,27 @@ To connect to a local Ollama instance:

Note: The `ai:apitoken` is required but can be any value as Ollama ignores it. See [Ollama OpenAI compatibility docs](https://github.com/ollama/ollama/blob/main/docs/openai.md) for more details.

### Docker Model Runner

If you have Docker Desktop or running Docker CE it can run LLMs the Docker Model Runner. To connect to a Docker Model Runner use a preset similar to the following:

```json
{
"ai@docker-gemma": {
"display:name": "Docker Model Runner",
"display:order": 3,
"ai:*": true,
"ai:baseurl": "http://localhost:12434/engines/llama.cpp/v1",
"ai:name": "ai/gemma3n:latest",
"ai:model": "ai/gemma3n:latest",
"ai:apitoken": "not_used"
}
}
```

Note: The `ai:apitoken` is required but can be any value as the Docker model runner ignores it.
See [Docker Model Runner docs](https://docs.docker.com/ai/model-runner/) for details how to pull models, configure model parameters like context size, and other details.

### Azure OpenAI

To connect to Azure AI services:
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/faq.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ title: "FAQ"

### How do I configure Wave to use different AI models/providers?

Wave supports various AI providers including local LLMs (via Ollama), Azure OpenAI, Anthropic's Claude, and Perplexity. The recommended way to configure these is through AI presets, which let you set up and easily switch between different providers and models.
Wave supports various AI providers including local LLMs (via Ollama or Docker), Azure OpenAI, Anthropic's Claude, and Perplexity. The recommended way to configure these is through AI presets, which let you set up and easily switch between different providers and models.

See our [AI Presets documentation](/ai-presets) for detailed setup instructions for each provider.

Expand Down