Configuration
Configure Selora AI after installation — LLM provider setup, device discovery, and integration options.
Selora-Ai
Configuration
Selora AI
Selora-Ai, ConfigurationConfigure Selora AI after installation — LLM provider setup, device discovery, and integration options.Search results
Initial Setup
- Go to Settings → Devices & Services → + Add integration and search for Selora AI.
- Select your LLM provider and fill in the required fields (see the tables below).
- Complete the Device Discovery step. Selora AI scans your local network for supported devices and integrations — this typically takes 15–30 seconds. Review the found devices, assign them to rooms, and click Finish.
Anthropic Claude
| Field | Required | Default | Description |
|---|---|---|---|
api_key | Yes | — | Your Anthropic API key |
model | No | claude-sonnet-4-6 | Model to use. Can be changed later in integration options. |
OpenAI
| Field | Required | Default | Description |
|---|---|---|---|
api_key | Yes | — | Your OpenAI API key |
model | No | gpt-4o | Model to use. Can be changed later in integration options. |
Ollama (local)
| Field | Required | Default | Description |
|---|---|---|---|
url | Yes | http://localhost:11434 | Base URL of your Ollama server, reachable from the HA host |
model | Yes | llama4 | Name of the locally pulled Ollama model |
Adding More Devices Later
Go to Settings → Devices & Services → Selora AI, click + Add entry, and the integration will skip the LLM setup and run a fresh discovery scan.
Integration Options
After setup, click Configure on the Selora AI card to adjust:
| Option | Description |
|---|---|
| LLM model | Switch models without re-entering your API key |
| Analysis frequency | How often Selora AI scans for new automation ideas |
Last modified April 1, 2026: Add Selora AI integration documentation page (9f03e2a)