Configuration
Configure Selora AI after installation — LLM provider setup, device discovery, and integration options.
Initial Setup
Before configuring a provider, make sure Selora AI is installed . Then go to Settings → Devices & Services → Selora AI → Configure, open the LLM Provider section, and pick one of the providers below.
Selora AI Cloud (recommended)
Selora AI Cloud is a managed LLM backend hosted by Selora Homes. There is no API key to obtain: authentication happens through your Selora Homes account, and usage is metered in Selora AI credits. Selora Hub subscriptions include a monthly credit allotment (see the pricing page for per-tier amounts), and the Free tier gets a one-time starter allotment.
| Field | Required | Default | Description |
|---|---|---|---|
| Selora account | Yes | — | Click Authenticate with Selora Homes and complete the OAuth flow in your browser |
The model is managed by Selora Homes and cannot be selected: requests are routed to a current top-tier model maintained by our team. If you need to pin a specific model or experiment with alternatives, use one of the BYOK providers below.
Once authenticated, you can monitor your current usage and remaining credits from your Selora Homes account, on the Selora AI page of the relevant installation.
Anthropic Claude
| Field | Required | Default | Description |
|---|---|---|---|
api_key | Yes | — | Your Anthropic API key |
model | No | claude-sonnet-4-6 | Model to use. Can be changed later in integration options. |
OpenAI
| Field | Required | Default | Description |
|---|---|---|---|
api_key | Yes | — | Your OpenAI API key |
model | No | gpt-4o | Model to use. Can be changed later in integration options. |
OpenRouter
| Field | Required | Default | Description |
|---|---|---|---|
api_key | Yes | — | Your OpenRouter API key |
model | No | anthropic/claude-sonnet-4 | Any model slug supported by OpenRouter. Can be changed later in options. |
Google Gemini
| Field | Required | Default | Description |
|---|---|---|---|
api_key | Yes | — | Your Google AI Studio API key |
model | No | gemini-2.5-pro | Model to use. Can be changed later in integration options. |
Ollama (local)
| Field | Required | Default | Description |
|---|---|---|---|
url | Yes | http://localhost:11434 | Base URL of your Ollama server, reachable from the HA host |
model | Yes | llama4 | Name of the locally pulled Ollama model |
Integration Options
After setup, click Configure on the Selora AI card to adjust:
| Option | Description |
|---|---|
| LLM model | Switch models without re-entering your API key (BYOK providers only — not Selora AI Cloud) |
| Analysis frequency | How often Selora AI scans for new automation ideas |
MCP Tokens
Selora AI tokens let you authenticate external MCP clients without sharing your Home Assistant credentials. Unlike HA long-lived access tokens, Selora AI tokens only grant access to the MCP server and support granular permission levels.
Creating a token
In the Selora AI settings panel, under Remote Access & MCP Authentication, click Create Token. Choose a name, permission level, and optional expiration. The token is shown only once — copy it immediately.
Permission levels
| Level | Access |
|---|---|
| Read-only | Read-only tools only (list automations, get snapshots, list suggestions, etc.) |
| Admin | Full access to all MCP tools including write operations |
| Custom | You choose exactly which tools are allowed |
Expiration
Tokens can be set to expire after 7 days, 30 days, 90 days, 1 year, or never. Expired tokens are automatically rejected.
Revoking a token
Click the Revoke button next to any token in the list. The token is invalidated immediately.
Last modified May 6, 2026: Document Selora AI Cloud and additional LLM providers (41a6d34)