MCP & Agent Skills
Connect external AI agents to Selora AI using the Model Context Protocol (MCP) — setup, client configs, Docker networking, and troubleshooting.
Search results
1. Verify Selora AI is Running
Complete the standard Selora AI setup first. Once the integration is active, the MCP endpoint is available at:
http://homeassistant.local:8123/api/selora_ai/mcp
If your HA instance uses a different hostname, IP address, or port, adjust the URL
accordingly. For remote installs accessed over HTTPS, replace http with https and use
your external hostname — ensure your long-lived token was created on the same HA instance.
SeloraBox users: If your Home Assistant runs on a SeloraBox with remote access enabled, you already have a managed HTTPS URL — no port forwarding, DDNS, or mDNS configuration required. Find your URL in Home Assistant under My Home Assistant → Remote Access. Use it directly as the host in your MCP endpoint:
https://<your-selora-connect-url>/api/selora_ai/mcp
Use this URL in place of homeassistant.local in any of the client configs below.
Docker users: If Home Assistant is running in Docker, homeassistant.local will not
resolve from outside the container network. Use the host machine’s local IP address (e.g.
http://192.168.1.100:8123/api/selora_ai/mcp) or the Docker container name if your MCP
client is also running inside Docker on the same network (e.g.
http://homeassistant:8123/api/selora_ai/mcp). To find your host IP, run ip addr or
hostname -I on the Docker host.
2. Create a Home Assistant Long-Lived Token
In Home Assistant go to Profile → Security → Long-Lived Access Tokens → Create Token.
Give it a descriptive name (e.g. selora-ai-mcp) and copy the token immediately — it is
only shown once.
3. Add the Server to Your Agent Client
There are two ways to connect a client to the Selora AI MCP endpoint. Choose based on what your client supports:
| Approach | When to use |
|---|---|
mcp-remote via npx (recommended) | Works with any MCP client that supports stdio — including Claude Desktop and most others. Requires Node.js. |
| Native SSE | Use only if your client explicitly supports native SSE transport with custom headers (e.g. some Cursor and Windsurf versions). |
mcp-remote approach (recommended for most clients)
mcp-remote is a small Node.js proxy that bridges your MCP client’s stdio connection to the
Selora AI HTTP endpoint. It works with any client that supports the command/args config
format.
Requirements: Node.js 18+ installed on the machine running your MCP client. mcp-remote is
fetched automatically via npx — no separate install needed.
{
"mcpServers": {
"selora-ai": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://homeassistant.local:8123/api/selora_ai/mcp",
"--transport",
"http-only",
"--header",
"Authorization: Bearer ${HA_TOKEN}"
],
"env": {
"HA_TOKEN": "<your-long-lived-token>"
}
}
}
}
Replace <your-long-lived-token> with the token from step 2.
Config fields explained:
| Field | What it does |
|---|---|
command | Runs npx to launch the mcp-remote proxy |
args[2] | The full Selora AI MCP URL |
--transport http-only | Tells mcp-remote to use plain HTTP streaming (not WebSocket) |
--header | Passes the Authorization header to every request |
env.HA_TOKEN | Stores the token as an environment variable, referenced as ${HA_TOKEN} in the header arg |
Claude Desktop
Config file location:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
If the file does not exist, create it. If it already contains an mcpServers block, add the
selora-ai entry inside the existing block. Claude Desktop uses the mcp-remote approach
above.
{
"mcpServers": {
"selora-ai": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://homeassistant.local:8123/api/selora_ai/mcp",
"--transport",
"http-only",
"--header",
"Authorization: Bearer ${HA_TOKEN}"
],
"env": {
"HA_TOKEN": "<your-long-lived-token>"
}
}
}
}
Restart Claude Desktop after saving.
Cursor
Open Cursor Settings → Features → MCP Servers and click + Add MCP Server. Cursor
supports both approaches. For the mcp-remote approach, select Command as the type and
paste the command/args block above. For native SSE (if your Cursor version supports it),
select SSE and enter:
- URL:
http://homeassistant.local:8123/api/selora_ai/mcp - Header key:
Authorization - Header value:
Bearer <your-long-lived-token>
Windsurf
Windsurf stores MCP configuration in ~/.codeium/windsurf/mcp_config.json (macOS/Linux) or
%USERPROFILE%\.codeium\windsurf\mcp_config.json (Windows). If the file does not exist,
create it. Add the selora-ai entry:
{
"mcpServers": {
"selora-ai": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://homeassistant.local:8123/api/selora_ai/mcp",
"--transport",
"http-only",
"--header",
"Authorization: Bearer ${HA_TOKEN}"
],
"env": {
"HA_TOKEN": "<your-long-lived-token>"
}
}
}
}
Restart Windsurf after saving. You can also add it through Windsurf Settings → MCP if your version has a UI for it.
Open WebUI
Open WebUI is a self-hosted chat interface commonly paired with Ollama. If you are already using Ollama as your Selora AI backend, Open WebUI lets you interact with the same local model while also having access to Selora AI’s MCP tools.
In Open WebUI, go to Admin Panel → Settings → Tools and add a new tool server:
- URL:
http://homeassistant.local:8123/api/selora_ai/mcp - Auth type: Bearer token
- Token:
<your-long-lived-token>
n8n
n8n is a workflow automation tool widely used alongside Home Assistant for multi-step automations that span multiple services. It supports MCP tool nodes via the community MCP node.
Install the MCP community node in your n8n instance, then configure a new MCP Tool node with:
- SSE URL:
http://homeassistant.local:8123/api/selora_ai/mcp - Authentication: Custom header —
Authorization: Bearer <your-long-lived-token>
This lets you call Selora AI tools (e.g. selora_create_automation, selora_trigger_scan)
as steps inside n8n workflows, enabling you to chain home automation actions with external
triggers like calendars, webhooks, or third-party services.
Continue (VS Code / JetBrains extension)
Open your ~/.continue/config.json file and add Selora AI to the mcpServers array.
Continue uses a different config structure than the command/args format — it accepts an
SSE transport directly. Set HA_TOKEN as an environment variable in your shell profile first
(see the token security note above), then reference it in the config:
{
"mcpServers": [
{
"name": "selora-ai",
"transport": {
"type": "sse",
"url": "http://homeassistant.local:8123/api/selora_ai/mcp",
"requestOptions": {
"headers": {
"Authorization": "Bearer ${HA_TOKEN}"
}
}
}
}
]
}
Other MCP-compatible clients
If your client supports the command/args stdio format, use the mcp-remote block above.
If it supports native SSE with custom headers, use these values:
- Transport type: SSE
- URL:
http://homeassistant.local:8123/api/selora_ai/mcp - Auth header:
Authorization: Bearer <your-long-lived-token>
Consult your client’s documentation for where to enter these values.
Docker: HA in Docker, MCP client on the same host machine
This is the most common setup — Home Assistant runs in Docker, and your MCP client (Claude Desktop, Cursor, Windsurf) runs as a normal desktop application on the same machine.
The key point: desktop apps on the host cannot use homeassistant.local to reach a Docker
container. Use localhost instead, since HA’s port is mapped to the host.
Your docker-compose.yml should map port 8123:
services:
homeassistant:
image: ghcr.io/home-assistant/home-assistant:stable
container_name: homeassistant
network_mode: host # simplest option — HA gets direct host network access
volumes:
- ./config:/config
restart: unless-stopped
With network_mode: host, Home Assistant binds directly to the host’s port 8123. Your MCP
client config on that same machine then uses:
http://localhost:8123/api/selora_ai/mcp
If you prefer bridge networking instead of host mode, map the port explicitly:
services:
homeassistant:
image: ghcr.io/home-assistant/home-assistant:stable
container_name: homeassistant
ports:
- "8123:8123"
volumes:
- ./config:/config
restart: unless-stopped
Same result — http://localhost:8123/api/selora_ai/mcp works from any desktop app on the
host.
Docker: HA and MCP client both running in Docker (n8n, Open WebUI)
n8n and Open WebUI are themselves commonly run as Docker containers. When both HA and your MCP client run in Docker on the same host, they can reach each other by service name — but only if they share the same Docker network.
services:
homeassistant:
image: ghcr.io/home-assistant/home-assistant:stable
container_name: homeassistant
networks:
- selora-net
volumes:
- ./config:/config
restart: unless-stopped
n8n:
image: n8nio/n8n
container_name: n8n
networks:
- selora-net
ports:
- "5678:5678"
environment:
- N8N_HOST=localhost
restart: unless-stopped
networks:
selora-net:
driver: bridge
Inside this network, n8n reaches Home Assistant at http://homeassistant:8123. Use that as
the MCP URL in your n8n MCP Tool node:
http://homeassistant:8123/api/selora_ai/mcp
The same pattern applies to Open WebUI — add it to the same selora-net network and point
its tool server URL at http://homeassistant:8123.
4. Verify Connectivity
Run a simple read flow from your client to confirm onboarding is working:
- Call
selora_get_home_snapshot - Call
selora_list_automations
If both return data, the MCP connection is live. If you receive an error, go straight to Troubleshooting below.
5. Recommended First-Run Workflow
- Ask: “analyze my home”
- Ask for an automation proposal and review the YAML before confirming
- Confirm create — automations are disabled by default
- Optionally run
selora_trigger_scanto refresh patterns - Review
selora_list_suggestionsandselora_list_patterns - Confirm
acceptordismissactions explicitly
6. Agent Skills (Advanced)
The repository includes skill files that guide compatible agent clients on safe call sequencing, confirmation gates, and write boundaries. Download the file for your target architecture and place it at the path shown:
| Architecture | Install path | Links |
|---|---|---|
| Claude / Claude Desktop | .claude/skills/selora-mcp/SKILL.md | View · Download |
| Codex | .codex/skills/selora-mcp/SKILL.md | View · Download |
| OpenClaw | .openclaw/skills/selora-mcp/SKILL.md | View · Download |
| Cursor | .cursor/rules/selora-ai.mdc | Coming soon |
| Windsurf | .windsurf/rules/selora-ai.md | Coming soon |
The paths above are relative to your agent client’s working directory or project root. After placing the file, restart your agent session so the skill is loaded.
Troubleshooting
401 / Unauthorized — Confirm the token is valid and copied in full. The header must be
exactly Authorization: Bearer <token> with no extra spaces or line breaks. Tokens are only
shown once at creation time — if you lost it, create a new one.
403 / Admin required — Use a token from an admin Home Assistant account. Read-only tools will still work with non-admin tokens.
Connection errors / timeout — Verify the MCP URL is reachable from the network where your
agent client is running. Common causes: firewall blocking port 8123, using
homeassistant.local inside Docker (use the container IP or service name instead), or using
http for a remote install that requires https.
homeassistant.local not resolving — This mDNS hostname only works on the same local
network segment. Docker containers, remote machines, and VPNs often cannot resolve it. Use
the host’s IP address instead.
Skill not triggering — Confirm the skill files exist at the paths above and restart the agent session.
Wrong automation targeted — Always call selora_list_automations first and use the
returned automation_id. Do not rely on names from memory or previous sessions.
No patterns or suggestions returned — Run selora_trigger_scan first, wait for it to
complete, then retry selora_list_suggestions.
Last modified April 1, 2026: Add Selora AI integration documentation page (9f03e2a)