Compare commits

...

5 Commits

Author SHA1 Message Date
Stefan Agner
c3b9b9535c Fix RAUC D-Bus type annotations for runtime type checking (#6532)
Replace ctypes integer types (c_uint32, c_uint64) with standard Python int
in SlotStatusDataType to satisfy typeguard runtime type checking. D-Bus
returns standard Python integers, not ctypes objects.

Also fix the mark() method return type from tuple[str, str] to list[str] to
match the actual D-Bus return value, and add missing optional fields
"bundle.hash" and "installed.transaction" to SlotStatusDataType.

Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-05 13:00:35 +01:00
Stefan Agner
0cd668ec77 Fix typeguard errors by explicitly converting IP addresses to strings (#6531)
* Fix environment variable type errors by converting IP addresses to strings

Environment variables must be strings, but IPv4Address and IPv4Network
objects were being passed directly to container environment dictionaries,
causing typeguard validation errors.

Changes:
- Convert IPv4Address objects to strings in homeassistant.py for
  SUPERVISOR and HASSIO environment variables
- Convert IPv4Network object to string in observer.py for
  NETWORK_MASK environment variable
- Update tests to expect string values instead of IP objects in
  environment dictionaries
- Remove unused ip_network import from test_observer.py

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* Use explicit string conversion for extra_hosts IP addresses

Use the !s format specifier in the f-string to explicitly convert
IPv4Address objects to strings when building the ExtraHosts list.
While f-strings implicitly convert objects to strings, using !s makes
the conversion explicit and consistent with the environment variable
fixes in the previous commit.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

---------

Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-05 11:00:43 +01:00
dependabot[bot]
d1cbb57c34 Bump sentry-sdk from 2.51.0 to 2.52.0 (#6530)
Bumps [sentry-sdk](https://github.com/getsentry/sentry-python) from 2.51.0 to 2.52.0.
- [Release notes](https://github.com/getsentry/sentry-python/releases)
- [Changelog](https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-python/compare/2.51.0...2.52.0)

---
updated-dependencies:
- dependency-name: sentry-sdk
  dependency-version: 2.52.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-05 09:28:13 +01:00
Stefan Agner
3d4849a3a2 Include Docker storage driver in Sentry reports (#6529)
Add the Docker storage driver (e.g., overlay2, vfs) to the context
information sent with Sentry error reports. This helps correlate
issues with specific storage backends and improves debugging of
Docker-related problems.

The storage driver is now included in both SETUP and RUNNING state
error reports under contexts.docker.storage_driver.

Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-05 09:27:51 +01:00
Tom Quist
4d8d44721d Fix MCP API proxy support for streaming and headers (#6461)
* Fix MCP API proxy support for streaming and headers

This commit fixes two issues with using the core API core/api/mcp through
the API proxy:

1. **Streaming support**: The proxy now detects text/event-stream responses
   and properly streams them instead of buffering all data. This is required
   for MCP's Server-Sent Events (SSE) transport.

2. **Header forwarding**: Added MCP-required headers to the forwarded headers:
   - Accept: Required for content negotiation
   - Last-Event-ID: Required for resuming broken SSE connections
   - Mcp-Session-Id: Required for session management across requests

The proxy now also preserves MCP-related response headers (Mcp-Session-Id)
and sets X-Accel-Buffering to "no" for streaming responses to prevent
buffering by intermediate proxies.

Tests added to verify:
- MCP headers are properly forwarded to Home Assistant
- Streaming responses (text/event-stream) are handled correctly
- Response headers are preserved

* Refactor: reuse stream logic for SSE responses (#3)

* Fix ruff format + cover streaming payload error

* Fix merge error

* Address review comments (headers / streaming proxy) (#4)

* Address review: header handling for streaming/non-streaming

* Forward MCP-Protocol-Version and Origin headers

* Do not forward Origin header through API proxy (#5)

---------

Co-authored-by: Stefan Agner <stefan@agner.ch>
2026-02-04 17:28:11 +01:00
11 changed files with 224 additions and 31 deletions

View File

@@ -25,7 +25,7 @@ pyudev==0.24.4
PyYAML==6.0.3
requests==2.32.5
securetar==2025.12.0
sentry-sdk==2.51.0
sentry-sdk==2.52.0
setuptools==80.10.2
voluptuous==0.16.0
dbus-fast==3.1.2

View File

@@ -21,7 +21,13 @@ from ..utils.logging import AddonLoggerAdapter
_LOGGER: logging.Logger = logging.getLogger(__name__)
FORWARD_HEADERS = ("X-Speech-Content",)
FORWARD_HEADERS = (
"X-Speech-Content",
"Accept",
"Last-Event-ID",
"Mcp-Session-Id",
"MCP-Protocol-Version",
)
HEADER_HA_ACCESS = "X-Ha-Access"
# Maximum message size for websocket messages from Home Assistant.
@@ -35,6 +41,38 @@ MAX_MESSAGE_SIZE_FROM_CORE = 64 * 1024 * 1024
class APIProxy(CoreSysAttributes):
"""API Proxy for Home Assistant."""
async def _stream_client_response(
self,
request: web.Request,
client: aiohttp.ClientResponse,
*,
content_type: str,
headers_to_copy: tuple[str, ...] = (),
) -> web.StreamResponse:
"""Stream an upstream aiohttp response to the caller.
Used for event streams (e.g. Home Assistant /api/stream) and for SSE endpoints
such as MCP (text/event-stream).
"""
response = web.StreamResponse(status=client.status)
response.content_type = content_type
for header in headers_to_copy:
if header in client.headers:
response.headers[header] = client.headers[header]
response.headers["X-Accel-Buffering"] = "no"
try:
await response.prepare(request)
async for data in client.content:
await response.write(data)
except (aiohttp.ClientError, aiohttp.ClientPayloadError):
# Client disconnected or upstream closed
pass
return response
def _check_access(self, request: web.Request):
"""Check the Supervisor token."""
if AUTHORIZATION in request.headers:
@@ -95,16 +133,11 @@ class APIProxy(CoreSysAttributes):
_LOGGER.info("Home Assistant EventStream start")
async with self._api_client(request, "stream", timeout=None) as client:
response = web.StreamResponse()
response.content_type = request.headers.get(CONTENT_TYPE, "")
try:
response.headers["X-Accel-Buffering"] = "no"
await response.prepare(request)
async for data in client.content:
await response.write(data)
except (aiohttp.ClientError, aiohttp.ClientPayloadError):
pass
response = await self._stream_client_response(
request,
client,
content_type=request.headers.get(CONTENT_TYPE, ""),
)
_LOGGER.info("Home Assistant EventStream close")
return response
@@ -118,10 +151,31 @@ class APIProxy(CoreSysAttributes):
# Normal request
path = request.match_info.get("path", "")
async with self._api_client(request, path) as client:
# Check if this is a streaming response (e.g., MCP SSE endpoints)
if client.content_type == "text/event-stream":
return await self._stream_client_response(
request,
client,
content_type=client.content_type,
headers_to_copy=(
"Cache-Control",
"Mcp-Session-Id",
),
)
# Non-streaming response
data = await client.read()
return web.Response(
response = web.Response(
body=data, status=client.status, content_type=client.content_type
)
# Copy selected headers from the upstream response
for header in (
"Cache-Control",
"Mcp-Session-Id",
):
if header in client.headers:
response.headers[header] = client.headers[header]
return response
async def _websocket_client(self) -> ClientWebSocketResponse:
"""Initialize a WebSocket API connection."""

View File

@@ -1,6 +1,5 @@
"""D-Bus interface for rauc."""
from ctypes import c_uint32, c_uint64
import logging
from typing import Any, NotRequired, TypedDict
@@ -33,13 +32,15 @@ SlotStatusDataType = TypedDict(
"state": str,
"device": str,
"bundle.compatible": NotRequired[str],
"bundle.hash": NotRequired[str],
"sha256": NotRequired[str],
"size": NotRequired[c_uint64],
"installed.count": NotRequired[c_uint32],
"size": NotRequired[int],
"installed.count": NotRequired[int],
"installed.transaction": NotRequired[str],
"bundle.version": NotRequired[str],
"installed.timestamp": NotRequired[str],
"status": NotRequired[str],
"activated.count": NotRequired[c_uint32],
"activated.count": NotRequired[int],
"activated.timestamp": NotRequired[str],
"boot-status": NotRequired[str],
"bootname": NotRequired[str],
@@ -117,7 +118,7 @@ class Rauc(DBusInterfaceProxy):
return self.connected_dbus.signal(DBUS_SIGNAL_RAUC_INSTALLER_COMPLETED)
@dbus_connected
async def mark(self, state: RaucState, slot_identifier: str) -> tuple[str, str]:
async def mark(self, state: RaucState, slot_identifier: str) -> list[str]:
"""Get slot status."""
return await self.connected_dbus.Installer.call("mark", state, slot_identifier)

View File

@@ -172,8 +172,8 @@ class DockerHomeAssistant(DockerInterface):
async def run(self, *, restore_job_id: str | None = None) -> None:
"""Run Docker image."""
environment = {
"SUPERVISOR": self.sys_docker.network.supervisor,
"HASSIO": self.sys_docker.network.supervisor,
"SUPERVISOR": str(self.sys_docker.network.supervisor),
"HASSIO": str(self.sys_docker.network.supervisor),
ENV_TIME: self.sys_timezone,
ENV_TOKEN: self.sys_homeassistant.supervisor_token,
ENV_TOKEN_OLD: self.sys_homeassistant.supervisor_token,

View File

@@ -398,7 +398,7 @@ class DockerAPI(CoreSysAttributes):
if restart_policy:
host_config["RestartPolicy"] = restart_policy
if extra_hosts:
host_config["ExtraHosts"] = [f"{k}:{v}" for k, v in extra_hosts.items()]
host_config["ExtraHosts"] = [f"{k}:{v!s}" for k, v in extra_hosts.items()]
if mounts:
host_config["Mounts"] = [mount.to_dict() for mount in mounts]
if oom_score_adj is not None:

View File

@@ -48,7 +48,7 @@ class DockerObserver(DockerInterface, CoreSysAttributes):
environment={
ENV_TIME: self.sys_timezone,
ENV_TOKEN: self.sys_plugins.observer.supervisor_token,
ENV_NETWORK_MASK: DOCKER_IPV4_NETWORK_MASK,
ENV_NETWORK_MASK: str(DOCKER_IPV4_NETWORK_MASK),
},
mounts=[MOUNT_DOCKER],
ports={"80/tcp": OBSERVER_PORT},

View File

@@ -72,6 +72,9 @@ def filter_data(coresys: CoreSys, event: Event, hint: Hint) -> Event | None:
"docker": coresys.docker.info.version,
"supervisor": coresys.supervisor.version,
},
"docker": {
"storage_driver": coresys.docker.info.storage,
},
"host": {
"machine": coresys.machine,
},
@@ -111,6 +114,9 @@ def filter_data(coresys: CoreSys, event: Event, hint: Hint) -> Event | None:
"docker": coresys.docker.info.version,
"supervisor": coresys.supervisor.version,
},
"docker": {
"storage_driver": coresys.docker.info.storage,
},
"resolution": {
"issues": [attr.asdict(issue) for issue in coresys.resolution.issues],
"suggestions": [

View File

@@ -9,7 +9,7 @@ import logging
from typing import Any, cast
from unittest.mock import AsyncMock, patch
from aiohttp import ClientWebSocketResponse, WSCloseCode
from aiohttp import ClientPayloadError, ClientWebSocketResponse, WSCloseCode
from aiohttp.http_websocket import WSMessage, WSMsgType
from aiohttp.test_utils import TestClient
import pytest
@@ -326,3 +326,129 @@ async def test_api_proxy_delete_request(
assert response.status == 200
assert await response.text() == '{"result": "ok"}'
assert response.content_type == "application/json"
async def test_api_proxy_mcp_headers_forwarded(
api_client: TestClient,
install_addon_example: Addon,
):
"""Test that MCP headers are forwarded to Home Assistant."""
install_addon_example.persist[ATTR_ACCESS_TOKEN] = "abc123"
install_addon_example.data["homeassistant_api"] = True
with patch.object(HomeAssistantAPI, "make_request") as make_request:
# Mock the response from make_request
mock_response = AsyncMock()
mock_response.status = 200
mock_response.content_type = "application/json"
mock_response.read.return_value = b"mocked response"
mock_response.headers = {"Mcp-Session-Id": "test-session-123"}
make_request.return_value.__aenter__.return_value = mock_response
response = await api_client.get(
"/core/api/mcp",
headers={
"Authorization": "Bearer abc123",
"Accept": "text/event-stream",
"Last-Event-ID": "5",
"Mcp-Session-Id": "test-session-123",
},
)
# Verify headers were forwarded in the request
assert make_request.call_args[1]["headers"]["Accept"] == "text/event-stream"
assert make_request.call_args[1]["headers"]["Last-Event-ID"] == "5"
assert (
make_request.call_args[1]["headers"]["Mcp-Session-Id"] == "test-session-123"
)
# Verify response headers are preserved
assert response.status == 200
assert response.headers.get("Mcp-Session-Id") == "test-session-123"
async def test_api_proxy_streaming_response(
api_client: TestClient,
install_addon_example: Addon,
):
"""Test that streaming responses (text/event-stream) are handled properly."""
install_addon_example.persist[ATTR_ACCESS_TOKEN] = "abc123"
install_addon_example.data["homeassistant_api"] = True
async def mock_content_iter():
"""Mock async iterator for streaming content."""
yield b"data: event1\n\n"
yield b"data: event2\n\n"
yield b"data: event3\n\n"
with patch.object(HomeAssistantAPI, "make_request") as make_request:
# Mock the response from make_request
mock_response = AsyncMock()
mock_response.status = 200
mock_response.content_type = "text/event-stream"
mock_response.headers = {
"Cache-Control": "no-cache",
"Mcp-Session-Id": "session-456",
}
mock_response.content = mock_content_iter()
make_request.return_value.__aenter__.return_value = mock_response
response = await api_client.get(
"/core/api/mcp",
headers={
"Authorization": "Bearer abc123",
"Accept": "text/event-stream",
},
)
# Verify it's a streaming response
assert response.status == 200
assert response.content_type == "text/event-stream"
assert response.headers.get("X-Accel-Buffering") == "no"
assert response.headers.get("Mcp-Session-Id") == "session-456"
# Read the streamed content
content = await response.read()
assert b"data: event1\n\n" in content
assert b"data: event2\n\n" in content
assert b"data: event3\n\n" in content
async def test_api_proxy_streaming_response_client_payload_error(
api_client: TestClient,
install_addon_example: Addon,
):
"""Test that client payload errors during streaming are handled gracefully."""
install_addon_example.persist[ATTR_ACCESS_TOKEN] = "abc123"
install_addon_example.data["homeassistant_api"] = True
async def mock_content_iter_error():
yield b"data: event1\n\n"
raise ClientPayloadError("boom")
with patch.object(HomeAssistantAPI, "make_request") as make_request:
mock_response = AsyncMock()
mock_response.status = 200
mock_response.content_type = "text/event-stream"
mock_response.headers = {
"Cache-Control": "no-cache",
"Mcp-Session-Id": "session-789",
}
mock_response.content = mock_content_iter_error()
make_request.return_value.__aenter__.return_value = mock_response
response = await api_client.get(
"/core/api/mcp",
headers={
"Authorization": "Bearer abc123",
"Accept": "text/event-stream",
},
)
assert response.status == 200
assert response.content_type == "text/event-stream"
assert response.headers.get("X-Accel-Buffering") == "no"
assert response.headers.get("Mcp-Session-Id") == "session-789"
content = await response.read()
assert b"data: event1\n\n" in content

View File

@@ -46,8 +46,8 @@ async def test_homeassistant_start(coresys: CoreSys, container: DockerContainer)
"observer": IPv4Address("172.30.32.6"),
}
assert run.call_args.kwargs["environment"] == {
"SUPERVISOR": IPv4Address("172.30.32.2"),
"HASSIO": IPv4Address("172.30.32.2"),
"SUPERVISOR": "172.30.32.2",
"HASSIO": "172.30.32.2",
"TZ": ANY,
"SUPERVISOR_TOKEN": ANY,
"HASSIO_TOKEN": ANY,
@@ -166,8 +166,8 @@ async def test_landingpage_start(coresys: CoreSys, container: DockerContainer):
"observer": IPv4Address("172.30.32.6"),
}
assert run.call_args.kwargs["environment"] == {
"SUPERVISOR": IPv4Address("172.30.32.2"),
"HASSIO": IPv4Address("172.30.32.2"),
"SUPERVISOR": "172.30.32.2",
"HASSIO": "172.30.32.2",
"TZ": ANY,
"SUPERVISOR_TOKEN": ANY,
"HASSIO_TOKEN": ANY,

View File

@@ -1,6 +1,6 @@
"""Test Observer plugin container."""
from ipaddress import IPv4Address, ip_network
from ipaddress import IPv4Address
from unittest.mock import patch
from aiodocker.containers import DockerContainer
@@ -26,9 +26,7 @@ async def test_start(coresys: CoreSys, container: DockerContainer):
"supervisor": IPv4Address("172.30.32.2")
}
assert run.call_args.kwargs["oom_score_adj"] == -300
assert run.call_args.kwargs["environment"]["NETWORK_MASK"] == ip_network(
"172.30.32.0/23"
)
assert run.call_args.kwargs["environment"]["NETWORK_MASK"] == "172.30.32.0/23"
assert run.call_args.kwargs["ports"] == {"80/tcp": 4357}
assert run.call_args.kwargs["mounts"] == [
DockerMount(

View File

@@ -121,10 +121,15 @@ async def test_not_started(coresys):
assert "versions" in filtered["contexts"]
assert "docker" in filtered["contexts"]["versions"]
assert "supervisor" in filtered["contexts"]["versions"]
assert "docker" in filtered["contexts"]
assert "storage_driver" in filtered["contexts"]["docker"]
assert "host" in filtered["contexts"]
assert "machine" in filtered["contexts"]["host"]
assert filtered["contexts"]["versions"]["docker"] == coresys.docker.info.version
assert filtered["contexts"]["versions"]["supervisor"] == coresys.supervisor.version
assert (
filtered["contexts"]["docker"]["storage_driver"] == coresys.docker.info.storage
)
assert filtered["contexts"]["host"]["machine"] == coresys.machine
@@ -142,6 +147,9 @@ async def test_defaults(coresys):
assert filtered["contexts"]["versions"]["supervisor"] == AwesomeVersion(
SUPERVISOR_VERSION
)
assert (
filtered["contexts"]["docker"]["storage_driver"] == coresys.docker.info.storage
)
assert filtered["user"]["id"] == coresys.machine_id