Compare commits

...

19 Commits

Author SHA1 Message Date
Stefan Agner
9a78c9d4d8 Drop Debian 12 from supported OS list
With the deprecation of Home Assistant Supervised installation method
Debian 12 is no longer supported. This change removes Debian 12
from the list of supported operating systems in the evaluation logic.
2025-11-24 09:53:43 +01:00
dependabot[bot]
bb450cad4f Bump peter-evans/create-pull-request from 7.0.8 to 7.0.9 (#6332)
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 7.0.8 to 7.0.9.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](271a8d0340...84ae59a2cd)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-version: 7.0.9
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 08:52:46 +01:00
dependabot[bot]
10af48a65b Bump backports-zstd from 1.0.0 to 1.1.0 (#6336)
Bumps [backports-zstd](https://github.com/rogdham/backports.zstd) from 1.0.0 to 1.1.0.
- [Changelog](https://github.com/Rogdham/backports.zstd/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rogdham/backports.zstd/compare/v1.0.0...v1.1.0)

---
updated-dependencies:
- dependency-name: backports-zstd
  dependency-version: 1.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 08:50:56 +01:00
dependabot[bot]
2f334c48c3 Bump ruff from 0.14.5 to 0.14.6 (#6335)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.14.5 to 0.14.6.
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.14.5...0.14.6)

---
updated-dependencies:
- dependency-name: ruff
  dependency-version: 0.14.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 08:49:55 +01:00
dependabot[bot]
6d87e8f591 Bump pre-commit from 4.4.0 to 4.5.0 (#6334) 2025-11-24 07:54:42 +01:00
dependabot[bot]
4d1dd63248 Bump time-machine from 3.0.0 to 3.1.0 (#6333) 2025-11-24 07:46:27 +01:00
Stefan Agner
0c2d0cf5c1 Fix D-Bus enum type conversions for NetworkManager (#6325)
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-22 21:52:14 +01:00
Jan Čermák
ca7a3af676 Drop codenotary options from the build config (#6330)
These options are obsolete, as all the support has been dropped from the
builder and Supervisor as well. Remove them from our build config too.
2025-11-21 16:36:48 +01:00
Stefan Agner
93272fe4c0 Deprecate i386, armhf and armv7 Supervisor architectures (#5620)
* Deprecate i386, armhf and armv7 Supervisor architectures

* Exclude Core from architecture deprecation checks

This allows to download the latest available Core version still, even
on deprecated systems.

* Fix pytest
2025-11-21 16:35:26 +01:00
Jan Čermák
79a99cc66d Use release-suffixed base images (pin to 2025.11.1) (#6329)
Currently we're lacking control over what version of the base images is
used, and it only depends on when the build is launched. This doesn't
allow any (easy) rollback mechanisms and it's also not very transparent.

Use the newly introduced base image tags which include the release
version suffix so we have more control over this aspect.
2025-11-21 16:22:22 +01:00
dependabot[bot]
6af6c3157f Bump actions/checkout from 5.0.1 to 6.0.0 (#6327)
Bumps [actions/checkout](https://github.com/actions/checkout) from 5.0.1 to 6.0.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](93cb6efe18...1af3b93b68)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-21 09:29:32 +01:00
Jan Čermák
5ed0c85168 Add optional no_colors query parameter to advanced logs endpoints (#6326)
Add support for `no_colors` query parameter on all advanced logs API endpoints,
allowing users to optionally strip ANSI color sequences from log output. This
complements the existing color stripping on /latest endpoints added in #6319.
2025-11-21 09:29:15 +01:00
Stefan Agner
63a3dff118 Handle pull events with complete progress details only (#6320)
* Handle pull events with complete progress details only

Under certain circumstances, Docker seems to send pull events with
incomplete progress details (i.e., missing 'current' or 'total' fields).
In practise, we've observed an empty dictionary for progress details
as well as missing 'total' field (while 'current' was present).
All events were using Docker 28.3.3 using the old, default Docker graph
backend.

* Fix docstring/comment
2025-11-19 12:21:27 +01:00
dependabot[bot]
fc8fc171c1 Bump time-machine from 2.19.0 to 3.0.0 (#6321)
Bumps [time-machine](https://github.com/adamchainz/time-machine) from 2.19.0 to 3.0.0.
- [Changelog](https://github.com/adamchainz/time-machine/blob/main/docs/changelog.rst)
- [Commits](https://github.com/adamchainz/time-machine/compare/2.19.0...3.0.0)

---
updated-dependencies:
- dependency-name: time-machine
  dependency-version: 3.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-19 12:21:17 +01:00
Stefan Agner
72bbc50c83 Fix call_at to use event loop time base instead of Unix timestamp (#6324)
* Fix call_at to use event loop time base instead of Unix timestamp

The CoreSys.call_at method was incorrectly passing Unix timestamps
directly to asyncio.loop.call_at(), which expects times in the event
loop's monotonic time base. This caused scheduled jobs to be scheduled
approximately 55 years in the future (the difference between Unix epoch
time and monotonic time since boot).

The bug was masked by time-machine 2.19.0, which patched time.monotonic()
and caused loop.time() to return Unix timestamps. Time-machine 3.0.0
removed this patching (as it caused event loop freezes), exposing the bug.

Fix by converting the datetime to event loop time base:
- Calculate delay from current Unix time to scheduled Unix time
- Add delay to current event loop time to get scheduled loop time

Also simplify test_job_scheduled_at to avoid time-machine's async
context managers, following the pattern of test_job_scheduled_delay.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Add comment about dateime in the past

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-11-19 11:49:05 +01:00
Jan Čermák
0837e05cb2 Strip ANSI escape color sequences from /latest log responses (#6319)
* Strip ANSI escape color sequences from /latest log responses

Strip ANSI sequences of CSI commands [1] used for log coloring from
/latest log endpoints. These endpoint were primarily designed for log
downloads and colors are mostly not wanted in those. Add optional
argument for stripping the colors from the logs and enable it for the
/latest endpoints.

[1] https://en.wikipedia.org/wiki/ANSI_escape_code#CSIsection

* Refactor advanced logs' tests to use fixture factory

Introduce `advanced_logs_tester` fixture to simplify testing of advanced logs
in the API tests, declaring all the needed fixture in a single place. # Please
enter the commit message for your changes. Lines starting
2025-11-19 09:39:24 +01:00
dependabot[bot]
d3d652eba5 Bump sentry-sdk from 2.44.0 to 2.45.0 (#6322)
Bumps [sentry-sdk](https://github.com/getsentry/sentry-python) from 2.44.0 to 2.45.0.
- [Release notes](https://github.com/getsentry/sentry-python/releases)
- [Changelog](https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-python/compare/2.44.0...2.45.0)

---
updated-dependencies:
- dependency-name: sentry-sdk
  dependency-version: 2.45.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-19 09:27:59 +01:00
dependabot[bot]
2eea3c70eb Bump coverage from 7.11.3 to 7.12.0 (#6323)
Bumps [coverage](https://github.com/coveragepy/coveragepy) from 7.11.3 to 7.12.0.
- [Release notes](https://github.com/coveragepy/coveragepy/releases)
- [Changelog](https://github.com/coveragepy/coveragepy/blob/main/CHANGES.rst)
- [Commits](https://github.com/coveragepy/coveragepy/compare/7.11.3...7.12.0)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.12.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-19 09:27:45 +01:00
dependabot[bot]
95c106d502 Bump actions/checkout from 5.0.0 to 5.0.1 (#6318)
Bumps [actions/checkout](https://github.com/actions/checkout) from 5.0.0 to 5.0.1.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](08c6903cd8...93cb6efe18)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 5.0.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-18 08:45:19 +01:00
41 changed files with 600 additions and 285 deletions

View File

@@ -53,7 +53,7 @@ jobs:
requirements: ${{ steps.requirements.outputs.changed }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
fetch-depth: 0
@@ -92,7 +92,7 @@ jobs:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
fetch-depth: 0
@@ -178,7 +178,7 @@ jobs:
steps:
- name: Checkout the repository
if: needs.init.outputs.publish == 'true'
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Initialize git
if: needs.init.outputs.publish == 'true'
@@ -203,7 +203,7 @@ jobs:
timeout-minutes: 60
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
# home-assistant/builder doesn't support sha pinning
- name: Build the Supervisor

View File

@@ -26,7 +26,7 @@ jobs:
name: Prepare Python dependencies
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
@@ -68,7 +68,7 @@ jobs:
needs: prepare
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python
@@ -111,7 +111,7 @@ jobs:
needs: prepare
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python
@@ -154,7 +154,7 @@ jobs:
needs: prepare
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Register hadolint problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/hadolint.json"
@@ -169,7 +169,7 @@ jobs:
needs: prepare
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python
@@ -213,7 +213,7 @@ jobs:
needs: prepare
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python
@@ -257,7 +257,7 @@ jobs:
needs: prepare
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python
@@ -293,7 +293,7 @@ jobs:
needs: prepare
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python
@@ -339,7 +339,7 @@ jobs:
name: Run tests Python ${{ needs.prepare.outputs.python-version }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python
@@ -398,7 +398,7 @@ jobs:
needs: ["pytest", "prepare"]
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python

View File

@@ -11,7 +11,7 @@ jobs:
name: Release Drafter
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
fetch-depth: 0

View File

@@ -10,7 +10,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Sentry Release
uses: getsentry/action-release@128c5058bbbe93c8e02147fe0a9c713f166259a6 # v3.4.0
env:

View File

@@ -14,7 +14,7 @@ jobs:
latest_version: ${{ steps.latest_frontend_version.outputs.latest_tag }}
steps:
- name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Get latest frontend release
id: latest_frontend_version
uses: abatilo/release-info-action@32cb932219f1cee3fc4f4a298fd65ead5d35b661 # v1.3.3
@@ -49,7 +49,7 @@ jobs:
if: needs.check-version.outputs.skip != 'true'
steps:
- name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Clear www folder
run: |
rm -rf supervisor/api/panel/*
@@ -68,7 +68,7 @@ jobs:
run: |
rm -f supervisor/api/panel/home_assistant_frontend_supervisor-*.tar.gz
- name: Create PR
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@84ae59a2cdc2258d6fa0732dd66352dddae2a412 # v7.0.9
with:
commit-message: "Update frontend to version ${{ needs.check-version.outputs.latest_version }}"
branch: autoupdate-frontend

View File

@@ -1,13 +1,10 @@
image: ghcr.io/home-assistant/{arch}-hassio-supervisor
build_from:
aarch64: ghcr.io/home-assistant/aarch64-base-python:3.13-alpine3.22
armhf: ghcr.io/home-assistant/armhf-base-python:3.13-alpine3.22
armv7: ghcr.io/home-assistant/armv7-base-python:3.13-alpine3.22
amd64: ghcr.io/home-assistant/amd64-base-python:3.13-alpine3.22
i386: ghcr.io/home-assistant/i386-base-python:3.13-alpine3.22
codenotary:
signer: notary@home-assistant.io
base_image: notary@home-assistant.io
aarch64: ghcr.io/home-assistant/aarch64-base-python:3.13-alpine3.22-2025.11.1
armhf: ghcr.io/home-assistant/armhf-base-python:3.13-alpine3.22-2025.11.1
armv7: ghcr.io/home-assistant/armv7-base-python:3.13-alpine3.22-2025.11.1
amd64: ghcr.io/home-assistant/amd64-base-python:3.13-alpine3.22-2025.11.1
i386: ghcr.io/home-assistant/i386-base-python:3.13-alpine3.22-2025.11.1
cosign:
base_identity: https://github.com/home-assistant/docker-base/.*
identity: https://github.com/home-assistant/supervisor/.*

View File

@@ -4,7 +4,7 @@ aiohttp==3.13.2
atomicwrites-homeassistant==1.4.1
attrs==25.4.0
awesomeversion==25.8.0
backports.zstd==1.0.0
backports.zstd==1.1.0
blockbuster==1.5.25
brotli==1.2.0
ciso8601==2.3.3
@@ -25,7 +25,7 @@ pyudev==0.24.4
PyYAML==6.0.3
requests==2.32.5
securetar==2025.2.1
sentry-sdk==2.44.0
sentry-sdk==2.45.0
setuptools==80.9.0
voluptuous==0.15.2
dbus-fast==2.45.1

View File

@@ -1,15 +1,15 @@
astroid==4.0.2
coverage==7.11.3
coverage==7.12.0
mypy==1.18.2
pre-commit==4.4.0
pre-commit==4.5.0
pylint==4.0.3
pytest-aiohttp==1.1.0
pytest-asyncio==1.3.0
pytest-cov==7.0.0
pytest-timeout==2.4.0
pytest==9.0.1
ruff==0.14.5
time-machine==2.19.0
ruff==0.14.6
time-machine==3.1.0
types-docker==7.1.0.20251009
types-pyyaml==6.0.12.20250915
types-requests==2.32.4.20250913

View File

@@ -152,6 +152,7 @@ class RestAPI(CoreSysAttributes):
self._api_host.advanced_logs,
identifier=syslog_identifier,
latest=True,
no_colors=True,
),
),
web.get(
@@ -449,6 +450,7 @@ class RestAPI(CoreSysAttributes):
await async_capture_exception(err)
kwargs.pop("follow", None) # Follow is not supported for Docker logs
kwargs.pop("latest", None) # Latest is not supported for Docker logs
kwargs.pop("no_colors", None) # no_colors not supported for Docker logs
return await api_supervisor.logs(*args, **kwargs)
self.webapp.add_routes(
@@ -460,7 +462,7 @@ class RestAPI(CoreSysAttributes):
),
web.get(
"/supervisor/logs/latest",
partial(get_supervisor_logs, latest=True),
partial(get_supervisor_logs, latest=True, no_colors=True),
),
web.get("/supervisor/logs/boots/{bootid}", get_supervisor_logs),
web.get(
@@ -576,7 +578,7 @@ class RestAPI(CoreSysAttributes):
),
web.get(
"/addons/{addon}/logs/latest",
partial(get_addon_logs, latest=True),
partial(get_addon_logs, latest=True, no_colors=True),
),
web.get("/addons/{addon}/logs/boots/{bootid}", get_addon_logs),
web.get(

View File

@@ -206,6 +206,7 @@ class APIHost(CoreSysAttributes):
identifier: str | None = None,
follow: bool = False,
latest: bool = False,
no_colors: bool = False,
) -> web.StreamResponse:
"""Return systemd-journald logs."""
log_formatter = LogFormatter.PLAIN
@@ -251,6 +252,9 @@ class APIHost(CoreSysAttributes):
if "verbose" in request.query or request.headers[ACCEPT] == CONTENT_TYPE_X_LOG:
log_formatter = LogFormatter.VERBOSE
if "no_colors" in request.query:
no_colors = True
if "lines" in request.query:
lines = request.query.get("lines", DEFAULT_LINES)
try:
@@ -280,7 +284,9 @@ class APIHost(CoreSysAttributes):
response = web.StreamResponse()
response.content_type = CONTENT_TYPE_TEXT
headers_returned = False
async for cursor, line in journal_logs_reader(resp, log_formatter):
async for cursor, line in journal_logs_reader(
resp, log_formatter, no_colors
):
try:
if not headers_returned:
if cursor:
@@ -318,9 +324,12 @@ class APIHost(CoreSysAttributes):
identifier: str | None = None,
follow: bool = False,
latest: bool = False,
no_colors: bool = False,
) -> web.StreamResponse:
"""Return systemd-journald logs. Wrapped as standard API handler."""
return await self.advanced_logs_handler(request, identifier, follow, latest)
return await self.advanced_logs_handler(
request, identifier, follow, latest, no_colors
)
@api_process
async def disk_usage(self, request: web.Request) -> dict:

View File

@@ -9,6 +9,7 @@ from datetime import UTC, datetime, tzinfo
from functools import partial
import logging
import os
import time
from types import MappingProxyType
from typing import TYPE_CHECKING, Any, Self, TypeVar
@@ -655,8 +656,14 @@ class CoreSys:
if kwargs:
funct = partial(funct, **kwargs)
# Convert datetime to event loop time base
# If datetime is in the past, delay will be negative and call_at will
# schedule the call as soon as possible.
delay = when.timestamp() - time.time()
loop_time = self.loop.time() + delay
return self.loop.call_at(
when.timestamp(), funct, *args, context=self._create_context()
loop_time, funct, *args, context=self._create_context()
)

View File

@@ -306,6 +306,8 @@ class DeviceType(IntEnum):
VLAN = 11
TUN = 16
VETH = 20
WIREGUARD = 29
LOOPBACK = 32
class WirelessMethodType(IntEnum):

View File

@@ -134,9 +134,10 @@ class NetworkManager(DBusInterfaceProxy):
async def check_connectivity(self, *, force: bool = False) -> ConnectivityState:
"""Check the connectivity of the host."""
if force:
return await self.connected_dbus.call("check_connectivity")
else:
return await self.connected_dbus.get("connectivity")
return ConnectivityState(
await self.connected_dbus.call("check_connectivity")
)
return ConnectivityState(await self.connected_dbus.get("connectivity"))
async def connect(self, bus: MessageBus) -> None:
"""Connect to system's D-Bus."""

View File

@@ -69,7 +69,7 @@ class NetworkConnection(DBusInterfaceProxy):
@dbus_property
def state(self) -> ConnectionStateType:
"""Return the state of the connection."""
return self.properties[DBUS_ATTR_STATE]
return ConnectionStateType(self.properties[DBUS_ATTR_STATE])
@property
def state_flags(self) -> set[ConnectionStateFlags]:

View File

@@ -1,5 +1,6 @@
"""NetworkInterface object for Network Manager."""
import logging
from typing import Any
from dbus_fast.aio.message_bus import MessageBus
@@ -23,6 +24,8 @@ from .connection import NetworkConnection
from .setting import NetworkSetting
from .wireless import NetworkWireless
_LOGGER: logging.Logger = logging.getLogger(__name__)
class NetworkInterface(DBusInterfaceProxy):
"""NetworkInterface object represents Network Manager Device objects.
@@ -57,7 +60,15 @@ class NetworkInterface(DBusInterfaceProxy):
@dbus_property
def type(self) -> DeviceType:
"""Return interface type."""
return self.properties[DBUS_ATTR_DEVICE_TYPE]
try:
return DeviceType(self.properties[DBUS_ATTR_DEVICE_TYPE])
except ValueError:
_LOGGER.debug(
"Unknown device type %s for %s, treating as UNKNOWN",
self.properties[DBUS_ATTR_DEVICE_TYPE],
self.object_path,
)
return DeviceType.UNKNOWN
@property
@dbus_property

View File

@@ -310,6 +310,8 @@ class DockerInterface(JobGroup, ABC):
if (
stage in {PullImageLayerStage.DOWNLOADING, PullImageLayerStage.EXTRACTING}
and reference.progress_detail
and reference.progress_detail.current is not None
and reference.progress_detail.total is not None
):
job.update(
progress=progress,

View File

@@ -34,6 +34,7 @@ class JobCondition(StrEnum):
PLUGINS_UPDATED = "plugins_updated"
RUNNING = "running"
SUPERVISOR_UPDATED = "supervisor_updated"
ARCHITECTURE_SUPPORTED = "architecture_supported"
class JobConcurrency(StrEnum):

View File

@@ -441,6 +441,14 @@ class Job(CoreSysAttributes):
raise JobConditionException(
f"'{method_name}' blocked from execution, supervisor needs to be updated first"
)
if (
JobCondition.ARCHITECTURE_SUPPORTED in used_conditions
and UnsupportedReason.SYSTEM_ARCHITECTURE
in coresys.sys_resolution.unsupported
):
raise JobConditionException(
f"'{method_name}' blocked from execution, unsupported system architecture"
)
if JobCondition.PLUGINS_UPDATED in used_conditions and (
out_of_date := [

View File

@@ -161,6 +161,7 @@ class Tasks(CoreSysAttributes):
JobCondition.INTERNET_HOST,
JobCondition.OS_SUPPORTED,
JobCondition.RUNNING,
JobCondition.ARCHITECTURE_SUPPORTED,
],
concurrency=JobConcurrency.REJECT,
)

View File

@@ -23,4 +23,5 @@ PLUGIN_UPDATE_CONDITIONS = [
JobCondition.HEALTHY,
JobCondition.INTERNET_HOST,
JobCondition.SUPERVISOR_UPDATED,
JobCondition.ARCHITECTURE_SUPPORTED,
]

View File

@@ -58,6 +58,7 @@ class UnsupportedReason(StrEnum):
SYSTEMD_JOURNAL = "systemd_journal"
SYSTEMD_RESOLVED = "systemd_resolved"
VIRTUALIZATION_IMAGE = "virtualization_image"
SYSTEM_ARCHITECTURE = "system_architecture"
class UnhealthyReason(StrEnum):

View File

@@ -5,8 +5,6 @@ from ...coresys import CoreSys
from ..const import UnsupportedReason
from .base import EvaluateBase
SUPPORTED_OS = ["Debian GNU/Linux 12 (bookworm)"]
def setup(coresys: CoreSys) -> EvaluateBase:
"""Initialize evaluation-setup function."""
@@ -33,6 +31,4 @@ class EvaluateOperatingSystem(EvaluateBase):
async def evaluate(self) -> bool:
"""Run evaluation."""
if self.sys_os.available:
return False
return self.sys_host.info.operating_system not in SUPPORTED_OS
return not self.sys_os.available

View File

@@ -0,0 +1,38 @@
"""Evaluation class for system architecture support."""
from ...const import CoreState
from ...coresys import CoreSys
from ..const import UnsupportedReason
from .base import EvaluateBase
def setup(coresys: CoreSys) -> EvaluateBase:
"""Initialize evaluation-setup function."""
return EvaluateSystemArchitecture(coresys)
class EvaluateSystemArchitecture(EvaluateBase):
"""Evaluate if the current Supervisor architecture is supported."""
@property
def reason(self) -> UnsupportedReason:
"""Return a UnsupportedReason enum."""
return UnsupportedReason.SYSTEM_ARCHITECTURE
@property
def on_failure(self) -> str:
"""Return a string that is printed when self.evaluate is True."""
return "System architecture is no longer supported. Move to a supported system architecture."
@property
def states(self) -> list[CoreState]:
"""Return a list of valid states when this evaluation can run."""
return [CoreState.INITIALIZE]
async def evaluate(self):
"""Run evaluation."""
return self.sys_host.info.sys_arch.supervisor in {
"i386",
"armhf",
"armv7",
}

View File

@@ -242,9 +242,10 @@ class Updater(FileConfiguration, CoreSysAttributes):
@Job(
name="updater_fetch_data",
conditions=[
JobCondition.ARCHITECTURE_SUPPORTED,
JobCondition.INTERNET_SYSTEM,
JobCondition.OS_SUPPORTED,
JobCondition.HOME_ASSISTANT_CORE_SUPPORTED,
JobCondition.OS_SUPPORTED,
],
on_condition=UpdaterJobError,
throttle_period=timedelta(seconds=30),

View File

@@ -5,12 +5,20 @@ from collections.abc import AsyncGenerator
from datetime import UTC, datetime
from functools import wraps
import json
import re
from aiohttp import ClientResponse
from supervisor.exceptions import MalformedBinaryEntryError
from supervisor.host.const import LogFormatter
_RE_ANSI_CSI_COLORS_PATTERN = re.compile(r"\x1B\[[0-9;]*m")
def _strip_ansi_colors(message: str) -> str:
"""Remove ANSI color codes from a message string."""
return _RE_ANSI_CSI_COLORS_PATTERN.sub("", message)
def formatter(required_fields: list[str]):
"""Decorate journal entry formatters with list of required fields.
@@ -31,9 +39,9 @@ def formatter(required_fields: list[str]):
@formatter(["MESSAGE"])
def journal_plain_formatter(entries: dict[str, str]) -> str:
def journal_plain_formatter(entries: dict[str, str], no_colors: bool = False) -> str:
"""Format parsed journal entries as a plain message."""
return entries["MESSAGE"]
return _strip_ansi_colors(entries["MESSAGE"]) if no_colors else entries["MESSAGE"]
@formatter(
@@ -45,7 +53,7 @@ def journal_plain_formatter(entries: dict[str, str]) -> str:
"MESSAGE",
]
)
def journal_verbose_formatter(entries: dict[str, str]) -> str:
def journal_verbose_formatter(entries: dict[str, str], no_colors: bool = False) -> str:
"""Format parsed journal entries to a journalctl-like format."""
ts = datetime.fromtimestamp(
int(entries["__REALTIME_TIMESTAMP"]) / 1e6, UTC
@@ -58,14 +66,24 @@ def journal_verbose_formatter(entries: dict[str, str]) -> str:
else entries.get("SYSLOG_IDENTIFIER", "_UNKNOWN_")
)
return f"{ts} {entries.get('_HOSTNAME', '')} {identifier}: {entries.get('MESSAGE', '')}"
message = (
_strip_ansi_colors(entries.get("MESSAGE", ""))
if no_colors
else entries.get("MESSAGE", "")
)
return f"{ts} {entries.get('_HOSTNAME', '')} {identifier}: {message}"
async def journal_logs_reader(
journal_logs: ClientResponse, log_formatter: LogFormatter = LogFormatter.PLAIN
journal_logs: ClientResponse,
log_formatter: LogFormatter = LogFormatter.PLAIN,
no_colors: bool = False,
) -> AsyncGenerator[tuple[str | None, str]]:
"""Read logs from systemd journal line by line, formatted using the given formatter.
Optionally strip ANSI color codes from the entries' messages.
Returns a generator of (cursor, formatted_entry) tuples.
"""
match log_formatter:
@@ -84,7 +102,10 @@ async def journal_logs_reader(
# at EOF (likely race between at_eof and EOF check in readuntil)
if line == b"\n" or not line:
if entries:
yield entries.get("__CURSOR"), formatter_(entries)
yield (
entries.get("__CURSOR"),
formatter_(entries, no_colors=no_colors),
)
entries = {}
continue

View File

@@ -1,95 +1 @@
"""Test for API calls."""
from unittest.mock import AsyncMock, MagicMock
from aiohttp.test_utils import TestClient
from supervisor.coresys import CoreSys
from supervisor.host.const import LogFormat
DEFAULT_LOG_RANGE = "entries=:-99:100"
DEFAULT_LOG_RANGE_FOLLOW = "entries=:-99:18446744073709551615"
async def common_test_api_advanced_logs(
path_prefix: str,
syslog_identifier: str,
api_client: TestClient,
journald_logs: MagicMock,
coresys: CoreSys,
os_available: None,
):
"""Template for tests of endpoints using advanced logs."""
resp = await api_client.get(f"{path_prefix}/logs")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journald_logs.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/follow")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier, "follow": ""},
range_header=DEFAULT_LOG_RANGE_FOLLOW,
accept=LogFormat.JOURNAL,
)
journald_logs.reset_mock()
mock_response = MagicMock()
mock_response.text = AsyncMock(
return_value='{"CONTAINER_LOG_EPOCH": "12345"}\n{"CONTAINER_LOG_EPOCH": "12345"}\n'
)
journald_logs.return_value.__aenter__.return_value = mock_response
resp = await api_client.get(f"{path_prefix}/logs/latest")
assert resp.status == 200
assert journald_logs.call_count == 2
# Check the first call for getting epoch
epoch_call = journald_logs.call_args_list[0]
assert epoch_call[1]["params"] == {"CONTAINER_NAME": syslog_identifier}
assert epoch_call[1]["range_header"] == "entries=:-1:2"
# Check the second call for getting logs with the epoch
logs_call = journald_logs.call_args_list[1]
assert logs_call[1]["params"]["SYSLOG_IDENTIFIER"] == syslog_identifier
assert logs_call[1]["params"]["CONTAINER_LOG_EPOCH"] == "12345"
assert logs_call[1]["range_header"] == "entries=:0:18446744073709551615"
journald_logs.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/boots/0")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier, "_BOOT_ID": "ccc"},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journald_logs.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/boots/0/follow")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={
"SYSLOG_IDENTIFIER": syslog_identifier,
"_BOOT_ID": "ccc",
"follow": "",
},
range_header=DEFAULT_LOG_RANGE_FOLLOW,
accept=LogFormat.JOURNAL,
)

149
tests/api/conftest.py Normal file
View File

@@ -0,0 +1,149 @@
"""Fixtures for API tests."""
from collections.abc import Awaitable, Callable
from unittest.mock import ANY, AsyncMock, MagicMock
from aiohttp.test_utils import TestClient
import pytest
from supervisor.coresys import CoreSys
from supervisor.host.const import LogFormat, LogFormatter
DEFAULT_LOG_RANGE = "entries=:-99:100"
DEFAULT_LOG_RANGE_FOLLOW = "entries=:-99:18446744073709551615"
async def _common_test_api_advanced_logs(
path_prefix: str,
syslog_identifier: str,
api_client: TestClient,
journald_logs: MagicMock,
coresys: CoreSys,
os_available: None,
journal_logs_reader: MagicMock,
):
"""Template for tests of endpoints using advanced logs."""
resp = await api_client.get(f"{path_prefix}/logs")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.PLAIN, False)
journald_logs.reset_mock()
journal_logs_reader.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs?no_colors")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.PLAIN, True)
journald_logs.reset_mock()
journal_logs_reader.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/follow")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier, "follow": ""},
range_header=DEFAULT_LOG_RANGE_FOLLOW,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.PLAIN, False)
journald_logs.reset_mock()
journal_logs_reader.reset_mock()
mock_response = MagicMock()
mock_response.text = AsyncMock(
return_value='{"CONTAINER_LOG_EPOCH": "12345"}\n{"CONTAINER_LOG_EPOCH": "12345"}\n'
)
journald_logs.return_value.__aenter__.return_value = mock_response
resp = await api_client.get(f"{path_prefix}/logs/latest")
assert resp.status == 200
assert journald_logs.call_count == 2
# Check the first call for getting epoch
epoch_call = journald_logs.call_args_list[0]
assert epoch_call[1]["params"] == {"CONTAINER_NAME": syslog_identifier}
assert epoch_call[1]["range_header"] == "entries=:-1:2"
# Check the second call for getting logs with the epoch
logs_call = journald_logs.call_args_list[1]
assert logs_call[1]["params"]["SYSLOG_IDENTIFIER"] == syslog_identifier
assert logs_call[1]["params"]["CONTAINER_LOG_EPOCH"] == "12345"
assert logs_call[1]["range_header"] == "entries=:0:18446744073709551615"
journal_logs_reader.assert_called_with(ANY, LogFormatter.PLAIN, True)
journald_logs.reset_mock()
journal_logs_reader.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/boots/0")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier, "_BOOT_ID": "ccc"},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journald_logs.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/boots/0/follow")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={
"SYSLOG_IDENTIFIER": syslog_identifier,
"_BOOT_ID": "ccc",
"follow": "",
},
range_header=DEFAULT_LOG_RANGE_FOLLOW,
accept=LogFormat.JOURNAL,
)
@pytest.fixture
async def advanced_logs_tester(
api_client: TestClient,
journald_logs: MagicMock,
coresys: CoreSys,
os_available,
journal_logs_reader: MagicMock,
) -> Callable[[str, str], Awaitable[None]]:
"""Fixture that returns a function to test advanced logs endpoints.
This allows tests to avoid explicitly passing all the required fixtures.
Usage:
async def test_my_logs(advanced_logs_tester):
await advanced_logs_tester("/path/prefix", "syslog_identifier")
"""
async def test_logs(path_prefix: str, syslog_identifier: str):
await _common_test_api_advanced_logs(
path_prefix,
syslog_identifier,
api_client,
journald_logs,
coresys,
os_available,
journal_logs_reader,
)
return test_logs

View File

@@ -20,7 +20,6 @@ from supervisor.exceptions import HassioError
from supervisor.store.repository import Repository
from ..const import TEST_ADDON_SLUG
from . import common_test_api_advanced_logs
def _create_test_event(name: str, state: ContainerState) -> DockerContainerStateEvent:
@@ -72,21 +71,11 @@ async def test_addons_info_not_installed(
async def test_api_addon_logs(
api_client: TestClient,
journald_logs: MagicMock,
coresys: CoreSys,
os_available,
advanced_logs_tester,
install_addon_ssh: Addon,
):
"""Test addon logs."""
await common_test_api_advanced_logs(
"/addons/local_ssh",
"addon_local_ssh",
api_client,
journald_logs,
coresys,
os_available,
)
await advanced_logs_tester("/addons/local_ssh", "addon_local_ssh")
async def test_api_addon_logs_not_installed(api_client: TestClient):

View File

@@ -1,18 +1,6 @@
"""Test audio api."""
from unittest.mock import MagicMock
from aiohttp.test_utils import TestClient
from supervisor.coresys import CoreSys
from tests.api import common_test_api_advanced_logs
async def test_api_audio_logs(
api_client: TestClient, journald_logs: MagicMock, coresys: CoreSys, os_available
):
async def test_api_audio_logs(advanced_logs_tester) -> None:
"""Test audio logs."""
await common_test_api_advanced_logs(
"/audio", "hassio_audio", api_client, journald_logs, coresys, os_available
)
await advanced_logs_tester("/audio", "hassio_audio")

View File

@@ -1,13 +1,12 @@
"""Test DNS API."""
from unittest.mock import MagicMock, patch
from unittest.mock import patch
from aiohttp.test_utils import TestClient
from supervisor.coresys import CoreSys
from supervisor.dbus.resolved import Resolved
from tests.api import common_test_api_advanced_logs
from tests.dbus_service_mocks.base import DBusServiceMock
from tests.dbus_service_mocks.resolved import Resolved as ResolvedService
@@ -66,15 +65,6 @@ async def test_options(api_client: TestClient, coresys: CoreSys):
restart.assert_called_once()
async def test_api_dns_logs(
api_client: TestClient, journald_logs: MagicMock, coresys: CoreSys, os_available
):
async def test_api_dns_logs(advanced_logs_tester):
"""Test dns logs."""
await common_test_api_advanced_logs(
"/dns",
"hassio_dns",
api_client,
journald_logs,
coresys,
os_available,
)
await advanced_logs_tester("/dns", "hassio_dns")

View File

@@ -18,26 +18,18 @@ from supervisor.homeassistant.const import WSEvent
from supervisor.homeassistant.core import HomeAssistantCore
from supervisor.homeassistant.module import HomeAssistant
from tests.api import common_test_api_advanced_logs
from tests.common import AsyncIterator, load_json_fixture
@pytest.mark.parametrize("legacy_route", [True, False])
async def test_api_core_logs(
api_client: TestClient,
journald_logs: MagicMock,
coresys: CoreSys,
os_available,
advanced_logs_tester: AsyncMock,
legacy_route: bool,
):
"""Test core logs."""
await common_test_api_advanced_logs(
await advanced_logs_tester(
f"/{'homeassistant' if legacy_route else 'core'}",
"homeassistant",
api_client,
journald_logs,
coresys,
os_available,
)

View File

@@ -272,7 +272,7 @@ async def test_advaced_logs_query_parameters(
range_header=DEFAULT_RANGE,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE)
journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE, False)
journal_logs_reader.reset_mock()
journald_logs.reset_mock()
@@ -290,7 +290,19 @@ async def test_advaced_logs_query_parameters(
range_header="entries=:-52:53",
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE)
journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE, False)
journal_logs_reader.reset_mock()
journald_logs.reset_mock()
# Check no_colors query parameter
await api_client.get("/host/logs?no_colors")
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": coresys.host.logs.default_identifiers},
range_header=DEFAULT_RANGE,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE, True)
async def test_advanced_logs_boot_id_offset(
@@ -343,24 +355,24 @@ async def test_advanced_logs_formatters(
"""Test advanced logs formatters varying on Accept header."""
await api_client.get("/host/logs")
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE)
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE, False)
journal_logs_reader.reset_mock()
headers = {"Accept": "text/x-log"}
await api_client.get("/host/logs", headers=headers)
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE)
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE, False)
journal_logs_reader.reset_mock()
await api_client.get("/host/logs/identifiers/test")
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.PLAIN)
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.PLAIN, False)
journal_logs_reader.reset_mock()
headers = {"Accept": "text/x-log"}
await api_client.get("/host/logs/identifiers/test", headers=headers)
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE)
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE, False)
async def test_advanced_logs_errors(coresys: CoreSys, api_client: TestClient):

View File

@@ -1,23 +1,6 @@
"""Test multicast api."""
from unittest.mock import MagicMock
from aiohttp.test_utils import TestClient
from supervisor.coresys import CoreSys
from tests.api import common_test_api_advanced_logs
async def test_api_multicast_logs(
api_client: TestClient, journald_logs: MagicMock, coresys: CoreSys, os_available
):
async def test_api_multicast_logs(advanced_logs_tester):
"""Test multicast logs."""
await common_test_api_advanced_logs(
"/multicast",
"hassio_multicast",
api_client,
journald_logs,
coresys,
os_available,
)
await advanced_logs_tester("/multicast", "hassio_multicast")

View File

@@ -18,7 +18,6 @@ from supervisor.store.repository import Repository
from supervisor.supervisor import Supervisor
from supervisor.updater import Updater
from tests.api import common_test_api_advanced_logs
from tests.common import AsyncIterator, load_json_fixture
from tests.dbus_service_mocks.base import DBusServiceMock
from tests.dbus_service_mocks.os_agent import OSAgent as OSAgentService
@@ -155,18 +154,9 @@ async def test_api_supervisor_options_diagnostics(
assert coresys.dbus.agent.diagnostics is False
async def test_api_supervisor_logs(
api_client: TestClient, journald_logs: MagicMock, coresys: CoreSys, os_available
):
async def test_api_supervisor_logs(advanced_logs_tester):
"""Test supervisor logs."""
await common_test_api_advanced_logs(
"/supervisor",
"hassio_supervisor",
api_client,
journald_logs,
coresys,
os_available,
)
await advanced_logs_tester("/supervisor", "hassio_supervisor")
async def test_api_supervisor_fallback(

View File

@@ -184,3 +184,20 @@ async def test_interface_becomes_unmanaged(
assert wireless.is_connected is False
assert eth0.connection is None
assert connection.is_connected is False
async def test_unknown_device_type(
device_eth0_service: DeviceService, dbus_session_bus: MessageBus
):
"""Test unknown device types are handled gracefully."""
interface = NetworkInterface("/org/freedesktop/NetworkManager/Devices/1")
await interface.connect(dbus_session_bus)
# Emit an unknown device type (e.g., 1000 which doesn't exist in the enum)
device_eth0_service.emit_properties_changed({"DeviceType": 1000})
await device_eth0_service.ping()
# Should return UNKNOWN instead of crashing
assert interface.type == DeviceType.UNKNOWN
# Wireless should be None since it's not a wireless device
assert interface.wireless is None

View File

@@ -445,28 +445,23 @@ async def test_install_progress_rounding_does_not_cause_misses(
]
coresys.docker.images.pull.return_value = AsyncIterator(logs)
with (
patch.object(
type(coresys.supervisor), "arch", PropertyMock(return_value="i386")
),
):
# Schedule job so we can listen for the end. Then we can assert against the WS mock
event = asyncio.Event()
job, install_task = coresys.jobs.schedule_job(
test_docker_interface.install,
JobSchedulerOptions(),
AwesomeVersion("1.2.3"),
"test",
)
# Schedule job so we can listen for the end. Then we can assert against the WS mock
event = asyncio.Event()
job, install_task = coresys.jobs.schedule_job(
test_docker_interface.install,
JobSchedulerOptions(),
AwesomeVersion("1.2.3"),
"test",
)
async def listen_for_job_end(reference: SupervisorJob):
if reference.uuid != job.uuid:
return
event.set()
async def listen_for_job_end(reference: SupervisorJob):
if reference.uuid != job.uuid:
return
event.set()
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_END, listen_for_job_end)
await install_task
await event.wait()
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_END, listen_for_job_end)
await install_task
await event.wait()
capture_exception.assert_not_called()
@@ -664,3 +659,64 @@ async def test_install_progress_handles_layers_skipping_download(
assert job.done is True
assert job.progress == 100
capture_exception.assert_not_called()
async def test_missing_total_handled_gracefully(
coresys: CoreSys,
test_docker_interface: DockerInterface,
ha_ws_client: AsyncMock,
capture_exception: Mock,
):
"""Test missing 'total' fields in progress details handled gracefully."""
coresys.core.set_state(CoreState.RUNNING)
# Progress details with missing 'total' fields observed in real-world pulls
logs = [
{
"status": "Pulling from home-assistant/odroid-n2-homeassistant",
"id": "2025.7.1",
},
{"status": "Pulling fs layer", "progressDetail": {}, "id": "1e214cd6d7d0"},
{
"status": "Downloading",
"progressDetail": {"current": 436480882},
"progress": "[===================================================] 436.5MB/436.5MB",
"id": "1e214cd6d7d0",
},
{"status": "Verifying Checksum", "progressDetail": {}, "id": "1e214cd6d7d0"},
{"status": "Download complete", "progressDetail": {}, "id": "1e214cd6d7d0"},
{
"status": "Extracting",
"progressDetail": {"current": 436480882},
"progress": "[===================================================] 436.5MB/436.5MB",
"id": "1e214cd6d7d0",
},
{"status": "Pull complete", "progressDetail": {}, "id": "1e214cd6d7d0"},
{
"status": "Digest: sha256:7d97da645f232f82a768d0a537e452536719d56d484d419836e53dbe3e4ec736"
},
{
"status": "Status: Downloaded newer image for ghcr.io/home-assistant/odroid-n2-homeassistant:2025.7.1"
},
]
coresys.docker.images.pull.return_value = AsyncIterator(logs)
# Schedule job so we can listen for the end. Then we can assert against the WS mock
event = asyncio.Event()
job, install_task = coresys.jobs.schedule_job(
test_docker_interface.install,
JobSchedulerOptions(),
AwesomeVersion("1.2.3"),
"test",
)
async def listen_for_job_end(reference: SupervisorJob):
if reference.uuid != job.uuid:
return
event.set()
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_END, listen_for_job_end)
await install_task
await event.wait()
capture_exception.assert_not_called()

View File

@@ -90,6 +90,49 @@ async def test_logs_coloured(journald_gateway: MagicMock, coresys: CoreSys):
)
async def test_logs_no_colors(journald_gateway: MagicMock, coresys: CoreSys):
"""Test ANSI color codes being stripped when no_colors=True."""
journald_gateway.content.feed_data(
load_fixture("logs_export_supervisor.txt").encode("utf-8")
)
journald_gateway.content.feed_eof()
async with coresys.host.logs.journald_logs() as resp:
cursor, line = await anext(journal_logs_reader(resp, no_colors=True))
assert (
cursor
== "s=83fee99ca0c3466db5fc120d52ca7dd8;i=2049389;b=f5a5c442fa6548cf97474d2d57c920b3;m=4263828e8c;t=612dda478b01b;x=9ae12394c9326930"
)
# Colors should be stripped
assert (
line == "24-03-04 23:56:56 INFO (MainThread) [__main__] Closing Supervisor"
)
async def test_logs_verbose_no_colors(journald_gateway: MagicMock, coresys: CoreSys):
"""Test ANSI color codes being stripped from verbose formatted logs when no_colors=True."""
journald_gateway.content.feed_data(
load_fixture("logs_export_supervisor.txt").encode("utf-8")
)
journald_gateway.content.feed_eof()
async with coresys.host.logs.journald_logs() as resp:
cursor, line = await anext(
journal_logs_reader(
resp, log_formatter=LogFormatter.VERBOSE, no_colors=True
)
)
assert (
cursor
== "s=83fee99ca0c3466db5fc120d52ca7dd8;i=2049389;b=f5a5c442fa6548cf97474d2d57c920b3;m=4263828e8c;t=612dda478b01b;x=9ae12394c9326930"
)
# Colors should be stripped in verbose format too
assert (
line
== "2024-03-04 22:56:56.709 ha-hloub hassio_supervisor[466]: 24-03-04 23:56:56 INFO (MainThread) [__main__] Closing Supervisor"
)
async def test_boot_ids(
journald_gateway: MagicMock,
coresys: CoreSys,

View File

@@ -1179,7 +1179,6 @@ async def test_job_scheduled_delay(coresys: CoreSys):
async def test_job_scheduled_at(coresys: CoreSys):
"""Test job that schedules a job to start at a specified time."""
dt = datetime.now()
class TestClass:
"""Test class."""
@@ -1189,10 +1188,12 @@ async def test_job_scheduled_at(coresys: CoreSys):
self.coresys = coresys
@Job(name="test_job_scheduled_at_job_scheduler")
async def job_scheduler(self) -> tuple[SupervisorJob, asyncio.TimerHandle]:
async def job_scheduler(
self, scheduled_time: datetime
) -> tuple[SupervisorJob, asyncio.TimerHandle]:
"""Schedule a job to run at specified time."""
return self.coresys.jobs.schedule_job(
self.job_task, JobSchedulerOptions(start_at=dt + timedelta(seconds=0.1))
self.job_task, JobSchedulerOptions(start_at=scheduled_time)
)
@Job(name="test_job_scheduled_at_job_task")
@@ -1201,29 +1202,28 @@ async def test_job_scheduled_at(coresys: CoreSys):
self.coresys.jobs.current.stage = "work"
test = TestClass(coresys)
job_started = asyncio.Event()
job_ended = asyncio.Event()
# Schedule job to run 0.1 seconds from now
scheduled_time = datetime.now() + timedelta(seconds=0.1)
job, _ = await test.job_scheduler(scheduled_time)
started = False
ended = False
async def start_listener(evt_job: SupervisorJob):
if evt_job.uuid == job.uuid:
job_started.set()
nonlocal started
started = started or evt_job.uuid == job.uuid
async def end_listener(evt_job: SupervisorJob):
if evt_job.uuid == job.uuid:
job_ended.set()
nonlocal ended
ended = ended or evt_job.uuid == job.uuid
async with time_machine.travel(dt):
job, _ = await test.job_scheduler()
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_START, start_listener)
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_END, end_listener)
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_START, start_listener)
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_END, end_listener)
# Advance time to exactly when job should start and wait for completion
async with time_machine.travel(dt + timedelta(seconds=0.1)):
await asyncio.wait_for(
asyncio.gather(job_started.wait(), job_ended.wait()), timeout=1.0
)
await asyncio.sleep(0.2)
assert started
assert ended
assert job.done
assert job.name == "test_job_scheduled_at_job_task"
assert job.stage == "work"

View File

@@ -5,10 +5,7 @@ from unittest.mock import MagicMock, patch
from supervisor.const import CoreState
from supervisor.coresys import CoreSys
from supervisor.resolution.evaluations.operating_system import (
SUPPORTED_OS,
EvaluateOperatingSystem,
)
from supervisor.resolution.evaluations.operating_system import EvaluateOperatingSystem
async def test_evaluation(coresys: CoreSys):
@@ -29,12 +26,6 @@ async def test_evaluation(coresys: CoreSys):
assert operating_system.reason not in coresys.resolution.unsupported
coresys.os._available = False
coresys.host._info = MagicMock(
operating_system=SUPPORTED_OS[0], timezone=None, timezone_tzinfo=None
)
await operating_system()
assert operating_system.reason not in coresys.resolution.unsupported
async def test_did_run(coresys: CoreSys):
"""Test that the evaluation ran as expected."""

View File

@@ -0,0 +1,43 @@
"""Test evaluation supported system architectures."""
from unittest.mock import PropertyMock, patch
import pytest
from supervisor.const import CoreState
from supervisor.coresys import CoreSys
from supervisor.resolution.evaluations.system_architecture import (
EvaluateSystemArchitecture,
)
@pytest.mark.parametrize("arch", ["i386", "armhf", "armv7"])
async def test_evaluation_unsupported_architectures(
coresys: CoreSys,
arch: str,
):
"""Test evaluation of unsupported system architectures."""
system_architecture = EvaluateSystemArchitecture(coresys)
await coresys.core.set_state(CoreState.INITIALIZE)
with patch.object(
type(coresys.supervisor), "arch", PropertyMock(return_value=arch)
):
await system_architecture()
assert system_architecture.reason in coresys.resolution.unsupported
@pytest.mark.parametrize("arch", ["amd64", "aarch64"])
async def test_evaluation_supported_architectures(
coresys: CoreSys,
arch: str,
):
"""Test evaluation of supported system architectures."""
system_architecture = EvaluateSystemArchitecture(coresys)
await coresys.core.set_state(CoreState.INITIALIZE)
with patch.object(
type(coresys.supervisor), "arch", PropertyMock(return_value=arch)
):
await system_architecture()
assert system_architecture.reason not in coresys.resolution.unsupported

View File

@@ -86,6 +86,22 @@ def test_format_verbose_newlines():
)
def test_format_verbose_colors():
"""Test verbose formatter with ANSI colors in message."""
fields = {
"__REALTIME_TIMESTAMP": "1379403171000000",
"_HOSTNAME": "homeassistant",
"SYSLOG_IDENTIFIER": "python",
"_PID": "666",
"MESSAGE": "\x1b[32mHello, world!\x1b[0m",
}
assert (
journal_verbose_formatter(fields)
== "2013-09-17 07:32:51.000 homeassistant python[666]: \x1b[32mHello, world!\x1b[0m"
)
async def test_parsing_simple():
"""Test plain formatter."""
journal_logs, stream = _journal_logs_mock()
@@ -297,3 +313,54 @@ async def test_parsing_non_utf8_in_binary_message():
)
_, line = await anext(journal_logs_reader(journal_logs))
assert line == "Hello, \ufffd world!"
def test_format_plain_no_colors():
"""Test plain formatter strips ANSI color codes when no_colors=True."""
fields = {"MESSAGE": "\x1b[32mHello, world!\x1b[0m"}
assert journal_plain_formatter(fields, no_colors=True) == "Hello, world!"
def test_format_verbose_no_colors():
"""Test verbose formatter strips ANSI color codes when no_colors=True."""
fields = {
"__REALTIME_TIMESTAMP": "1379403171000000",
"_HOSTNAME": "homeassistant",
"SYSLOG_IDENTIFIER": "python",
"_PID": "666",
"MESSAGE": "\x1b[32mHello, world!\x1b[0m",
}
assert (
journal_verbose_formatter(fields, no_colors=True)
== "2013-09-17 07:32:51.000 homeassistant python[666]: Hello, world!"
)
async def test_parsing_colored_logs_verbose_no_colors():
"""Test verbose formatter strips colors from colored logs."""
journal_logs, stream = _journal_logs_mock()
stream.feed_data(
b"__REALTIME_TIMESTAMP=1379403171000000\n"
b"_HOSTNAME=homeassistant\n"
b"SYSLOG_IDENTIFIER=python\n"
b"_PID=666\n"
b"MESSAGE\n\x0e\x00\x00\x00\x00\x00\x00\x00\x1b[31mERROR\x1b[0m\n"
b"AFTER=after\n\n"
)
_, line = await anext(
journal_logs_reader(
journal_logs, log_formatter=LogFormatter.VERBOSE, no_colors=True
)
)
assert line == "2013-09-17 07:32:51.000 homeassistant python[666]: ERROR"
async def test_parsing_multiple_color_codes():
"""Test stripping multiple ANSI color codes in single message."""
journal_logs, stream = _journal_logs_mock()
stream.feed_data(
b"MESSAGE\n\x29\x00\x00\x00\x00\x00\x00\x00\x1b[31mRed\x1b[0m \x1b[32mGreen\x1b[0m \x1b[34mBlue\x1b[0m\n"
b"AFTER=after\n\n"
)
_, line = await anext(journal_logs_reader(journal_logs, no_colors=True))
assert line == "Red Green Blue"