Compare commits

...

19 Commits

Author SHA1 Message Date
Mike Degatano
5781b8c35c Add defined error for when build fails 2025-11-21 19:11:26 +00:00
Mike Degatano
9be1ad4fb9 Fix stats test and add more for known errors 2025-11-21 18:40:55 +00:00
Mike Degatano
9ae4322313 Fix docker ratelimit exception and tests 2025-11-21 18:40:54 +00:00
Mike Degatano
240dce1e29 Remove customized unknown error types 2025-11-21 18:40:53 +00:00
Mike Degatano
52feac110c Remove unknown errors from addons 2025-11-21 18:40:52 +00:00
Jan Čermák
ca7a3af676 Drop codenotary options from the build config (#6330)
These options are obsolete, as all the support has been dropped from the
builder and Supervisor as well. Remove them from our build config too.
2025-11-21 16:36:48 +01:00
Stefan Agner
93272fe4c0 Deprecate i386, armhf and armv7 Supervisor architectures (#5620)
* Deprecate i386, armhf and armv7 Supervisor architectures

* Exclude Core from architecture deprecation checks

This allows to download the latest available Core version still, even
on deprecated systems.

* Fix pytest
2025-11-21 16:35:26 +01:00
Jan Čermák
79a99cc66d Use release-suffixed base images (pin to 2025.11.1) (#6329)
Currently we're lacking control over what version of the base images is
used, and it only depends on when the build is launched. This doesn't
allow any (easy) rollback mechanisms and it's also not very transparent.

Use the newly introduced base image tags which include the release
version suffix so we have more control over this aspect.
2025-11-21 16:22:22 +01:00
dependabot[bot]
6af6c3157f Bump actions/checkout from 5.0.1 to 6.0.0 (#6327)
Bumps [actions/checkout](https://github.com/actions/checkout) from 5.0.1 to 6.0.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](93cb6efe18...1af3b93b68)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-21 09:29:32 +01:00
Jan Čermák
5ed0c85168 Add optional no_colors query parameter to advanced logs endpoints (#6326)
Add support for `no_colors` query parameter on all advanced logs API endpoints,
allowing users to optionally strip ANSI color sequences from log output. This
complements the existing color stripping on /latest endpoints added in #6319.
2025-11-21 09:29:15 +01:00
Stefan Agner
63a3dff118 Handle pull events with complete progress details only (#6320)
* Handle pull events with complete progress details only

Under certain circumstances, Docker seems to send pull events with
incomplete progress details (i.e., missing 'current' or 'total' fields).
In practise, we've observed an empty dictionary for progress details
as well as missing 'total' field (while 'current' was present).
All events were using Docker 28.3.3 using the old, default Docker graph
backend.

* Fix docstring/comment
2025-11-19 12:21:27 +01:00
dependabot[bot]
fc8fc171c1 Bump time-machine from 2.19.0 to 3.0.0 (#6321)
Bumps [time-machine](https://github.com/adamchainz/time-machine) from 2.19.0 to 3.0.0.
- [Changelog](https://github.com/adamchainz/time-machine/blob/main/docs/changelog.rst)
- [Commits](https://github.com/adamchainz/time-machine/compare/2.19.0...3.0.0)

---
updated-dependencies:
- dependency-name: time-machine
  dependency-version: 3.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-19 12:21:17 +01:00
Stefan Agner
72bbc50c83 Fix call_at to use event loop time base instead of Unix timestamp (#6324)
* Fix call_at to use event loop time base instead of Unix timestamp

The CoreSys.call_at method was incorrectly passing Unix timestamps
directly to asyncio.loop.call_at(), which expects times in the event
loop's monotonic time base. This caused scheduled jobs to be scheduled
approximately 55 years in the future (the difference between Unix epoch
time and monotonic time since boot).

The bug was masked by time-machine 2.19.0, which patched time.monotonic()
and caused loop.time() to return Unix timestamps. Time-machine 3.0.0
removed this patching (as it caused event loop freezes), exposing the bug.

Fix by converting the datetime to event loop time base:
- Calculate delay from current Unix time to scheduled Unix time
- Add delay to current event loop time to get scheduled loop time

Also simplify test_job_scheduled_at to avoid time-machine's async
context managers, following the pattern of test_job_scheduled_delay.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Add comment about dateime in the past

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-11-19 11:49:05 +01:00
Jan Čermák
0837e05cb2 Strip ANSI escape color sequences from /latest log responses (#6319)
* Strip ANSI escape color sequences from /latest log responses

Strip ANSI sequences of CSI commands [1] used for log coloring from
/latest log endpoints. These endpoint were primarily designed for log
downloads and colors are mostly not wanted in those. Add optional
argument for stripping the colors from the logs and enable it for the
/latest endpoints.

[1] https://en.wikipedia.org/wiki/ANSI_escape_code#CSIsection

* Refactor advanced logs' tests to use fixture factory

Introduce `advanced_logs_tester` fixture to simplify testing of advanced logs
in the API tests, declaring all the needed fixture in a single place. # Please
enter the commit message for your changes. Lines starting
2025-11-19 09:39:24 +01:00
dependabot[bot]
d3d652eba5 Bump sentry-sdk from 2.44.0 to 2.45.0 (#6322)
Bumps [sentry-sdk](https://github.com/getsentry/sentry-python) from 2.44.0 to 2.45.0.
- [Release notes](https://github.com/getsentry/sentry-python/releases)
- [Changelog](https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-python/compare/2.44.0...2.45.0)

---
updated-dependencies:
- dependency-name: sentry-sdk
  dependency-version: 2.45.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-19 09:27:59 +01:00
dependabot[bot]
2eea3c70eb Bump coverage from 7.11.3 to 7.12.0 (#6323)
Bumps [coverage](https://github.com/coveragepy/coveragepy) from 7.11.3 to 7.12.0.
- [Release notes](https://github.com/coveragepy/coveragepy/releases)
- [Changelog](https://github.com/coveragepy/coveragepy/blob/main/CHANGES.rst)
- [Commits](https://github.com/coveragepy/coveragepy/compare/7.11.3...7.12.0)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.12.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-19 09:27:45 +01:00
dependabot[bot]
95c106d502 Bump actions/checkout from 5.0.0 to 5.0.1 (#6318)
Bumps [actions/checkout](https://github.com/actions/checkout) from 5.0.0 to 5.0.1.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](08c6903cd8...93cb6efe18)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 5.0.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-18 08:45:19 +01:00
dependabot[bot]
74f9431519 Bump ruff from 0.14.4 to 0.14.5 (#6314)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.14.4 to 0.14.5.
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.14.4...0.14.5)

---
updated-dependencies:
- dependency-name: ruff
  dependency-version: 0.14.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-14 09:06:58 +01:00
dependabot[bot]
0eef2169f7 Bump pylint from 4.0.2 to 4.0.3 (#6315) 2025-11-13 23:02:33 -08:00
54 changed files with 1564 additions and 539 deletions

View File

@@ -53,7 +53,7 @@ jobs:
requirements: ${{ steps.requirements.outputs.changed }} requirements: ${{ steps.requirements.outputs.changed }}
steps: steps:
- name: Checkout the repository - name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with: with:
fetch-depth: 0 fetch-depth: 0
@@ -92,7 +92,7 @@ jobs:
arch: ${{ fromJson(needs.init.outputs.architectures) }} arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps: steps:
- name: Checkout the repository - name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with: with:
fetch-depth: 0 fetch-depth: 0
@@ -178,7 +178,7 @@ jobs:
steps: steps:
- name: Checkout the repository - name: Checkout the repository
if: needs.init.outputs.publish == 'true' if: needs.init.outputs.publish == 'true'
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Initialize git - name: Initialize git
if: needs.init.outputs.publish == 'true' if: needs.init.outputs.publish == 'true'
@@ -203,7 +203,7 @@ jobs:
timeout-minutes: 60 timeout-minutes: 60
steps: steps:
- name: Checkout the repository - name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
# home-assistant/builder doesn't support sha pinning # home-assistant/builder doesn't support sha pinning
- name: Build the Supervisor - name: Build the Supervisor

View File

@@ -26,7 +26,7 @@ jobs:
name: Prepare Python dependencies name: Prepare Python dependencies
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python - name: Set up Python
id: python id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
@@ -68,7 +68,7 @@ jobs:
needs: prepare needs: prepare
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }} - name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python id: python
@@ -111,7 +111,7 @@ jobs:
needs: prepare needs: prepare
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }} - name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python id: python
@@ -154,7 +154,7 @@ jobs:
needs: prepare needs: prepare
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Register hadolint problem matcher - name: Register hadolint problem matcher
run: | run: |
echo "::add-matcher::.github/workflows/matchers/hadolint.json" echo "::add-matcher::.github/workflows/matchers/hadolint.json"
@@ -169,7 +169,7 @@ jobs:
needs: prepare needs: prepare
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }} - name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python id: python
@@ -213,7 +213,7 @@ jobs:
needs: prepare needs: prepare
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }} - name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python id: python
@@ -257,7 +257,7 @@ jobs:
needs: prepare needs: prepare
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }} - name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python id: python
@@ -293,7 +293,7 @@ jobs:
needs: prepare needs: prepare
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }} - name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python id: python
@@ -339,7 +339,7 @@ jobs:
name: Run tests Python ${{ needs.prepare.outputs.python-version }} name: Run tests Python ${{ needs.prepare.outputs.python-version }}
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }} - name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python id: python
@@ -398,7 +398,7 @@ jobs:
needs: ["pytest", "prepare"] needs: ["pytest", "prepare"]
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ needs.prepare.outputs.python-version }} - name: Set up Python ${{ needs.prepare.outputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
id: python id: python

View File

@@ -11,7 +11,7 @@ jobs:
name: Release Drafter name: Release Drafter
steps: steps:
- name: Checkout the repository - name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with: with:
fetch-depth: 0 fetch-depth: 0

View File

@@ -10,7 +10,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Sentry Release - name: Sentry Release
uses: getsentry/action-release@128c5058bbbe93c8e02147fe0a9c713f166259a6 # v3.4.0 uses: getsentry/action-release@128c5058bbbe93c8e02147fe0a9c713f166259a6 # v3.4.0
env: env:

View File

@@ -14,7 +14,7 @@ jobs:
latest_version: ${{ steps.latest_frontend_version.outputs.latest_tag }} latest_version: ${{ steps.latest_frontend_version.outputs.latest_tag }}
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Get latest frontend release - name: Get latest frontend release
id: latest_frontend_version id: latest_frontend_version
uses: abatilo/release-info-action@32cb932219f1cee3fc4f4a298fd65ead5d35b661 # v1.3.3 uses: abatilo/release-info-action@32cb932219f1cee3fc4f4a298fd65ead5d35b661 # v1.3.3
@@ -49,7 +49,7 @@ jobs:
if: needs.check-version.outputs.skip != 'true' if: needs.check-version.outputs.skip != 'true'
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Clear www folder - name: Clear www folder
run: | run: |
rm -rf supervisor/api/panel/* rm -rf supervisor/api/panel/*

View File

@@ -1,13 +1,10 @@
image: ghcr.io/home-assistant/{arch}-hassio-supervisor image: ghcr.io/home-assistant/{arch}-hassio-supervisor
build_from: build_from:
aarch64: ghcr.io/home-assistant/aarch64-base-python:3.13-alpine3.22 aarch64: ghcr.io/home-assistant/aarch64-base-python:3.13-alpine3.22-2025.11.1
armhf: ghcr.io/home-assistant/armhf-base-python:3.13-alpine3.22 armhf: ghcr.io/home-assistant/armhf-base-python:3.13-alpine3.22-2025.11.1
armv7: ghcr.io/home-assistant/armv7-base-python:3.13-alpine3.22 armv7: ghcr.io/home-assistant/armv7-base-python:3.13-alpine3.22-2025.11.1
amd64: ghcr.io/home-assistant/amd64-base-python:3.13-alpine3.22 amd64: ghcr.io/home-assistant/amd64-base-python:3.13-alpine3.22-2025.11.1
i386: ghcr.io/home-assistant/i386-base-python:3.13-alpine3.22 i386: ghcr.io/home-assistant/i386-base-python:3.13-alpine3.22-2025.11.1
codenotary:
signer: notary@home-assistant.io
base_image: notary@home-assistant.io
cosign: cosign:
base_identity: https://github.com/home-assistant/docker-base/.* base_identity: https://github.com/home-assistant/docker-base/.*
identity: https://github.com/home-assistant/supervisor/.* identity: https://github.com/home-assistant/supervisor/.*

View File

@@ -25,7 +25,7 @@ pyudev==0.24.4
PyYAML==6.0.3 PyYAML==6.0.3
requests==2.32.5 requests==2.32.5
securetar==2025.2.1 securetar==2025.2.1
sentry-sdk==2.44.0 sentry-sdk==2.45.0
setuptools==80.9.0 setuptools==80.9.0
voluptuous==0.15.2 voluptuous==0.15.2
dbus-fast==2.45.1 dbus-fast==2.45.1

View File

@@ -1,15 +1,15 @@
astroid==4.0.2 astroid==4.0.2
coverage==7.11.3 coverage==7.12.0
mypy==1.18.2 mypy==1.18.2
pre-commit==4.4.0 pre-commit==4.4.0
pylint==4.0.2 pylint==4.0.3
pytest-aiohttp==1.1.0 pytest-aiohttp==1.1.0
pytest-asyncio==1.3.0 pytest-asyncio==1.3.0
pytest-cov==7.0.0 pytest-cov==7.0.0
pytest-timeout==2.4.0 pytest-timeout==2.4.0
pytest==9.0.1 pytest==9.0.1
ruff==0.14.4 ruff==0.14.5
time-machine==2.19.0 time-machine==3.0.0
types-docker==7.1.0.20251009 types-docker==7.1.0.20251009
types-pyyaml==6.0.12.20250915 types-pyyaml==6.0.12.20250915
types-requests==2.32.4.20250913 types-requests==2.32.4.20250913

View File

@@ -66,13 +66,22 @@ from ..docker.const import ContainerState
from ..docker.monitor import DockerContainerStateEvent from ..docker.monitor import DockerContainerStateEvent
from ..docker.stats import DockerStats from ..docker.stats import DockerStats
from ..exceptions import ( from ..exceptions import (
AddonConfigurationError, AddonBackupMetadataInvalidError,
AddonBuildFailedUnknownError,
AddonConfigurationInvalidError,
AddonNotRunningError,
AddonNotSupportedError, AddonNotSupportedError,
AddonNotSupportedWriteStdinError,
AddonPrePostBackupCommandReturnedError,
AddonsError, AddonsError,
AddonsJobError, AddonsJobError,
AddonUnknownError,
BackupRestoreUnknownError,
ConfigurationFileError, ConfigurationFileError,
DockerBuildError,
DockerError, DockerError,
HostAppArmorError, HostAppArmorError,
StoreAddonNotFoundError,
) )
from ..hardware.data import Device from ..hardware.data import Device
from ..homeassistant.const import WSEvent from ..homeassistant.const import WSEvent
@@ -235,7 +244,7 @@ class Addon(AddonModel):
await self.instance.check_image(self.version, default_image, self.arch) await self.instance.check_image(self.version, default_image, self.arch)
except DockerError: except DockerError:
_LOGGER.info("No %s addon Docker image %s found", self.slug, self.image) _LOGGER.info("No %s addon Docker image %s found", self.slug, self.image)
with suppress(DockerError): with suppress(DockerError, AddonNotSupportedError):
await self.instance.install(self.version, default_image, arch=self.arch) await self.instance.install(self.version, default_image, arch=self.arch)
self.persist[ATTR_IMAGE] = default_image self.persist[ATTR_IMAGE] = default_image
@@ -718,18 +727,16 @@ class Addon(AddonModel):
options = self.schema.validate(self.options) options = self.schema.validate(self.options)
await self.sys_run_in_executor(write_json_file, self.path_options, options) await self.sys_run_in_executor(write_json_file, self.path_options, options)
except vol.Invalid as ex: except vol.Invalid as ex:
_LOGGER.error( raise AddonConfigurationInvalidError(
"Add-on %s has invalid options: %s", _LOGGER.error,
self.slug, addon=self.slug,
humanize_error(self.options, ex), validation_error=humanize_error(self.options, ex),
) ) from None
except ConfigurationFileError: except ConfigurationFileError as err:
_LOGGER.error("Add-on %s can't write options", self.slug) _LOGGER.error("Add-on %s can't write options", self.slug)
else: raise AddonUnknownError(addon=self.slug) from err
_LOGGER.debug("Add-on %s write options: %s", self.slug, options)
return
raise AddonConfigurationError() _LOGGER.debug("Add-on %s write options: %s", self.slug, options)
@Job( @Job(
name="addon_unload", name="addon_unload",
@@ -772,7 +779,7 @@ class Addon(AddonModel):
async def install(self) -> None: async def install(self) -> None:
"""Install and setup this addon.""" """Install and setup this addon."""
if not self.addon_store: if not self.addon_store:
raise AddonsError("Missing from store, cannot install!") raise StoreAddonNotFoundError(addon=self.slug)
await self.sys_addons.data.install(self.addon_store) await self.sys_addons.data.install(self.addon_store)
@@ -793,9 +800,17 @@ class Addon(AddonModel):
await self.instance.install( await self.instance.install(
self.latest_version, self.addon_store.image, arch=self.arch self.latest_version, self.addon_store.image, arch=self.arch
) )
except DockerError as err: except AddonsError:
await self.sys_addons.data.uninstall(self) await self.sys_addons.data.uninstall(self)
raise AddonsError() from err raise
except DockerBuildError as err:
_LOGGER.error("Could not build image for addon %s: %s", self.slug, err)
await self.sys_addons.data.uninstall(self)
raise AddonBuildFailedUnknownError(addon=self.slug) from err
except DockerError as err:
_LOGGER.error("Could not pull image to update addon %s: %s", self.slug, err)
await self.sys_addons.data.uninstall(self)
raise AddonUnknownError(addon=self.slug) from err
# Finish initialization and set up listeners # Finish initialization and set up listeners
await self.load() await self.load()
@@ -819,7 +834,8 @@ class Addon(AddonModel):
try: try:
await self.instance.remove(remove_image=remove_image) await self.instance.remove(remove_image=remove_image)
except DockerError as err: except DockerError as err:
raise AddonsError() from err _LOGGER.error("Could not remove image for addon %s: %s", self.slug, err)
raise AddonUnknownError(addon=self.slug) from err
self.state = AddonState.UNKNOWN self.state = AddonState.UNKNOWN
@@ -884,7 +900,7 @@ class Addon(AddonModel):
if it was running. Else nothing is returned. if it was running. Else nothing is returned.
""" """
if not self.addon_store: if not self.addon_store:
raise AddonsError("Missing from store, cannot update!") raise StoreAddonNotFoundError(addon=self.slug)
old_image = self.image old_image = self.image
# Cache data to prevent races with other updates to global # Cache data to prevent races with other updates to global
@@ -892,8 +908,12 @@ class Addon(AddonModel):
try: try:
await self.instance.update(store.version, store.image, arch=self.arch) await self.instance.update(store.version, store.image, arch=self.arch)
except DockerBuildError as err:
_LOGGER.error("Could not build image for addon %s: %s", self.slug, err)
raise AddonBuildFailedUnknownError(addon=self.slug) from err
except DockerError as err: except DockerError as err:
raise AddonsError() from err _LOGGER.error("Could not pull image to update addon %s: %s", self.slug, err)
raise AddonUnknownError(addon=self.slug) from err
# Stop the addon if running # Stop the addon if running
if (last_state := self.state) in {AddonState.STARTED, AddonState.STARTUP}: if (last_state := self.state) in {AddonState.STARTED, AddonState.STARTUP}:
@@ -935,12 +955,23 @@ class Addon(AddonModel):
""" """
last_state: AddonState = self.state last_state: AddonState = self.state
try: try:
# remove docker container but not addon config # remove docker container and image but not addon config
try: try:
await self.instance.remove() await self.instance.remove()
await self.instance.install(self.version)
except DockerError as err: except DockerError as err:
raise AddonsError() from err _LOGGER.error("Could not remove image for addon %s: %s", self.slug, err)
raise AddonUnknownError(addon=self.slug) from err
try:
await self.instance.install(self.version)
except DockerBuildError as err:
_LOGGER.error("Could not build image for addon %s: %s", self.slug, err)
raise AddonBuildFailedUnknownError(addon=self.slug) from err
except DockerError as err:
_LOGGER.error(
"Could not pull image to update addon %s: %s", self.slug, err
)
raise AddonUnknownError(addon=self.slug) from err
if self.addon_store: if self.addon_store:
await self.sys_addons.data.update(self.addon_store) await self.sys_addons.data.update(self.addon_store)
@@ -1111,8 +1142,9 @@ class Addon(AddonModel):
try: try:
await self.instance.run() await self.instance.run()
except DockerError as err: except DockerError as err:
_LOGGER.error("Could not start container for addon %s: %s", self.slug, err)
self.state = AddonState.ERROR self.state = AddonState.ERROR
raise AddonsError() from err raise AddonUnknownError(addon=self.slug) from err
return self.sys_create_task(self._wait_for_startup()) return self.sys_create_task(self._wait_for_startup())
@@ -1127,8 +1159,9 @@ class Addon(AddonModel):
try: try:
await self.instance.stop() await self.instance.stop()
except DockerError as err: except DockerError as err:
_LOGGER.error("Could not stop container for addon %s: %s", self.slug, err)
self.state = AddonState.ERROR self.state = AddonState.ERROR
raise AddonsError() from err raise AddonUnknownError(addon=self.slug) from err
@Job( @Job(
name="addon_restart", name="addon_restart",
@@ -1161,9 +1194,15 @@ class Addon(AddonModel):
async def stats(self) -> DockerStats: async def stats(self) -> DockerStats:
"""Return stats of container.""" """Return stats of container."""
try: try:
if not await self.is_running():
raise AddonNotRunningError(_LOGGER.warning, addon=self.slug)
return await self.instance.stats() return await self.instance.stats()
except DockerError as err: except DockerError as err:
raise AddonsError() from err _LOGGER.error(
"Could not get stats of container for addon %s: %s", self.slug, err
)
raise AddonUnknownError(addon=self.slug) from err
@Job( @Job(
name="addon_write_stdin", name="addon_write_stdin",
@@ -1173,14 +1212,18 @@ class Addon(AddonModel):
async def write_stdin(self, data) -> None: async def write_stdin(self, data) -> None:
"""Write data to add-on stdin.""" """Write data to add-on stdin."""
if not self.with_stdin: if not self.with_stdin:
raise AddonNotSupportedError( raise AddonNotSupportedWriteStdinError(_LOGGER.error, addon=self.slug)
f"Add-on {self.slug} does not support writing to stdin!", _LOGGER.error
)
try: try:
return await self.instance.write_stdin(data) if not await self.is_running():
raise AddonNotRunningError(_LOGGER.warning, addon=self.slug)
await self.instance.write_stdin(data)
except DockerError as err: except DockerError as err:
raise AddonsError() from err _LOGGER.error(
"Could not write stdin to container for addon %s: %s", self.slug, err
)
raise AddonUnknownError(addon=self.slug) from err
async def _backup_command(self, command: str) -> None: async def _backup_command(self, command: str) -> None:
try: try:
@@ -1189,15 +1232,14 @@ class Addon(AddonModel):
_LOGGER.debug( _LOGGER.debug(
"Pre-/Post backup command failed with: %s", command_return.output "Pre-/Post backup command failed with: %s", command_return.output
) )
raise AddonsError( raise AddonPrePostBackupCommandReturnedError(
f"Pre-/Post backup command returned error code: {command_return.exit_code}", _LOGGER.error, addon=self.slug, exit_code=command_return.exit_code
_LOGGER.error,
) )
except DockerError as err: except DockerError as err:
raise AddonsError( _LOGGER.error(
f"Failed running pre-/post backup command {command}: {str(err)}", "Failed running pre-/post backup command %s: %s", command, err
_LOGGER.error, )
) from err raise AddonUnknownError(addon=self.slug) from err
@Job( @Job(
name="addon_begin_backup", name="addon_begin_backup",
@@ -1286,15 +1328,14 @@ class Addon(AddonModel):
try: try:
self.instance.export_image(temp_path.joinpath("image.tar")) self.instance.export_image(temp_path.joinpath("image.tar"))
except DockerError as err: except DockerError as err:
raise AddonsError() from err raise BackupRestoreUnknownError() from err
# Store local configs/state # Store local configs/state
try: try:
write_json_file(temp_path.joinpath("addon.json"), metadata) write_json_file(temp_path.joinpath("addon.json"), metadata)
except ConfigurationFileError as err: except ConfigurationFileError as err:
raise AddonsError( _LOGGER.error("Can't save meta for %s: %s", self.slug, err)
f"Can't save meta for {self.slug}", _LOGGER.error raise BackupRestoreUnknownError() from err
) from err
# Store AppArmor Profile # Store AppArmor Profile
if apparmor_profile: if apparmor_profile:
@@ -1304,9 +1345,7 @@ class Addon(AddonModel):
apparmor_profile, profile_backup_file apparmor_profile, profile_backup_file
) )
except HostAppArmorError as err: except HostAppArmorError as err:
raise AddonsError( raise BackupRestoreUnknownError() from err
"Can't backup AppArmor profile", _LOGGER.error
) from err
# Write tarfile # Write tarfile
with tar_file as backup: with tar_file as backup:
@@ -1360,7 +1399,8 @@ class Addon(AddonModel):
) )
_LOGGER.info("Finish backup for addon %s", self.slug) _LOGGER.info("Finish backup for addon %s", self.slug)
except (tarfile.TarError, OSError, AddFileError) as err: except (tarfile.TarError, OSError, AddFileError) as err:
raise AddonsError(f"Can't write tarfile: {err}", _LOGGER.error) from err _LOGGER.error("Can't write backup tarfile for addon %s: %s", self.slug, err)
raise BackupRestoreUnknownError() from err
finally: finally:
if was_running: if was_running:
wait_for_start = await self.end_backup() wait_for_start = await self.end_backup()
@@ -1402,28 +1442,24 @@ class Addon(AddonModel):
try: try:
tmp, data = await self.sys_run_in_executor(_extract_tarfile) tmp, data = await self.sys_run_in_executor(_extract_tarfile)
except tarfile.TarError as err: except tarfile.TarError as err:
raise AddonsError( _LOGGER.error("Can't extract backup tarfile for %s: %s", self.slug, err)
f"Can't read tarfile {tar_file}: {err}", _LOGGER.error raise BackupRestoreUnknownError() from err
) from err
except ConfigurationFileError as err: except ConfigurationFileError as err:
raise AddonsError() from err raise AddonUnknownError(addon=self.slug) from err
try: try:
# Validate # Validate
try: try:
data = SCHEMA_ADDON_BACKUP(data) data = SCHEMA_ADDON_BACKUP(data)
except vol.Invalid as err: except vol.Invalid as err:
raise AddonsError( raise AddonBackupMetadataInvalidError(
f"Can't validate {self.slug}, backup data: {humanize_error(data, err)}",
_LOGGER.error, _LOGGER.error,
addon=self.slug,
validation_error=humanize_error(data, err),
) from err ) from err
# If available # Validate availability. Raises if not
if not self._available(data[ATTR_SYSTEM]): self._validate_availability(data[ATTR_SYSTEM], logger=_LOGGER.error)
raise AddonNotSupportedError(
f"Add-on {self.slug} is not available for this platform",
_LOGGER.error,
)
# Restore local add-on information # Restore local add-on information
_LOGGER.info("Restore config for addon %s", self.slug) _LOGGER.info("Restore config for addon %s", self.slug)
@@ -1482,9 +1518,10 @@ class Addon(AddonModel):
try: try:
await self.sys_run_in_executor(_restore_data) await self.sys_run_in_executor(_restore_data)
except shutil.Error as err: except shutil.Error as err:
raise AddonsError( _LOGGER.error(
f"Can't restore origin data: {err}", _LOGGER.error "Can't restore origin data for %s: %s", self.slug, err
) from err )
raise BackupRestoreUnknownError() from err
# Restore AppArmor # Restore AppArmor
profile_file = Path(tmp.name, "apparmor.txt") profile_file = Path(tmp.name, "apparmor.txt")
@@ -1495,10 +1532,11 @@ class Addon(AddonModel):
) )
except HostAppArmorError as err: except HostAppArmorError as err:
_LOGGER.error( _LOGGER.error(
"Can't restore AppArmor profile for add-on %s", "Can't restore AppArmor profile for add-on %s: %s",
self.slug, self.slug,
err,
) )
raise AddonsError() from err raise BackupRestoreUnknownError() from err
finally: finally:
# Is add-on loaded # Is add-on loaded

View File

@@ -3,6 +3,7 @@
from __future__ import annotations from __future__ import annotations
from functools import cached_property from functools import cached_property
import logging
from pathlib import Path from pathlib import Path
from typing import TYPE_CHECKING, Any from typing import TYPE_CHECKING, Any
@@ -19,13 +20,20 @@ from ..const import (
) )
from ..coresys import CoreSys, CoreSysAttributes from ..coresys import CoreSys, CoreSysAttributes
from ..docker.interface import MAP_ARCH from ..docker.interface import MAP_ARCH
from ..exceptions import ConfigurationFileError, HassioArchNotFound from ..exceptions import (
AddonBuildArchitectureNotSupportedError,
AddonBuildDockerfileMissingError,
ConfigurationFileError,
HassioArchNotFound,
)
from ..utils.common import FileConfiguration, find_one_filetype from ..utils.common import FileConfiguration, find_one_filetype
from .validate import SCHEMA_BUILD_CONFIG from .validate import SCHEMA_BUILD_CONFIG
if TYPE_CHECKING: if TYPE_CHECKING:
from .manager import AnyAddon from .manager import AnyAddon
_LOGGER: logging.Logger = logging.getLogger(__name__)
class AddonBuild(FileConfiguration, CoreSysAttributes): class AddonBuild(FileConfiguration, CoreSysAttributes):
"""Handle build options for add-ons.""" """Handle build options for add-ons."""
@@ -106,7 +114,7 @@ class AddonBuild(FileConfiguration, CoreSysAttributes):
return self.addon.path_location.joinpath(f"Dockerfile.{self.arch}") return self.addon.path_location.joinpath(f"Dockerfile.{self.arch}")
return self.addon.path_location.joinpath("Dockerfile") return self.addon.path_location.joinpath("Dockerfile")
async def is_valid(self) -> bool: async def is_valid(self) -> None:
"""Return true if the build env is valid.""" """Return true if the build env is valid."""
def build_is_valid() -> bool: def build_is_valid() -> bool:
@@ -118,9 +126,17 @@ class AddonBuild(FileConfiguration, CoreSysAttributes):
) )
try: try:
return await self.sys_run_in_executor(build_is_valid) if not await self.sys_run_in_executor(build_is_valid):
raise AddonBuildDockerfileMissingError(
_LOGGER.error, addon=self.addon.slug
)
except HassioArchNotFound: except HassioArchNotFound:
return False raise AddonBuildArchitectureNotSupportedError(
_LOGGER.error,
addon=self.addon.slug,
addon_arch_list=self.addon.supported_arch,
system_arch_list=self.sys_arch.supported,
) from None
def get_docker_args( def get_docker_args(
self, version: AwesomeVersion, image_tag: str self, version: AwesomeVersion, image_tag: str

View File

@@ -152,6 +152,7 @@ class RestAPI(CoreSysAttributes):
self._api_host.advanced_logs, self._api_host.advanced_logs,
identifier=syslog_identifier, identifier=syslog_identifier,
latest=True, latest=True,
no_colors=True,
), ),
), ),
web.get( web.get(
@@ -449,6 +450,7 @@ class RestAPI(CoreSysAttributes):
await async_capture_exception(err) await async_capture_exception(err)
kwargs.pop("follow", None) # Follow is not supported for Docker logs kwargs.pop("follow", None) # Follow is not supported for Docker logs
kwargs.pop("latest", None) # Latest is not supported for Docker logs kwargs.pop("latest", None) # Latest is not supported for Docker logs
kwargs.pop("no_colors", None) # no_colors not supported for Docker logs
return await api_supervisor.logs(*args, **kwargs) return await api_supervisor.logs(*args, **kwargs)
self.webapp.add_routes( self.webapp.add_routes(
@@ -460,7 +462,7 @@ class RestAPI(CoreSysAttributes):
), ),
web.get( web.get(
"/supervisor/logs/latest", "/supervisor/logs/latest",
partial(get_supervisor_logs, latest=True), partial(get_supervisor_logs, latest=True, no_colors=True),
), ),
web.get("/supervisor/logs/boots/{bootid}", get_supervisor_logs), web.get("/supervisor/logs/boots/{bootid}", get_supervisor_logs),
web.get( web.get(
@@ -576,7 +578,7 @@ class RestAPI(CoreSysAttributes):
), ),
web.get( web.get(
"/addons/{addon}/logs/latest", "/addons/{addon}/logs/latest",
partial(get_addon_logs, latest=True), partial(get_addon_logs, latest=True, no_colors=True),
), ),
web.get("/addons/{addon}/logs/boots/{bootid}", get_addon_logs), web.get("/addons/{addon}/logs/boots/{bootid}", get_addon_logs),
web.get( web.get(

View File

@@ -100,6 +100,9 @@ from ..const import (
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..docker.stats import DockerStats from ..docker.stats import DockerStats
from ..exceptions import ( from ..exceptions import (
AddonBootConfigCannotChangeError,
AddonConfigurationInvalidError,
AddonNotSupportedWriteStdinError,
APIAddonNotInstalled, APIAddonNotInstalled,
APIError, APIError,
APIForbidden, APIForbidden,
@@ -125,6 +128,7 @@ SCHEMA_OPTIONS = vol.Schema(
vol.Optional(ATTR_AUDIO_INPUT): vol.Maybe(str), vol.Optional(ATTR_AUDIO_INPUT): vol.Maybe(str),
vol.Optional(ATTR_INGRESS_PANEL): vol.Boolean(), vol.Optional(ATTR_INGRESS_PANEL): vol.Boolean(),
vol.Optional(ATTR_WATCHDOG): vol.Boolean(), vol.Optional(ATTR_WATCHDOG): vol.Boolean(),
vol.Optional(ATTR_OPTIONS): vol.Maybe(dict),
} }
) )
@@ -300,19 +304,20 @@ class APIAddons(CoreSysAttributes):
# Update secrets for validation # Update secrets for validation
await self.sys_homeassistant.secrets.reload() await self.sys_homeassistant.secrets.reload()
# Extend schema with add-on specific validation
addon_schema = SCHEMA_OPTIONS.extend(
{vol.Optional(ATTR_OPTIONS): vol.Maybe(addon.schema)}
)
# Validate/Process Body # Validate/Process Body
body = await api_validate(addon_schema, request) body = await api_validate(SCHEMA_OPTIONS, request)
if ATTR_OPTIONS in body: if ATTR_OPTIONS in body:
addon.options = body[ATTR_OPTIONS] try:
addon.options = addon.schema(body[ATTR_OPTIONS])
except vol.Invalid as ex:
raise AddonConfigurationInvalidError(
addon=addon.slug,
validation_error=humanize_error(body[ATTR_OPTIONS], ex),
) from None
if ATTR_BOOT in body: if ATTR_BOOT in body:
if addon.boot_config == AddonBootConfig.MANUAL_ONLY: if addon.boot_config == AddonBootConfig.MANUAL_ONLY:
raise APIError( raise AddonBootConfigCannotChangeError(
f"Addon {addon.slug} boot option is set to {addon.boot_config} so it cannot be changed" addon=addon.slug, boot_config=addon.boot_config.value
) )
addon.boot = body[ATTR_BOOT] addon.boot = body[ATTR_BOOT]
if ATTR_AUTO_UPDATE in body: if ATTR_AUTO_UPDATE in body:
@@ -476,7 +481,7 @@ class APIAddons(CoreSysAttributes):
"""Write to stdin of add-on.""" """Write to stdin of add-on."""
addon = self.get_addon_for_request(request) addon = self.get_addon_for_request(request)
if not addon.with_stdin: if not addon.with_stdin:
raise APIError(f"STDIN not supported the {addon.slug} add-on") raise AddonNotSupportedWriteStdinError(_LOGGER.error, addon=addon.slug)
data = await request.read() data = await request.read()
await asyncio.shield(addon.write_stdin(data)) await asyncio.shield(addon.write_stdin(data))

View File

@@ -15,7 +15,7 @@ import voluptuous as vol
from ..addons.addon import Addon from ..addons.addon import Addon
from ..const import ATTR_NAME, ATTR_PASSWORD, ATTR_USERNAME, REQUEST_FROM from ..const import ATTR_NAME, ATTR_PASSWORD, ATTR_USERNAME, REQUEST_FROM
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..exceptions import APIForbidden from ..exceptions import APIForbidden, AuthInvalidNonStringValueError
from .const import ( from .const import (
ATTR_GROUP_IDS, ATTR_GROUP_IDS,
ATTR_IS_ACTIVE, ATTR_IS_ACTIVE,
@@ -69,7 +69,9 @@ class APIAuth(CoreSysAttributes):
try: try:
_ = username.encode and password.encode # type: ignore _ = username.encode and password.encode # type: ignore
except AttributeError: except AttributeError:
raise HTTPUnauthorized(headers=REALM_HEADER) from None raise AuthInvalidNonStringValueError(
_LOGGER.error, headers=REALM_HEADER
) from None
return self.sys_auth.check_login( return self.sys_auth.check_login(
addon, cast(str, username), cast(str, password) addon, cast(str, username), cast(str, password)

View File

@@ -206,6 +206,7 @@ class APIHost(CoreSysAttributes):
identifier: str | None = None, identifier: str | None = None,
follow: bool = False, follow: bool = False,
latest: bool = False, latest: bool = False,
no_colors: bool = False,
) -> web.StreamResponse: ) -> web.StreamResponse:
"""Return systemd-journald logs.""" """Return systemd-journald logs."""
log_formatter = LogFormatter.PLAIN log_formatter = LogFormatter.PLAIN
@@ -251,6 +252,9 @@ class APIHost(CoreSysAttributes):
if "verbose" in request.query or request.headers[ACCEPT] == CONTENT_TYPE_X_LOG: if "verbose" in request.query or request.headers[ACCEPT] == CONTENT_TYPE_X_LOG:
log_formatter = LogFormatter.VERBOSE log_formatter = LogFormatter.VERBOSE
if "no_colors" in request.query:
no_colors = True
if "lines" in request.query: if "lines" in request.query:
lines = request.query.get("lines", DEFAULT_LINES) lines = request.query.get("lines", DEFAULT_LINES)
try: try:
@@ -280,7 +284,9 @@ class APIHost(CoreSysAttributes):
response = web.StreamResponse() response = web.StreamResponse()
response.content_type = CONTENT_TYPE_TEXT response.content_type = CONTENT_TYPE_TEXT
headers_returned = False headers_returned = False
async for cursor, line in journal_logs_reader(resp, log_formatter): async for cursor, line in journal_logs_reader(
resp, log_formatter, no_colors
):
try: try:
if not headers_returned: if not headers_returned:
if cursor: if cursor:
@@ -318,9 +324,12 @@ class APIHost(CoreSysAttributes):
identifier: str | None = None, identifier: str | None = None,
follow: bool = False, follow: bool = False,
latest: bool = False, latest: bool = False,
no_colors: bool = False,
) -> web.StreamResponse: ) -> web.StreamResponse:
"""Return systemd-journald logs. Wrapped as standard API handler.""" """Return systemd-journald logs. Wrapped as standard API handler."""
return await self.advanced_logs_handler(request, identifier, follow, latest) return await self.advanced_logs_handler(
request, identifier, follow, latest, no_colors
)
@api_process @api_process
async def disk_usage(self, request: web.Request) -> dict: async def disk_usage(self, request: web.Request) -> dict:

View File

@@ -53,7 +53,7 @@ from ..const import (
REQUEST_FROM, REQUEST_FROM,
) )
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..exceptions import APIError, APIForbidden, APINotFound from ..exceptions import APIError, APIForbidden, APINotFound, StoreAddonNotFoundError
from ..store.addon import AddonStore from ..store.addon import AddonStore
from ..store.repository import Repository from ..store.repository import Repository
from ..store.validate import validate_repository from ..store.validate import validate_repository
@@ -104,7 +104,7 @@ class APIStore(CoreSysAttributes):
addon_slug: str = request.match_info["addon"] addon_slug: str = request.match_info["addon"]
if not (addon := self.sys_addons.get(addon_slug)): if not (addon := self.sys_addons.get(addon_slug)):
raise APINotFound(f"Addon {addon_slug} does not exist") raise StoreAddonNotFoundError(addon=addon_slug)
if installed and not addon.is_installed: if installed and not addon.is_installed:
raise APIError(f"Addon {addon_slug} is not installed") raise APIError(f"Addon {addon_slug} is not installed")
@@ -112,7 +112,7 @@ class APIStore(CoreSysAttributes):
if not installed and addon.is_installed: if not installed and addon.is_installed:
addon = cast(Addon, addon) addon = cast(Addon, addon)
if not addon.addon_store: if not addon.addon_store:
raise APINotFound(f"Addon {addon_slug} does not exist in the store") raise StoreAddonNotFoundError(addon=addon_slug)
return addon.addon_store return addon.addon_store
return addon return addon

View File

@@ -1,7 +1,7 @@
"""Init file for Supervisor util for RESTful API.""" """Init file for Supervisor util for RESTful API."""
import asyncio import asyncio
from collections.abc import Callable from collections.abc import Callable, Mapping
import json import json
from typing import Any, cast from typing import Any, cast
@@ -26,7 +26,7 @@ from ..const import (
RESULT_OK, RESULT_OK,
) )
from ..coresys import CoreSys, CoreSysAttributes from ..coresys import CoreSys, CoreSysAttributes
from ..exceptions import APIError, BackupFileNotFoundError, DockerAPIError, HassioError from ..exceptions import APIError, DockerAPIError, HassioError
from ..jobs import JobSchedulerOptions, SupervisorJob from ..jobs import JobSchedulerOptions, SupervisorJob
from ..utils import check_exception_chain, get_message_from_exception_chain from ..utils import check_exception_chain, get_message_from_exception_chain
from ..utils.json import json_dumps, json_loads as json_loads_util from ..utils.json import json_dumps, json_loads as json_loads_util
@@ -69,10 +69,10 @@ def api_process(method):
"""Return API information.""" """Return API information."""
try: try:
answer = await method(api, *args, **kwargs) answer = await method(api, *args, **kwargs)
except BackupFileNotFoundError as err:
return api_return_error(err, status=404)
except APIError as err: except APIError as err:
return api_return_error(err, status=err.status, job_id=err.job_id) return api_return_error(
err, status=err.status, job_id=err.job_id, headers=err.headers
)
except HassioError as err: except HassioError as err:
return api_return_error(err) return api_return_error(err)
@@ -143,6 +143,7 @@ def api_return_error(
error_type: str | None = None, error_type: str | None = None,
status: int = 400, status: int = 400,
*, *,
headers: Mapping[str, str] | None = None,
job_id: str | None = None, job_id: str | None = None,
) -> web.Response: ) -> web.Response:
"""Return an API error message.""" """Return an API error message."""
@@ -155,10 +156,15 @@ def api_return_error(
match error_type: match error_type:
case const.CONTENT_TYPE_TEXT: case const.CONTENT_TYPE_TEXT:
return web.Response(body=message, content_type=error_type, status=status) return web.Response(
body=message, content_type=error_type, status=status, headers=headers
)
case const.CONTENT_TYPE_BINARY: case const.CONTENT_TYPE_BINARY:
return web.Response( return web.Response(
body=message.encode(), content_type=error_type, status=status body=message.encode(),
content_type=error_type,
status=status,
headers=headers,
) )
case _: case _:
result: dict[str, Any] = { result: dict[str, Any] = {
@@ -176,6 +182,7 @@ def api_return_error(
result, result,
status=status, status=status,
dumps=json_dumps, dumps=json_dumps,
headers=headers,
) )

View File

@@ -9,8 +9,10 @@ from .addons.addon import Addon
from .const import ATTR_PASSWORD, ATTR_TYPE, ATTR_USERNAME, FILE_HASSIO_AUTH from .const import ATTR_PASSWORD, ATTR_TYPE, ATTR_USERNAME, FILE_HASSIO_AUTH
from .coresys import CoreSys, CoreSysAttributes from .coresys import CoreSys, CoreSysAttributes
from .exceptions import ( from .exceptions import (
AuthError, AuthHomeAssistantAPIValidationError,
AuthInvalidNonStringValueError,
AuthListUsersError, AuthListUsersError,
AuthListUsersNoneResponseError,
AuthPasswordResetError, AuthPasswordResetError,
HomeAssistantAPIError, HomeAssistantAPIError,
HomeAssistantWSError, HomeAssistantWSError,
@@ -83,10 +85,8 @@ class Auth(FileConfiguration, CoreSysAttributes):
self, addon: Addon, username: str | None, password: str | None self, addon: Addon, username: str | None, password: str | None
) -> bool: ) -> bool:
"""Check username login.""" """Check username login."""
if password is None: if username is None or password is None:
raise AuthError("None as password is not supported!", _LOGGER.error) raise AuthInvalidNonStringValueError(_LOGGER.error)
if username is None:
raise AuthError("None as username is not supported!", _LOGGER.error)
_LOGGER.info("Auth request from '%s' for '%s'", addon.slug, username) _LOGGER.info("Auth request from '%s' for '%s'", addon.slug, username)
@@ -137,7 +137,7 @@ class Auth(FileConfiguration, CoreSysAttributes):
finally: finally:
self._running.pop(username, None) self._running.pop(username, None)
raise AuthError() raise AuthHomeAssistantAPIValidationError()
async def change_password(self, username: str, password: str) -> None: async def change_password(self, username: str, password: str) -> None:
"""Change user password login.""" """Change user password login."""
@@ -155,7 +155,7 @@ class Auth(FileConfiguration, CoreSysAttributes):
except HomeAssistantAPIError as err: except HomeAssistantAPIError as err:
_LOGGER.error("Can't request password reset on Home Assistant: %s", err) _LOGGER.error("Can't request password reset on Home Assistant: %s", err)
raise AuthPasswordResetError() raise AuthPasswordResetError(user=username)
async def list_users(self) -> list[dict[str, Any]]: async def list_users(self) -> list[dict[str, Any]]:
"""List users on the Home Assistant instance.""" """List users on the Home Assistant instance."""
@@ -166,15 +166,12 @@ class Auth(FileConfiguration, CoreSysAttributes):
{ATTR_TYPE: "config/auth/list"} {ATTR_TYPE: "config/auth/list"}
) )
except HomeAssistantWSError as err: except HomeAssistantWSError as err:
raise AuthListUsersError( _LOGGER.error("Can't request listing users on Home Assistant: %s", err)
f"Can't request listing users on Home Assistant: {err}", _LOGGER.error raise AuthListUsersError() from err
) from err
if users is not None: if users is not None:
return users return users
raise AuthListUsersError( raise AuthListUsersNoneResponseError(_LOGGER.error)
"Can't request listing users on Home Assistant!", _LOGGER.error
)
@staticmethod @staticmethod
def _rehash(value: str, salt2: str = "") -> str: def _rehash(value: str, salt2: str = "") -> str:

View File

@@ -628,9 +628,6 @@ class Backup(JobGroup):
if start_task := await self._addon_save(addon): if start_task := await self._addon_save(addon):
start_tasks.append(start_task) start_tasks.append(start_task)
except BackupError as err: except BackupError as err:
err = BackupError(
f"Can't backup add-on {addon.slug}: {str(err)}", _LOGGER.error
)
self.sys_jobs.current.capture_error(err) self.sys_jobs.current.capture_error(err)
return start_tasks return start_tasks

View File

@@ -9,6 +9,7 @@ from datetime import UTC, datetime, tzinfo
from functools import partial from functools import partial
import logging import logging
import os import os
import time
from types import MappingProxyType from types import MappingProxyType
from typing import TYPE_CHECKING, Any, Self, TypeVar from typing import TYPE_CHECKING, Any, Self, TypeVar
@@ -655,8 +656,14 @@ class CoreSys:
if kwargs: if kwargs:
funct = partial(funct, **kwargs) funct = partial(funct, **kwargs)
# Convert datetime to event loop time base
# If datetime is in the past, delay will be negative and call_at will
# schedule the call as soon as possible.
delay = when.timestamp() - time.time()
loop_time = self.loop.time() + delay
return self.loop.call_at( return self.loop.call_at(
when.timestamp(), funct, *args, context=self._create_context() loop_time, funct, *args, context=self._create_context()
) )

View File

@@ -2,6 +2,7 @@
from __future__ import annotations from __future__ import annotations
from collections.abc import Awaitable
from contextlib import suppress from contextlib import suppress
from ipaddress import IPv4Address from ipaddress import IPv4Address
import logging import logging
@@ -33,6 +34,7 @@ from ..coresys import CoreSys
from ..exceptions import ( from ..exceptions import (
CoreDNSError, CoreDNSError,
DBusError, DBusError,
DockerBuildError,
DockerError, DockerError,
DockerJobError, DockerJobError,
DockerNotFound, DockerNotFound,
@@ -680,9 +682,8 @@ class DockerAddon(DockerInterface):
async def _build(self, version: AwesomeVersion, image: str | None = None) -> None: async def _build(self, version: AwesomeVersion, image: str | None = None) -> None:
"""Build a Docker container.""" """Build a Docker container."""
build_env = await AddonBuild(self.coresys, self.addon).load_config() build_env = await AddonBuild(self.coresys, self.addon).load_config()
if not await build_env.is_valid(): # Check if the build environment is valid, raises if not
_LOGGER.error("Invalid build environment, can't build this add-on!") await build_env.is_valid()
raise DockerError()
_LOGGER.info("Starting build for %s:%s", self.image, version) _LOGGER.info("Starting build for %s:%s", self.image, version)
@@ -733,8 +734,9 @@ class DockerAddon(DockerInterface):
requests.RequestException, requests.RequestException,
aiodocker.DockerError, aiodocker.DockerError,
) as err: ) as err:
_LOGGER.error("Can't build %s:%s: %s", self.image, version, err) raise DockerBuildError(
raise DockerError() from err f"Can't build {self.image}:{version}: {err!s}", _LOGGER.error
) from err
_LOGGER.info("Build %s:%s done", self.image, version) _LOGGER.info("Build %s:%s done", self.image, version)
@@ -792,12 +794,9 @@ class DockerAddon(DockerInterface):
on_condition=DockerJobError, on_condition=DockerJobError,
concurrency=JobConcurrency.GROUP_REJECT, concurrency=JobConcurrency.GROUP_REJECT,
) )
async def write_stdin(self, data: bytes) -> None: def write_stdin(self, data: bytes) -> Awaitable[None]:
"""Write to add-on stdin.""" """Write to add-on stdin."""
if not await self.is_running(): return self.sys_run_in_executor(self._write_stdin, data)
raise DockerError()
await self.sys_run_in_executor(self._write_stdin, data)
def _write_stdin(self, data: bytes) -> None: def _write_stdin(self, data: bytes) -> None:
"""Write to add-on stdin. """Write to add-on stdin.

View File

@@ -310,6 +310,8 @@ class DockerInterface(JobGroup, ABC):
if ( if (
stage in {PullImageLayerStage.DOWNLOADING, PullImageLayerStage.EXTRACTING} stage in {PullImageLayerStage.DOWNLOADING, PullImageLayerStage.EXTRACTING}
and reference.progress_detail and reference.progress_detail
and reference.progress_detail.current is not None
and reference.progress_detail.total is not None
): ):
job.update( job.update(
progress=progress, progress=progress,
@@ -480,35 +482,34 @@ class DockerInterface(JobGroup, ABC):
return True return True
return False return False
async def is_running(self) -> bool: async def _get_container(self) -> Container | None:
"""Return True if Docker is running.""" """Get docker container, returns None if not found."""
try: try:
docker_container = await self.sys_run_in_executor( return await self.sys_run_in_executor(
self.sys_docker.containers.get, self.name self.sys_docker.containers.get, self.name
) )
except docker.errors.NotFound: except docker.errors.NotFound:
return False return None
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
raise DockerAPIError() from err raise DockerAPIError(
f"Docker API error occurred while getting container information: {err!s}"
) from err
except requests.RequestException as err: except requests.RequestException as err:
raise DockerRequestError() from err raise DockerRequestError(
f"Error communicating with Docker to get container information: {err!s}"
) from err
async def is_running(self) -> bool:
"""Return True if Docker is running."""
if docker_container := await self._get_container():
return docker_container.status == "running" return docker_container.status == "running"
return False
async def current_state(self) -> ContainerState: async def current_state(self) -> ContainerState:
"""Return current state of container.""" """Return current state of container."""
try: if docker_container := await self._get_container():
docker_container = await self.sys_run_in_executor(
self.sys_docker.containers.get, self.name
)
except docker.errors.NotFound:
return ContainerState.UNKNOWN
except docker.errors.DockerException as err:
raise DockerAPIError() from err
except requests.RequestException as err:
raise DockerRequestError() from err
return _container_state_from_model(docker_container) return _container_state_from_model(docker_container)
return ContainerState.UNKNOWN
@Job(name="docker_interface_attach", concurrency=JobConcurrency.GROUP_QUEUE) @Job(name="docker_interface_attach", concurrency=JobConcurrency.GROUP_QUEUE)
async def attach( async def attach(
@@ -543,7 +544,9 @@ class DockerInterface(JobGroup, ABC):
# Successful? # Successful?
if not self._meta: if not self._meta:
raise DockerError() raise DockerError(
f"Could not get metadata on container or image for {self.name}"
)
_LOGGER.info("Attaching to %s with version %s", self.image, self.version) _LOGGER.info("Attaching to %s with version %s", self.image, self.version)
@Job( @Job(
@@ -748,14 +751,8 @@ class DockerInterface(JobGroup, ABC):
async def is_failed(self) -> bool: async def is_failed(self) -> bool:
"""Return True if Docker is failing state.""" """Return True if Docker is failing state."""
try: if not (docker_container := await self._get_container()):
docker_container = await self.sys_run_in_executor(
self.sys_docker.containers.get, self.name
)
except docker.errors.NotFound:
return False return False
except (docker.errors.DockerException, requests.RequestException) as err:
raise DockerError() from err
# container is not running # container is not running
if docker_container.status != "exited": if docker_container.status != "exited":

View File

@@ -578,9 +578,15 @@ class DockerAPI(CoreSysAttributes):
except aiodocker.DockerError as err: except aiodocker.DockerError as err:
if err.status == HTTPStatus.NOT_FOUND: if err.status == HTTPStatus.NOT_FOUND:
return False return False
raise DockerError() from err raise DockerError(
f"Could not get container {name} or image {image}:{version} to check state: {err!s}",
_LOGGER.error,
) from err
except (docker_errors.DockerException, requests.RequestException) as err: except (docker_errors.DockerException, requests.RequestException) as err:
raise DockerError() from err raise DockerError(
f"Could not get container {name} or image {image}:{version} to check state: {err!s}",
_LOGGER.error,
) from err
# Check the image is correct and state is good # Check the image is correct and state is good
return ( return (
@@ -596,9 +602,13 @@ class DockerAPI(CoreSysAttributes):
try: try:
docker_container: Container = self.containers.get(name) docker_container: Container = self.containers.get(name)
except docker_errors.NotFound: except docker_errors.NotFound:
# Generally suppressed so we don't log this
raise DockerNotFound() from None raise DockerNotFound() from None
except (docker_errors.DockerException, requests.RequestException) as err: except (docker_errors.DockerException, requests.RequestException) as err:
raise DockerError() from err raise DockerError(
f"Could not get container {name} for stopping: {err!s}",
_LOGGER.error,
) from err
if docker_container.status == "running": if docker_container.status == "running":
_LOGGER.info("Stopping %s application", name) _LOGGER.info("Stopping %s application", name)
@@ -638,9 +648,13 @@ class DockerAPI(CoreSysAttributes):
try: try:
container: Container = self.containers.get(name) container: Container = self.containers.get(name)
except docker_errors.NotFound: except docker_errors.NotFound:
raise DockerNotFound() from None raise DockerNotFound(
f"Container {name} not found for restarting", _LOGGER.warning
) from None
except (docker_errors.DockerException, requests.RequestException) as err: except (docker_errors.DockerException, requests.RequestException) as err:
raise DockerError() from err raise DockerError(
f"Could not get container {name} for restarting: {err!s}", _LOGGER.error
) from err
_LOGGER.info("Restarting %s", name) _LOGGER.info("Restarting %s", name)
try: try:
@@ -653,9 +667,13 @@ class DockerAPI(CoreSysAttributes):
try: try:
docker_container: Container = self.containers.get(name) docker_container: Container = self.containers.get(name)
except docker_errors.NotFound: except docker_errors.NotFound:
raise DockerNotFound() from None raise DockerNotFound(
f"Container {name} not found for logs", _LOGGER.warning
) from None
except (docker_errors.DockerException, requests.RequestException) as err: except (docker_errors.DockerException, requests.RequestException) as err:
raise DockerError() from err raise DockerError(
f"Could not get container {name} for logs: {err!s}", _LOGGER.error
) from err
try: try:
return docker_container.logs(tail=tail, stdout=True, stderr=True) return docker_container.logs(tail=tail, stdout=True, stderr=True)
@@ -669,9 +687,13 @@ class DockerAPI(CoreSysAttributes):
try: try:
docker_container: Container = self.containers.get(name) docker_container: Container = self.containers.get(name)
except docker_errors.NotFound: except docker_errors.NotFound:
raise DockerNotFound() from None raise DockerNotFound(
f"Container {name} not found for stats", _LOGGER.warning
) from None
except (docker_errors.DockerException, requests.RequestException) as err: except (docker_errors.DockerException, requests.RequestException) as err:
raise DockerError() from err raise DockerError(
f"Could not inspect container '{name}': {err!s}", _LOGGER.error
) from err
# container is not running # container is not running
if docker_container.status != "running": if docker_container.status != "running":
@@ -689,15 +711,21 @@ class DockerAPI(CoreSysAttributes):
try: try:
docker_container: Container = self.containers.get(name) docker_container: Container = self.containers.get(name)
except docker_errors.NotFound: except docker_errors.NotFound:
raise DockerNotFound() from None raise DockerNotFound(
f"Container {name} not found for running command", _LOGGER.warning
) from None
except (docker_errors.DockerException, requests.RequestException) as err: except (docker_errors.DockerException, requests.RequestException) as err:
raise DockerError() from err raise DockerError(
f"Can't get container {name} to run command: {err!s}"
) from err
# Execute # Execute
try: try:
code, output = docker_container.exec_run(command) code, output = docker_container.exec_run(command)
except (docker_errors.DockerException, requests.RequestException) as err: except (docker_errors.DockerException, requests.RequestException) as err:
raise DockerError() from err raise DockerError(
f"Can't run command in container {name}: {err!s}"
) from err
return CommandReturn(code, output) return CommandReturn(code, output)

View File

@@ -1,25 +1,25 @@
"""Core Exceptions.""" """Core Exceptions."""
from collections.abc import Callable from collections.abc import Callable, Mapping
from typing import Any from typing import Any
MESSAGE_CHECK_SUPERVISOR_LOGS = (
"Check supervisor logs for details (check with '{logs_command}')"
)
EXTRA_FIELDS_LOGS_COMMAND = {"logs_command": "ha supervisor logs"}
class HassioError(Exception): class HassioError(Exception):
"""Root exception.""" """Root exception."""
error_key: str | None = None error_key: str | None = None
message_template: str | None = None message_template: str | None = None
extra_fields: dict[str, Any] | None = None
def __init__( def __init__(
self, self, message: str | None = None, logger: Callable[..., None] | None = None
message: str | None = None,
logger: Callable[..., None] | None = None,
*,
extra_fields: dict[str, Any] | None = None,
) -> None: ) -> None:
"""Raise & log.""" """Raise & log."""
self.extra_fields = extra_fields or {}
if not message and self.message_template: if not message and self.message_template:
message = ( message = (
self.message_template.format(**self.extra_fields) self.message_template.format(**self.extra_fields)
@@ -41,6 +41,94 @@ class HassioNotSupportedError(HassioError):
"""Function is not supported.""" """Function is not supported."""
# API
class APIError(HassioError, RuntimeError):
"""API errors."""
status = 400
headers: Mapping[str, str] | None = None
def __init__(
self,
message: str | None = None,
logger: Callable[..., None] | None = None,
*,
headers: Mapping[str, str] | None = None,
job_id: str | None = None,
) -> None:
"""Raise & log, optionally with job."""
super().__init__(message, logger)
self.headers = headers
self.job_id = job_id
class APIUnauthorized(APIError):
"""API unauthorized error."""
status = 401
class APIForbidden(APIError):
"""API forbidden error."""
status = 403
class APINotFound(APIError):
"""API not found error."""
status = 404
class APIGone(APIError):
"""API is no longer available."""
status = 410
class APITooManyRequests(APIError):
"""API too many requests error."""
status = 429
class APIInternalServerError(APIError):
"""API internal server error."""
status = 500
class APIAddonNotInstalled(APIError):
"""Not installed addon requested at addons API."""
class APIDBMigrationInProgress(APIError):
"""Service is unavailable due to an offline DB migration is in progress."""
status = 503
class APIUnknownSupervisorError(APIError):
"""Unknown error occurred within supervisor. Adds supervisor check logs rider to mesage template."""
status = 500
def __init__(
self,
logger: Callable[..., None] | None = None,
*,
job_id: str | None = None,
) -> None:
"""Initialize exception."""
self.message_template = (
f"{self.message_template}. {MESSAGE_CHECK_SUPERVISOR_LOGS}"
)
self.extra_fields = (self.extra_fields or {}) | EXTRA_FIELDS_LOGS_COMMAND
super().__init__(None, logger, job_id=job_id)
# JobManager # JobManager
@@ -122,6 +210,13 @@ class SupervisorAppArmorError(SupervisorError):
"""Supervisor AppArmor error.""" """Supervisor AppArmor error."""
class SupervisorUnknownError(SupervisorError, APIUnknownSupervisorError):
"""Raise when an unknown error occurs interacting with Supervisor or its container."""
error_key = "supervisor_unknown_error"
message_template = "An unknown error occurred with Supervisor"
class SupervisorJobError(SupervisorError, JobException): class SupervisorJobError(SupervisorError, JobException):
"""Raise on job errors.""" """Raise on job errors."""
@@ -250,6 +345,54 @@ class AddonConfigurationError(AddonsError):
"""Error with add-on configuration.""" """Error with add-on configuration."""
class AddonConfigurationInvalidError(AddonConfigurationError, APIError):
"""Raise if invalid configuration provided for addon."""
error_key = "addon_configuration_invalid_error"
message_template = "Add-on {addon} has invalid options: {validation_error}"
def __init__(
self,
logger: Callable[..., None] | None = None,
*,
addon: str,
validation_error: str,
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon, "validation_error": validation_error}
super().__init__(None, logger)
class AddonBootConfigCannotChangeError(AddonsError, APIError):
"""Raise if user attempts to change addon boot config when it can't be changed."""
error_key = "addon_boot_config_cannot_change_error"
message_template = (
"Addon {addon} boot option is set to {boot_config} so it cannot be changed"
)
def __init__(
self, logger: Callable[..., None] | None = None, *, addon: str, boot_config: str
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon, "boot_config": boot_config}
super().__init__(None, logger)
class AddonNotRunningError(AddonsError, APIError):
"""Raise when an addon is not running."""
error_key = "addon_not_running_error"
message_template = "Add-on {addon} is not running"
def __init__(
self, logger: Callable[..., None] | None = None, *, addon: str
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon}
super().__init__(None, logger)
class AddonNotSupportedError(HassioNotSupportedError): class AddonNotSupportedError(HassioNotSupportedError):
"""Addon doesn't support a function.""" """Addon doesn't support a function."""
@@ -268,11 +411,8 @@ class AddonNotSupportedArchitectureError(AddonNotSupportedError):
architectures: list[str], architectures: list[str],
) -> None: ) -> None:
"""Initialize exception.""" """Initialize exception."""
super().__init__( self.extra_fields = {"slug": slug, "architectures": ", ".join(architectures)}
None, super().__init__(None, logger)
logger,
extra_fields={"slug": slug, "architectures": ", ".join(architectures)},
)
class AddonNotSupportedMachineTypeError(AddonNotSupportedError): class AddonNotSupportedMachineTypeError(AddonNotSupportedError):
@@ -289,11 +429,8 @@ class AddonNotSupportedMachineTypeError(AddonNotSupportedError):
machine_types: list[str], machine_types: list[str],
) -> None: ) -> None:
"""Initialize exception.""" """Initialize exception."""
super().__init__( self.extra_fields = {"slug": slug, "machine_types": ", ".join(machine_types)}
None, super().__init__(None, logger)
logger,
extra_fields={"slug": slug, "machine_types": ", ".join(machine_types)},
)
class AddonNotSupportedHomeAssistantVersionError(AddonNotSupportedError): class AddonNotSupportedHomeAssistantVersionError(AddonNotSupportedError):
@@ -310,12 +447,97 @@ class AddonNotSupportedHomeAssistantVersionError(AddonNotSupportedError):
version: str, version: str,
) -> None: ) -> None:
"""Initialize exception.""" """Initialize exception."""
super().__init__( self.extra_fields = {"slug": slug, "version": version}
None, super().__init__(None, logger)
logger,
extra_fields={"slug": slug, "version": version},
class AddonNotSupportedWriteStdinError(AddonNotSupportedError, APIError):
"""Addon does not support writing to stdin."""
error_key = "addon_not_supported_write_stdin_error"
message_template = "Add-on {addon} does not support writing to stdin"
def __init__(
self, logger: Callable[..., None] | None = None, *, addon: str
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon}
super().__init__(None, logger)
class AddonBuildDockerfileMissingError(AddonNotSupportedError, APIError):
"""Raise when addon build invalid because dockerfile is missing."""
error_key = "addon_build_dockerfile_missing_error"
message_template = (
"Cannot build addon '{addon}' because dockerfile is missing. A repair "
"using '{repair_command}' will fix this if the cause is data "
"corruption. Otherwise please report this to the addon developer."
) )
def __init__(
self, logger: Callable[..., None] | None = None, *, addon: str
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon, "repair_command": "ha supervisor repair"}
super().__init__(None, logger)
class AddonBuildArchitectureNotSupportedError(AddonNotSupportedError, APIError):
"""Raise when addon cannot be built on system because it doesn't support its architecture."""
error_key = "addon_build_architecture_not_supported_error"
message_template = (
"Cannot build addon '{addon}' because its supported architectures "
"({addon_arches}) do not match the system supported architectures ({system_arches})"
)
def __init__(
self,
logger: Callable[..., None] | None = None,
*,
addon: str,
addon_arch_list: list[str],
system_arch_list: list[str],
) -> None:
"""Initialize exception."""
self.extra_fields = {
"addon": addon,
"addon_arches": ", ".join(addon_arch_list),
"system_arches": ", ".join(system_arch_list),
}
super().__init__(None, logger)
class AddonUnknownError(AddonsError, APIUnknownSupervisorError):
"""Raise when unknown error occurs taking an action for an addon."""
error_key = "addon_unknown_error"
message_template = "An unknown error occurred with addon {addon}"
def __init__(
self, logger: Callable[..., None] | None = None, *, addon: str
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon}
super().__init__(logger)
class AddonBuildFailedUnknownError(AddonsError, APIUnknownSupervisorError):
"""Raise when the build failed for an addon due to an unknown error."""
error_key = "addon_build_failed_unknown_error"
message_template = (
"An unknown error occurred while trying to build the image for addon {addon}"
)
def __init__(
self, logger: Callable[..., None] | None = None, *, addon: str
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon}
super().__init__(logger)
class AddonsJobError(AddonsError, JobException): class AddonsJobError(AddonsError, JobException):
"""Raise on job errors.""" """Raise on job errors."""
@@ -346,13 +568,68 @@ class AuthError(HassioError):
"""Auth errors.""" """Auth errors."""
class AuthPasswordResetError(HassioError): # This one uses the check logs rider even though its not a 500 error because it
# is bad practice to return error specifics from a password reset API.
class AuthPasswordResetError(AuthError, APIError):
"""Auth error if password reset failed.""" """Auth error if password reset failed."""
error_key = "auth_password_reset_error"
message_template = (
f"Unable to reset password for '{{user}}'. {MESSAGE_CHECK_SUPERVISOR_LOGS}"
)
class AuthListUsersError(HassioError): def __init__(
self,
logger: Callable[..., None] | None = None,
*,
user: str,
) -> None:
"""Initialize exception."""
self.extra_fields = {"user": user} | EXTRA_FIELDS_LOGS_COMMAND
super().__init__(None, logger)
class AuthListUsersError(AuthError, APIUnknownSupervisorError):
"""Auth error if listing users failed.""" """Auth error if listing users failed."""
error_key = "auth_list_users_error"
message_template = "Can't request listing users on Home Assistant"
class AuthListUsersNoneResponseError(AuthError, APIInternalServerError):
"""Auth error if listing users returned invalid None response."""
error_key = "auth_list_users_none_response_error"
message_template = "Home Assistant returned invalid response of `{none}` instead of a list of users. Check Home Assistant logs for details (check with `{logs_command}`)"
extra_fields = {"none": "None", "logs_command": "ha core logs"}
def __init__(self, logger: Callable[..., None] | None = None) -> None:
"""Initialize exception."""
super().__init__(None, logger)
class AuthInvalidNonStringValueError(AuthError, APIUnauthorized):
"""Auth error if something besides a string provided as username or password."""
error_key = "auth_invalid_non_string_value_error"
message_template = "Username and password must be strings"
def __init__(
self,
logger: Callable[..., None] | None = None,
*,
headers: Mapping[str, str] | None = None,
) -> None:
"""Initialize exception."""
super().__init__(None, logger, headers=headers)
class AuthHomeAssistantAPIValidationError(AuthError, APIUnknownSupervisorError):
"""Error encountered trying to validate auth details via Home Assistant API."""
error_key = "auth_home_assistant_api_validation_error"
message_template = "Unable to validate authentication details with Home Assistant"
# Host # Host
@@ -385,60 +662,6 @@ class HostLogError(HostError):
"""Internal error with host log.""" """Internal error with host log."""
# API
class APIError(HassioError, RuntimeError):
"""API errors."""
status = 400
def __init__(
self,
message: str | None = None,
logger: Callable[..., None] | None = None,
*,
job_id: str | None = None,
error: HassioError | None = None,
) -> None:
"""Raise & log, optionally with job."""
# Allow these to be set from another error here since APIErrors essentially wrap others to add a status
self.error_key = error.error_key if error else None
self.message_template = error.message_template if error else None
super().__init__(
message, logger, extra_fields=error.extra_fields if error else None
)
self.job_id = job_id
class APIForbidden(APIError):
"""API forbidden error."""
status = 403
class APINotFound(APIError):
"""API not found error."""
status = 404
class APIGone(APIError):
"""API is no longer available."""
status = 410
class APIAddonNotInstalled(APIError):
"""Not installed addon requested at addons API."""
class APIDBMigrationInProgress(APIError):
"""Service is unavailable due to an offline DB migration is in progress."""
status = 503
# Service / Discovery # Service / Discovery
@@ -616,6 +839,10 @@ class DockerError(HassioError):
"""Docker API/Transport errors.""" """Docker API/Transport errors."""
class DockerBuildError(DockerError):
"""Docker error during build."""
class DockerAPIError(DockerError): class DockerAPIError(DockerError):
"""Docker API error.""" """Docker API error."""
@@ -647,7 +874,7 @@ class DockerNoSpaceOnDevice(DockerError):
super().__init__(None, logger=logger) super().__init__(None, logger=logger)
class DockerHubRateLimitExceeded(DockerError): class DockerHubRateLimitExceeded(DockerError, APITooManyRequests):
"""Raise for docker hub rate limit exceeded error.""" """Raise for docker hub rate limit exceeded error."""
error_key = "dockerhub_rate_limit_exceeded" error_key = "dockerhub_rate_limit_exceeded"
@@ -655,16 +882,13 @@ class DockerHubRateLimitExceeded(DockerError):
"Your IP address has made too many requests to Docker Hub which activated a rate limit. " "Your IP address has made too many requests to Docker Hub which activated a rate limit. "
"For more details see {dockerhub_rate_limit_url}" "For more details see {dockerhub_rate_limit_url}"
) )
extra_fields = {
"dockerhub_rate_limit_url": "https://www.home-assistant.io/more-info/dockerhub-rate-limit"
}
def __init__(self, logger: Callable[..., None] | None = None) -> None: def __init__(self, logger: Callable[..., None] | None = None) -> None:
"""Raise & log.""" """Raise & log."""
super().__init__( super().__init__(None, logger=logger)
None,
logger=logger,
extra_fields={
"dockerhub_rate_limit_url": "https://www.home-assistant.io/more-info/dockerhub-rate-limit"
},
)
class DockerJobError(DockerError, JobException): class DockerJobError(DockerError, JobException):
@@ -735,6 +959,20 @@ class StoreNotFound(StoreError):
"""Raise if slug is not known.""" """Raise if slug is not known."""
class StoreAddonNotFoundError(StoreError, APINotFound):
"""Raise if a requested addon is not in the store."""
error_key = "store_addon_not_found_error"
message_template = "Addon {addon} does not exist in the store"
def __init__(
self, logger: Callable[..., None] | None = None, *, addon: str
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon}
super().__init__(None, logger)
class StoreJobError(StoreError, JobException): class StoreJobError(StoreError, JobException):
"""Raise on job error with git.""" """Raise on job error with git."""
@@ -770,7 +1008,7 @@ class BackupJobError(BackupError, JobException):
"""Raise on Backup job error.""" """Raise on Backup job error."""
class BackupFileNotFoundError(BackupError): class BackupFileNotFoundError(BackupError, APINotFound):
"""Raise if the backup file hasn't been found.""" """Raise if the backup file hasn't been found."""
@@ -782,6 +1020,55 @@ class BackupFileExistError(BackupError):
"""Raise if the backup file already exists.""" """Raise if the backup file already exists."""
class AddonBackupMetadataInvalidError(BackupError, APIError):
"""Raise if invalid metadata file provided for addon in backup."""
error_key = "addon_backup_metadata_invalid_error"
message_template = (
"Metadata file for add-on {addon} in backup is invalid: {validation_error}"
)
def __init__(
self,
logger: Callable[..., None] | None = None,
*,
addon: str,
validation_error: str,
) -> None:
"""Initialize exception."""
self.extra_fields = {"addon": addon, "validation_error": validation_error}
super().__init__(None, logger)
class AddonPrePostBackupCommandReturnedError(BackupError, APIError):
"""Raise when addon's pre/post backup command returns an error."""
error_key = "addon_pre_post_backup_command_returned_error"
message_template = (
"Pre-/Post backup command for add-on {addon} returned error code: "
"{exit_code}. Please report this to the addon developer. Enable debug "
"logging to capture complete command output using {debug_logging_command}"
)
def __init__(
self, logger: Callable[..., None] | None = None, *, addon: str, exit_code: int
) -> None:
"""Initialize exception."""
self.extra_fields = {
"addon": addon,
"exit_code": exit_code,
"debug_logging_command": "ha supervisor options --logging debug",
}
super().__init__(None, logger)
class BackupRestoreUnknownError(BackupError, APIUnknownSupervisorError):
"""Raise when an unknown error occurs during backup or restore."""
error_key = "backup_restore_unknown_error"
message_template = "An unknown error occurred during backup/restore"
# Security # Security

View File

@@ -102,13 +102,17 @@ class SupervisorJobError:
"Unknown error, see Supervisor logs (check with 'ha supervisor logs')" "Unknown error, see Supervisor logs (check with 'ha supervisor logs')"
) )
stage: str | None = None stage: str | None = None
error_key: str | None = None
extra_fields: dict[str, Any] | None = None
def as_dict(self) -> dict[str, str | None]: def as_dict(self) -> dict[str, Any]:
"""Return dictionary representation.""" """Return dictionary representation."""
return { return {
"type": self.type_.__name__, "type": self.type_.__name__,
"message": self.message, "message": self.message,
"stage": self.stage, "stage": self.stage,
"error_key": self.error_key,
"extra_fields": self.extra_fields,
} }
@@ -158,7 +162,9 @@ class SupervisorJob:
def capture_error(self, err: HassioError | None = None) -> None: def capture_error(self, err: HassioError | None = None) -> None:
"""Capture an error or record that an unknown error has occurred.""" """Capture an error or record that an unknown error has occurred."""
if err: if err:
new_error = SupervisorJobError(type(err), str(err), self.stage) new_error = SupervisorJobError(
type(err), str(err), self.stage, err.error_key, err.extra_fields
)
else: else:
new_error = SupervisorJobError(stage=self.stage) new_error = SupervisorJobError(stage=self.stage)
self.errors += [new_error] self.errors += [new_error]

View File

@@ -34,6 +34,7 @@ class JobCondition(StrEnum):
PLUGINS_UPDATED = "plugins_updated" PLUGINS_UPDATED = "plugins_updated"
RUNNING = "running" RUNNING = "running"
SUPERVISOR_UPDATED = "supervisor_updated" SUPERVISOR_UPDATED = "supervisor_updated"
ARCHITECTURE_SUPPORTED = "architecture_supported"
class JobConcurrency(StrEnum): class JobConcurrency(StrEnum):

View File

@@ -441,6 +441,14 @@ class Job(CoreSysAttributes):
raise JobConditionException( raise JobConditionException(
f"'{method_name}' blocked from execution, supervisor needs to be updated first" f"'{method_name}' blocked from execution, supervisor needs to be updated first"
) )
if (
JobCondition.ARCHITECTURE_SUPPORTED in used_conditions
and UnsupportedReason.SYSTEM_ARCHITECTURE
in coresys.sys_resolution.unsupported
):
raise JobConditionException(
f"'{method_name}' blocked from execution, unsupported system architecture"
)
if JobCondition.PLUGINS_UPDATED in used_conditions and ( if JobCondition.PLUGINS_UPDATED in used_conditions and (
out_of_date := [ out_of_date := [

View File

@@ -161,6 +161,7 @@ class Tasks(CoreSysAttributes):
JobCondition.INTERNET_HOST, JobCondition.INTERNET_HOST,
JobCondition.OS_SUPPORTED, JobCondition.OS_SUPPORTED,
JobCondition.RUNNING, JobCondition.RUNNING,
JobCondition.ARCHITECTURE_SUPPORTED,
], ],
concurrency=JobConcurrency.REJECT, concurrency=JobConcurrency.REJECT,
) )

View File

@@ -23,4 +23,5 @@ PLUGIN_UPDATE_CONDITIONS = [
JobCondition.HEALTHY, JobCondition.HEALTHY,
JobCondition.INTERNET_HOST, JobCondition.INTERNET_HOST,
JobCondition.SUPERVISOR_UPDATED, JobCondition.SUPERVISOR_UPDATED,
JobCondition.ARCHITECTURE_SUPPORTED,
] ]

View File

@@ -58,6 +58,7 @@ class UnsupportedReason(StrEnum):
SYSTEMD_JOURNAL = "systemd_journal" SYSTEMD_JOURNAL = "systemd_journal"
SYSTEMD_RESOLVED = "systemd_resolved" SYSTEMD_RESOLVED = "systemd_resolved"
VIRTUALIZATION_IMAGE = "virtualization_image" VIRTUALIZATION_IMAGE = "virtualization_image"
SYSTEM_ARCHITECTURE = "system_architecture"
class UnhealthyReason(StrEnum): class UnhealthyReason(StrEnum):

View File

@@ -0,0 +1,38 @@
"""Evaluation class for system architecture support."""
from ...const import CoreState
from ...coresys import CoreSys
from ..const import UnsupportedReason
from .base import EvaluateBase
def setup(coresys: CoreSys) -> EvaluateBase:
"""Initialize evaluation-setup function."""
return EvaluateSystemArchitecture(coresys)
class EvaluateSystemArchitecture(EvaluateBase):
"""Evaluate if the current Supervisor architecture is supported."""
@property
def reason(self) -> UnsupportedReason:
"""Return a UnsupportedReason enum."""
return UnsupportedReason.SYSTEM_ARCHITECTURE
@property
def on_failure(self) -> str:
"""Return a string that is printed when self.evaluate is True."""
return "System architecture is no longer supported. Move to a supported system architecture."
@property
def states(self) -> list[CoreState]:
"""Return a list of valid states when this evaluation can run."""
return [CoreState.INITIALIZE]
async def evaluate(self):
"""Run evaluation."""
return self.sys_host.info.sys_arch.supervisor in {
"i386",
"armhf",
"armv7",
}

View File

@@ -28,8 +28,8 @@ from .exceptions import (
DockerError, DockerError,
HostAppArmorError, HostAppArmorError,
SupervisorAppArmorError, SupervisorAppArmorError,
SupervisorError,
SupervisorJobError, SupervisorJobError,
SupervisorUnknownError,
SupervisorUpdateError, SupervisorUpdateError,
) )
from .jobs.const import JobCondition, JobThrottle from .jobs.const import JobCondition, JobThrottle
@@ -261,7 +261,7 @@ class Supervisor(CoreSysAttributes):
try: try:
return await self.instance.stats() return await self.instance.stats()
except DockerError as err: except DockerError as err:
raise SupervisorError() from err raise SupervisorUnknownError() from err
async def repair(self): async def repair(self):
"""Repair local Supervisor data.""" """Repair local Supervisor data."""

View File

@@ -242,9 +242,10 @@ class Updater(FileConfiguration, CoreSysAttributes):
@Job( @Job(
name="updater_fetch_data", name="updater_fetch_data",
conditions=[ conditions=[
JobCondition.ARCHITECTURE_SUPPORTED,
JobCondition.INTERNET_SYSTEM, JobCondition.INTERNET_SYSTEM,
JobCondition.OS_SUPPORTED,
JobCondition.HOME_ASSISTANT_CORE_SUPPORTED, JobCondition.HOME_ASSISTANT_CORE_SUPPORTED,
JobCondition.OS_SUPPORTED,
], ],
on_condition=UpdaterJobError, on_condition=UpdaterJobError,
throttle_period=timedelta(seconds=30), throttle_period=timedelta(seconds=30),

View File

@@ -5,12 +5,20 @@ from collections.abc import AsyncGenerator
from datetime import UTC, datetime from datetime import UTC, datetime
from functools import wraps from functools import wraps
import json import json
import re
from aiohttp import ClientResponse from aiohttp import ClientResponse
from supervisor.exceptions import MalformedBinaryEntryError from supervisor.exceptions import MalformedBinaryEntryError
from supervisor.host.const import LogFormatter from supervisor.host.const import LogFormatter
_RE_ANSI_CSI_COLORS_PATTERN = re.compile(r"\x1B\[[0-9;]*m")
def _strip_ansi_colors(message: str) -> str:
"""Remove ANSI color codes from a message string."""
return _RE_ANSI_CSI_COLORS_PATTERN.sub("", message)
def formatter(required_fields: list[str]): def formatter(required_fields: list[str]):
"""Decorate journal entry formatters with list of required fields. """Decorate journal entry formatters with list of required fields.
@@ -31,9 +39,9 @@ def formatter(required_fields: list[str]):
@formatter(["MESSAGE"]) @formatter(["MESSAGE"])
def journal_plain_formatter(entries: dict[str, str]) -> str: def journal_plain_formatter(entries: dict[str, str], no_colors: bool = False) -> str:
"""Format parsed journal entries as a plain message.""" """Format parsed journal entries as a plain message."""
return entries["MESSAGE"] return _strip_ansi_colors(entries["MESSAGE"]) if no_colors else entries["MESSAGE"]
@formatter( @formatter(
@@ -45,7 +53,7 @@ def journal_plain_formatter(entries: dict[str, str]) -> str:
"MESSAGE", "MESSAGE",
] ]
) )
def journal_verbose_formatter(entries: dict[str, str]) -> str: def journal_verbose_formatter(entries: dict[str, str], no_colors: bool = False) -> str:
"""Format parsed journal entries to a journalctl-like format.""" """Format parsed journal entries to a journalctl-like format."""
ts = datetime.fromtimestamp( ts = datetime.fromtimestamp(
int(entries["__REALTIME_TIMESTAMP"]) / 1e6, UTC int(entries["__REALTIME_TIMESTAMP"]) / 1e6, UTC
@@ -58,14 +66,24 @@ def journal_verbose_formatter(entries: dict[str, str]) -> str:
else entries.get("SYSLOG_IDENTIFIER", "_UNKNOWN_") else entries.get("SYSLOG_IDENTIFIER", "_UNKNOWN_")
) )
return f"{ts} {entries.get('_HOSTNAME', '')} {identifier}: {entries.get('MESSAGE', '')}" message = (
_strip_ansi_colors(entries.get("MESSAGE", ""))
if no_colors
else entries.get("MESSAGE", "")
)
return f"{ts} {entries.get('_HOSTNAME', '')} {identifier}: {message}"
async def journal_logs_reader( async def journal_logs_reader(
journal_logs: ClientResponse, log_formatter: LogFormatter = LogFormatter.PLAIN journal_logs: ClientResponse,
log_formatter: LogFormatter = LogFormatter.PLAIN,
no_colors: bool = False,
) -> AsyncGenerator[tuple[str | None, str]]: ) -> AsyncGenerator[tuple[str | None, str]]:
"""Read logs from systemd journal line by line, formatted using the given formatter. """Read logs from systemd journal line by line, formatted using the given formatter.
Optionally strip ANSI color codes from the entries' messages.
Returns a generator of (cursor, formatted_entry) tuples. Returns a generator of (cursor, formatted_entry) tuples.
""" """
match log_formatter: match log_formatter:
@@ -84,7 +102,10 @@ async def journal_logs_reader(
# at EOF (likely race between at_eof and EOF check in readuntil) # at EOF (likely race between at_eof and EOF check in readuntil)
if line == b"\n" or not line: if line == b"\n" or not line:
if entries: if entries:
yield entries.get("__CURSOR"), formatter_(entries) yield (
entries.get("__CURSOR"),
formatter_(entries, no_colors=no_colors),
)
entries = {} entries = {}
continue continue

View File

@@ -5,6 +5,7 @@ from datetime import timedelta
import errno import errno
from http import HTTPStatus from http import HTTPStatus
from pathlib import Path from pathlib import Path
from typing import Any
from unittest.mock import MagicMock, PropertyMock, call, patch from unittest.mock import MagicMock, PropertyMock, call, patch
import aiodocker import aiodocker
@@ -23,7 +24,13 @@ from supervisor.docker.addon import DockerAddon
from supervisor.docker.const import ContainerState from supervisor.docker.const import ContainerState
from supervisor.docker.manager import CommandReturn, DockerAPI from supervisor.docker.manager import CommandReturn, DockerAPI
from supervisor.docker.monitor import DockerContainerStateEvent from supervisor.docker.monitor import DockerContainerStateEvent
from supervisor.exceptions import AddonsError, AddonsJobError, AudioUpdateError from supervisor.exceptions import (
AddonPrePostBackupCommandReturnedError,
AddonsJobError,
AddonUnknownError,
AudioUpdateError,
HassioError,
)
from supervisor.hardware.helper import HwHelper from supervisor.hardware.helper import HwHelper
from supervisor.ingress import Ingress from supervisor.ingress import Ingress
from supervisor.store.repository import Repository from supervisor.store.repository import Repository
@@ -502,31 +509,26 @@ async def test_backup_with_pre_post_command(
@pytest.mark.parametrize( @pytest.mark.parametrize(
"get_error,exception_on_exec", ("container_get_side_effect", "exec_run_side_effect", "exc_type_raised"),
[ [
(NotFound("missing"), False), (NotFound("missing"), [(1, None)], AddonUnknownError),
(DockerException(), False), (DockerException(), [(1, None)], AddonUnknownError),
(None, True), (None, DockerException(), AddonUnknownError),
(None, False), (None, [(1, None)], AddonPrePostBackupCommandReturnedError),
], ],
) )
@pytest.mark.usefixtures("tmp_supervisor_data", "path_extern")
async def test_backup_with_pre_command_error( async def test_backup_with_pre_command_error(
coresys: CoreSys, coresys: CoreSys,
install_addon_ssh: Addon, install_addon_ssh: Addon,
container: MagicMock, container: MagicMock,
get_error: DockerException | None, container_get_side_effect: DockerException | None,
exception_on_exec: bool, exec_run_side_effect: DockerException | list[tuple[int, Any]],
tmp_supervisor_data, exc_type_raised: type[HassioError],
path_extern,
) -> None: ) -> None:
"""Test backing up an addon with error running pre command.""" """Test backing up an addon with error running pre command."""
if get_error: coresys.docker.containers.get.side_effect = container_get_side_effect
coresys.docker.containers.get.side_effect = get_error container.exec_run.side_effect = exec_run_side_effect
if exception_on_exec:
container.exec_run.side_effect = DockerException()
else:
container.exec_run.return_value = (1, None)
install_addon_ssh.path_data.mkdir() install_addon_ssh.path_data.mkdir()
await install_addon_ssh.load() await install_addon_ssh.load()
@@ -535,7 +537,7 @@ async def test_backup_with_pre_command_error(
with ( with (
patch.object(DockerAddon, "is_running", return_value=True), patch.object(DockerAddon, "is_running", return_value=True),
patch.object(Addon, "backup_pre", new=PropertyMock(return_value="backup_pre")), patch.object(Addon, "backup_pre", new=PropertyMock(return_value="backup_pre")),
pytest.raises(AddonsError), pytest.raises(exc_type_raised),
): ):
assert await install_addon_ssh.backup(tarfile) is None assert await install_addon_ssh.backup(tarfile) is None
@@ -947,7 +949,7 @@ async def test_addon_load_succeeds_with_docker_errors(
) )
caplog.clear() caplog.clear()
await install_addon_ssh.load() await install_addon_ssh.load()
assert "Invalid build environment" in caplog.text assert "Cannot build addon 'local_ssh' because dockerfile is missing" in caplog.text
# Image build failure # Image build failure
caplog.clear() caplog.clear()

View File

@@ -3,10 +3,12 @@
from unittest.mock import PropertyMock, patch from unittest.mock import PropertyMock, patch
from awesomeversion import AwesomeVersion from awesomeversion import AwesomeVersion
import pytest
from supervisor.addons.addon import Addon from supervisor.addons.addon import Addon
from supervisor.addons.build import AddonBuild from supervisor.addons.build import AddonBuild
from supervisor.coresys import CoreSys from supervisor.coresys import CoreSys
from supervisor.exceptions import AddonBuildDockerfileMissingError
from tests.common import is_in_list from tests.common import is_in_list
@@ -102,11 +104,11 @@ async def test_build_valid(coresys: CoreSys, install_addon_ssh: Addon):
type(coresys.arch), "default", new=PropertyMock(return_value="aarch64") type(coresys.arch), "default", new=PropertyMock(return_value="aarch64")
), ),
): ):
assert await build.is_valid() assert (await build.is_valid()) is None
async def test_build_invalid(coresys: CoreSys, install_addon_ssh: Addon): async def test_build_invalid(coresys: CoreSys, install_addon_ssh: Addon):
"""Test platform set in docker args.""" """Test build not supported because Dockerfile missing for specified architecture."""
build = await AddonBuild(coresys, install_addon_ssh).load_config() build = await AddonBuild(coresys, install_addon_ssh).load_config()
with ( with (
patch.object( patch.object(
@@ -115,5 +117,6 @@ async def test_build_invalid(coresys: CoreSys, install_addon_ssh: Addon):
patch.object( patch.object(
type(coresys.arch), "default", new=PropertyMock(return_value="amd64") type(coresys.arch), "default", new=PropertyMock(return_value="amd64")
), ),
pytest.raises(AddonBuildDockerfileMissingError),
): ):
assert not await build.is_valid() await build.is_valid()

View File

@@ -1,95 +1 @@
"""Test for API calls.""" """Test for API calls."""
from unittest.mock import AsyncMock, MagicMock
from aiohttp.test_utils import TestClient
from supervisor.coresys import CoreSys
from supervisor.host.const import LogFormat
DEFAULT_LOG_RANGE = "entries=:-99:100"
DEFAULT_LOG_RANGE_FOLLOW = "entries=:-99:18446744073709551615"
async def common_test_api_advanced_logs(
path_prefix: str,
syslog_identifier: str,
api_client: TestClient,
journald_logs: MagicMock,
coresys: CoreSys,
os_available: None,
):
"""Template for tests of endpoints using advanced logs."""
resp = await api_client.get(f"{path_prefix}/logs")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journald_logs.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/follow")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier, "follow": ""},
range_header=DEFAULT_LOG_RANGE_FOLLOW,
accept=LogFormat.JOURNAL,
)
journald_logs.reset_mock()
mock_response = MagicMock()
mock_response.text = AsyncMock(
return_value='{"CONTAINER_LOG_EPOCH": "12345"}\n{"CONTAINER_LOG_EPOCH": "12345"}\n'
)
journald_logs.return_value.__aenter__.return_value = mock_response
resp = await api_client.get(f"{path_prefix}/logs/latest")
assert resp.status == 200
assert journald_logs.call_count == 2
# Check the first call for getting epoch
epoch_call = journald_logs.call_args_list[0]
assert epoch_call[1]["params"] == {"CONTAINER_NAME": syslog_identifier}
assert epoch_call[1]["range_header"] == "entries=:-1:2"
# Check the second call for getting logs with the epoch
logs_call = journald_logs.call_args_list[1]
assert logs_call[1]["params"]["SYSLOG_IDENTIFIER"] == syslog_identifier
assert logs_call[1]["params"]["CONTAINER_LOG_EPOCH"] == "12345"
assert logs_call[1]["range_header"] == "entries=:0:18446744073709551615"
journald_logs.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/boots/0")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier, "_BOOT_ID": "ccc"},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journald_logs.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/boots/0/follow")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={
"SYSLOG_IDENTIFIER": syslog_identifier,
"_BOOT_ID": "ccc",
"follow": "",
},
range_header=DEFAULT_LOG_RANGE_FOLLOW,
accept=LogFormat.JOURNAL,
)

149
tests/api/conftest.py Normal file
View File

@@ -0,0 +1,149 @@
"""Fixtures for API tests."""
from collections.abc import Awaitable, Callable
from unittest.mock import ANY, AsyncMock, MagicMock
from aiohttp.test_utils import TestClient
import pytest
from supervisor.coresys import CoreSys
from supervisor.host.const import LogFormat, LogFormatter
DEFAULT_LOG_RANGE = "entries=:-99:100"
DEFAULT_LOG_RANGE_FOLLOW = "entries=:-99:18446744073709551615"
async def _common_test_api_advanced_logs(
path_prefix: str,
syslog_identifier: str,
api_client: TestClient,
journald_logs: MagicMock,
coresys: CoreSys,
os_available: None,
journal_logs_reader: MagicMock,
):
"""Template for tests of endpoints using advanced logs."""
resp = await api_client.get(f"{path_prefix}/logs")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.PLAIN, False)
journald_logs.reset_mock()
journal_logs_reader.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs?no_colors")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.PLAIN, True)
journald_logs.reset_mock()
journal_logs_reader.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/follow")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier, "follow": ""},
range_header=DEFAULT_LOG_RANGE_FOLLOW,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.PLAIN, False)
journald_logs.reset_mock()
journal_logs_reader.reset_mock()
mock_response = MagicMock()
mock_response.text = AsyncMock(
return_value='{"CONTAINER_LOG_EPOCH": "12345"}\n{"CONTAINER_LOG_EPOCH": "12345"}\n'
)
journald_logs.return_value.__aenter__.return_value = mock_response
resp = await api_client.get(f"{path_prefix}/logs/latest")
assert resp.status == 200
assert journald_logs.call_count == 2
# Check the first call for getting epoch
epoch_call = journald_logs.call_args_list[0]
assert epoch_call[1]["params"] == {"CONTAINER_NAME": syslog_identifier}
assert epoch_call[1]["range_header"] == "entries=:-1:2"
# Check the second call for getting logs with the epoch
logs_call = journald_logs.call_args_list[1]
assert logs_call[1]["params"]["SYSLOG_IDENTIFIER"] == syslog_identifier
assert logs_call[1]["params"]["CONTAINER_LOG_EPOCH"] == "12345"
assert logs_call[1]["range_header"] == "entries=:0:18446744073709551615"
journal_logs_reader.assert_called_with(ANY, LogFormatter.PLAIN, True)
journald_logs.reset_mock()
journal_logs_reader.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/boots/0")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": syslog_identifier, "_BOOT_ID": "ccc"},
range_header=DEFAULT_LOG_RANGE,
accept=LogFormat.JOURNAL,
)
journald_logs.reset_mock()
resp = await api_client.get(f"{path_prefix}/logs/boots/0/follow")
assert resp.status == 200
assert resp.content_type == "text/plain"
journald_logs.assert_called_once_with(
params={
"SYSLOG_IDENTIFIER": syslog_identifier,
"_BOOT_ID": "ccc",
"follow": "",
},
range_header=DEFAULT_LOG_RANGE_FOLLOW,
accept=LogFormat.JOURNAL,
)
@pytest.fixture
async def advanced_logs_tester(
api_client: TestClient,
journald_logs: MagicMock,
coresys: CoreSys,
os_available,
journal_logs_reader: MagicMock,
) -> Callable[[str, str], Awaitable[None]]:
"""Fixture that returns a function to test advanced logs endpoints.
This allows tests to avoid explicitly passing all the required fixtures.
Usage:
async def test_my_logs(advanced_logs_tester):
await advanced_logs_tester("/path/prefix", "syslog_identifier")
"""
async def test_logs(path_prefix: str, syslog_identifier: str):
await _common_test_api_advanced_logs(
path_prefix,
syslog_identifier,
api_client,
journald_logs,
coresys,
os_available,
journal_logs_reader,
)
return test_logs

View File

@@ -5,6 +5,7 @@ from unittest.mock import MagicMock, PropertyMock, patch
from aiohttp import ClientResponse from aiohttp import ClientResponse
from aiohttp.test_utils import TestClient from aiohttp.test_utils import TestClient
from docker.errors import DockerException
import pytest import pytest
from supervisor.addons.addon import Addon from supervisor.addons.addon import Addon
@@ -20,7 +21,6 @@ from supervisor.exceptions import HassioError
from supervisor.store.repository import Repository from supervisor.store.repository import Repository
from ..const import TEST_ADDON_SLUG from ..const import TEST_ADDON_SLUG
from . import common_test_api_advanced_logs
def _create_test_event(name: str, state: ContainerState) -> DockerContainerStateEvent: def _create_test_event(name: str, state: ContainerState) -> DockerContainerStateEvent:
@@ -72,21 +72,11 @@ async def test_addons_info_not_installed(
async def test_api_addon_logs( async def test_api_addon_logs(
api_client: TestClient, advanced_logs_tester,
journald_logs: MagicMock,
coresys: CoreSys,
os_available,
install_addon_ssh: Addon, install_addon_ssh: Addon,
): ):
"""Test addon logs.""" """Test addon logs."""
await common_test_api_advanced_logs( await advanced_logs_tester("/addons/local_ssh", "addon_local_ssh")
"/addons/local_ssh",
"addon_local_ssh",
api_client,
journald_logs,
coresys,
os_available,
)
async def test_api_addon_logs_not_installed(api_client: TestClient): async def test_api_addon_logs_not_installed(api_client: TestClient):
@@ -482,6 +472,11 @@ async def test_addon_options_boot_mode_manual_only_invalid(
body["message"] body["message"]
== "Addon local_example boot option is set to manual_only so it cannot be changed" == "Addon local_example boot option is set to manual_only so it cannot be changed"
) )
assert body["error_key"] == "addon_boot_config_cannot_change_error"
assert body["extra_fields"] == {
"addon": "local_example",
"boot_config": "manual_only",
}
async def get_message(resp: ClientResponse, json_expected: bool) -> str: async def get_message(resp: ClientResponse, json_expected: bool) -> str:
@@ -550,3 +545,131 @@ async def test_addon_not_installed(
resp = await api_client.request(method, url) resp = await api_client.request(method, url)
assert resp.status == 400 assert resp.status == 400
assert await get_message(resp, json_expected) == "Addon is not installed" assert await get_message(resp, json_expected) == "Addon is not installed"
async def test_addon_set_options(api_client: TestClient, install_addon_example: Addon):
"""Test setting options for an addon."""
resp = await api_client.post(
"/addons/local_example/options", json={"options": {"message": "test"}}
)
assert resp.status == 200
assert install_addon_example.options == {"message": "test"}
async def test_addon_set_options_error(
api_client: TestClient, install_addon_example: Addon
):
"""Test setting options for an addon."""
resp = await api_client.post(
"/addons/local_example/options", json={"options": {"message": True}}
)
assert resp.status == 400
body = await resp.json()
assert (
body["message"]
== "Add-on local_example has invalid options: not a valid value. Got {'message': True}"
)
assert body["error_key"] == "addon_configuration_invalid_error"
assert body["extra_fields"] == {
"addon": "local_example",
"validation_error": "not a valid value. Got {'message': True}",
}
async def test_addon_start_options_error(
api_client: TestClient,
install_addon_example: Addon,
caplog: pytest.LogCaptureFixture,
):
"""Test error writing options when trying to start addon."""
install_addon_example.options = {"message": "hello"}
# Simulate OS error trying to write the file
with patch("supervisor.utils.json.atomic_write", side_effect=OSError("fail")):
resp = await api_client.post("/addons/local_example/start")
assert resp.status == 500
body = await resp.json()
assert (
body["message"]
== "An unknown error occurred with addon local_example. Check supervisor logs for details (check with 'ha supervisor logs')"
)
assert body["error_key"] == "addon_unknown_error"
assert body["extra_fields"] == {
"addon": "local_example",
"logs_command": "ha supervisor logs",
}
assert "Add-on local_example can't write options" in caplog.text
# Simulate an update with a breaking change for options schema creating failure on start
caplog.clear()
install_addon_example.data["schema"] = {"message": "bool"}
resp = await api_client.post("/addons/local_example/start")
assert resp.status == 400
body = await resp.json()
assert (
body["message"]
== "Add-on local_example has invalid options: expected boolean. Got {'message': 'hello'}"
)
assert body["error_key"] == "addon_configuration_invalid_error"
assert body["extra_fields"] == {
"addon": "local_example",
"validation_error": "expected boolean. Got {'message': 'hello'}",
}
assert (
"Add-on local_example has invalid options: expected boolean. Got {'message': 'hello'}"
in caplog.text
)
@pytest.mark.parametrize(("method", "action"), [("get", "stats"), ("post", "stdin")])
@pytest.mark.usefixtures("install_addon_example")
async def test_addon_not_running_error(
api_client: TestClient, method: str, action: str
):
"""Test addon not running error for endpoints that require that."""
with patch.object(
Addon, "with_stdin", return_value=PropertyMock(return_value=True)
):
resp = await api_client.request(method, f"/addons/local_example/{action}")
assert resp.status == 400
body = await resp.json()
assert body["message"] == "Add-on local_example is not running"
assert body["error_key"] == "addon_not_running_error"
assert body["extra_fields"] == {"addon": "local_example"}
@pytest.mark.usefixtures("install_addon_example")
async def test_addon_write_stdin_not_supported_error(api_client: TestClient):
"""Test error when trying to write stdin to addon that does not support it."""
resp = await api_client.post("/addons/local_example/stdin")
assert resp.status == 400
body = await resp.json()
assert body["message"] == "Add-on local_example does not support writing to stdin"
assert body["error_key"] == "addon_not_supported_write_stdin_error"
assert body["extra_fields"] == {"addon": "local_example"}
@pytest.mark.usefixtures("install_addon_ssh")
async def test_addon_rebuild_fails_error(api_client: TestClient, coresys: CoreSys):
"""Test error when build fails during rebuild for addon."""
coresys.hardware.disk.get_disk_free_space = lambda x: 5000
coresys.docker.containers.run.side_effect = DockerException("fail")
with (
patch.object(CpuArch, "supported", new=PropertyMock(return_value=["aarch64"])),
patch.object(CpuArch, "default", new=PropertyMock(return_value="aarch64")),
patch.object(AddonBuild, "get_docker_args", return_value={}),
):
resp = await api_client.post("/addons/local_ssh/rebuild")
assert resp.status == 500
body = await resp.json()
assert (
body["message"]
== "An unknown error occurred while trying to build the image for addon local_ssh. Check supervisor logs for details (check with 'ha supervisor logs')"
)
assert body["error_key"] == "addon_build_failed_unknown_error"
assert body["extra_fields"] == {
"addon": "local_ssh",
"logs_command": "ha supervisor logs",
}

View File

@@ -1,18 +1,6 @@
"""Test audio api.""" """Test audio api."""
from unittest.mock import MagicMock
from aiohttp.test_utils import TestClient async def test_api_audio_logs(advanced_logs_tester) -> None:
from supervisor.coresys import CoreSys
from tests.api import common_test_api_advanced_logs
async def test_api_audio_logs(
api_client: TestClient, journald_logs: MagicMock, coresys: CoreSys, os_available
):
"""Test audio logs.""" """Test audio logs."""
await common_test_api_advanced_logs( await advanced_logs_tester("/audio", "hassio_audio")
"/audio", "hassio_audio", api_client, journald_logs, coresys, os_available
)

View File

@@ -6,9 +6,12 @@ from unittest.mock import AsyncMock, MagicMock, patch
from aiohttp.hdrs import WWW_AUTHENTICATE from aiohttp.hdrs import WWW_AUTHENTICATE
from aiohttp.test_utils import TestClient from aiohttp.test_utils import TestClient
import pytest import pytest
from securetar import Any
from supervisor.addons.addon import Addon from supervisor.addons.addon import Addon
from supervisor.coresys import CoreSys from supervisor.coresys import CoreSys
from supervisor.exceptions import HomeAssistantAPIError, HomeAssistantWSError
from supervisor.homeassistant.api import HomeAssistantAPI
from tests.common import MockResponse from tests.common import MockResponse
from tests.const import TEST_ADDON_SLUG from tests.const import TEST_ADDON_SLUG
@@ -100,6 +103,52 @@ async def test_password_reset(
assert "Successful password reset for 'john'" in caplog.text assert "Successful password reset for 'john'" in caplog.text
@pytest.mark.parametrize(
("post_mock", "expected_log"),
[
(
MagicMock(return_value=MockResponse(status=400)),
"The user 'john' is not registered",
),
(
MagicMock(side_effect=HomeAssistantAPIError("fail")),
"Can't request password reset on Home Assistant: fail",
),
],
)
async def test_failed_password_reset(
api_client: TestClient,
coresys: CoreSys,
caplog: pytest.LogCaptureFixture,
websession: MagicMock,
post_mock: MagicMock,
expected_log: str,
):
"""Test failed password reset."""
coresys.homeassistant.api.access_token = "abc123"
# pylint: disable-next=protected-access
coresys.homeassistant.api._access_token_expires = datetime.now(tz=UTC) + timedelta(
days=1
)
websession.post = post_mock
resp = await api_client.post(
"/auth/reset", json={"username": "john", "password": "doe"}
)
assert resp.status == 400
body = await resp.json()
assert (
body["message"]
== "Unable to reset password for 'john'. Check supervisor logs for details (check with 'ha supervisor logs')"
)
assert body["error_key"] == "auth_password_reset_error"
assert body["extra_fields"] == {
"user": "john",
"logs_command": "ha supervisor logs",
}
assert expected_log in caplog.text
async def test_list_users( async def test_list_users(
api_client: TestClient, coresys: CoreSys, ha_ws_client: AsyncMock api_client: TestClient, coresys: CoreSys, ha_ws_client: AsyncMock
): ):
@@ -120,6 +169,48 @@ async def test_list_users(
] ]
@pytest.mark.parametrize(
("send_command_mock", "error_response", "expected_log"),
[
(
AsyncMock(return_value=None),
{
"result": "error",
"message": "Home Assistant returned invalid response of `None` instead of a list of users. Check Home Assistant logs for details (check with `ha core logs`)",
"error_key": "auth_list_users_none_response_error",
"extra_fields": {"none": "None", "logs_command": "ha core logs"},
},
"Home Assistant returned invalid response of `None` instead of a list of users. Check Home Assistant logs for details (check with `ha core logs`)",
),
(
AsyncMock(side_effect=HomeAssistantWSError("fail")),
{
"result": "error",
"message": "Can't request listing users on Home Assistant. Check supervisor logs for details (check with 'ha supervisor logs')",
"error_key": "auth_list_users_error",
"extra_fields": {"logs_command": "ha supervisor logs"},
},
"Can't request listing users on Home Assistant: fail",
),
],
)
async def test_list_users_failure(
api_client: TestClient,
ha_ws_client: AsyncMock,
caplog: pytest.LogCaptureFixture,
send_command_mock: AsyncMock,
error_response: dict[str, Any],
expected_log: str,
):
"""Test failure listing users via API."""
ha_ws_client.async_send_command = send_command_mock
resp = await api_client.get("/auth/list")
assert resp.status == 500
result = await resp.json()
assert result == error_response
assert expected_log in caplog.text
@pytest.mark.parametrize( @pytest.mark.parametrize(
("field", "api_client"), ("field", "api_client"),
[("username", TEST_ADDON_SLUG), ("user", TEST_ADDON_SLUG)], [("username", TEST_ADDON_SLUG), ("user", TEST_ADDON_SLUG)],
@@ -156,6 +247,13 @@ async def test_auth_json_failure_none(
mock_check_login.return_value = True mock_check_login.return_value = True
resp = await api_client.post("/auth", json={"username": user, "password": password}) resp = await api_client.post("/auth", json={"username": user, "password": password})
assert resp.status == 401 assert resp.status == 401
assert (
resp.headers["WWW-Authenticate"]
== 'Basic realm="Home Assistant Authentication"'
)
body = await resp.json()
assert body["message"] == "Username and password must be strings"
assert body["error_key"] == "auth_invalid_non_string_value_error"
@pytest.mark.parametrize("api_client", [TEST_ADDON_SLUG], indirect=True) @pytest.mark.parametrize("api_client", [TEST_ADDON_SLUG], indirect=True)
@@ -267,3 +365,26 @@ async def test_non_addon_token_no_auth_access(api_client: TestClient):
"""Test auth where add-on is not allowed to access auth API.""" """Test auth where add-on is not allowed to access auth API."""
resp = await api_client.post("/auth", json={"username": "test", "password": "pass"}) resp = await api_client.post("/auth", json={"username": "test", "password": "pass"})
assert resp.status == 403 assert resp.status == 403
@pytest.mark.parametrize("api_client", [TEST_ADDON_SLUG], indirect=True)
@pytest.mark.usefixtures("install_addon_ssh")
async def test_auth_backend_login_failure(api_client: TestClient):
"""Test backend login failure on auth."""
with (
patch.object(HomeAssistantAPI, "check_api_state", return_value=True),
patch.object(
HomeAssistantAPI, "make_request", side_effect=HomeAssistantAPIError("fail")
),
):
resp = await api_client.post(
"/auth", json={"username": "test", "password": "pass"}
)
assert resp.status == 500
body = await resp.json()
assert (
body["message"]
== "Unable to validate authentication details with Home Assistant. Check supervisor logs for details (check with 'ha supervisor logs')"
)
assert body["error_key"] == "auth_home_assistant_api_validation_error"
assert body["extra_fields"] == {"logs_command": "ha supervisor logs"}

View File

@@ -17,6 +17,7 @@ from supervisor.const import CoreState
from supervisor.coresys import CoreSys from supervisor.coresys import CoreSys
from supervisor.docker.manager import DockerAPI from supervisor.docker.manager import DockerAPI
from supervisor.exceptions import ( from supervisor.exceptions import (
AddonPrePostBackupCommandReturnedError,
AddonsError, AddonsError,
BackupInvalidError, BackupInvalidError,
HomeAssistantBackupError, HomeAssistantBackupError,
@@ -24,6 +25,7 @@ from supervisor.exceptions import (
from supervisor.homeassistant.core import HomeAssistantCore from supervisor.homeassistant.core import HomeAssistantCore
from supervisor.homeassistant.module import HomeAssistant from supervisor.homeassistant.module import HomeAssistant
from supervisor.homeassistant.websocket import HomeAssistantWebSocket from supervisor.homeassistant.websocket import HomeAssistantWebSocket
from supervisor.jobs import SupervisorJob
from supervisor.mounts.mount import Mount from supervisor.mounts.mount import Mount
from supervisor.supervisor import Supervisor from supervisor.supervisor import Supervisor
@@ -401,6 +403,8 @@ async def test_api_backup_errors(
"type": "BackupError", "type": "BackupError",
"message": str(err), "message": str(err),
"stage": None, "stage": None,
"error_key": None,
"extra_fields": None,
} }
] ]
assert job["child_jobs"][2]["name"] == "backup_store_folders" assert job["child_jobs"][2]["name"] == "backup_store_folders"
@@ -437,6 +441,8 @@ async def test_api_backup_errors(
"type": "HomeAssistantBackupError", "type": "HomeAssistantBackupError",
"message": "Backup error", "message": "Backup error",
"stage": "home_assistant", "stage": "home_assistant",
"error_key": None,
"extra_fields": None,
} }
] ]
assert job["child_jobs"][0]["name"] == "backup_store_homeassistant" assert job["child_jobs"][0]["name"] == "backup_store_homeassistant"
@@ -445,6 +451,8 @@ async def test_api_backup_errors(
"type": "HomeAssistantBackupError", "type": "HomeAssistantBackupError",
"message": "Backup error", "message": "Backup error",
"stage": None, "stage": None,
"error_key": None,
"extra_fields": None,
} }
] ]
assert len(job["child_jobs"]) == 1 assert len(job["child_jobs"]) == 1
@@ -749,6 +757,8 @@ async def test_backup_to_multiple_locations_error_on_copy(
"type": "BackupError", "type": "BackupError",
"message": "Could not copy backup to .cloud_backup due to: ", "message": "Could not copy backup to .cloud_backup due to: ",
"stage": None, "stage": None,
"error_key": None,
"extra_fields": None,
} }
] ]
@@ -1483,3 +1493,44 @@ async def test_immediate_list_after_missing_file_restore(
result = await resp.json() result = await resp.json()
assert len(result["data"]["backups"]) == 1 assert len(result["data"]["backups"]) == 1
assert result["data"]["backups"][0]["slug"] == "93b462f8" assert result["data"]["backups"][0]["slug"] == "93b462f8"
@pytest.mark.parametrize("command", ["backup_pre", "backup_post"])
@pytest.mark.usefixtures("install_addon_example", "tmp_supervisor_data")
async def test_pre_post_backup_command_error(
api_client: TestClient, coresys: CoreSys, container: MagicMock, command: str
):
"""Test pre/post backup command error."""
await coresys.core.set_state(CoreState.RUNNING)
coresys.hardware.disk.get_disk_free_space = lambda x: 5000
container.status = "running"
container.exec_run.return_value = (1, b"")
with patch.object(Addon, command, return_value=PropertyMock(return_value="test")):
resp = await api_client.post(
"/backups/new/partial", json={"addons": ["local_example"]}
)
assert resp.status == 200
body = await resp.json()
job_id = body["data"]["job_id"]
job: SupervisorJob | None = None
for j in coresys.jobs.jobs:
if j.name == "backup_store_addons" and j.parent_id == job_id:
job = j
break
assert job
assert job.done is True
assert job.errors[0].type_ == AddonPrePostBackupCommandReturnedError
assert job.errors[0].message == (
"Pre-/Post backup command for add-on local_example returned error code: "
"1. Please report this to the addon developer. Enable debug "
"logging to capture complete command output using ha supervisor options --logging debug"
)
assert job.errors[0].error_key == "addon_pre_post_backup_command_returned_error"
assert job.errors[0].extra_fields == {
"addon": "local_example",
"exit_code": 1,
"debug_logging_command": "ha supervisor options --logging debug",
}

View File

@@ -1,13 +1,12 @@
"""Test DNS API.""" """Test DNS API."""
from unittest.mock import MagicMock, patch from unittest.mock import patch
from aiohttp.test_utils import TestClient from aiohttp.test_utils import TestClient
from supervisor.coresys import CoreSys from supervisor.coresys import CoreSys
from supervisor.dbus.resolved import Resolved from supervisor.dbus.resolved import Resolved
from tests.api import common_test_api_advanced_logs
from tests.dbus_service_mocks.base import DBusServiceMock from tests.dbus_service_mocks.base import DBusServiceMock
from tests.dbus_service_mocks.resolved import Resolved as ResolvedService from tests.dbus_service_mocks.resolved import Resolved as ResolvedService
@@ -66,15 +65,6 @@ async def test_options(api_client: TestClient, coresys: CoreSys):
restart.assert_called_once() restart.assert_called_once()
async def test_api_dns_logs( async def test_api_dns_logs(advanced_logs_tester):
api_client: TestClient, journald_logs: MagicMock, coresys: CoreSys, os_available
):
"""Test dns logs.""" """Test dns logs."""
await common_test_api_advanced_logs( await advanced_logs_tester("/dns", "hassio_dns")
"/dns",
"hassio_dns",
api_client,
journald_logs,
coresys,
os_available,
)

View File

@@ -18,26 +18,18 @@ from supervisor.homeassistant.const import WSEvent
from supervisor.homeassistant.core import HomeAssistantCore from supervisor.homeassistant.core import HomeAssistantCore
from supervisor.homeassistant.module import HomeAssistant from supervisor.homeassistant.module import HomeAssistant
from tests.api import common_test_api_advanced_logs
from tests.common import AsyncIterator, load_json_fixture from tests.common import AsyncIterator, load_json_fixture
@pytest.mark.parametrize("legacy_route", [True, False]) @pytest.mark.parametrize("legacy_route", [True, False])
async def test_api_core_logs( async def test_api_core_logs(
api_client: TestClient, advanced_logs_tester: AsyncMock,
journald_logs: MagicMock,
coresys: CoreSys,
os_available,
legacy_route: bool, legacy_route: bool,
): ):
"""Test core logs.""" """Test core logs."""
await common_test_api_advanced_logs( await advanced_logs_tester(
f"/{'homeassistant' if legacy_route else 'core'}", f"/{'homeassistant' if legacy_route else 'core'}",
"homeassistant", "homeassistant",
api_client,
journald_logs,
coresys,
os_available,
) )

View File

@@ -272,7 +272,7 @@ async def test_advaced_logs_query_parameters(
range_header=DEFAULT_RANGE, range_header=DEFAULT_RANGE,
accept=LogFormat.JOURNAL, accept=LogFormat.JOURNAL,
) )
journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE) journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE, False)
journal_logs_reader.reset_mock() journal_logs_reader.reset_mock()
journald_logs.reset_mock() journald_logs.reset_mock()
@@ -290,7 +290,19 @@ async def test_advaced_logs_query_parameters(
range_header="entries=:-52:53", range_header="entries=:-52:53",
accept=LogFormat.JOURNAL, accept=LogFormat.JOURNAL,
) )
journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE) journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE, False)
journal_logs_reader.reset_mock()
journald_logs.reset_mock()
# Check no_colors query parameter
await api_client.get("/host/logs?no_colors")
journald_logs.assert_called_once_with(
params={"SYSLOG_IDENTIFIER": coresys.host.logs.default_identifiers},
range_header=DEFAULT_RANGE,
accept=LogFormat.JOURNAL,
)
journal_logs_reader.assert_called_with(ANY, LogFormatter.VERBOSE, True)
async def test_advanced_logs_boot_id_offset( async def test_advanced_logs_boot_id_offset(
@@ -343,24 +355,24 @@ async def test_advanced_logs_formatters(
"""Test advanced logs formatters varying on Accept header.""" """Test advanced logs formatters varying on Accept header."""
await api_client.get("/host/logs") await api_client.get("/host/logs")
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE) journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE, False)
journal_logs_reader.reset_mock() journal_logs_reader.reset_mock()
headers = {"Accept": "text/x-log"} headers = {"Accept": "text/x-log"}
await api_client.get("/host/logs", headers=headers) await api_client.get("/host/logs", headers=headers)
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE) journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE, False)
journal_logs_reader.reset_mock() journal_logs_reader.reset_mock()
await api_client.get("/host/logs/identifiers/test") await api_client.get("/host/logs/identifiers/test")
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.PLAIN) journal_logs_reader.assert_called_once_with(ANY, LogFormatter.PLAIN, False)
journal_logs_reader.reset_mock() journal_logs_reader.reset_mock()
headers = {"Accept": "text/x-log"} headers = {"Accept": "text/x-log"}
await api_client.get("/host/logs/identifiers/test", headers=headers) await api_client.get("/host/logs/identifiers/test", headers=headers)
journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE) journal_logs_reader.assert_called_once_with(ANY, LogFormatter.VERBOSE, False)
async def test_advanced_logs_errors(coresys: CoreSys, api_client: TestClient): async def test_advanced_logs_errors(coresys: CoreSys, api_client: TestClient):

View File

@@ -374,6 +374,8 @@ async def test_job_with_error(
"type": "SupervisorError", "type": "SupervisorError",
"message": "bad", "message": "bad",
"stage": "test", "stage": "test",
"error_key": None,
"extra_fields": None,
} }
], ],
"child_jobs": [ "child_jobs": [
@@ -391,6 +393,8 @@ async def test_job_with_error(
"type": "SupervisorError", "type": "SupervisorError",
"message": "bad", "message": "bad",
"stage": None, "stage": None,
"error_key": None,
"extra_fields": None,
} }
], ],
"child_jobs": [], "child_jobs": [],

View File

@@ -1,23 +1,6 @@
"""Test multicast api.""" """Test multicast api."""
from unittest.mock import MagicMock
from aiohttp.test_utils import TestClient async def test_api_multicast_logs(advanced_logs_tester):
from supervisor.coresys import CoreSys
from tests.api import common_test_api_advanced_logs
async def test_api_multicast_logs(
api_client: TestClient, journald_logs: MagicMock, coresys: CoreSys, os_available
):
"""Test multicast logs.""" """Test multicast logs."""
await common_test_api_advanced_logs( await advanced_logs_tester("/multicast", "hassio_multicast")
"/multicast",
"hassio_multicast",
api_client,
journald_logs,
coresys,
os_available,
)

View File

@@ -4,7 +4,6 @@ import asyncio
from pathlib import Path from pathlib import Path
from unittest.mock import AsyncMock, MagicMock, PropertyMock, patch from unittest.mock import AsyncMock, MagicMock, PropertyMock, patch
from aiohttp import ClientResponse
from aiohttp.test_utils import TestClient from aiohttp.test_utils import TestClient
from awesomeversion import AwesomeVersion from awesomeversion import AwesomeVersion
import pytest import pytest
@@ -290,14 +289,6 @@ async def test_api_detached_addon_documentation(
assert result == "Addon local_ssh does not exist in the store" assert result == "Addon local_ssh does not exist in the store"
async def get_message(resp: ClientResponse, json_expected: bool) -> str:
"""Get message from response based on response type."""
if json_expected:
body = await resp.json()
return body["message"]
return await resp.text()
@pytest.mark.parametrize( @pytest.mark.parametrize(
("method", "url", "json_expected"), ("method", "url", "json_expected"),
[ [
@@ -323,7 +314,13 @@ async def test_store_addon_not_found(
"""Test store addon not found error.""" """Test store addon not found error."""
resp = await api_client.request(method, url) resp = await api_client.request(method, url)
assert resp.status == 404 assert resp.status == 404
assert await get_message(resp, json_expected) == "Addon bad does not exist" if json_expected:
body = await resp.json()
assert body["message"] == "Addon bad does not exist in the store"
assert body["error_key"] == "store_addon_not_found_error"
assert body["extra_fields"] == {"addon": "bad"}
else:
assert await resp.text() == "Addon bad does not exist in the store"
@pytest.mark.parametrize( @pytest.mark.parametrize(

View File

@@ -7,6 +7,7 @@ from unittest.mock import AsyncMock, MagicMock, PropertyMock, patch
from aiohttp.test_utils import TestClient from aiohttp.test_utils import TestClient
from awesomeversion import AwesomeVersion from awesomeversion import AwesomeVersion
from blockbuster import BlockingError from blockbuster import BlockingError
from docker.errors import DockerException
import pytest import pytest
from supervisor.const import CoreState from supervisor.const import CoreState
@@ -18,7 +19,6 @@ from supervisor.store.repository import Repository
from supervisor.supervisor import Supervisor from supervisor.supervisor import Supervisor
from supervisor.updater import Updater from supervisor.updater import Updater
from tests.api import common_test_api_advanced_logs
from tests.common import AsyncIterator, load_json_fixture from tests.common import AsyncIterator, load_json_fixture
from tests.dbus_service_mocks.base import DBusServiceMock from tests.dbus_service_mocks.base import DBusServiceMock
from tests.dbus_service_mocks.os_agent import OSAgent as OSAgentService from tests.dbus_service_mocks.os_agent import OSAgent as OSAgentService
@@ -155,18 +155,9 @@ async def test_api_supervisor_options_diagnostics(
assert coresys.dbus.agent.diagnostics is False assert coresys.dbus.agent.diagnostics is False
async def test_api_supervisor_logs( async def test_api_supervisor_logs(advanced_logs_tester):
api_client: TestClient, journald_logs: MagicMock, coresys: CoreSys, os_available
):
"""Test supervisor logs.""" """Test supervisor logs."""
await common_test_api_advanced_logs( await advanced_logs_tester("/supervisor", "hassio_supervisor")
"/supervisor",
"hassio_supervisor",
api_client,
journald_logs,
coresys,
os_available,
)
async def test_api_supervisor_fallback( async def test_api_supervisor_fallback(
@@ -417,3 +408,37 @@ async def test_api_progress_updates_supervisor_update(
"done": True, "done": True,
}, },
] ]
async def test_api_supervisor_stats(api_client: TestClient, coresys: CoreSys):
"""Test supervisor stats."""
coresys.docker.containers.get.return_value.status = "running"
coresys.docker.containers.get.return_value.stats.return_value = load_json_fixture(
"container_stats.json"
)
resp = await api_client.get("/supervisor/stats")
assert resp.status == 200
result = await resp.json()
assert result["data"]["cpu_percent"] == 90.0
assert result["data"]["memory_usage"] == 59700000
assert result["data"]["memory_limit"] == 4000000000
assert result["data"]["memory_percent"] == 1.49
async def test_supervisor_api_stats_failure(
api_client: TestClient, coresys: CoreSys, caplog: pytest.LogCaptureFixture
):
"""Test supervisor stats failure."""
coresys.docker.containers.get.side_effect = DockerException("fail")
resp = await api_client.get("/supervisor/stats")
assert resp.status == 500
body = await resp.json()
assert (
body["message"]
== "An unknown error occurred with Supervisor. Check supervisor logs for details (check with 'ha supervisor logs')"
)
assert body["error_key"] == "supervisor_unknown_error"
assert body["extra_fields"] == {"logs_command": "ha supervisor logs"}
assert "Could not inspect container 'hassio_supervisor': fail" in caplog.text

View File

@@ -445,11 +445,6 @@ async def test_install_progress_rounding_does_not_cause_misses(
] ]
coresys.docker.images.pull.return_value = AsyncIterator(logs) coresys.docker.images.pull.return_value = AsyncIterator(logs)
with (
patch.object(
type(coresys.supervisor), "arch", PropertyMock(return_value="i386")
),
):
# Schedule job so we can listen for the end. Then we can assert against the WS mock # Schedule job so we can listen for the end. Then we can assert against the WS mock
event = asyncio.Event() event = asyncio.Event()
job, install_task = coresys.jobs.schedule_job( job, install_task = coresys.jobs.schedule_job(
@@ -664,3 +659,64 @@ async def test_install_progress_handles_layers_skipping_download(
assert job.done is True assert job.done is True
assert job.progress == 100 assert job.progress == 100
capture_exception.assert_not_called() capture_exception.assert_not_called()
async def test_missing_total_handled_gracefully(
coresys: CoreSys,
test_docker_interface: DockerInterface,
ha_ws_client: AsyncMock,
capture_exception: Mock,
):
"""Test missing 'total' fields in progress details handled gracefully."""
coresys.core.set_state(CoreState.RUNNING)
# Progress details with missing 'total' fields observed in real-world pulls
logs = [
{
"status": "Pulling from home-assistant/odroid-n2-homeassistant",
"id": "2025.7.1",
},
{"status": "Pulling fs layer", "progressDetail": {}, "id": "1e214cd6d7d0"},
{
"status": "Downloading",
"progressDetail": {"current": 436480882},
"progress": "[===================================================] 436.5MB/436.5MB",
"id": "1e214cd6d7d0",
},
{"status": "Verifying Checksum", "progressDetail": {}, "id": "1e214cd6d7d0"},
{"status": "Download complete", "progressDetail": {}, "id": "1e214cd6d7d0"},
{
"status": "Extracting",
"progressDetail": {"current": 436480882},
"progress": "[===================================================] 436.5MB/436.5MB",
"id": "1e214cd6d7d0",
},
{"status": "Pull complete", "progressDetail": {}, "id": "1e214cd6d7d0"},
{
"status": "Digest: sha256:7d97da645f232f82a768d0a537e452536719d56d484d419836e53dbe3e4ec736"
},
{
"status": "Status: Downloaded newer image for ghcr.io/home-assistant/odroid-n2-homeassistant:2025.7.1"
},
]
coresys.docker.images.pull.return_value = AsyncIterator(logs)
# Schedule job so we can listen for the end. Then we can assert against the WS mock
event = asyncio.Event()
job, install_task = coresys.jobs.schedule_job(
test_docker_interface.install,
JobSchedulerOptions(),
AwesomeVersion("1.2.3"),
"test",
)
async def listen_for_job_end(reference: SupervisorJob):
if reference.uuid != job.uuid:
return
event.set()
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_END, listen_for_job_end)
await install_task
await event.wait()
capture_exception.assert_not_called()

View File

@@ -90,6 +90,49 @@ async def test_logs_coloured(journald_gateway: MagicMock, coresys: CoreSys):
) )
async def test_logs_no_colors(journald_gateway: MagicMock, coresys: CoreSys):
"""Test ANSI color codes being stripped when no_colors=True."""
journald_gateway.content.feed_data(
load_fixture("logs_export_supervisor.txt").encode("utf-8")
)
journald_gateway.content.feed_eof()
async with coresys.host.logs.journald_logs() as resp:
cursor, line = await anext(journal_logs_reader(resp, no_colors=True))
assert (
cursor
== "s=83fee99ca0c3466db5fc120d52ca7dd8;i=2049389;b=f5a5c442fa6548cf97474d2d57c920b3;m=4263828e8c;t=612dda478b01b;x=9ae12394c9326930"
)
# Colors should be stripped
assert (
line == "24-03-04 23:56:56 INFO (MainThread) [__main__] Closing Supervisor"
)
async def test_logs_verbose_no_colors(journald_gateway: MagicMock, coresys: CoreSys):
"""Test ANSI color codes being stripped from verbose formatted logs when no_colors=True."""
journald_gateway.content.feed_data(
load_fixture("logs_export_supervisor.txt").encode("utf-8")
)
journald_gateway.content.feed_eof()
async with coresys.host.logs.journald_logs() as resp:
cursor, line = await anext(
journal_logs_reader(
resp, log_formatter=LogFormatter.VERBOSE, no_colors=True
)
)
assert (
cursor
== "s=83fee99ca0c3466db5fc120d52ca7dd8;i=2049389;b=f5a5c442fa6548cf97474d2d57c920b3;m=4263828e8c;t=612dda478b01b;x=9ae12394c9326930"
)
# Colors should be stripped in verbose format too
assert (
line
== "2024-03-04 22:56:56.709 ha-hloub hassio_supervisor[466]: 24-03-04 23:56:56 INFO (MainThread) [__main__] Closing Supervisor"
)
async def test_boot_ids( async def test_boot_ids(
journald_gateway: MagicMock, journald_gateway: MagicMock,
coresys: CoreSys, coresys: CoreSys,

View File

@@ -1179,7 +1179,6 @@ async def test_job_scheduled_delay(coresys: CoreSys):
async def test_job_scheduled_at(coresys: CoreSys): async def test_job_scheduled_at(coresys: CoreSys):
"""Test job that schedules a job to start at a specified time.""" """Test job that schedules a job to start at a specified time."""
dt = datetime.now()
class TestClass: class TestClass:
"""Test class.""" """Test class."""
@@ -1189,10 +1188,12 @@ async def test_job_scheduled_at(coresys: CoreSys):
self.coresys = coresys self.coresys = coresys
@Job(name="test_job_scheduled_at_job_scheduler") @Job(name="test_job_scheduled_at_job_scheduler")
async def job_scheduler(self) -> tuple[SupervisorJob, asyncio.TimerHandle]: async def job_scheduler(
self, scheduled_time: datetime
) -> tuple[SupervisorJob, asyncio.TimerHandle]:
"""Schedule a job to run at specified time.""" """Schedule a job to run at specified time."""
return self.coresys.jobs.schedule_job( return self.coresys.jobs.schedule_job(
self.job_task, JobSchedulerOptions(start_at=dt + timedelta(seconds=0.1)) self.job_task, JobSchedulerOptions(start_at=scheduled_time)
) )
@Job(name="test_job_scheduled_at_job_task") @Job(name="test_job_scheduled_at_job_task")
@@ -1201,29 +1202,28 @@ async def test_job_scheduled_at(coresys: CoreSys):
self.coresys.jobs.current.stage = "work" self.coresys.jobs.current.stage = "work"
test = TestClass(coresys) test = TestClass(coresys)
job_started = asyncio.Event()
job_ended = asyncio.Event() # Schedule job to run 0.1 seconds from now
scheduled_time = datetime.now() + timedelta(seconds=0.1)
job, _ = await test.job_scheduler(scheduled_time)
started = False
ended = False
async def start_listener(evt_job: SupervisorJob): async def start_listener(evt_job: SupervisorJob):
if evt_job.uuid == job.uuid: nonlocal started
job_started.set() started = started or evt_job.uuid == job.uuid
async def end_listener(evt_job: SupervisorJob): async def end_listener(evt_job: SupervisorJob):
if evt_job.uuid == job.uuid: nonlocal ended
job_ended.set() ended = ended or evt_job.uuid == job.uuid
async with time_machine.travel(dt):
job, _ = await test.job_scheduler()
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_START, start_listener) coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_START, start_listener)
coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_END, end_listener) coresys.bus.register_event(BusEvent.SUPERVISOR_JOB_END, end_listener)
# Advance time to exactly when job should start and wait for completion await asyncio.sleep(0.2)
async with time_machine.travel(dt + timedelta(seconds=0.1)):
await asyncio.wait_for(
asyncio.gather(job_started.wait(), job_ended.wait()), timeout=1.0
)
assert started
assert ended
assert job.done assert job.done
assert job.name == "test_job_scheduled_at_job_task" assert job.name == "test_job_scheduled_at_job_task"
assert job.stage == "work" assert job.stage == "work"

View File

@@ -200,6 +200,8 @@ async def test_notify_on_change(coresys: CoreSys, ha_ws_client: AsyncMock):
"type": "HassioError", "type": "HassioError",
"message": "Unknown error, see Supervisor logs (check with 'ha supervisor logs')", "message": "Unknown error, see Supervisor logs (check with 'ha supervisor logs')",
"stage": "test", "stage": "test",
"error_key": None,
"extra_fields": None,
} }
], ],
"created": ANY, "created": ANY,
@@ -228,6 +230,8 @@ async def test_notify_on_change(coresys: CoreSys, ha_ws_client: AsyncMock):
"type": "HassioError", "type": "HassioError",
"message": "Unknown error, see Supervisor logs (check with 'ha supervisor logs')", "message": "Unknown error, see Supervisor logs (check with 'ha supervisor logs')",
"stage": "test", "stage": "test",
"error_key": None,
"extra_fields": None,
} }
], ],
"created": ANY, "created": ANY,

View File

@@ -0,0 +1,43 @@
"""Test evaluation supported system architectures."""
from unittest.mock import PropertyMock, patch
import pytest
from supervisor.const import CoreState
from supervisor.coresys import CoreSys
from supervisor.resolution.evaluations.system_architecture import (
EvaluateSystemArchitecture,
)
@pytest.mark.parametrize("arch", ["i386", "armhf", "armv7"])
async def test_evaluation_unsupported_architectures(
coresys: CoreSys,
arch: str,
):
"""Test evaluation of unsupported system architectures."""
system_architecture = EvaluateSystemArchitecture(coresys)
await coresys.core.set_state(CoreState.INITIALIZE)
with patch.object(
type(coresys.supervisor), "arch", PropertyMock(return_value=arch)
):
await system_architecture()
assert system_architecture.reason in coresys.resolution.unsupported
@pytest.mark.parametrize("arch", ["amd64", "aarch64"])
async def test_evaluation_supported_architectures(
coresys: CoreSys,
arch: str,
):
"""Test evaluation of supported system architectures."""
system_architecture = EvaluateSystemArchitecture(coresys)
await coresys.core.set_state(CoreState.INITIALIZE)
with patch.object(
type(coresys.supervisor), "arch", PropertyMock(return_value=arch)
):
await system_architecture()
assert system_architecture.reason not in coresys.resolution.unsupported

View File

@@ -86,6 +86,22 @@ def test_format_verbose_newlines():
) )
def test_format_verbose_colors():
"""Test verbose formatter with ANSI colors in message."""
fields = {
"__REALTIME_TIMESTAMP": "1379403171000000",
"_HOSTNAME": "homeassistant",
"SYSLOG_IDENTIFIER": "python",
"_PID": "666",
"MESSAGE": "\x1b[32mHello, world!\x1b[0m",
}
assert (
journal_verbose_formatter(fields)
== "2013-09-17 07:32:51.000 homeassistant python[666]: \x1b[32mHello, world!\x1b[0m"
)
async def test_parsing_simple(): async def test_parsing_simple():
"""Test plain formatter.""" """Test plain formatter."""
journal_logs, stream = _journal_logs_mock() journal_logs, stream = _journal_logs_mock()
@@ -297,3 +313,54 @@ async def test_parsing_non_utf8_in_binary_message():
) )
_, line = await anext(journal_logs_reader(journal_logs)) _, line = await anext(journal_logs_reader(journal_logs))
assert line == "Hello, \ufffd world!" assert line == "Hello, \ufffd world!"
def test_format_plain_no_colors():
"""Test plain formatter strips ANSI color codes when no_colors=True."""
fields = {"MESSAGE": "\x1b[32mHello, world!\x1b[0m"}
assert journal_plain_formatter(fields, no_colors=True) == "Hello, world!"
def test_format_verbose_no_colors():
"""Test verbose formatter strips ANSI color codes when no_colors=True."""
fields = {
"__REALTIME_TIMESTAMP": "1379403171000000",
"_HOSTNAME": "homeassistant",
"SYSLOG_IDENTIFIER": "python",
"_PID": "666",
"MESSAGE": "\x1b[32mHello, world!\x1b[0m",
}
assert (
journal_verbose_formatter(fields, no_colors=True)
== "2013-09-17 07:32:51.000 homeassistant python[666]: Hello, world!"
)
async def test_parsing_colored_logs_verbose_no_colors():
"""Test verbose formatter strips colors from colored logs."""
journal_logs, stream = _journal_logs_mock()
stream.feed_data(
b"__REALTIME_TIMESTAMP=1379403171000000\n"
b"_HOSTNAME=homeassistant\n"
b"SYSLOG_IDENTIFIER=python\n"
b"_PID=666\n"
b"MESSAGE\n\x0e\x00\x00\x00\x00\x00\x00\x00\x1b[31mERROR\x1b[0m\n"
b"AFTER=after\n\n"
)
_, line = await anext(
journal_logs_reader(
journal_logs, log_formatter=LogFormatter.VERBOSE, no_colors=True
)
)
assert line == "2013-09-17 07:32:51.000 homeassistant python[666]: ERROR"
async def test_parsing_multiple_color_codes():
"""Test stripping multiple ANSI color codes in single message."""
journal_logs, stream = _journal_logs_mock()
stream.feed_data(
b"MESSAGE\n\x29\x00\x00\x00\x00\x00\x00\x00\x1b[31mRed\x1b[0m \x1b[32mGreen\x1b[0m \x1b[34mBlue\x1b[0m\n"
b"AFTER=after\n\n"
)
_, line = await anext(journal_logs_reader(journal_logs, no_colors=True))
assert line == "Red Green Blue"