Compare commits

...

83 Commits
179 ... 191

Author SHA1 Message Date
Pascal Vizeli
8cdc769ec8 Merge pull request #1343 from home-assistant/dev
Release 191
2019-10-23 15:59:42 +02:00
Pascal Vizeli
76e1304241 Downgrade aiohttp to 3.6.1 to fix lost connections (#1342) 2019-10-23 15:58:54 +02:00
Pascal Vizeli
eb9b1ff03d Bump version 191 2019-10-22 15:04:04 +02:00
Pascal Vizeli
b3b12d35fd Merge pull request #1341 from home-assistant/dev
Release 190
2019-10-22 14:57:25 +02:00
Pascal Vizeli
74485262e7 Prune network/interface on repair (#1340)
* Prune network/interface on repair

* Force disconnect
2019-10-22 14:30:14 +02:00
Pascal Vizeli
615e68b29b Add discovery support for Almond (#1339)
* Add discovery support for Almond

* Fix docstring
2019-10-22 13:39:46 +02:00
dependabot-preview[bot]
927b4695c9 Bump gitpython from 3.0.3 to 3.0.4 (#1338)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.3 to 3.0.4.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.3...3.0.4)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-22 13:31:14 +02:00
Pascal Vizeli
11811701d0 Add snapshot_exclude option (#1337)
* Add snapshot tar filter

* Add filter to add-on

* Fix bug

* Fix
2019-10-21 14:48:24 +02:00
Pascal Vizeli
05c8022db3 Check path on extractall (#1336)
* Check path on extractall

* code cleanup

* Add logger

* Fix issue

* Add tests
2019-10-21 12:23:00 +02:00
dependabot-preview[bot]
a9ebb147c5 Bump cryptography from 2.7 to 2.8 (#1332)
Bumps [cryptography](https://github.com/pyca/cryptography) from 2.7 to 2.8.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/2.7...2.8)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-18 14:54:27 +02:00
dependabot-preview[bot]
ba8ca4d9ee Bump pylint from 2.4.2 to 2.4.3 (#1334)
Bumps [pylint](https://github.com/PyCQA/pylint) from 2.4.2 to 2.4.3.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.4.2...pylint-2.4.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-18 14:53:35 +02:00
dependabot-preview[bot]
3574df1385 Bump attrs from 19.1.0 to 19.3.0 (#1329)
* Bump attrs from 19.1.0 to 19.3.0

Bumps [attrs](https://github.com/python-attrs/attrs) from 19.1.0 to 19.3.0.
- [Release notes](https://github.com/python-attrs/attrs/releases)
- [Changelog](https://github.com/python-attrs/attrs/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/python-attrs/attrs/compare/19.1.0...19.3.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

* Fix attr Deprecations
2019-10-17 16:48:06 +02:00
dependabot-preview[bot]
b4497d231b Bump pytz from 2019.2 to 2019.3 (#1323)
Bumps [pytz](https://github.com/stub42/pytz) from 2019.2 to 2019.3.
- [Release notes](https://github.com/stub42/pytz/releases)
- [Commits](https://github.com/stub42/pytz/compare/release_2019.2...release_2019.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-14 11:40:50 +02:00
dependabot-preview[bot]
5aa9b0245a Bump pytest from 5.2.0 to 5.2.1 (#1324)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.2.0 to 5.2.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.0...5.2.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-14 11:33:59 +02:00
dependabot-preview[bot]
4c72c3aafc Bump aiohttp from 3.6.1 to 3.6.2 (#1325)
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.6.1 to 3.6.2.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.6.1...v3.6.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-14 11:32:45 +02:00
dependabot-preview[bot]
bf4f40f991 Bump docker from 4.0.2 to 4.1.0 (#1321)
Bumps [docker](https://github.com/docker/docker-py) from 4.0.2 to 4.1.0.
- [Release notes](https://github.com/docker/docker-py/releases)
- [Commits](https://github.com/docker/docker-py/compare/4.0.2...4.1.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-14 11:31:03 +02:00
Timmo
603334f4f3 Add support for Home Panel discovery (#1327) 2019-10-14 11:30:18 +02:00
dependabot-preview[bot]
46548af165 Bump gitpython from 3.0.2 to 3.0.3 (#1319)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.2 to 3.0.3.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.2...3.0.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-03 13:06:20 +02:00
dependabot-preview[bot]
8ef32b40c8 Bump pylint from 2.4.1 to 2.4.2 (#1314)
Bumps [pylint](https://github.com/PyCQA/pylint) from 2.4.1 to 2.4.2.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.4.1...pylint-2.4.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-30 22:21:00 +02:00
dependabot-preview[bot]
fb25377087 Bump pytest from 5.1.3 to 5.2.0 (#1315)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.1.3 to 5.2.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.1.3...5.2.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-30 22:15:12 +02:00
Pascal Vizeli
a75fd2d07e Update devcontainer.json 2019-09-30 11:01:59 +02:00
Pascal Vizeli
e30f39e97e Update devcontainer.json 2019-09-30 11:01:35 +02:00
dependabot-preview[bot]
4818ad7465 Bump pylint from 2.4.0 to 2.4.1 (#1308)
Bumps [pylint](https://github.com/PyCQA/pylint) from 2.4.0 to 2.4.1.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.4.0...pylint-2.4.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-25 18:08:33 +02:00
dependabot-preview[bot]
5e4e9740c7 Bump pylint from 2.3.1 to 2.4.0 (#1307)
* Bump pylint from 2.3.1 to 2.4.0

Bumps [pylint](https://github.com/PyCQA/pylint) from 2.3.1 to 2.4.0.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.3.1...pylint-2.4.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

* Update __main__.py

* Update bootstrap.py

* Update homeassistant.py

* Update __init__.py
2019-09-25 09:41:16 +02:00
Pascal Vizeli
d4e41dbf80 Bump version 190 2019-09-24 15:25:28 +02:00
Pascal Vizeli
cea1a1a15f Merge pull request #1306 from home-assistant/dev
Release 189
2019-09-24 15:24:27 +02:00
dependabot-preview[bot]
c2700b14dc Bump packaging from 19.1 to 19.2 (#1305)
Bumps [packaging](https://github.com/pypa/packaging) from 19.1 to 19.2.
- [Release notes](https://github.com/pypa/packaging/releases)
- [Changelog](https://github.com/pypa/packaging/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pypa/packaging/compare/19.1...19.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-24 10:31:56 +02:00
dependabot-preview[bot]
07d27170db Bump pytest from 5.1.2 to 5.1.3 (#1303)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.1.2 to 5.1.3.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.1.2...5.1.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-24 10:31:13 +02:00
Pascal Vizeli
8eb8c07df6 Update uvloop 0.13.0 (#1302) 2019-09-23 23:00:57 +02:00
Pascal Vizeli
7bee6f884c Update aiohttp 3.6.1 (#1301) 2019-09-23 22:45:02 +02:00
Franck Nijhof
78dd20e314 Fixes accidental string concatenation in classifiers list (#1300) 2019-09-23 12:23:57 +02:00
Pascal Vizeli
2a011b6448 Fix typo to validate list (#1298)
* Fix typo to validate list

* Fix lint

* Add Typo
2019-09-20 17:28:23 +02:00
Pascal Vizeli
5c90370ec8 Bump version 189 2019-09-15 15:08:12 +02:00
Pascal Vizeli
120465b88d Merge pull request #1294 from home-assistant/dev
Release 188
2019-09-15 15:07:39 +02:00
Pascal Vizeli
c77292439a Fix invalid secrets (#1293)
* Fix invalid secrets format

* Fix style
2019-09-15 15:06:22 +02:00
Pascal Vizeli
0a0209f81a Bump version 188 2019-09-12 23:32:20 +02:00
Pascal Vizeli
69a7ed8a5c Merge pull request #1291 from home-assistant/dev
Release 187
2019-09-12 23:30:53 +02:00
Pascal Vizeli
8df35ab488 Fix detection of HA container / image (#1290) 2019-09-12 23:28:55 +02:00
Pascal Vizeli
a12567d0a8 Update secrets handling (#1289)
* Update secrets handling

* Remove start pre_check

* fix lint

* remove tasker
2019-09-12 23:16:56 +02:00
Pascal Vizeli
64fe190119 Bump version 187 2019-09-11 18:29:24 +02:00
Pascal Vizeli
e3ede66943 Merge pull request #1287 from home-assistant/dev
Release 186
2019-09-11 18:26:22 +02:00
Pascal Vizeli
2672b800d4 DNS fallback to docker internal one (#1286)
* DNS fallback to docker internal one

* Fix log

* Fix style

* Fix startup handling
2019-09-11 17:54:16 +02:00
Pascal Vizeli
c60d4bda92 Check supervisor docker permission (#1285)
* Check supervisor docker permission

* Update log message
2019-09-11 17:47:49 +02:00
Pascal Vizeli
db9d0f2639 Fix lint (#1284) 2019-09-11 16:37:49 +02:00
Pascal Vizeli
02d4045ec3 Add secrets support for options (#1283)
* Add secrets API

* Don't expose secrets
2019-09-11 16:29:34 +02:00
Pascal Vizeli
a308ea6927 Update Dockerfile 2019-09-05 14:20:35 +02:00
Pascal Vizeli
edc5e5e812 Update Dockerfile 2019-09-05 12:41:42 +02:00
dependabot-preview[bot]
23b65cb479 Bump pytest from 5.1.1 to 5.1.2 (#1278)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.1.1 to 5.1.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.1.1...5.1.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-02 23:07:11 +02:00
Pascal Vizeli
e5eabd2143 Fix typing warning / hardware (#1276) 2019-09-02 15:54:37 +02:00
Pascal Vizeli
b0dd043975 Fix typing warning / hardware (#1277) 2019-09-02 15:45:31 +02:00
Pascal Vizeli
435a1096ed Cleanup debug gdbus output (#1275) 2019-09-02 15:08:26 +02:00
Pascal Vizeli
21a9084ca0 Bump version 186 2019-09-02 14:39:56 +02:00
Pascal Vizeli
10d9135d86 Merge pull request #1274 from home-assistant/dev
Release 185
2019-09-02 14:39:17 +02:00
Pascal Vizeli
272d8b29f3 Fix version handling with nightly (#1273)
* Fix version handling with nightly

* fix lint
2019-09-02 14:37:59 +02:00
Pascal Vizeli
3d665b9eec Support for udev device trigger (#1272) 2019-09-02 14:07:09 +02:00
Pascal Vizeli
c563f484c9 Add support for udev trigger 2019-09-02 11:28:49 +00:00
Pascal Vizeli
38268ea4ea Bump version to 185 2019-08-26 10:04:36 +02:00
Pascal Vizeli
c1ad64cddf Merge pull request #1264 from home-assistant/dev
Release 184
2019-08-26 10:03:53 +02:00
Pascal Vizeli
b898cd2a3a Preserve ordering of locals (#1263)
* Preserve ordering of locals

* fix lint
2019-08-26 09:45:10 +02:00
Pascal Vizeli
937b31d845 Bump version 184 2019-08-23 14:22:47 +02:00
Pascal Vizeli
e4e655493b Merge pull request #1259 from home-assistant/dev
Release 183
2019-08-23 14:21:58 +02:00
Pascal Vizeli
387d2dcc2e Add support for gvariant annotations (#1258) 2019-08-23 13:41:17 +02:00
Pascal Vizeli
8abe33d48a Bump version 183 2019-08-22 18:58:02 +02:00
Pascal Vizeli
860442d5c4 Merge pull request #1256 from home-assistant/dev
Release 182
2019-08-22 18:57:38 +02:00
Pascal Vizeli
ce5183ce16 Add support to read Host DNS (#1255)
* Add support to read Host DNS

* Include properties

* Improve host info handling

* Add API

* Better abstraction

* Change prio list

* Address lint

* fix get properties

* Fix nameserver list

* Small cleanups

* Bit more stability

* cleanup
2019-08-22 18:01:49 +02:00
dependabot-preview[bot]
3e69b04b86 Bump gitpython from 3.0.1 to 3.0.2 (#1254)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.1 to 3.0.2.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.1...3.0.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-22 16:20:47 +02:00
Pascal Vizeli
8b9cd4f122 Improve handling with nested objects (#1253) 2019-08-22 14:24:50 +02:00
Pascal Vizeli
c0e3ccdb83 Improve gdbus error handling (#1252)
* Improve gdbus error handling

* Fix logging type

* Detect no dbus

* Fix issue with complex

* Update hassio/dbus/__init__.py

Co-Authored-By: Franck Nijhof <frenck@frenck.nl>

* Update hassio/dbus/hostname.py

Co-Authored-By: Franck Nijhof <frenck@frenck.nl>

* Update hassio/dbus/rauc.py

Co-Authored-By: Franck Nijhof <frenck@frenck.nl>

* Update hassio/dbus/systemd.py

Co-Authored-By: Franck Nijhof <frenck@frenck.nl>

* Fix black
2019-08-22 12:48:02 +02:00
dependabot-preview[bot]
e8cc85c487 Bump pytest from 5.1.0 to 5.1.1 (#1250)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.1.0 to 5.1.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.1.0...5.1.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-21 23:41:16 +02:00
Pascal Vizeli
b3eff41692 Update devcontainer.json 2019-08-20 17:51:27 +02:00
Pascal Vizeli
1ea63f185c Bump version 182 2019-08-18 21:08:01 +02:00
Pascal Vizeli
a513d5c09a Merge pull request #1247 from home-assistant/dev
Release 181
2019-08-18 21:07:21 +02:00
Pascal Vizeli
fb8216c102 Fix AAAA resolv for nginx (#1246) 2019-08-18 21:05:42 +02:00
Pascal Vizeli
4f381d01df Log coredns errors (#1245) 2019-08-18 17:21:46 +02:00
Pascal Vizeli
de3382226e Update setting on startup (#1244)
* Update setting on startup

* Fix

* fix exception

* Cleanup handling
2019-08-18 17:11:42 +02:00
Pascal Vizeli
77be830b72 Bump version 181 2019-08-18 12:00:31 +02:00
Pascal Vizeli
09c0e1320f Merge pull request #1243 from home-assistant/dev
Release 180
2019-08-18 12:00:01 +02:00
Franck Nijhof
cc4ee59542 Replace Google DNS by Quad9, prefer CloudFlare (#1235) 2019-08-18 11:48:29 +02:00
Pascal Vizeli
1f448744f3 Add restart function / options change (#1242) 2019-08-18 11:46:23 +02:00
Franck Nijhof
ee2c257057 Adjust coredns do not forward local.hass.io (#1237) 2019-08-18 11:08:34 +02:00
Franck Nijhof
be8439d4ac Add localhost to hosts file (#1240) 2019-08-18 11:07:23 +02:00
Franck Nijhof
981f2b193c Adjust coredns to use upstream fowarding server in order (#1238) 2019-08-18 11:06:17 +02:00
Pascal Vizeli
39087e09ce Bump version 180 2019-08-16 20:33:56 +02:00
91 changed files with 1292 additions and 299 deletions

View File

@@ -34,10 +34,10 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
&& rm -rf /var/lib/apt/lists/*
# Install Python dependencies from requirements.txt if it exists
COPY requirements.txt requirements_tests.txt /workspaces/
RUN pip install -r requirements.txt \
&& pip3 install -r requirements_tests.txt \
&& pip install black tox
COPY requirements.txt requirements_tests.txt ./
RUN pip3 install -r requirements.txt -r requirements_tests.txt \
&& pip3 install black tox \
&& rm -f requirements.txt requirements_tests.txt
# Set the default shell to bash instead of sh
ENV SHELL /bin/bash

View File

@@ -6,11 +6,13 @@
"appPort": "9123:8123",
"runArgs": [
"-e",
"GIT_EDITOR='code --wait'",
"GIT_EDITOR=code --wait",
"--privileged"
],
"extensions": [
"ms-python.python"
"ms-python.python",
"visualstudioexptteam.vscodeintellicode",
"esbenp.prettier-vscode"
],
"settings": {
"python.pythonPath": "/usr/local/bin/python",

9
API.md
View File

@@ -350,6 +350,10 @@ Load host configs from a USB stick.
}
```
- POST `/hardware/trigger`
Trigger an udev reload
### Home Assistant
- GET `/homeassistant/info`
@@ -753,7 +757,8 @@ return:
"host": "ip-address",
"version": "1",
"latest_version": "2",
"servers": ["dns://8.8.8.8"]
"servers": ["dns://8.8.8.8"],
"locals": ["dns://xy"]
}
```
@@ -771,6 +776,8 @@ return:
}
```
- POST `/dns/restart`
- GET `/dns/logs`
- GET `/dns/stats`

View File

@@ -6,12 +6,13 @@ import sys
from hassio import bootstrap
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
def initialize_event_loop():
"""Attempt to use uvloop."""
try:
# pylint: disable=import-outside-toplevel
import uvloop
uvloop.install()

View File

@@ -19,7 +19,7 @@ from ..store.addon import AddonStore
from .addon import Addon
from .data import AddonsData
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
AnyAddon = Union[Addon, AddonStore]
@@ -285,6 +285,9 @@ class AddonManager(CoreSysAttributes):
for addon in needs_repair:
_LOGGER.info("Start repair for add-on: %s", addon.slug)
await self.sys_run_in_executor(
self.sys_docker.network.stale_cleanup, addon.instance.name
)
with suppress(DockerAPIError, KeyError):
# Need pull a image again
@@ -293,7 +296,7 @@ class AddonManager(CoreSysAttributes):
continue
# Need local lookup
elif addon.need_build and not addon.is_detached:
if addon.need_build and not addon.is_detached:
store = self.store[addon.slug]
# If this add-on is available for rebuild
if addon.version == store.version:

View File

@@ -51,11 +51,12 @@ from ..exceptions import (
)
from ..utils.apparmor import adjust_profile
from ..utils.json import read_json_file, write_json_file
from ..utils.tar import exclude_filter, secure_path
from .model import AddonModel, Data
from .utils import remove_data
from .validate import SCHEMA_ADDON_SNAPSHOT, validate_options
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
RE_WEBUI = re.compile(
r"^(?:(?P<s_prefix>https?)|\[PROTO:(?P<t_proto>\w+)\])"
@@ -345,13 +346,16 @@ class Addon(AddonModel):
"""Save data of add-on."""
self.sys_addons.data.save_data()
def write_options(self):
async def write_options(self):
"""Return True if add-on options is written to data."""
schema = self.schema
options = self.options
# Update secrets for validation
await self.sys_secrets.reload()
try:
schema(options)
options = schema(options)
write_json_file(self.path_options, options)
except vol.Invalid as ex:
_LOGGER.error(
@@ -438,7 +442,9 @@ class Addon(AddonModel):
options = {**self.persist[ATTR_OPTIONS], **default_options}
# create voluptuous
new_schema = vol.Schema(vol.All(dict, validate_options(new_raw_schema)))
new_schema = vol.Schema(
vol.All(dict, validate_options(self.coresys, new_raw_schema))
)
# validate
try:
@@ -465,12 +471,13 @@ class Addon(AddonModel):
self.save_persist()
# Options
self.write_options()
await self.write_options()
# Sound
if self.with_audio:
self.write_asound()
# Start Add-on
try:
await self.instance.run()
except DockerAPIError:
@@ -519,7 +526,7 @@ class Addon(AddonModel):
async def snapshot(self, tar_file: tarfile.TarFile) -> None:
"""Snapshot state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
with TemporaryDirectory(dir=self.sys_config.path_tmp) as temp:
# store local image
if self.need_build:
try:
@@ -554,8 +561,15 @@ class Addon(AddonModel):
def _write_tarfile():
"""Write tar inside loop."""
with tar_file as snapshot:
# Snapshot system
snapshot.add(temp, arcname=".")
snapshot.add(self.path_data, arcname="data")
# Snapshot data
snapshot.add(
self.path_data,
arcname="data",
filter=exclude_filter(self.snapshot_exclude),
)
try:
_LOGGER.info("Build snapshot for add-on %s", self.slug)
@@ -568,12 +582,12 @@ class Addon(AddonModel):
async def restore(self, tar_file: tarfile.TarFile) -> None:
"""Restore state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
with TemporaryDirectory(dir=self.sys_config.path_tmp) as temp:
# extract snapshot
def _extract_tarfile():
"""Extract tar snapshot."""
with tar_file as snapshot:
snapshot.extractall(path=Path(temp))
snapshot.extractall(path=Path(temp), members=secure_path(snapshot))
try:
await self.sys_run_in_executor(_extract_tarfile)
@@ -634,7 +648,7 @@ class Addon(AddonModel):
# Restore data
def _restore_data():
"""Restore data."""
shutil.copytree(str(Path(temp, "data")), str(self.path_data))
shutil.copytree(Path(temp, "data"), self.path_data)
_LOGGER.info("Restore data for addon %s", self.slug)
if self.path_data.is_dir():

View File

@@ -17,7 +17,7 @@ from ..store.addon import AddonStore
from .addon import Addon
from .validate import SCHEMA_ADDONS_FILE
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
Config = Dict[str, Any]

View File

@@ -1,8 +1,8 @@
"""Init file for Hass.io add-ons."""
from distutils.version import StrictVersion
from pathlib import Path
from typing import Any, Awaitable, Dict, List, Optional
from packaging import version as pkg_version
import voluptuous as vol
from ..const import (
@@ -47,6 +47,7 @@ from ..const import (
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SNAPSHOT_EXCLUDE,
ATTR_STARTUP,
ATTR_STDIN,
ATTR_TIMEOUT,
@@ -324,6 +325,11 @@ class AddonModel(CoreSysAttributes):
"""Return Hass.io role for API."""
return self.data[ATTR_HASSIO_ROLE]
@property
def snapshot_exclude(self) -> List[str]:
"""Return Exclude list for snapshot."""
return self.data.get(ATTR_SNAPSHOT_EXCLUDE, [])
@property
def with_stdin(self) -> bool:
"""Return True if the add-on access use stdin input."""
@@ -461,7 +467,7 @@ class AddonModel(CoreSysAttributes):
if isinstance(raw_schema, bool):
return vol.Schema(dict)
return vol.Schema(vol.All(dict, validate_options(raw_schema)))
return vol.Schema(vol.All(dict, validate_options(self.coresys, raw_schema)))
def __eq__(self, other):
"""Compaired add-on objects."""
@@ -482,7 +488,9 @@ class AddonModel(CoreSysAttributes):
# Home Assistant
version = config.get(ATTR_HOMEASSISTANT) or self.sys_homeassistant.version
if StrictVersion(self.sys_homeassistant.version) < StrictVersion(version):
if pkg_version.parse(self.sys_homeassistant.version) < pkg_version.parse(
version
):
return False
return True

View File

@@ -22,7 +22,7 @@ from ..const import (
if TYPE_CHECKING:
from .model import AddonModel
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
def rating_security(addon: AddonModel) -> int:

View File

@@ -2,6 +2,7 @@
import logging
import re
import secrets
from typing import Any, Dict
import uuid
import voluptuous as vol
@@ -61,6 +62,7 @@ from ..const import (
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SNAPSHOT_EXCLUDE,
ATTR_SQUASH,
ATTR_STARTUP,
ATTR_STATE,
@@ -85,6 +87,7 @@ from ..const import (
STATE_STARTED,
STATE_STOPPED,
)
from ..coresys import CoreSys
from ..discovery.validate import valid_discovery_service
from ..validate import (
ALSA_DEVICE,
@@ -95,7 +98,7 @@ from ..validate import (
UUID_MATCH,
)
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
RE_VOLUME = re.compile(r"^(config|ssl|addons|backup|share)(?::(rw|ro))?$")
@@ -109,16 +112,21 @@ V_EMAIL = "email"
V_URL = "url"
V_PORT = "port"
V_MATCH = "match"
V_LIST = "list"
RE_SCHEMA_ELEMENT = re.compile(
r"^(?:"
r"|str|bool|email|url|port"
r"|bool|email|url|port"
r"|str(?:\((?P<s_min>\d+)?,(?P<s_max>\d+)?\))?"
r"|int(?:\((?P<i_min>\d+)?,(?P<i_max>\d+)?\))?"
r"|float(?:\((?P<f_min>[\d\.]+)?,(?P<f_max>[\d\.]+)?\))?"
r"|match\((?P<match>.*)\)"
r"|list\((?P<list>.+)\)"
r")\??$"
)
_SCHEMA_LENGTH_PARTS = ("i_min", "i_max", "f_min", "f_max", "s_min", "s_max")
RE_DOCKER_IMAGE = re.compile(r"^([a-zA-Z\-\.:\d{}]+/)*?([\-\w{}]+)/([\-\w{}]+)$")
RE_DOCKER_IMAGE_BUILD = re.compile(
r"^([a-zA-Z\-\.:\d{}]+/)*?([\-\w{}]+)/([\-\w{}]+)(:[\.\-\w{}]+)?$"
@@ -207,6 +215,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema(
vol.Optional(ATTR_AUTH_API, default=False): vol.Boolean(),
vol.Optional(ATTR_SERVICES): [vol.Match(RE_SERVICE)],
vol.Optional(ATTR_DISCOVERY): [valid_discovery_service],
vol.Optional(ATTR_SNAPSHOT_EXCLUDE): [vol.Coerce(str)],
vol.Required(ATTR_OPTIONS): dict,
vol.Required(ATTR_SCHEMA): vol.Any(
vol.Schema(
@@ -305,7 +314,7 @@ SCHEMA_ADDON_SNAPSHOT = vol.Schema(
)
def validate_options(raw_schema):
def validate_options(coresys: CoreSys, raw_schema: Dict[str, Any]):
"""Validate schema."""
def validate(struct):
@@ -323,13 +332,13 @@ def validate_options(raw_schema):
try:
if isinstance(typ, list):
# nested value list
options[key] = _nested_validate_list(typ[0], value, key)
options[key] = _nested_validate_list(coresys, typ[0], value, key)
elif isinstance(typ, dict):
# nested value dict
options[key] = _nested_validate_dict(typ, value, key)
options[key] = _nested_validate_dict(coresys, typ, value, key)
else:
# normal value
options[key] = _single_validate(typ, value, key)
options[key] = _single_validate(coresys, typ, value, key)
except (IndexError, KeyError):
raise vol.Invalid(f"Type error for {key}") from None
@@ -341,24 +350,31 @@ def validate_options(raw_schema):
# pylint: disable=no-value-for-parameter
# pylint: disable=inconsistent-return-statements
def _single_validate(typ, value, key):
def _single_validate(coresys: CoreSys, typ: str, value: Any, key: str):
"""Validate a single element."""
# if required argument
if value is None:
raise vol.Invalid(f"Missing required option '{key}'")
# Lookup secret
if str(value).startswith("!secret "):
secret: str = value.partition(" ")[2]
value = coresys.secrets.get(secret)
if value is None:
raise vol.Invalid(f"Unknown secret {secret}")
# parse extend data from type
match = RE_SCHEMA_ELEMENT.match(typ)
# prepare range
range_args = {}
for group_name in ("i_min", "i_max", "f_min", "f_max"):
for group_name in _SCHEMA_LENGTH_PARTS:
group_value = match.group(group_name)
if group_value:
range_args[group_name[2:]] = float(group_value)
if typ.startswith(V_STR):
return str(value)
return vol.All(str(value), vol.Range(**range_args))(value)
elif typ.startswith(V_INT):
return vol.All(vol.Coerce(int), vol.Range(**range_args))(value)
elif typ.startswith(V_FLOAT):
@@ -373,26 +389,28 @@ def _single_validate(typ, value, key):
return NETWORK_PORT(value)
elif typ.startswith(V_MATCH):
return vol.Match(match.group("match"))(str(value))
elif typ.startswith(V_LIST):
return vol.In(match.group("list").split("|"))(str(value))
raise vol.Invalid(f"Fatal error for {key} type {typ}")
def _nested_validate_list(typ, data_list, key):
def _nested_validate_list(coresys, typ, data_list, key):
"""Validate nested items."""
options = []
for element in data_list:
# Nested?
if isinstance(typ, dict):
c_options = _nested_validate_dict(typ, element, key)
c_options = _nested_validate_dict(coresys, typ, element, key)
options.append(c_options)
else:
options.append(_single_validate(typ, element, key))
options.append(_single_validate(coresys, typ, element, key))
return options
def _nested_validate_dict(typ, data_dict, key):
def _nested_validate_dict(coresys, typ, data_dict, key):
"""Validate nested items."""
options = {}
@@ -404,9 +422,11 @@ def _nested_validate_dict(typ, data_dict, key):
# Nested?
if isinstance(typ[c_key], list):
options[c_key] = _nested_validate_list(typ[c_key][0], c_value, c_key)
options[c_key] = _nested_validate_list(
coresys, typ[c_key][0], c_value, c_key
)
else:
options[c_key] = _single_validate(typ[c_key], c_value, c_key)
options[c_key] = _single_validate(coresys, typ[c_key], c_value, c_key)
_check_missing_options(typ, options, key)
return options

View File

@@ -22,7 +22,7 @@ from .services import APIServices
from .snapshots import APISnapshots
from .supervisor import APISupervisor
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class RestAPI(CoreSysAttributes):
@@ -101,6 +101,7 @@ class RestAPI(CoreSysAttributes):
[
web.get("/hardware/info", api_hardware.info),
web.get("/hardware/audio", api_hardware.audio),
web.post("/hardware/trigger", api_hardware.trigger),
]
)
@@ -278,6 +279,7 @@ class RestAPI(CoreSysAttributes):
web.get("/dns/logs", api_dns.logs),
web.post("/dns/update", api_dns.update),
web.post("/dns/options", api_dns.options),
web.post("/dns/restart", api_dns.restart),
]
)

View File

@@ -5,7 +5,6 @@ from typing import Any, Awaitable, Dict, List
from aiohttp import web
import voluptuous as vol
from voluptuous.humanize import humanize_error
from ..addons import AnyAddon
from ..docker.stats import DockerStats
@@ -94,7 +93,7 @@ from ..exceptions import APIError
from ..validate import ALSA_DEVICE, DOCKER_PORTS
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
@@ -266,11 +265,16 @@ class APIAddons(CoreSysAttributes):
"""Store user options for add-on."""
addon: AnyAddon = self._extract_addon(request)
# Update secrets for validation
await self.sys_secrets.reload()
# Extend schema with add-on specific validation
addon_schema = SCHEMA_OPTIONS.extend(
{vol.Optional(ATTR_OPTIONS): vol.Any(None, addon.schema)}
)
body: Dict[str, Any] = await api_validate(addon_schema, request)
# Validate/Process Body
body = await api_validate(addon_schema, request, origin=[ATTR_OPTIONS])
if ATTR_OPTIONS in body:
addon.options = body[ATTR_OPTIONS]
if ATTR_BOOT in body:
@@ -334,14 +338,6 @@ class APIAddons(CoreSysAttributes):
def start(self, request: web.Request) -> Awaitable[None]:
"""Start add-on."""
addon: AnyAddon = self._extract_addon(request)
# check options
options = addon.options
try:
addon.schema(options)
except vol.Invalid as ex:
raise APIError(humanize_error(options, ex)) from None
return asyncio.shield(addon.start())
@api_process

View File

@@ -10,7 +10,7 @@ from ..const import REQUEST_FROM, CONTENT_TYPE_JSON, CONTENT_TYPE_URL
from ..coresys import CoreSysAttributes
from ..exceptions import APIForbidden
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class APIAuth(CoreSysAttributes):

View File

@@ -12,9 +12,10 @@ from ..const import (
ATTR_CPU_PERCENT,
ATTR_HOST,
ATTR_LATEST_VERSION,
ATTR_LOCALS,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE,
ATTR_MEMORY_PERCENT,
ATTR_MEMORY_USAGE,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_SERVERS,
@@ -26,7 +27,7 @@ from ..exceptions import APIError
from ..validate import DNS_SERVER_LIST
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_OPTIONS = vol.Schema({vol.Optional(ATTR_SERVERS): DNS_SERVER_LIST})
@@ -45,6 +46,7 @@ class APICoreDNS(CoreSysAttributes):
ATTR_LATEST_VERSION: self.sys_dns.latest_version,
ATTR_HOST: str(self.sys_docker.network.dns),
ATTR_SERVERS: self.sys_dns.servers,
ATTR_LOCALS: self.sys_host.network.dns_servers,
}
@api_process
@@ -54,6 +56,7 @@ class APICoreDNS(CoreSysAttributes):
if ATTR_SERVERS in body:
self.sys_dns.servers = body[ATTR_SERVERS]
self.sys_create_task(self.sys_dns.restart())
self.sys_dns.save_data()
@@ -87,3 +90,8 @@ class APICoreDNS(CoreSysAttributes):
def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return DNS Docker logs."""
return self.sys_dns.logs()
@api_process
def restart(self, request: web.Request) -> Awaitable[None]:
"""Restart CoreDNS plugin."""
return asyncio.shield(self.sys_dns.restart())

View File

@@ -1,5 +1,9 @@
"""Init file for Hass.io hardware RESTful API."""
import asyncio
import logging
from typing import Any, Dict
from aiohttp import web
from .utils import api_process
from ..const import (
@@ -12,14 +16,14 @@ from ..const import (
)
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class APIHardware(CoreSysAttributes):
"""Handle RESTful API for hardware functions."""
@api_process
async def info(self, request):
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Show hardware info."""
return {
ATTR_SERIAL: list(
@@ -32,7 +36,7 @@ class APIHardware(CoreSysAttributes):
}
@api_process
async def audio(self, request):
async def audio(self, request: web.Request) -> Dict[str, Any]:
"""Show ALSA audio devices."""
return {
ATTR_AUDIO: {
@@ -40,3 +44,8 @@ class APIHardware(CoreSysAttributes):
ATTR_OUTPUT: self.sys_host.alsa.output_devices,
}
}
@api_process
def trigger(self, request: web.Request) -> None:
"""Trigger a udev device reload."""
return asyncio.shield(self.sys_hardware.udev_trigger())

View File

@@ -16,7 +16,7 @@ from ..const import (
from ..coresys import CoreSysAttributes
from .utils import api_process, api_validate
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})

View File

@@ -36,7 +36,7 @@ from ..exceptions import APIError
from ..validate import DOCKER_IMAGE, NETWORK_PORT
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_OPTIONS = vol.Schema(

View File

@@ -20,7 +20,7 @@ from ..const import (
)
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
SERVICE = "service"

View File

@@ -19,7 +19,7 @@ from ..const import (
from ..coresys import CoreSysAttributes
from .utils import api_process
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class APIInfo(CoreSysAttributes):

View File

@@ -28,7 +28,7 @@ from ..const import (
from ..coresys import CoreSysAttributes
from .utils import api_process
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class APIIngress(CoreSysAttributes):

View File

@@ -14,7 +14,7 @@ from ..const import HEADER_HA_ACCESS
from ..coresys import CoreSysAttributes
from ..exceptions import HomeAssistantAuthError, HomeAssistantAPIError, APIError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class APIProxy(CoreSysAttributes):

View File

@@ -16,7 +16,7 @@ from ..const import (
)
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# fmt: off
@@ -40,7 +40,9 @@ NO_SECURITY_CHECK = re.compile(
ADDONS_API_BYPASS = re.compile(
r"^(?:"
r"|/addons/self/(?!security|update)[^/]+"
r"|/secrets/.+"
r"|/info"
r"|/hardware/trigger"
r"|/services.*"
r"|/discovery.*"
r"|/auth"

View File

@@ -28,7 +28,7 @@ from ..const import (
from ..coresys import CoreSysAttributes
from ..exceptions import APIError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter

View File

@@ -44,7 +44,7 @@ from ..utils.validate import validate_timezone
from ..validate import CHANNELS, LOG_LEVEL, REPOSITORIES, WAIT_BOOT
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_OPTIONS = vol.Schema(
@@ -161,7 +161,9 @@ class APISupervisor(CoreSysAttributes):
@api_process
def reload(self, request: web.Request) -> Awaitable[None]:
"""Reload add-ons, configuration, etc."""
return asyncio.shield(self.sys_updater.reload())
return asyncio.shield(
asyncio.wait([self.sys_updater.reload(), self.sys_secrets.reload()])
)
@api_process
def repair(self, request: web.Request) -> Awaitable[None]:

View File

@@ -1,25 +1,26 @@
"""Init file for Hass.io util for RESTful API."""
import json
import logging
from typing import Any, Dict, List, Optional
from aiohttp import web
import voluptuous as vol
from voluptuous.humanize import humanize_error
from ..const import (
JSON_RESULT,
CONTENT_TYPE_BINARY,
JSON_DATA,
JSON_MESSAGE,
RESULT_OK,
JSON_RESULT,
RESULT_ERROR,
CONTENT_TYPE_BINARY,
RESULT_OK,
)
from ..exceptions import HassioError, APIError, APIForbidden
from ..exceptions import APIError, APIForbidden, HassioError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
def json_loads(data):
def json_loads(data: Any) -> Dict[str, Any]:
"""Extract json from string with support for '' and None."""
if not data:
return {}
@@ -77,24 +78,34 @@ def api_process_raw(content):
return wrap_method
def api_return_error(message=None):
def api_return_error(message: Optional[str] = None) -> web.Response:
"""Return an API error message."""
return web.json_response(
{JSON_RESULT: RESULT_ERROR, JSON_MESSAGE: message}, status=400
)
def api_return_ok(data=None):
def api_return_ok(data: Optional[Dict[str, Any]] = None) -> web.Response:
"""Return an API ok answer."""
return web.json_response({JSON_RESULT: RESULT_OK, JSON_DATA: data or {}})
async def api_validate(schema, request):
async def api_validate(
schema: vol.Schema, request: web.Request, origin: Optional[List[str]] = None
) -> Dict[str, Any]:
"""Validate request data with schema."""
data = await request.json(loads=json_loads)
data: Dict[str, Any] = await request.json(loads=json_loads)
try:
data = schema(data)
data_validated = schema(data)
except vol.Invalid as ex:
raise APIError(humanize_error(data, ex)) from None
return data
if not origin:
return data_validated
for origin_value in origin:
if origin_value not in data_validated:
continue
data_validated[origin_value] = data[origin_value]
return data_validated

View File

@@ -8,7 +8,7 @@ from .coresys import CoreSys, CoreSysAttributes
from .exceptions import HassioArchNotFound, JsonFileError
from .utils.json import read_json_file
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
ARCH_JSON: Path = Path(__file__).parent.joinpath("data/arch.json")

View File

@@ -8,7 +8,7 @@ from .utils.json import JsonConfig
from .validate import SCHEMA_AUTH_CONFIG
from .exceptions import AuthError, HomeAssistantAPIError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class Auth(JsonConfig, CoreSysAttributes):

View File

@@ -27,9 +27,10 @@ from .store import StoreManager
from .supervisor import Supervisor
from .tasks import Tasks
from .updater import Updater
from .secrets import SecretsManager
from .utils.dt import fetch_timezone
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
ENV_SHARE = "SUPERVISOR_SHARE"
ENV_NAME = "SUPERVISOR_NAME"
@@ -61,6 +62,7 @@ async def initialize_coresys():
coresys.discovery = Discovery(coresys)
coresys.dbus = DBusManager(coresys)
coresys.hassos = HassOS(coresys)
coresys.secrets = SecretsManager(coresys)
# bootstrap config
initialize_system_data(coresys)
@@ -234,6 +236,7 @@ def supervisor_debugger(coresys: CoreSys) -> None:
"""Setup debugger if needed."""
if not coresys.config.debug:
return
# pylint: disable=import-outside-toplevel
import ptvsd
_LOGGER.info("Initialize Hass.io debugger")

View File

@@ -19,7 +19,7 @@ from .utils.dt import parse_datetime
from .utils.json import JsonConfig
from .validate import SCHEMA_HASSIO_CONFIG
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
HOMEASSISTANT_CONFIG = PurePath("homeassistant")

View File

@@ -2,8 +2,8 @@
from pathlib import Path
from ipaddress import ip_network
HASSIO_VERSION = "191"
HASSIO_VERSION = "179"
URL_HASSIO_ADDONS = "https://github.com/home-assistant/hassio-addons"
URL_HASSIO_VERSION = "https://version.home-assistant.io/{channel}.json"
@@ -32,7 +32,7 @@ DOCKER_NETWORK = "hassio"
DOCKER_NETWORK_MASK = ip_network("172.30.32.0/23")
DOCKER_NETWORK_RANGE = ip_network("172.30.33.0/24")
DNS_SERVERS = ["dns://8.8.8.8", "dns://1.1.1.1"]
DNS_SERVERS = ["dns://1.1.1.1", "dns://9.9.9.9"]
DNS_SUFFIX = "local.hass.io"
LABEL_VERSION = "io.hass.version"
@@ -218,7 +218,10 @@ ATTR_DEBUG = "debug"
ATTR_DEBUG_BLOCK = "debug_block"
ATTR_DNS = "dns"
ATTR_SERVERS = "servers"
ATTR_LOCALS = "locals"
ATTR_UDEV = "udev"
ATTR_VALUE = "value"
ATTR_SNAPSHOT_EXCLUDE = "snapshot_exclude"
PROVIDE_SERVICE = "provide"
NEED_SERVICE = "need"

View File

@@ -14,7 +14,7 @@ from .const import (
)
from .exceptions import HassioError, HomeAssistantError, SupervisorUpdateError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class HassIO(CoreSysAttributes):
@@ -30,15 +30,15 @@ class HassIO(CoreSysAttributes):
async def setup(self):
"""Setup HassIO orchestration."""
# Load CoreDNS
await self.sys_dns.load()
# Load DBus
await self.sys_dbus.load()
# Load Host
await self.sys_host.load()
# Load CoreDNS
await self.sys_dns.load()
# Load Home Assistant
await self.sys_homeassistant.load()
@@ -72,6 +72,9 @@ class HassIO(CoreSysAttributes):
# Load ingress
await self.sys_ingress.load()
# Load secrets
await self.sys_secrets.load()
async def start(self):
"""Start Hass.io orchestration."""
await self.sys_api.start()

View File

@@ -24,6 +24,7 @@ if TYPE_CHECKING:
from .homeassistant import HomeAssistant
from .host import HostManager
from .ingress import Ingress
from .secrets import SecretsManager
from .services import ServiceManager
from .snapshots import SnapshotManager
from .supervisor import Supervisor
@@ -70,6 +71,7 @@ class CoreSys:
self._dbus: Optional[DBusManager] = None
self._hassos: Optional[HassOS] = None
self._services: Optional[ServiceManager] = None
self._secrets: Optional[SecretsManager] = None
self._store: Optional[StoreManager] = None
self._discovery: Optional[Discovery] = None
@@ -209,6 +211,18 @@ class CoreSys:
raise RuntimeError("Updater already set!")
self._updater = value
@property
def secrets(self) -> SecretsManager:
"""Return SecretsManager object."""
return self._secrets
@secrets.setter
def secrets(self, value: SecretsManager):
"""Set a Updater object."""
if self._secrets:
raise RuntimeError("SecretsManager already set!")
self._secrets = value
@property
def addons(self) -> AddonManager:
"""Return AddonManager object."""
@@ -437,6 +451,11 @@ class CoreSysAttributes:
"""Return Updater object."""
return self.coresys.updater
@property
def sys_secrets(self) -> SecretsManager:
"""Return SecretsManager object."""
return self.coresys.secrets
@property
def sys_addons(self) -> AddonManager:
"""Return AddonManager object."""

View File

@@ -1,9 +1,15 @@
.:53 {
log
errors
hosts /config/hosts {
fallthrough
}
template ANY AAAA local.hass.io hassio {
rcode NOERROR
}
forward . $servers {
except local.hass.io
policy sequential
health_check 10s
}
}

View File

@@ -1,39 +1,57 @@
"""D-Bus interface objects."""
import logging
from .systemd import Systemd
from .hostname import Hostname
from .rauc import Rauc
from ..coresys import CoreSysAttributes
from .nmi_dns import NMIDnsManager
from ..coresys import CoreSysAttributes, CoreSys
from ..exceptions import DBusNotConnectedError
_LOGGER: logging.Logger = logging.getLogger(__name__)
class DBusManager(CoreSysAttributes):
"""A DBus Interface handler."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys) -> None:
"""Initialize D-Bus interface."""
self.coresys = coresys
self.coresys: CoreSys = coresys
self._systemd = Systemd()
self._hostname = Hostname()
self._rauc = Rauc()
self._systemd: Systemd = Systemd()
self._hostname: Hostname = Hostname()
self._rauc: Rauc = Rauc()
self._nmi_dns: NMIDnsManager = NMIDnsManager()
@property
def systemd(self):
def systemd(self) -> Systemd:
"""Return the systemd interface."""
return self._systemd
@property
def hostname(self):
def hostname(self) -> Hostname:
"""Return the hostname interface."""
return self._hostname
@property
def rauc(self):
def rauc(self) -> Rauc:
"""Return the rauc interface."""
return self._rauc
async def load(self):
@property
def nmi_dns(self) -> NMIDnsManager:
"""Return NetworkManager DNS interface."""
return self._nmi_dns
async def load(self) -> None:
"""Connect interfaces to D-Bus."""
await self.systemd.connect()
await self.hostname.connect()
await self.rauc.connect()
try:
await self.systemd.connect()
await self.hostname.connect()
await self.rauc.connect()
await self.nmi_dns.connect()
except DBusNotConnectedError:
_LOGGER.error(
"No DBus support from Host. Disabled any kind of host control!"
)

View File

@@ -1,12 +1,13 @@
"""D-Bus interface for hostname."""
import logging
from typing import Optional
from .interface import DBusInterface
from .utils import dbus_connected
from ..exceptions import DBusError
from ..exceptions import DBusError, DBusInterfaceError
from ..utils.gdbus import DBus
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
DBUS_NAME = "org.freedesktop.hostname1"
DBUS_OBJECT = "/org/freedesktop/hostname1"
@@ -15,12 +16,55 @@ DBUS_OBJECT = "/org/freedesktop/hostname1"
class Hostname(DBusInterface):
"""Handle D-Bus interface for hostname/system."""
def __init__(self):
"""Initialize Properties."""
self._hostname: Optional[str] = None
self._chassis: Optional[str] = None
self._deployment: Optional[str] = None
self._kernel: Optional[str] = None
self._operating_system: Optional[str] = None
self._cpe: Optional[str] = None
async def connect(self):
"""Connect to system's D-Bus."""
try:
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:
_LOGGER.warning("Can't connect to hostname")
except DBusInterfaceError:
_LOGGER.warning(
"No hostname support on the host. Hostname functions have been disabled."
)
@property
def hostname(self) -> Optional[str]:
"""Return local hostname."""
return self._hostname
@property
def chassis(self) -> Optional[str]:
"""Return local chassis type."""
return self._chassis
@property
def deployment(self) -> Optional[str]:
"""Return local deployment type."""
return self._deployment
@property
def kernel(self) -> Optional[str]:
"""Return local kernel version."""
return self._kernel
@property
def operating_system(self) -> Optional[str]:
"""Return local operating system."""
return self._operating_system
@property
def cpe(self) -> Optional[str]:
"""Return local CPE."""
return self._cpe
@dbus_connected
def set_static_hostname(self, hostname):
@@ -31,9 +75,16 @@ class Hostname(DBusInterface):
return self.dbus.SetStaticHostname(hostname, False)
@dbus_connected
def get_properties(self):
"""Return local host informations.
async def update(self):
"""Update Properties."""
data = await self.dbus.get_properties(DBUS_NAME)
if not data:
_LOGGER.warning("Can't get properties for Hostname")
return
Return a coroutine.
"""
return self.dbus.get_properties(DBUS_NAME)
self._hostname = data.get("StaticHostname")
self._chassis = data.get("Chassis")
self._deployment = data.get("Deployment")
self._kernel = data.get("KernelRelease")
self._operating_system = data.get("OperatingSystemPrettyName")
self._cpe = data.get("OperatingSystemCPEName")

View File

@@ -1,12 +1,13 @@
"""Interface class for D-Bus wrappers."""
from typing import Optional
from ..utils.gdbus import DBus
class DBusInterface:
"""Handle D-Bus interface for hostname/system."""
def __init__(self):
"""Initialize systemd."""
self.dbus = None
dbus: Optional[DBus] = None
@property
def is_connected(self):

85
hassio/dbus/nmi_dns.py Normal file
View File

@@ -0,0 +1,85 @@
"""D-Bus interface for hostname."""
import logging
from typing import Optional, List
import attr
from .interface import DBusInterface
from .utils import dbus_connected
from ..exceptions import DBusError, DBusInterfaceError
from ..utils.gdbus import DBus
_LOGGER: logging.Logger = logging.getLogger(__name__)
DBUS_NAME = "org.freedesktop.NetworkManager"
DBUS_OBJECT = "/org/freedesktop/NetworkManager/DnsManager"
@attr.s
class DNSConfiguration:
"""NMI DnsManager configuration Object."""
nameservers: List[str] = attr.ib()
domains: List[str] = attr.ib()
interface: str = attr.ib()
priority: int = attr.ib()
vpn: bool = attr.ib()
class NMIDnsManager(DBusInterface):
"""Handle D-Bus interface for NMI DnsManager."""
def __init__(self) -> None:
"""Initialize Properties."""
self._mode: Optional[str] = None
self._rc_manager: Optional[str] = None
self._configuration: List[DNSConfiguration] = []
@property
def mode(self) -> Optional[str]:
"""Return Propertie mode."""
return self._mode
@property
def rc_manager(self) -> Optional[str]:
"""Return Propertie RcManager."""
return self._rc_manager
@property
def configuration(self) -> List[DNSConfiguration]:
"""Return Propertie configuraton."""
return self._configuration
async def connect(self) -> None:
"""Connect to system's D-Bus."""
try:
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:
_LOGGER.warning("Can't connect to DnsManager")
except DBusInterfaceError:
_LOGGER.warning(
"No DnsManager support on the host. Local DNS functions have been disabled."
)
@dbus_connected
async def update(self):
"""Update Properties."""
data = await self.dbus.get_properties(f"{DBUS_NAME}.DnsManager")
if not data:
_LOGGER.warning("Can't get properties for NMI DnsManager")
return
self._mode = data.get("Mode")
self._rc_manager = data.get("RcManager")
# Parse configuraton
self._configuration.clear()
for config in data.get("Configuration", []):
dns = DNSConfiguration(
config.get("nameservers"),
config.get("domains"),
config.get("interface"),
config.get("priority"),
config.get("vpn"),
)
self._configuration.append(dns)

View File

@@ -3,10 +3,10 @@ import logging
from .interface import DBusInterface
from .utils import dbus_connected
from ..exceptions import DBusError
from ..exceptions import DBusError, DBusInterfaceError
from ..utils.gdbus import DBus
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
DBUS_NAME = "de.pengutronix.rauc"
DBUS_OBJECT = "/"
@@ -21,6 +21,8 @@ class Rauc(DBusInterface):
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:
_LOGGER.warning("Can't connect to rauc")
except DBusInterfaceError:
_LOGGER.warning("Host has no rauc support. OTA updates have been disabled.")
@dbus_connected
def install(self, raucb_file):

View File

@@ -3,10 +3,10 @@ import logging
from .interface import DBusInterface
from .utils import dbus_connected
from ..exceptions import DBusError
from ..exceptions import DBusError, DBusInterfaceError
from ..utils.gdbus import DBus
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
DBUS_NAME = "org.freedesktop.systemd1"
DBUS_OBJECT = "/org/freedesktop/systemd1"
@@ -21,6 +21,10 @@ class Systemd(DBusInterface):
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:
_LOGGER.warning("Can't connect to systemd")
except DBusInterfaceError:
_LOGGER.warning(
"No systemd support on the host. Host control has been disabled."
)
@dbus_connected
def reboot(self):

View File

@@ -19,7 +19,7 @@ from .validate import SCHEMA_DISCOVERY_CONFIG, valid_discovery_config
if TYPE_CHECKING:
from ..addons.addon import Addon
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
CMD_NEW = "post"
CMD_DEL = "delete"
@@ -31,8 +31,8 @@ class Message:
addon: str = attr.ib()
service: str = attr.ib()
config: Dict[str, Any] = attr.ib(cmp=False)
uuid: UUID = attr.ib(factory=lambda: uuid4().hex, cmp=False)
config: Dict[str, Any] = attr.ib(eq=False)
uuid: UUID = attr.ib(factory=lambda: uuid4().hex, eq=False)
class Discovery(CoreSysAttributes, JsonConfig):

View File

@@ -0,0 +1,11 @@
"""Discovery service for Almond."""
import voluptuous as vol
from hassio.validate import NETWORK_PORT
from ..const import ATTR_HOST, ATTR_PORT
SCHEMA = vol.Schema(
{vol.Required(ATTR_HOST): vol.Coerce(str), vol.Required(ATTR_PORT): NETWORK_PORT}
)

View File

@@ -0,0 +1,11 @@
"""Discovery service for Home Panel."""
import voluptuous as vol
from hassio.validate import NETWORK_PORT
from ..const import ATTR_HOST, ATTR_PORT
SCHEMA = vol.Schema(
{vol.Required(ATTR_HOST): vol.Coerce(str), vol.Required(ATTR_PORT): NETWORK_PORT}
)

View File

@@ -1,24 +1,25 @@
"""Home Assistant control object."""
import asyncio
import logging
from contextlib import suppress
from ipaddress import IPv4Address
import logging
from pathlib import Path
from string import Template
from typing import Awaitable, List, Optional
import attr
import voluptuous as vol
from .const import ATTR_SERVERS, ATTR_VERSION, DNS_SERVERS, FILE_HASSIO_DNS, DNS_SUFFIX
from .const import ATTR_SERVERS, ATTR_VERSION, DNS_SERVERS, DNS_SUFFIX, FILE_HASSIO_DNS
from .coresys import CoreSys, CoreSysAttributes
from .docker.dns import DockerDNS
from .docker.stats import DockerStats
from .exceptions import CoreDNSError, CoreDNSUpdateError, DockerAPIError
from .misc.forwarder import DNSForward
from .utils.json import JsonConfig
from .validate import SCHEMA_DNS_CONFIG
from .validate import DNS_URL, SCHEMA_DNS_CONFIG
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
COREDNS_TMPL: Path = Path(__file__).parents[0].joinpath("data/coredns.tmpl")
RESOLV_CONF: Path = Path("/etc/resolv.conf")
@@ -114,14 +115,16 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
# Start DNS forwarder
self.sys_create_task(self.forwarder.start(self.sys_docker.network.dns))
self._update_local_resolv()
with suppress(CoreDNSError):
self._update_local_resolv()
# Start is not Running
# Reset container configuration
if await self.instance.is_running():
return
await self.start()
with suppress(DockerAPIError):
await self.instance.stop()
# Run CoreDNS
with suppress(CoreDNSError):
await self.start()
async def unload(self) -> None:
"""Unload DNS forwarder."""
@@ -146,7 +149,8 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
self.version = self.instance.version
self.save_data()
await self.start()
# Init Hosts
self.write_hosts()
async def update(self, version: Optional[str] = None) -> None:
"""Update CoreDNS plugin."""
@@ -174,14 +178,13 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
async def restart(self) -> None:
"""Restart CoreDNS plugin."""
self._write_corefile()
with suppress(DockerAPIError):
await self.instance.stop()
await self.start()
await self.instance.restart()
async def start(self) -> None:
"""Run CoreDNS."""
self._write_corefile()
self.write_hosts()
# Start Instance
_LOGGER.info("Start CoreDNS plugin")
@@ -195,6 +198,7 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
"""Reset Config / Hosts."""
self.servers = DNS_SERVERS
# Resets hosts
with suppress(OSError):
self.hosts.unlink()
self._init_hosts()
@@ -203,14 +207,26 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
def _write_corefile(self) -> None:
"""Write CoreDNS config."""
dns_servers: List[str] = []
# Load Template
try:
corefile_template: Template = Template(COREDNS_TMPL.read_text())
except OSError as err:
_LOGGER.error("Can't read coredns template file: %s", err)
raise CoreDNSError() from None
# Prepare DNS serverlist: Prio 1 Local, Prio 2 Manual, Prio 3 Fallback
local_dns: List[str] = self.sys_host.network.dns_servers or ["dns://127.0.0.11"]
for server in local_dns + self.servers + DNS_SERVERS:
try:
DNS_URL(server)
if server not in dns_servers:
dns_servers.append(server)
except vol.Invalid:
_LOGGER.warning("Ignore invalid DNS Server: %s", server)
# Generate config file
dns_servers = self.servers + list(set(DNS_SERVERS) - set(self.servers))
data = corefile_template.safe_substitute(servers=" ".join(dns_servers))
try:
@@ -222,6 +238,7 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
def _init_hosts(self) -> None:
"""Import hosts entry."""
# Generate Default
self.add_host(IPv4Address("127.0.0.1"), ["localhost"], write=False)
self.add_host(
self.sys_docker.network.supervisor, ["hassio", "supervisor"], write=False
)
@@ -230,6 +247,7 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
["homeassistant", "home-assistant"],
write=False,
)
self.add_host(self.sys_docker.network.dns, ["dns"], write=False)
def write_hosts(self) -> None:
"""Write hosts from memory to file."""
@@ -342,8 +360,8 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
continue
resolv_lines.append(line.strip())
except OSError as err:
_LOGGER.error("Can't read local resolv: %s", err)
raise CoreDNSError() from None
_LOGGER.warning("Can't read local resolv: %s", err)
return
if nameserver in resolv_lines:
return
@@ -356,5 +374,5 @@ class CoreDNS(JsonConfig, CoreSysAttributes):
for line in resolv_lines:
resolv.write(f"{line}\n")
except OSError as err:
_LOGGER.error("Can't write local resolv: %s", err)
raise CoreDNSError() from None
_LOGGER.warning("Can't write local resolv: %s", err)
return

View File

@@ -11,7 +11,7 @@ from ..const import SOCKET_DOCKER, DNS_SUFFIX
from ..exceptions import DockerAPIError
from .network import DockerNetwork
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
@attr.s(frozen=True)
@@ -54,6 +54,7 @@ class DockerAPI:
self,
image: str,
version: str = "latest",
dns: bool = True,
ipv4: Optional[IPv4Address] = None,
**kwargs: Dict[str, Any],
) -> docker.models.containers.Container:
@@ -61,14 +62,15 @@ class DockerAPI:
Need run inside executor.
"""
name: str = kwargs.get("name", image)
name: str = kwargs.get("name")
network_mode: str = kwargs.get("network_mode")
hostname: str = kwargs.get("hostname")
# Setup DNS
kwargs["dns"] = [str(self.network.dns)]
kwargs["dns_search"] = [DNS_SUFFIX]
kwargs["domainname"] = DNS_SUFFIX
if dns:
kwargs["dns"] = [str(self.network.dns)]
kwargs["dns_search"] = [DNS_SUFFIX]
kwargs["domainname"] = DNS_SUFFIX
# Setup network
if not network_mode:
@@ -176,3 +178,10 @@ class DockerAPI:
_LOGGER.debug("Volumes prune: %s", output)
except docker.errors.APIError as err:
_LOGGER.warning("Error for volumes prune: %s", err)
_LOGGER.info("Prune stale networks")
try:
output = self.docker.api.prune_networks()
_LOGGER.debug("Networks prune: %s", output)
except docker.errors.APIError as err:
_LOGGER.warning("Error for networks prune: %s", err)

View File

@@ -32,7 +32,7 @@ if TYPE_CHECKING:
from ..addons.addon import Addon
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
AUDIO_DEVICE = "/dev/snd:/dev/snd:rwm"
NO_ADDDRESS = ip_address("0.0.0.0")

View File

@@ -7,7 +7,7 @@ from ..coresys import CoreSysAttributes
from ..exceptions import DockerAPIError
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
DNS_DOCKER_NAME: str = "hassio_dns"
@@ -41,6 +41,7 @@ class DockerDNS(DockerInterface, CoreSysAttributes):
docker_container = self.sys_docker.run(
self.image,
version=self.sys_dns.version,
dns=False,
ipv4=self.sys_docker.network.dns,
name=self.name,
hostname=self.name.replace("_", "-"),

View File

@@ -6,7 +6,7 @@ import docker
from ..coresys import CoreSysAttributes
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class DockerHassOSCli(DockerInterface, CoreSysAttributes):

View File

@@ -10,7 +10,7 @@ from ..const import ENV_TIME, ENV_TOKEN, LABEL_MACHINE
from ..exceptions import DockerAPIError
from .interface import CommandReturn, DockerInterface
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
HASS_DOCKER_NAME = "homeassistant"
@@ -127,7 +127,9 @@ class DockerHomeAssistant(DockerInterface):
"""
try:
docker_container = self.sys_docker.containers.get(self.name)
docker_image = self.sys_docker.images.get(self.image)
docker_image = self.sys_docker.images.get(
f"{self.image}:{self.sys_homeassistant.version}"
)
except docker.errors.DockerException:
return False

View File

@@ -13,7 +13,7 @@ from ..exceptions import DockerAPIError
from ..utils import process_lock
from .stats import DockerStats
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class DockerInterface(CoreSysAttributes):
@@ -42,6 +42,13 @@ class DockerInterface(CoreSysAttributes):
return {}
return self._meta.get("Config", {})
@property
def meta_host(self) -> Dict[str, Any]:
"""Return meta data of configuration for host."""
if not self._meta:
return {}
return self._meta.get("HostConfig", {})
@property
def meta_labels(self) -> Dict[str, str]:
"""Return meta data of labels for container/image."""

View File

@@ -1,4 +1,5 @@
"""Internal network manager for Hass.io."""
from contextlib import suppress
from ipaddress import IPv4Address
import logging
from typing import List, Optional
@@ -8,7 +9,7 @@ import docker
from ..const import DOCKER_NETWORK, DOCKER_NETWORK_MASK, DOCKER_NETWORK_RANGE
from ..exceptions import DockerAPIError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class DockerNetwork:
@@ -107,3 +108,11 @@ class DockerNetwork:
except docker.errors.APIError as err:
_LOGGER.warning("Can't disconnect container from default: %s", err)
raise DockerAPIError() from None
def stale_cleanup(self, container_name: str):
"""Remove force a container from Network.
Fix: https://github.com/moby/moby/issues/23302
"""
with suppress(docker.errors.APIError):
self.network.disconnect(container_name, force=True)

View File

@@ -10,7 +10,7 @@ from ..coresys import CoreSysAttributes
from ..exceptions import DockerAPIError
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class DockerSupervisor(DockerInterface, CoreSysAttributes):
@@ -26,6 +26,11 @@ class DockerSupervisor(DockerInterface, CoreSysAttributes):
"""Return IP address of this container."""
return self.sys_docker.network.supervisor
@property
def privileged(self) -> bool:
"""Return True if the container run with Privileged."""
return self.meta_host.get("Privileged", False)
def _attach(self, tag: str) -> None:
"""Attach to running docker container.

View File

@@ -149,6 +149,10 @@ class DBusNotConnectedError(HostNotSupportedError):
"""DBus is not connected and call a method."""
class DBusInterfaceError(HassioNotSupportedError):
"""DBus interface not connected."""
class DBusFatalError(DBusError):
"""DBus call going wrong."""
@@ -184,3 +188,10 @@ class JsonFileError(HassioError):
class DockerAPIError(HassioError):
"""Docker API error."""
# Hardware
class HardwareNotSupportedError(HassioNotSupportedError):
"""Raise if hardware function is not supported."""

View File

@@ -18,7 +18,7 @@ from .exceptions import (
DockerAPIError,
)
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class HassOS(CoreSysAttributes):

View File

@@ -2,7 +2,6 @@
import asyncio
from contextlib import asynccontextmanager, suppress
from datetime import datetime, timedelta
from distutils.version import StrictVersion
from ipaddress import IPv4Address
import logging
import os
@@ -16,6 +15,7 @@ from uuid import UUID
import aiohttp
from aiohttp import hdrs
import attr
from packaging import version as pkg_version
from .const import (
ATTR_ACCESS_TOKEN,
@@ -47,7 +47,7 @@ from .utils import check_port, convert_to_ascii, process_lock
from .utils.json import JsonConfig
from .validate import SCHEMA_HASS_CONFIG
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
RE_YAML_ERROR = re.compile(r"homeassistant\.util\.yaml")
@@ -80,7 +80,9 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
try:
# Evaluate Version if we lost this information
if not self.version:
self.version = await self.instance.get_latest_version(key=StrictVersion)
self.version = await self.instance.get_latest_version(
key=pkg_version.parse
)
await self.instance.attach(tag=self.version)
except DockerAPIError:
@@ -573,7 +575,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
migration_progress = True
_LOGGER.info("Home Assistant record migration in progress")
continue
elif migration_progress:
if migration_progress:
migration_progress = False # Reset start time
start_time = time.monotonic()
_LOGGER.info("Home Assistant record migration done")
@@ -584,7 +586,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
pip_progress = True
_LOGGER.info("Home Assistant pip installation in progress")
continue
elif pip_progress:
if pip_progress:
pip_progress = False # Reset start time
start_time = time.monotonic()
_LOGGER.info("Home Assistant pip installation done")
@@ -603,6 +605,11 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
return
_LOGGER.info("Repair Home Assistant %s", self.version)
await self.sys_run_in_executor(
self.sys_docker.network.stale_cleanup, self.instance.name
)
# Pull image
try:
await self.instance.install(self.version)
except DockerAPIError:

View File

@@ -7,6 +7,7 @@ from .apparmor import AppArmorControl
from .control import SystemControl
from .info import InfoCenter
from .services import ServiceManager
from .network import NetworkManager
from ..const import (
FEATURES_REBOOT,
FEATURES_SHUTDOWN,
@@ -14,49 +15,56 @@ from ..const import (
FEATURES_SERVICES,
FEATURES_HASSOS,
)
from ..coresys import CoreSysAttributes
from ..coresys import CoreSysAttributes, CoreSys
from ..exceptions import HassioError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class HostManager(CoreSysAttributes):
"""Manage supported function from host."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize Host manager."""
self.coresys = coresys
self._alsa = AlsaAudio(coresys)
self._apparmor = AppArmorControl(coresys)
self._control = SystemControl(coresys)
self._info = InfoCenter(coresys)
self._services = ServiceManager(coresys)
self.coresys: CoreSys = coresys
self._alsa: AlsaAudio = AlsaAudio(coresys)
self._apparmor: AppArmorControl = AppArmorControl(coresys)
self._control: SystemControl = SystemControl(coresys)
self._info: InfoCenter = InfoCenter(coresys)
self._services: ServiceManager = ServiceManager(coresys)
self._network: NetworkManager = NetworkManager(coresys)
@property
def alsa(self):
def alsa(self) -> AlsaAudio:
"""Return host ALSA handler."""
return self._alsa
@property
def apparmor(self):
def apparmor(self) -> AppArmorControl:
"""Return host AppArmor handler."""
return self._apparmor
@property
def control(self):
def control(self) -> SystemControl:
"""Return host control handler."""
return self._control
@property
def info(self):
def info(self) -> InfoCenter:
"""Return host info handler."""
return self._info
@property
def services(self):
def services(self) -> ServiceManager:
"""Return host services handler."""
return self._services
@property
def network(self) -> NetworkManager:
"""Return host NetworkManager handler."""
return self._network
@property
def supperted_features(self):
"""Return a list of supported host features."""
@@ -81,6 +89,9 @@ class HostManager(CoreSysAttributes):
if self.sys_dbus.systemd.is_connected:
await self.services.update()
if self.sys_dbus.nmi_dns.is_connected:
await self.network.update()
async def load(self):
"""Load host information."""
with suppress(HassioError):

View File

@@ -9,10 +9,15 @@ import attr
from ..const import ATTR_INPUT, ATTR_OUTPUT, ATTR_DEVICES, ATTR_NAME, CHAN_ID, CHAN_TYPE
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# pylint: disable=invalid-name
DefaultConfig = attr.make_class("DefaultConfig", ["input", "output"])
@attr.s()
class DefaultConfig:
"""Default config input/output ALSA channel."""
input: str = attr.ib()
output: str = attr.ib()
AUDIODB_JSON: Path = Path(__file__).parents[1].joinpath("data/audiodb.json")

View File

@@ -7,7 +7,7 @@ from ..coresys import CoreSysAttributes
from ..exceptions import DBusError, HostAppArmorError
from ..utils.apparmor import validate_profile
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
SYSTEMD_SERVICES = {"hassos-apparmor.service", "hassio-apparmor.service"}

View File

@@ -4,7 +4,7 @@ import logging
from ..coresys import CoreSysAttributes
from ..exceptions import HostNotSupportedError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
MANAGER = "manager"
HOSTNAME = "hostname"

View File

@@ -1,10 +1,11 @@
"""Info control for host."""
import logging
from typing import Optional
from ..coresys import CoreSysAttributes
from ..exceptions import HassioError, HostNotSupportedError
from ..exceptions import HostNotSupportedError, DBusNotConnectedError, DBusError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class InfoCenter(CoreSysAttributes):
@@ -13,46 +14,44 @@ class InfoCenter(CoreSysAttributes):
def __init__(self, coresys):
"""Initialize system center handling."""
self.coresys = coresys
self._data = {}
@property
def hostname(self):
def hostname(self) -> Optional[str]:
"""Return local hostname."""
return self._data.get("StaticHostname") or None
return self.sys_dbus.hostname.hostname
@property
def chassis(self):
def chassis(self) -> Optional[str]:
"""Return local chassis type."""
return self._data.get("Chassis") or None
return self.sys_dbus.hostname.chassis
@property
def deployment(self):
def deployment(self) -> Optional[str]:
"""Return local deployment type."""
return self._data.get("Deployment") or None
return self.sys_dbus.hostname.deployment
@property
def kernel(self):
def kernel(self) -> Optional[str]:
"""Return local kernel version."""
return self._data.get("KernelRelease") or None
return self.sys_dbus.hostname.kernel
@property
def operating_system(self):
def operating_system(self) -> Optional[str]:
"""Return local operating system."""
return self._data.get("OperatingSystemPrettyName") or None
return self.sys_dbus.hostname.operating_system
@property
def cpe(self):
def cpe(self) -> Optional[str]:
"""Return local CPE."""
return self._data.get("OperatingSystemCPEName") or None
return self.sys_dbus.hostname.cpe
async def update(self):
"""Update properties over dbus."""
if not self.sys_dbus.hostname.is_connected:
_LOGGER.error("No hostname D-Bus connection available")
raise HostNotSupportedError()
_LOGGER.info("Update local host information")
try:
self._data = await self.sys_dbus.hostname.get_properties()
except HassioError:
await self.sys_dbus.hostname.update()
except DBusError:
_LOGGER.warning("Can't update host system information!")
except DBusNotConnectedError:
_LOGGER.error("No hostname D-Bus connection available")
raise HostNotSupportedError() from None

39
hassio/host/network.py Normal file
View File

@@ -0,0 +1,39 @@
"""Info control for host."""
import logging
from typing import List
from ..coresys import CoreSysAttributes, CoreSys
from ..exceptions import HostNotSupportedError, DBusNotConnectedError, DBusError
_LOGGER: logging.Logger = logging.getLogger(__name__)
class NetworkManager(CoreSysAttributes):
"""Handle local network setup."""
def __init__(self, coresys: CoreSys):
"""Initialize system center handling."""
self.coresys: CoreSys = coresys
@property
def dns_servers(self) -> List[str]:
"""Return a list of local DNS servers."""
# Read all local dns servers
servers: List[str] = []
for config in self.sys_dbus.nmi_dns.configuration:
if config.vpn or not config.nameservers:
continue
servers.extend(config.nameservers)
return [f"dns://{server}" for server in list(dict.fromkeys(servers))]
async def update(self):
"""Update properties over dbus."""
_LOGGER.info("Update local network DNS information")
try:
await self.sys_dbus.nmi_dns.update()
except DBusError:
_LOGGER.warning("Can't update host DNS system information!")
except DBusNotConnectedError:
_LOGGER.error("No hostname D-Bus connection available")
raise HostNotSupportedError() from None

View File

@@ -6,7 +6,7 @@ import attr
from ..coresys import CoreSysAttributes
from ..exceptions import HassioError, HostNotSupportedError, HostServiceError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
MOD_REPLACE = "replace"
@@ -91,9 +91,9 @@ class ServiceManager(CoreSysAttributes):
class ServiceInfo:
"""Represent a single Service."""
name = attr.ib(type=str)
description = attr.ib(type=str)
state = attr.ib(type=str)
name: str = attr.ib()
description: str = attr.ib()
state: str = attr.ib()
@staticmethod
def read_from(unit):

View File

@@ -12,7 +12,7 @@ from .utils.dt import utc_from_timestamp, utcnow
from .utils.json import JsonConfig
from .validate import SCHEMA_INGRESS_CONFIG
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class Ingress(JsonConfig, CoreSysAttributes):

View File

@@ -7,7 +7,7 @@ from typing import Optional
import async_timeout
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
COMMAND = "socat UDP-RECVFROM:53,fork UDP-SENDTO:{!s}:53"

View File

@@ -1,4 +1,5 @@
"""Read hardware info from system."""
import asyncio
from datetime import datetime
import logging
from pathlib import Path
@@ -8,8 +9,9 @@ from typing import Any, Dict, Optional, Set
import pyudev
from ..const import ATTR_DEVICES, ATTR_NAME, ATTR_TYPE, CHAN_ID, CHAN_TYPE
from ..exceptions import HardwareNotSupportedError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
ASOUND_CARDS: Path = Path("/proc/asound/cards")
RE_CARDS: re.Pattern = re.compile(r"(\d+) \[(\w*) *\]: (.*\w)")
@@ -148,3 +150,14 @@ class Hardware:
return None
return datetime.utcfromtimestamp(int(found.group(1)))
async def udev_trigger(self) -> None:
"""Trigger a udev reload."""
proc = await asyncio.create_subprocess_exec("udevadm", "trigger")
await proc.wait()
if proc.returncode == 0:
return
_LOGGER.warning("udevadm device triggering fails!")
raise HardwareNotSupportedError()

View File

@@ -3,7 +3,7 @@ import asyncio
from datetime import date, datetime, time, timedelta
import logging
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
INTERVAL = "interval"
REPEAT = "repeat"

64
hassio/secrets.py Normal file
View File

@@ -0,0 +1,64 @@
"""Handle Home Assistant secrets to add-ons."""
from datetime import timedelta
import logging
from pathlib import Path
from typing import Dict
from ruamel.yaml import YAML, YAMLError
import voluptuous as vol
from .coresys import CoreSys, CoreSysAttributes
from .utils import AsyncThrottle
_LOGGER: logging.Logger = logging.getLogger(__name__)
SECRETS_SCHEMA = vol.Schema({str: vol.Any(str, int, None, float)})
class SecretsManager(CoreSysAttributes):
"""Manage Home Assistant secrets."""
def __init__(self, coresys: CoreSys):
"""Initialize secret manager."""
self.coresys: CoreSys = coresys
self.secrets: Dict[str, str] = {}
@property
def path_secrets(self) -> Path:
"""Return path to secret file."""
return Path(self.sys_config.path_homeassistant, "secrets.yaml")
def get(self, secret: str) -> str:
"""Get secret from store."""
_LOGGER.info("Request secret %s", secret)
return self.secrets.get(secret)
async def load(self) -> None:
"""Load secrets on start."""
await self._read_secrets()
_LOGGER.info("Load Home Assistant secrets: %s", len(self.secrets))
async def reload(self) -> None:
"""Reload secrets."""
await self._read_secrets()
@AsyncThrottle(timedelta(seconds=60))
async def _read_secrets(self):
"""Read secrets.yaml into memory."""
if not self.path_secrets.exists():
_LOGGER.debug("Home Assistant secrets not exists")
return
# Read secrets
try:
yaml = YAML()
data = await self.sys_run_in_executor(yaml.load, self.path_secrets) or {}
self.secrets = SECRETS_SCHEMA(data)
except YAMLError as err:
_LOGGER.error("Can't process Home Assistant secrets: %s", err)
except vol.Invalid:
_LOGGER.warning("Home Assistant secrets have a invalid format")
else:
_LOGGER.debug("Reload Home Assistant secrets: %s", len(self.secrets))

View File

@@ -19,7 +19,7 @@ from ..const import (
)
from ..interface import ServiceInterface
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter

View File

@@ -9,7 +9,7 @@ from ..const import FOLDER_HOMEASSISTANT, SNAPSHOT_FULL, SNAPSHOT_PARTIAL
from ..coresys import CoreSysAttributes
from ..utils.dt import utcnow
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class SnapshotManager(CoreSysAttributes):

View File

@@ -41,11 +41,11 @@ from ..const import (
from ..coresys import CoreSys, CoreSysAttributes
from ..exceptions import AddonsError
from ..utils.json import write_json_file
from ..utils.tar import SecureTarFile
from ..utils.tar import SecureTarFile, secure_path
from .utils import key_to_iv, password_for_validating, password_to_key, remove_folder
from .validate import ALL_FOLDERS, SCHEMA_SNAPSHOT
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class Snapshot(CoreSysAttributes):
@@ -248,7 +248,7 @@ class Snapshot(CoreSysAttributes):
def _extract_snapshot():
"""Extract a snapshot."""
with tarfile.open(self.tarfile, "r:") as tar:
tar.extractall(path=self._tmp.name)
tar.extractall(path=self._tmp.name, members=secure_path(tar))
await self.sys_run_in_executor(_extract_snapshot)
@@ -396,7 +396,7 @@ class Snapshot(CoreSysAttributes):
try:
_LOGGER.info("Restore folder %s", name)
with SecureTarFile(tar_name, "r", key=self._key) as tar_file:
tar_file.extractall(path=origin_dir)
tar_file.extractall(path=origin_dir, members=tar_file)
_LOGGER.info("Restore folder %s done", name)
except (tarfile.TarError, OSError) as err:
_LOGGER.warning("Can't restore folder %s: %s", name, err)

View File

@@ -42,7 +42,7 @@ def remove_folder(folder):
for obj in folder.iterdir():
try:
if obj.is_dir():
shutil.rmtree(str(obj), ignore_errors=True)
shutil.rmtree(obj, ignore_errors=True)
else:
obj.unlink()
except (OSError, shutil.Error):

View File

@@ -9,7 +9,7 @@ from .addon import AddonStore
from .data import StoreData
from .repository import Repository
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
BUILTIN_REPOSITORIES = set((REPOSITORY_CORE, REPOSITORY_LOCAL))

View File

@@ -4,7 +4,7 @@ import logging
from ..coresys import CoreSys
from ..addons.model import AddonModel, Data
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class AddonStore(AddonModel):

View File

@@ -20,7 +20,7 @@ from ..utils.json import read_json_file
from .utils import extract_hash_from_path
from .validate import SCHEMA_REPOSITORY_CONFIG
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class StoreData(CoreSysAttributes):

View File

@@ -12,7 +12,7 @@ from ..const import URL_HASSIO_ADDONS, ATTR_URL, ATTR_BRANCH
from ..coresys import CoreSysAttributes
from ..validate import RE_REPOSITORY
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class GitRepo(CoreSysAttributes):
@@ -137,7 +137,7 @@ class GitRepo(CoreSysAttributes):
"""Log error."""
_LOGGER.warning("Can't remove %s", path)
shutil.rmtree(str(self.path), onerror=log_err)
shutil.rmtree(self.path, onerror=log_err)
class GitRepoHassIO(GitRepo):

View File

@@ -5,7 +5,7 @@ from pathlib import Path
import re
RE_SHA1 = re.compile(r"[a-f0-9]{8}")
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
def get_hash_from_repository(name: str) -> str:

View File

@@ -20,7 +20,7 @@ from .exceptions import (
SupervisorUpdateError,
)
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class Supervisor(CoreSysAttributes):
@@ -41,6 +41,12 @@ class Supervisor(CoreSysAttributes):
with suppress(DockerAPIError):
await self.instance.cleanup()
# Check privileged mode
if not self.instance.privileged:
_LOGGER.error(
"Supervisor does not run in Privileged mode. Hassio runs with limited functionality!"
)
@property
def ip_address(self) -> IPv4Address:
"""Return IP of Supervisor instance."""

View File

@@ -5,7 +5,7 @@ import logging
from .coresys import CoreSysAttributes
from .exceptions import HomeAssistantError, CoreDNSError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
HASS_WATCHDOG_API = "HASS_WATCHDOG_API"
@@ -16,7 +16,7 @@ RUN_UPDATE_DNS = 30100
RUN_RELOAD_ADDONS = 10800
RUN_RELOAD_SNAPSHOTS = 72000
RUN_RELOAD_HOST = 72000
RUN_RELOAD_HOST = 7600
RUN_RELOAD_UPDATER = 7200
RUN_RELOAD_INGRESS = 930

View File

@@ -24,7 +24,7 @@ from .utils import AsyncThrottle
from .utils.json import JsonConfig
from .validate import SCHEMA_UPDATER_CONFIG
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
class Updater(JsonConfig, CoreSysAttributes):

View File

@@ -5,7 +5,7 @@ import logging
import re
import socket
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
RE_STRING = re.compile(r"\x1b(\[.*?[@-~]|\].*?(\x07|\x1b\\))")

View File

@@ -4,7 +4,7 @@ import re
from ..exceptions import AppArmorFileError, AppArmorInvalidError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
RE_PROFILE = re.compile(r"^profile ([^ ]+).*$")

View File

@@ -12,7 +12,7 @@ UTC = pytz.utc
GEOIP_URL = "http://ip-api.com/json/"
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# Copyright (c) Django Software Foundation and individual contributors.

View File

@@ -1,79 +1,95 @@
"""DBus implementation with glib."""
from __future__ import annotations
import asyncio
import logging
import json
import shlex
import re
from signal import SIGINT
from typing import Any, Dict, List, Optional, Set
import xml.etree.ElementTree as ET
from ..exceptions import DBusFatalError, DBusParseError
from ..exceptions import (
DBusFatalError,
DBusParseError,
DBusInterfaceError,
DBusNotConnectedError,
)
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
# Use to convert GVariant into json
RE_GVARIANT_TYPE = re.compile(
r"(?:boolean|byte|int16|uint16|int32|uint32|handle|int64|uint64|double|"
r"string|objectpath|signature) "
RE_GVARIANT_TYPE: re.Match = re.compile(
r"\"[^\"\\]*(?:\\.[^\"\\]*)*\"|(boolean|byte|int16|uint16|int32|uint32|handle|int64|uint64|double|"
r"string|objectpath|signature|@[asviumodf\{\}]+) "
)
RE_GVARIANT_VARIANT = re.compile(
r"(?<=(?: |{|\[))<((?:'|\").*?(?:'|\")|\d+(?:\.\d+)?)>(?=(?:|]|}|,))"
RE_GVARIANT_VARIANT: re.Match = re.compile(r"\"[^\"\\]*(?:\\.[^\"\\]*)*\"|(<|>)")
RE_GVARIANT_STRING_ESC: re.Match = re.compile(
r"(?<=(?: |{|\[|\(|<))'[^']*?\"[^']*?'(?=(?:|]|}|,|\)|>))"
)
RE_GVARIANT_STRING = re.compile(r"(?<=(?: |{|\[|\())'(.*?)'(?=(?:|]|}|,|\)))")
RE_GVARIANT_TUPLE_O = re.compile(r"\"[^\"]*?\"|(\()")
RE_GVARIANT_TUPLE_C = re.compile(r"\"[^\"]*?\"|(,?\))")
RE_GVARIANT_STRING: re.Match = re.compile(
r"(?<=(?: |{|\[|\(|<))'(.*?)'(?=(?:|]|}|,|\)|>))"
)
RE_GVARIANT_TUPLE_O: re.Match = re.compile(r"\"[^\"\\]*(?:\\.[^\"\\]*)*\"|(\()")
RE_GVARIANT_TUPLE_C: re.Match = re.compile(r"\"[^\"\\]*(?:\\.[^\"\\]*)*\"|(,?\))")
RE_MONITOR_OUTPUT = re.compile(r".+?: (?P<signal>[^ ].+) (?P<data>.*)")
RE_MONITOR_OUTPUT: re.Match = re.compile(r".+?: (?P<signal>[^ ].+) (?P<data>.*)")
# Map GDBus to errors
MAP_GDBUS_ERROR: Dict[str, Any] = {
"GDBus.Error:org.freedesktop.DBus.Error.ServiceUnknown": DBusInterfaceError,
"No such file or directory": DBusNotConnectedError,
}
# Commands for dbus
INTROSPECT = "gdbus introspect --system --dest {bus} " "--object-path {object} --xml"
CALL = (
INTROSPECT: str = "gdbus introspect --system --dest {bus} " "--object-path {object} --xml"
CALL: str = (
"gdbus call --system --dest {bus} --object-path {object} "
"--method {method} {args}"
)
MONITOR = "gdbus monitor --system --dest {bus}"
MONITOR: str = "gdbus monitor --system --dest {bus}"
DBUS_METHOD_GETALL = "org.freedesktop.DBus.Properties.GetAll"
DBUS_METHOD_GETALL: str = "org.freedesktop.DBus.Properties.GetAll"
class DBus:
"""DBus handler."""
def __init__(self, bus_name, object_path):
def __init__(self, bus_name: str, object_path: str) -> None:
"""Initialize dbus object."""
self.bus_name = bus_name
self.object_path = object_path
self.methods = set()
self.signals = set()
self.bus_name: str = bus_name
self.object_path: str = object_path
self.methods: Set[str] = set()
self.signals: Set[str] = set()
@staticmethod
async def connect(bus_name, object_path):
async def connect(bus_name: str, object_path: str) -> DBus:
"""Read object data."""
self = DBus(bus_name, object_path)
await self._init_proxy() # pylint: disable=protected-access
# pylint: disable=protected-access
await self._init_proxy()
_LOGGER.info("Connect to dbus: %s - %s", bus_name, object_path)
return self
async def _init_proxy(self):
async def _init_proxy(self) -> None:
"""Read interface data."""
command = shlex.split(
INTROSPECT.format(bus=self.bus_name, object=self.object_path)
)
# Ask data
_LOGGER.info("Introspect %s on %s", self.bus_name, self.object_path)
data = await self._send(command)
# Parse XML
data = await self._send(command)
try:
xml = ET.fromstring(data)
except ET.ParseError as err:
_LOGGER.error("Can't parse introspect data: %s", err)
_LOGGER.debug("Introspect %s on %s", self.bus_name, self.object_path)
raise DBusParseError() from None
# Read available methods
_LOGGER.debug("data: %s", data)
for interface in xml.findall("./interface"):
interface_name = interface.get("name")
@@ -88,30 +104,41 @@ class DBus:
self.signals.add(f"{interface_name}.{signal_name}")
@staticmethod
def parse_gvariant(raw):
def parse_gvariant(raw: str) -> Any:
"""Parse GVariant input to python."""
raw = RE_GVARIANT_TYPE.sub("", raw)
raw = RE_GVARIANT_VARIANT.sub(r"\1", raw)
raw = RE_GVARIANT_STRING.sub(r'"\1"', raw)
raw = RE_GVARIANT_TUPLE_O.sub(
lambda x: x.group(0) if not x.group(1) else "[", raw
# Process first string
json_raw = RE_GVARIANT_STRING_ESC.sub(
lambda x: x.group(0).replace('"', '\\"'), raw
)
raw = RE_GVARIANT_TUPLE_C.sub(
lambda x: x.group(0) if not x.group(1) else "]", raw
json_raw = RE_GVARIANT_STRING.sub(r'"\1"', json_raw)
# Remove complex type handling
json_raw: str = RE_GVARIANT_TYPE.sub(
lambda x: x.group(0) if not x.group(1) else "", json_raw
)
json_raw = RE_GVARIANT_VARIANT.sub(
lambda x: x.group(0) if not x.group(1) else "", json_raw
)
json_raw = RE_GVARIANT_TUPLE_O.sub(
lambda x: x.group(0) if not x.group(1) else "[", json_raw
)
json_raw = RE_GVARIANT_TUPLE_C.sub(
lambda x: x.group(0) if not x.group(1) else "]", json_raw
)
# No data
if raw.startswith("[]"):
if json_raw.startswith("[]"):
return []
try:
return json.loads(raw)
return json.loads(json_raw)
except json.JSONDecodeError as err:
_LOGGER.error("Can't parse '%s': %s", raw, err)
_LOGGER.error("Can't parse '%s': %s", json_raw, err)
_LOGGER.debug("GVariant data: '%s'", raw)
raise DBusParseError() from None
@staticmethod
def gvariant_args(args):
def gvariant_args(args: List[Any]) -> str:
"""Convert args into gvariant."""
gvariant = ""
for arg in args:
@@ -122,11 +149,11 @@ class DBus:
elif isinstance(arg, str):
gvariant += f' "{arg}"'
else:
gvariant += " {}".format(str(arg))
gvariant += f" {arg!s}"
return gvariant.lstrip()
async def call_dbus(self, method, *args):
async def call_dbus(self, method: str, *args: List[Any]) -> str:
"""Call a dbus method."""
command = shlex.split(
CALL.format(
@@ -142,10 +169,9 @@ class DBus:
data = await self._send(command)
# Parse and return data
_LOGGER.debug("Receive from %s: %s", method, data)
return self.parse_gvariant(data)
async def get_properties(self, interface):
async def get_properties(self, interface: str) -> Dict[str, Any]:
"""Read all properties from interface."""
try:
return (await self.call_dbus(DBUS_METHOD_GETALL, interface))[0]
@@ -153,7 +179,7 @@ class DBus:
_LOGGER.error("No attributes returned for %s", interface)
raise DBusFatalError from None
async def _send(self, command):
async def _send(self, command: List[str]) -> str:
"""Send command over dbus."""
# Run command
_LOGGER.debug("Send dbus command: %s", command)
@@ -171,12 +197,19 @@ class DBus:
raise DBusFatalError() from None
# Success?
if proc.returncode != 0:
_LOGGER.error("DBus return error: %s", error)
raise DBusFatalError()
if proc.returncode == 0:
return data.decode()
# End
return data.decode()
# Filter error
error = error.decode()
for msg, exception in MAP_GDBUS_ERROR.items():
if msg not in error:
continue
raise exception()
# General
_LOGGER.error("DBus return error: %s", error)
raise DBusFatalError()
def attach_signals(self, filters=None):
"""Generate a signals wrapper."""
@@ -189,7 +222,7 @@ class DBus:
async for signal in signals:
return signal
def __getattr__(self, name):
def __getattr__(self, name: str) -> DBusCallWrapper:
"""Mapping to dbus method."""
return getattr(DBusCallWrapper(self, self.bus_name), name)
@@ -197,17 +230,17 @@ class DBus:
class DBusCallWrapper:
"""Wrapper a DBus interface for a call."""
def __init__(self, dbus, interface):
def __init__(self, dbus: DBus, interface: str) -> None:
"""Initialize wrapper."""
self.dbus = dbus
self.interface = interface
self.dbus: DBus = dbus
self.interface: str = interface
def __call__(self):
def __call__(self) -> None:
"""Should never be called."""
_LOGGER.error("DBus method %s not exists!", self.interface)
raise DBusFatalError()
def __getattr__(self, name):
def __getattr__(self, name: str):
"""Mapping to dbus method."""
interface = f"{self.interface}.{name}"
@@ -227,11 +260,11 @@ class DBusCallWrapper:
class DBusSignalWrapper:
"""Process Signals."""
def __init__(self, dbus, signals=None):
def __init__(self, dbus: DBus, signals: Optional[str] = None):
"""Initialize dbus signal wrapper."""
self.dbus = dbus
self._signals = signals
self._proc = None
self.dbus: DBus = dbus
self._signals: Optional[str] = signals
self._proc: Optional[asyncio.Process] = None
async def __aenter__(self):
"""Start monitor events."""

View File

@@ -9,7 +9,7 @@ from voluptuous.humanize import humanize_error
from ..exceptions import JsonFileError
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
def write_json_file(jsonfile: Path, data: Any) -> None:

View File

@@ -1,19 +1,22 @@
"""Tarfile fileobject handler for encrypted files."""
import hashlib
import logging
import os
from pathlib import Path
import tarfile
from typing import IO, Optional
from typing import IO, Callable, Generator, List, Optional
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import padding
from cryptography.hazmat.primitives.ciphers import (
CipherContext,
Cipher,
CipherContext,
algorithms,
modes,
)
_LOGGER: logging.Logger = logging.getLogger(__name__)
BLOCK_SIZE = 16
BLOCK_SIZE_BITS = 128
@@ -111,3 +114,39 @@ def _generate_iv(key: bytes, salt: bytes) -> bytes:
for _ in range(100):
temp_iv = hashlib.sha256(temp_iv).digest()
return temp_iv[:16]
def secure_path(tar: tarfile.TarFile) -> Generator[tarfile.TarInfo, None, None]:
"""Security safe check of path.
Prevent ../ or absolut paths
"""
for member in tar:
file_path = Path(member.name)
try:
assert not file_path.is_absolute()
Path("/fake", file_path).resolve().relative_to("/fake")
except (ValueError, RuntimeError, AssertionError):
_LOGGER.warning("Issue with file %s", file_path)
continue
else:
yield member
def exclude_filter(
exclude_list: List[str]
) -> Callable[[tarfile.TarInfo], Optional[tarfile.TarInfo]]:
"""Create callable filter function to check TarInfo for add."""
def my_filter(tar: tarfile.TarInfo) -> Optional[tarfile.TarInfo]:
"""Custom exclude filter."""
file_path = Path(tar.name)
for exclude in exclude_list:
if not file_path.match(exclude):
continue
_LOGGER.debug("Ignore %s because of %s", file_path, exclude)
return None
return tar
return my_filter

View File

@@ -1,14 +1,16 @@
aiohttp==3.5.4
aiohttp==3.6.1
async_timeout==3.0.1
attrs==19.1.0
attrs==19.3.0
cchardet==2.1.4
colorlog==4.0.2
cpe==1.2.1
cryptography==2.7
docker==4.0.2
gitpython==3.0.1
pytz==2019.2
cryptography==2.8
docker==4.1.0
gitpython==3.0.4
packaging==19.2
pytz==2019.3
pyudev==0.21.0
uvloop==0.12.2
ruamel.yaml==0.15.100
uvloop==0.13.0
voluptuous==0.11.7
ptvsd==4.3.2

View File

@@ -1,5 +1,5 @@
flake8==3.7.8
pylint==2.3.1
pytest==5.1.0
pylint==2.4.3
pytest==5.2.1
pytest-timeout==1.3.3
pytest-aiohttp==0.3.0

View File

@@ -19,7 +19,7 @@ setup(
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Topic :: Home Automation"
"Topic :: Home Automation",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Scientific/Engineering :: Atmospheric Science",
"Development Status :: 5 - Production/Stable",

View File

@@ -0,0 +1,19 @@
"""Test adguard discovery."""
import voluptuous as vol
import pytest
from hassio.discovery.validate import valid_discovery_config
def test_good_config():
"""Test good deconz config."""
valid_discovery_config("almond", {"host": "test", "port": 3812})
def test_bad_config():
"""Test good adguard config."""
with pytest.raises(vol.Invalid):
valid_discovery_config("almond", {"host": "test"})

View File

@@ -0,0 +1,19 @@
"""Test adguard discovery."""
import voluptuous as vol
import pytest
from hassio.discovery.validate import valid_discovery_config
def test_good_config():
"""Test good deconz config."""
valid_discovery_config("home_panel", {"host": "test", "port": 3812})
def test_bad_config():
"""Test good adguard config."""
with pytest.raises(vol.Invalid):
valid_discovery_config("home_panel", {"host": "test"})

View File

@@ -0,0 +1,312 @@
"""Test gdbus gvariant parser."""
from hassio.utils.gdbus import DBus
def test_simple_return():
"""Test Simple return value."""
raw = "(objectpath '/org/freedesktop/systemd1/job/35383',)"
# parse data
data = DBus.parse_gvariant(raw)
assert data == ["/org/freedesktop/systemd1/job/35383"]
def test_get_property():
"""Test Property parsing."""
raw = "({'Hostname': <'hassio'>, 'StaticHostname': <'hassio'>, 'PrettyHostname': <''>, 'IconName': <'computer-embedded'>, 'Chassis': <'embedded'>, 'Deployment': <'production'>, 'Location': <''>, 'KernelName': <'Linux'>, 'KernelRelease': <'4.14.98-v7'>, 'KernelVersion': <'#1 SMP Sat May 11 02:17:06 UTC 2019'>, 'OperatingSystemPrettyName': <'HassOS 2.12'>, 'OperatingSystemCPEName': <'cpe:2.3:o:home_assistant:hassos:2.12:*:production:*:*:*:rpi3:*'>, 'HomeURL': <'https://hass.io/'>},)"
# parse data
data = DBus.parse_gvariant(raw)
assert data[0] == {
"Hostname": "hassio",
"StaticHostname": "hassio",
"PrettyHostname": "",
"IconName": "computer-embedded",
"Chassis": "embedded",
"Deployment": "production",
"Location": "",
"KernelName": "Linux",
"KernelRelease": "4.14.98-v7",
"KernelVersion": "#1 SMP Sat May 11 02:17:06 UTC 2019",
"OperatingSystemPrettyName": "HassOS 2.12",
"OperatingSystemCPEName": "cpe:2.3:o:home_assistant:hassos:2.12:*:production:*:*:*:rpi3:*",
"HomeURL": "https://hass.io/",
}
def test_systemd_unitlist_simple():
"""Test Systemd Unit list simple."""
raw = "([('systemd-remount-fs.service', 'Remount Root and Kernel File Systems', 'loaded', 'active', 'exited', '', objectpath '/org/freedesktop/systemd1/unit/systemd_2dremount_2dfs_2eservice', uint32 0, '', objectpath '/'), ('sys-subsystem-net-devices-veth5714b4e.device', '/sys/subsystem/net/devices/veth5714b4e', 'loaded', 'active', 'plugged', '', '/org/freedesktop/systemd1/unit/sys_2dsubsystem_2dnet_2ddevices_2dveth5714b4e_2edevice', 0, '', '/'), ('rauc.service', 'Rauc Update Service', 'loaded', 'active', 'running', '', '/org/freedesktop/systemd1/unit/rauc_2eservice', 0, '', '/'), ('mnt-data-docker-overlay2-7493c48dd99ab0e68420e3317d93711630dd55a76d4f2a21863a220031203ac2-merged.mount', '/mnt/data/docker/overlay2/7493c48dd99ab0e68420e3317d93711630dd55a76d4f2a21863a220031203ac2/merged', 'loaded', 'active', 'mounted', '', '/org/freedesktop/systemd1/unit/mnt_2ddata_2ddocker_2doverlay2_2d7493c48dd99ab0e68420e3317d93711630dd55a76d4f2a21863a220031203ac2_2dmerged_2emount', 0, '', '/'), ('hassos-hardware.target', 'HassOS hardware targets', 'loaded', 'active', 'active', '', '/org/freedesktop/systemd1/unit/hassos_2dhardware_2etarget', 0, '', '/'), ('dev-zram1.device', '/dev/zram1', 'loaded', 'active', 'plugged', 'sys-devices-virtual-block-zram1.device', '/org/freedesktop/systemd1/unit/dev_2dzram1_2edevice', 0, '', '/'), ('sys-subsystem-net-devices-hassio.device', '/sys/subsystem/net/devices/hassio', 'loaded', 'active', 'plugged', '', '/org/freedesktop/systemd1/unit/sys_2dsubsystem_2dnet_2ddevices_2dhassio_2edevice', 0, '', '/'), ('cryptsetup.target', 'cryptsetup.target', 'not-found', 'inactive', 'dead', '', '/org/freedesktop/systemd1/unit/cryptsetup_2etarget', 0, '', '/'), ('sys-devices-virtual-net-vethd256dfa.device', '/sys/devices/virtual/net/vethd256dfa', 'loaded', 'active', 'plugged', '', '/org/freedesktop/systemd1/unit/sys_2ddevices_2dvirtual_2dnet_2dvethd256dfa_2edevice', 0, '', '/'), ('network-pre.target', 'Network (Pre)', 'loaded', 'inactive', 'dead', '', '/org/freedesktop/systemd1/unit/network_2dpre_2etarget', 0, '', '/'), ('sys-devices-virtual-net-veth5714b4e.device', '/sys/devices/virtual/net/veth5714b4e', 'loaded', 'active', 'plugged', '', '/org/freedesktop/systemd1/unit/sys_2ddevices_2dvirtual_2dnet_2dveth5714b4e_2edevice', 0, '', '/'), ('sys-kernel-debug.mount', 'Kernel Debug File System', 'loaded', 'active', 'mounted', '', '/org/freedesktop/systemd1/unit/sys_2dkernel_2ddebug_2emount', 0, '', '/'), ('slices.target', 'Slices', 'loaded', 'active', 'active', '', '/org/freedesktop/systemd1/unit/slices_2etarget', 0, '', '/'), ('etc-NetworkManager-system\x2dconnections.mount', 'NetworkManager persistent system connections', 'loaded', 'active', 'mounted', '', '/org/freedesktop/systemd1/unit/etc_2dNetworkManager_2dsystem_5cx2dconnections_2emount', 0, '', '/'), ('run-docker-netns-26ede3178729.mount', '/run/docker/netns/26ede3178729', 'loaded', 'active', 'mounted', '', '/org/freedesktop/systemd1/unit/run_2ddocker_2dnetns_2d26ede3178729_2emount', 0, '', '/'), ('dev-disk-by\x2dpath-platform\x2d3f202000.mmc\x2dpart2.device', '/dev/disk/by-path/platform-3f202000.mmc-part2', 'loaded', 'active', 'plugged', 'sys-devices-platform-soc-3f202000.mmc-mmc_host-mmc0-mmc0:e624-block-mmcblk0-mmcblk0p2.device', '/org/freedesktop/systemd1/unit/dev_2ddisk_2dby_5cx2dpath_2dplatform_5cx2d3f202000_2emmc_5cx2dpart2_2edevice', 0, '', '/')],)"
# parse data
data = DBus.parse_gvariant(raw)
assert data == [
[
[
"systemd-remount-fs.service",
"Remount Root and Kernel File Systems",
"loaded",
"active",
"exited",
"",
"/org/freedesktop/systemd1/unit/systemd_2dremount_2dfs_2eservice",
0,
"",
"/",
],
[
"sys-subsystem-net-devices-veth5714b4e.device",
"/sys/subsystem/net/devices/veth5714b4e",
"loaded",
"active",
"plugged",
"",
"/org/freedesktop/systemd1/unit/sys_2dsubsystem_2dnet_2ddevices_2dveth5714b4e_2edevice",
0,
"",
"/",
],
[
"rauc.service",
"Rauc Update Service",
"loaded",
"active",
"running",
"",
"/org/freedesktop/systemd1/unit/rauc_2eservice",
0,
"",
"/",
],
[
"mnt-data-docker-overlay2-7493c48dd99ab0e68420e3317d93711630dd55a76d4f2a21863a220031203ac2-merged.mount",
"/mnt/data/docker/overlay2/7493c48dd99ab0e68420e3317d93711630dd55a76d4f2a21863a220031203ac2/merged",
"loaded",
"active",
"mounted",
"",
"/org/freedesktop/systemd1/unit/mnt_2ddata_2ddocker_2doverlay2_2d7493c48dd99ab0e68420e3317d93711630dd55a76d4f2a21863a220031203ac2_2dmerged_2emount",
0,
"",
"/",
],
[
"hassos-hardware.target",
"HassOS hardware targets",
"loaded",
"active",
"active",
"",
"/org/freedesktop/systemd1/unit/hassos_2dhardware_2etarget",
0,
"",
"/",
],
[
"dev-zram1.device",
"/dev/zram1",
"loaded",
"active",
"plugged",
"sys-devices-virtual-block-zram1.device",
"/org/freedesktop/systemd1/unit/dev_2dzram1_2edevice",
0,
"",
"/",
],
[
"sys-subsystem-net-devices-hassio.device",
"/sys/subsystem/net/devices/hassio",
"loaded",
"active",
"plugged",
"",
"/org/freedesktop/systemd1/unit/sys_2dsubsystem_2dnet_2ddevices_2dhassio_2edevice",
0,
"",
"/",
],
[
"cryptsetup.target",
"cryptsetup.target",
"not-found",
"inactive",
"dead",
"",
"/org/freedesktop/systemd1/unit/cryptsetup_2etarget",
0,
"",
"/",
],
[
"sys-devices-virtual-net-vethd256dfa.device",
"/sys/devices/virtual/net/vethd256dfa",
"loaded",
"active",
"plugged",
"",
"/org/freedesktop/systemd1/unit/sys_2ddevices_2dvirtual_2dnet_2dvethd256dfa_2edevice",
0,
"",
"/",
],
[
"network-pre.target",
"Network (Pre)",
"loaded",
"inactive",
"dead",
"",
"/org/freedesktop/systemd1/unit/network_2dpre_2etarget",
0,
"",
"/",
],
[
"sys-devices-virtual-net-veth5714b4e.device",
"/sys/devices/virtual/net/veth5714b4e",
"loaded",
"active",
"plugged",
"",
"/org/freedesktop/systemd1/unit/sys_2ddevices_2dvirtual_2dnet_2dveth5714b4e_2edevice",
0,
"",
"/",
],
[
"sys-kernel-debug.mount",
"Kernel Debug File System",
"loaded",
"active",
"mounted",
"",
"/org/freedesktop/systemd1/unit/sys_2dkernel_2ddebug_2emount",
0,
"",
"/",
],
[
"slices.target",
"Slices",
"loaded",
"active",
"active",
"",
"/org/freedesktop/systemd1/unit/slices_2etarget",
0,
"",
"/",
],
[
"etc-NetworkManager-system-connections.mount",
"NetworkManager persistent system connections",
"loaded",
"active",
"mounted",
"",
"/org/freedesktop/systemd1/unit/etc_2dNetworkManager_2dsystem_5cx2dconnections_2emount",
0,
"",
"/",
],
[
"run-docker-netns-26ede3178729.mount",
"/run/docker/netns/26ede3178729",
"loaded",
"active",
"mounted",
"",
"/org/freedesktop/systemd1/unit/run_2ddocker_2dnetns_2d26ede3178729_2emount",
0,
"",
"/",
],
[
"dev-disk-by-path-platform-3f202000.mmc-part2.device",
"/dev/disk/by-path/platform-3f202000.mmc-part2",
"loaded",
"active",
"plugged",
"sys-devices-platform-soc-3f202000.mmc-mmc_host-mmc0-mmc0:e624-block-mmcblk0-mmcblk0p2.device",
"/org/freedesktop/systemd1/unit/dev_2ddisk_2dby_5cx2dpath_2dplatform_5cx2d3f202000_2emmc_5cx2dpart2_2edevice",
0,
"",
"/",
],
]
]
def test_systemd_unitlist_complex():
"""Test Systemd Unit list simple."""
raw = "([('systemd-remount-fs.service', 'Remount Root and \"Kernel File Systems\"', 'loaded', 'active', 'exited', '', objectpath '/org/freedesktop/systemd1/unit/systemd_2dremount_2dfs_2eservice', uint32 0, '', objectpath '/'), ('sys-subsystem-net-devices-veth5714b4e.device', '/sys/subsystem/net/devices/veth5714b4e for \" is', 'loaded', 'active', 'plugged', '', '/org/freedesktop/systemd1/unit/sys_2dsubsystem_2dnet_2ddevices_2dveth5714b4e_2edevice', 0, '', '/')],)"
# parse data
data = DBus.parse_gvariant(raw)
assert data == [
[
[
"systemd-remount-fs.service",
'Remount Root and "Kernel File Systems"',
"loaded",
"active",
"exited",
"",
"/org/freedesktop/systemd1/unit/systemd_2dremount_2dfs_2eservice",
0,
"",
"/",
],
[
"sys-subsystem-net-devices-veth5714b4e.device",
'/sys/subsystem/net/devices/veth5714b4e for " is',
"loaded",
"active",
"plugged",
"",
"/org/freedesktop/systemd1/unit/sys_2dsubsystem_2dnet_2ddevices_2dveth5714b4e_2edevice",
0,
"",
"/",
],
]
]
def test_networkmanager_dns_properties():
"""Test NetworkManager DNS properties."""
raw = "({'Mode': <'default'>, 'RcManager': <'file'>, 'Configuration': <[{'nameservers': <['192.168.23.30']>, 'domains': <['syshack.local']>, 'interface': <'eth0'>, 'priority': <100>, 'vpn': <false>}]>},)"
# parse data
data = DBus.parse_gvariant(raw)
assert data == [
{
"Mode": "default",
"RcManager": "file",
"Configuration": [
{
"nameservers": ["192.168.23.30"],
"domains": ["syshack.local"],
"interface": "eth0",
"priority": 100,
"vpn": False,
}
],
}
]
def test_networkmanager_dns_properties_empty():
"""Test NetworkManager DNS properties."""
raw = "({'Mode': <'default'>, 'RcManager': <'resolvconf'>, 'Configuration': <@aa{sv} []>},)"
# parse data
data = DBus.parse_gvariant(raw)
assert data == [{"Mode": "default", "RcManager": "resolvconf", "Configuration": []}]

View File

@@ -0,0 +1,61 @@
"""Test Tarfile functions."""
import attr
import pytest
from hassio.utils.tar import secure_path, exclude_filter
@attr.s
class TarInfo:
"""Fake TarInfo"""
name: str = attr.ib()
def test_secure_path():
"""Test Secure Path."""
test_list = [
TarInfo("test.txt"),
TarInfo("data/xy.blob"),
TarInfo("bla/blu/ble"),
TarInfo("data/../xy.blob"),
]
assert test_list == list(secure_path(test_list))
def test_not_secure_path():
"""Test Not secure path."""
test_list = [
TarInfo("/test.txt"),
TarInfo("data/../../xy.blob"),
TarInfo("/bla/blu/ble"),
]
assert [] == list(secure_path(test_list))
def test_exclude_filter_good():
"""Test exclude filter."""
filter_funct = exclude_filter(["not/match", "/dev/xy"])
test_list = [
TarInfo("test.txt"),
TarInfo("data/xy.blob"),
TarInfo("bla/blu/ble"),
TarInfo("data/../xy.blob"),
]
assert test_list == [filter_funct(result) for result in test_list]
def test_exclude_filter_bad():
"""Test exclude filter."""
filter_funct = exclude_filter(["*.txt", "data/*", "bla/blu/ble"])
test_list = [
TarInfo("test.txt"),
TarInfo("data/xy.blob"),
TarInfo("bla/blu/ble"),
TarInfo("data/test_files/kk.txt"),
]
for info in [filter_funct(result) for result in test_list]:
assert info is None