Compare commits

...

45 Commits
146 ... 153

Author SHA1 Message Date
Pascal Vizeli
82b2f66920 Merge pull request #1007 from home-assistant/dev
Release 153
2019-04-07 00:21:54 +02:00
Pascal Vizeli
01da42e1b6 Add add-ons reload for manager (#1006) 2019-04-07 00:15:33 +02:00
Pascal Vizeli
d652d22547 Panel ingress fix (#1004) 2019-04-06 23:03:54 +02:00
Pascal Vizeli
baea84abe6 Panel ingress support (#999) 2019-04-06 12:03:26 +02:00
Pascal Vizeli
c2d705a42a Fix ingress_url with not installed add-ons (#998)
Fix ingress_url with not installed add-ons
2019-04-06 11:24:04 +02:00
Pascal Vizeli
f10b433e1f Fix token handling with new secrets (#996)
* Fix token handling with new secrets

* add schema also to ingress
2019-04-05 17:49:43 +02:00
Pascal Vizeli
67f562a846 Init ingress session boarder / lookup (#995)
* Init ingress session boarder / lookup

* Add session to API

* Add cokkie validate

* Do it without event bus

* Add logger

* Fix validation

* Add tests

* Update tests

* Mock json storage
2019-04-05 17:36:07 +02:00
Pascal Vizeli
1edec61133 Add Ingress support (#991)
* Add Ingress support to supervisor

* Update security

* cleanup add-on extraction

* update description

* fix header part

* fix

* Fix header check

* fix tox

* Migrate docker interface typing

* Update home assistant to new docker

* Migrate supervisor

* Fix host add-on problem

* Update hassos

* Update API

* Expose data to API

* Check on API ingress support

* Add ingress URL

* Some cleanups

* debug

* disable uvloop

* Fix issue

* test

* Fix bug

* Fix flow

* Fix interface

* Fix network

* Fix metadata

* cleanups

* Fix exception

* Migrate to token system

* Fix webui

* Fix update

* Fix relaod

* Update log messages

* Attach ingress url only if enabled

* Cleanup ingress url handling

* Ingress update

* Support check version

* Fix raise error

* Migrate default port

* Fix junks

* search error

* Fix content filter

* Add debug

* Update log

* Update flags

* Update documentation

* Cleanup debugs

* Fix lint

* change default port to 8099

* Fix lint

* fix lint
2019-04-05 12:13:44 +02:00
Stephen Beechen
c13a33bf71 Downgrade add-on API access logging to debug (#992)
resolves home-assistant/hassio#865
2019-04-05 11:42:31 +02:00
Pascal Vizeli
2ae93ae7b1 Update API data for deconz (#989)
* Update API data for deconz

* Fix tests
2019-04-01 16:58:58 +02:00
Pascal Vizeli
8451020afe Update uvloop 0.12.2 (#988) 2019-04-01 14:02:28 +02:00
Pascal Vizeli
a48e568efc Update azure-pipelines.yml for Azure Pipelines [skip ci] 2019-04-01 14:01:59 +02:00
Pascal Vizeli
dee2808cb5 Delete main.workflow 2019-04-01 13:57:42 +02:00
Pascal Vizeli
06a2ab26a2 Update azure-pipelines.yml for Azure Pipelines [skip ci] 2019-04-01 13:56:02 +02:00
Pascal Vizeli
45de0f2f39 Update azure-pipelines.yml for Azure Pipelines [skip ci] 2019-04-01 13:05:23 +02:00
Pascal Vizeli
bac5f704dc Update azure-pipelines.yml for Azure Pipelines [skip ci] 2019-04-01 13:01:37 +02:00
Pascal Vizeli
79669a5d04 Set up CI with Azure Pipelines [skip ci] 2019-04-01 12:49:59 +02:00
Pascal Vizeli
a6e712c9ea Bump version 153 2019-03-30 23:37:11 +01:00
Pascal Vizeli
069fe99699 Merge pull request #984 from home-assistant/dev
Release 152
2019-03-30 23:35:54 +01:00
Pascal Vizeli
4754f067ad Update panel for 0.91.0 (#983) 2019-03-30 23:30:46 +01:00
Pascal Vizeli
dce9818812 Bump version 152 2019-03-28 15:01:53 +01:00
Pascal Vizeli
d054b6dbb7 Merge pull request #979 from home-assistant/dev
Release 151
2019-03-28 15:01:23 +01:00
Pascal Vizeli
3093165325 Update cryptography (#981) 2019-03-28 14:37:06 +01:00
Pascal Vizeli
fd9c5bd412 Make arch required (#978) 2019-03-28 14:23:46 +01:00
Pascal Vizeli
9a8850fecd Remove unused pylint 2019-03-28 14:13:36 +01:00
Pascal Vizeli
b12175ab9a Support for deconz discovery & cleanup (#974)
* Support for deconz discovery & cleanup

* Split discovery

* Fix lint

* Fix lint / import
2019-03-28 14:11:18 +01:00
Pascal Vizeli
b52f90187b Make homeassistant container constant (#808)
* Make homeassistant container constant

* Update homeassistant.py

* Update homeassistant.py

* Update interface.py

* Update homeassistant.py

* Fix handling

* add start function

* Add typing

* Fix lint

* Add API call

* Update logs

* Fix some issue with watchdog

* Fix lint
2019-03-27 17:20:05 +01:00
Pascal Vizeli
4eb02f474d Bump version 151 2019-03-20 22:09:42 +01:00
Pascal Vizeli
dfdcddfd0b Merge pull request #968 from home-assistant/dev
Release 150
2019-03-20 22:08:17 +01:00
Pascal Vizeli
0391277bad Fix panel for 0.90.0 (#967) 2019-03-20 22:03:31 +01:00
Pascal Vizeli
73643b9bfe Bump version 150 2019-03-19 21:29:47 +01:00
Pascal Vizeli
93a52b8382 Merge pull request #965 from home-assistant/dev
Release 149
2019-03-19 21:26:38 +01:00
Pascal Vizeli
7a91bb1f6c Update panel for 0.90.0 v6 (#963) 2019-03-19 19:01:52 +01:00
Pascal Vizeli
26efa998a1 Revert dev link (#956) 2019-03-18 09:42:31 +01:00
Pascal Vizeli
fc9f3fee0a Fix 2019-03-18 09:20:48 +01:00
cadwal
ec19bd570b Include serial device node links in container device mapping to allow for persistent names in the HA serial config (#944) 2019-03-18 09:05:04 +01:00
David McNett
3335bad9e1 Correct typo: 'ignore' -> 'ignored' (#947) 2019-03-18 09:02:29 +01:00
Pascal Vizeli
71ae334e24 Update pylint (#945) 2019-03-11 14:03:28 +01:00
Pascal Vizeli
0807651fbd Bump version 149 2019-03-08 11:59:41 +01:00
Pascal Vizeli
7026d42d77 Merge pull request #942 from home-assistant/dev
Release 148
2019-03-08 11:41:58 +01:00
Pascal Vizeli
31047b9ec2 Down or upgrade exists image on restore (#941) 2019-03-08 11:36:36 +01:00
Pascal Vizeli
714791de8f Bump version 148 2019-03-07 21:12:54 +01:00
Pascal Vizeli
c544fff2b2 Merge pull request #939 from home-assistant/dev
Release 147
2019-03-07 21:12:20 +01:00
Pascal Vizeli
fc45670686 Fix bug with update (#938) 2019-03-07 21:09:43 +01:00
Pascal Vizeli
5cefa0a2ee Bump version 147 2019-03-07 16:28:39 +01:00
105 changed files with 2673 additions and 2475 deletions

16
.github/main.workflow vendored
View File

@@ -1,16 +0,0 @@
workflow "tox" {
on = "push"
resolves = [
"Python 3.7",
"Json Files",
]
}
action "Python 3.7" {
uses = "home-assistant/actions/py37-tox@master"
}
action "Json Files" {
uses = "home-assistant/actions/jq@master"
args = "**/*.json"
}

27
API.md
View File

@@ -41,6 +41,7 @@ The addons from `addons` are only installed one.
"arch": "armhf|aarch64|i386|amd64",
"channel": "stable|beta|dev",
"timezone": "TIMEZONE",
"ip_address": "ip address",
"wait_boot": "int",
"addons": [
{
@@ -348,6 +349,7 @@ Load host configs from a USB stick.
"last_version": "LAST_VERSION",
"arch": "arch",
"machine": "Image machine type",
"ip_address": "ip address",
"image": "str",
"custom": "bool -> if custom image",
"boot": "bool",
@@ -376,6 +378,7 @@ Output is the raw Docker log.
- POST `/homeassistant/check`
- POST `/homeassistant/start`
- POST `/homeassistant/stop`
- POST `/homeassistant/rebuild`
- POST `/homeassistant/options`
@@ -468,6 +471,7 @@ Get all available addons.
"available": "bool",
"arch": ["armhf", "aarch64", "i386", "amd64"],
"machine": "[raspberrypi2, tinker]",
"homeassistant": "null|min Home Assistant version",
"repository": "12345678|null",
"version": "null|VERSION_INSTALLED",
"last_version": "LAST_VERSION",
@@ -504,7 +508,11 @@ Get all available addons.
"audio_input": "null|0,0",
"audio_output": "null|0,0",
"services_role": "['service:access']",
"discovery": "['service']"
"discovery": "['service']",
"ip_address": "ip address",
"ingress": "bool",
"ingress_entry": "null|/api/hassio_ingress/slug",
"ingress_url": "null|/api/hassio_ingress/slug/entry.html"
}
```
@@ -578,6 +586,23 @@ Write data to add-on stdin
}
```
### ingress
- POST `/ingress/session`
Create a new Session for access to ingress service.
```json
{
"session": "token"
}
```
- VIEW `/ingress/{token}`
Ingress WebUI for this Add-on. The addon need support HASS Auth!
Need ingress session as cookie.
### discovery
- GET `/discovery`

45
azure-pipelines.yml Normal file
View File

@@ -0,0 +1,45 @@
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
- dev
pr:
- dev
jobs:
- job: "Tox"
pool:
vmImage: 'ubuntu-16.04'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python $(python.version)'
inputs:
versionSpec: '3.7'
- script: pip install tox
displayName: 'Install Tox'
- script: tox
displayName: 'Run Tox'
- job: "JQ"
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: sudo apt-get install -y jq
displayName: 'Install JQ'
- bash: |
shopt -s globstar
cat **/*.json | jq '.'
displayName: 'Run JQ'

View File

@@ -13,7 +13,8 @@ def initialize_event_loop():
"""Attempt to use uvloop."""
try:
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
uvloop.install()
except ImportError:
pass

View File

@@ -1,41 +1,105 @@
"""Init file for Hass.io add-ons."""
from contextlib import suppress
from copy import deepcopy
from distutils.version import StrictVersion
from ipaddress import IPv4Address, ip_address
import logging
from pathlib import Path, PurePath
import re
import secrets
import shutil
import tarfile
from tempfile import TemporaryDirectory
from typing import Dict, Any
from typing import Any, Awaitable, Dict, Optional
import voluptuous as vol
from voluptuous.humanize import humanize_error
from ..const import (
ATTR_ACCESS_TOKEN, ATTR_APPARMOR, ATTR_ARCH, ATTR_AUDIO, ATTR_AUDIO_INPUT,
ATTR_AUDIO_OUTPUT, ATTR_AUTH_API, ATTR_AUTO_UART, ATTR_AUTO_UPDATE,
ATTR_BOOT, ATTR_DESCRIPTON, ATTR_DEVICES, ATTR_DEVICETREE, ATTR_DISCOVERY,
ATTR_DOCKER_API, ATTR_ENVIRONMENT, ATTR_FULL_ACCESS, ATTR_GPIO,
ATTR_HASSIO_API, ATTR_HASSIO_ROLE, ATTR_HOMEASSISTANT_API, ATTR_HOST_DBUS,
ATTR_HOST_IPC, ATTR_HOST_NETWORK, ATTR_HOST_PID, ATTR_IMAGE,
ATTR_KERNEL_MODULES, ATTR_LEGACY, ATTR_LOCATON, ATTR_MACHINE, ATTR_MAP,
ATTR_NAME, ATTR_NETWORK, ATTR_OPTIONS, ATTR_PORTS, ATTR_PRIVILEGED,
ATTR_PROTECTED, ATTR_REPOSITORY, ATTR_SCHEMA, ATTR_SERVICES, ATTR_SLUG,
ATTR_STARTUP, ATTR_STATE, ATTR_STDIN, ATTR_SYSTEM, ATTR_TIMEOUT,
ATTR_TMPFS, ATTR_URL, ATTR_USER, ATTR_UUID, ATTR_VERSION, ATTR_WEBUI,
SECURITY_DEFAULT, SECURITY_DISABLE, SECURITY_PROFILE, STATE_NONE,
STATE_STARTED, STATE_STOPPED)
from ..coresys import CoreSysAttributes
ATTR_ACCESS_TOKEN,
ATTR_APPARMOR,
ATTR_ARCH,
ATTR_AUDIO,
ATTR_AUDIO_INPUT,
ATTR_AUDIO_OUTPUT,
ATTR_AUTH_API,
ATTR_AUTO_UART,
ATTR_AUTO_UPDATE,
ATTR_BOOT,
ATTR_DESCRIPTON,
ATTR_DEVICES,
ATTR_DEVICETREE,
ATTR_DISCOVERY,
ATTR_DOCKER_API,
ATTR_ENVIRONMENT,
ATTR_FULL_ACCESS,
ATTR_GPIO,
ATTR_HASSIO_API,
ATTR_HASSIO_ROLE,
ATTR_HOMEASSISTANT,
ATTR_HOMEASSISTANT_API,
ATTR_HOST_DBUS,
ATTR_HOST_IPC,
ATTR_HOST_NETWORK,
ATTR_HOST_PID,
ATTR_IMAGE,
ATTR_INGRESS,
ATTR_INGRESS_ENTRY,
ATTR_INGRESS_PORT,
ATTR_INGRESS_TOKEN,
ATTR_KERNEL_MODULES,
ATTR_LEGACY,
ATTR_LOCATON,
ATTR_MACHINE,
ATTR_MAP,
ATTR_NAME,
ATTR_NETWORK,
ATTR_OPTIONS,
ATTR_PORTS,
ATTR_PRIVILEGED,
ATTR_PROTECTED,
ATTR_REPOSITORY,
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_STARTUP,
ATTR_STATE,
ATTR_STDIN,
ATTR_SYSTEM,
ATTR_TIMEOUT,
ATTR_TMPFS,
ATTR_URL,
ATTR_USER,
ATTR_UUID,
ATTR_VERSION,
ATTR_WEBUI,
SECURITY_DEFAULT,
SECURITY_DISABLE,
SECURITY_PROFILE,
STATE_NONE,
STATE_STARTED,
STATE_STOPPED,
)
from ..coresys import CoreSys, CoreSysAttributes
from ..docker.addon import DockerAddon
from ..exceptions import HostAppArmorError, JsonFileError
from ..utils import create_token
from ..docker.stats import DockerStats
from ..exceptions import (
AddonsError,
AddonsNotSupportedError,
DockerAPIError,
HostAppArmorError,
JsonFileError,
)
from ..utils.apparmor import adjust_profile
from ..utils.json import read_json_file, write_json_file
from .utils import check_installed, remove_data
from .validate import (
MACHINE_ALL, RE_SERVICE, RE_VOLUME, SCHEMA_ADDON_SNAPSHOT,
validate_options)
MACHINE_ALL,
RE_SERVICE,
RE_VOLUME,
SCHEMA_ADDON_SNAPSHOT,
validate_options,
)
_LOGGER = logging.getLogger(__name__)
@@ -47,21 +111,28 @@ RE_WEBUI = re.compile(
class Addon(CoreSysAttributes):
"""Hold data for add-on inside Hass.io."""
def __init__(self, coresys, slug):
def __init__(self, coresys: CoreSys, slug: str):
"""Initialize data holder."""
self.coresys = coresys
self.instance = DockerAddon(coresys, slug)
self.coresys: CoreSys = coresys
self.instance: DockerAddon = DockerAddon(coresys, slug)
self._id: str = slug
self._id = slug
async def load(self):
async def load(self) -> None:
"""Async initialize of object."""
if not self.is_installed:
return
with suppress(DockerAPIError):
await self.instance.attach()
@property
def slug(self):
def ip_address(self) -> IPv4Address:
"""Return IP of Add-on instance."""
if not self.is_installed:
return ip_address("0.0.0.0")
return self.instance.ip_address
@property
def slug(self) -> str:
"""Return slug/id of add-on."""
return self._id
@@ -76,30 +147,41 @@ class Addon(CoreSysAttributes):
return self.sys_addons.data
@property
def is_installed(self):
def is_installed(self) -> bool:
"""Return True if an add-on is installed."""
return self._id in self._data.system
@property
def is_detached(self):
def is_detached(self) -> bool:
"""Return True if add-on is detached."""
return self._id not in self._data.cache
@property
def available(self):
def available(self) -> bool:
"""Return True if this add-on is available on this platform."""
if self.is_detached:
addon_data = self._data.system.get(self._id)
else:
addon_data = self._data.cache.get(self._id)
# Architecture
if not self.sys_arch.is_supported(self.supported_arch):
if not self.sys_arch.is_supported(addon_data[ATTR_ARCH]):
return False
# Machine / Hardware
if self.sys_machine not in self.supported_machine:
machine = addon_data.get(ATTR_MACHINE) or MACHINE_ALL
if self.sys_machine not in machine:
return False
# Home Assistant
version = addon_data.get(ATTR_HOMEASSISTANT) or self.sys_homeassistant.version
if StrictVersion(self.sys_homeassistant.version) < StrictVersion(version):
return False
return True
@property
def version_installed(self):
def version_installed(self) -> Optional[str]:
"""Return installed version."""
return self._data.user.get(self._id, {}).get(ATTR_VERSION)
@@ -122,7 +204,10 @@ class Addon(CoreSysAttributes):
def _set_update(self, image: str, version: str) -> None:
"""Update version of add-on."""
self._data.system[self._id] = deepcopy(self._data.cache[self._id])
self._data.user[self._id][ATTR_VERSION] = version
self._data.user[self._id].update({
ATTR_VERSION: version,
ATTR_IMAGE: image,
})
self.save_data()
def _restore_data(self, user: Dict[str, Any], system: Dict[str, Any], image: str) -> None:
@@ -199,6 +284,20 @@ class Addon(CoreSysAttributes):
return self._data.user[self._id].get(ATTR_ACCESS_TOKEN)
return None
@property
def ingress_token(self):
"""Return access token for Hass.io API."""
if self.is_installed:
return self._data.user[self._id].get(ATTR_INGRESS_TOKEN)
return None
@property
def ingress_entry(self):
"""Return ingress external URL."""
if self.is_installed and self.with_ingress:
return f"/api/hassio_ingress/{self.ingress_token}"
return None
@property
def description(self):
"""Return description of add-on."""
@@ -289,6 +388,17 @@ class Addon(CoreSysAttributes):
self._data.user[self._id][ATTR_NETWORK] = new_ports
@property
def ingress_url(self):
"""Return URL to ingress url."""
if not self.is_installed or not self.with_ingress:
return None
webui = f"/api/hassio_ingress/{self.ingress_token}/"
if ATTR_INGRESS_ENTRY in self._mesh:
return f"{webui}{self._mesh[ATTR_INGRESS_ENTRY]}"
return webui
@property
def webui(self):
"""Return URL to webui or None."""
@@ -320,6 +430,11 @@ class Addon(CoreSysAttributes):
return f"{proto}://[HOST]:{port}{s_suffix}"
@property
def ingress_internal(self):
"""Return Ingress host URL."""
return f"http://{self.ip_address}:{self._mesh[ATTR_INGRESS_PORT]}"
@property
def host_network(self):
"""Return True if add-on run on host network."""
@@ -404,6 +519,11 @@ class Addon(CoreSysAttributes):
"""Return True if the add-on access use stdin input."""
return self._mesh[ATTR_STDIN]
@property
def with_ingress(self):
"""Return True if the add-on access support ingress."""
return self._mesh[ATTR_INGRESS]
@property
def with_gpio(self):
"""Return True if the add-on access to GPIO interface."""
@@ -434,6 +554,11 @@ class Addon(CoreSysAttributes):
"""Return True if the add-on access to audio."""
return self._mesh[ATTR_AUDIO]
@property
def homeassistant_version(self) -> Optional[str]:
"""Return min Home Assistant version they needed by Add-on."""
return self._mesh.get(ATTR_HOMEASSISTANT)
@property
def audio_output(self):
"""Return ALSA config for output or None."""
@@ -639,7 +764,7 @@ class Addon(CoreSysAttributes):
return True
async def _install_apparmor(self):
async def _install_apparmor(self) -> None:
"""Install or Update AppArmor profile for Add-on."""
exists_local = self.sys_host.apparmor.exists(self.slug)
exists_addon = self.path_apparmor.exists()
@@ -661,7 +786,7 @@ class Addon(CoreSysAttributes):
await self.sys_host.apparmor.load_profile(self.slug, profile_file)
@property
def schema(self):
def schema(self) -> vol.Schema:
"""Create a schema for add-on options."""
raw_schema = self._mesh[ATTR_SCHEMA]
@@ -669,7 +794,7 @@ class Addon(CoreSysAttributes):
return vol.Schema(dict)
return vol.Schema(vol.All(dict, validate_options(raw_schema)))
def test_update_schema(self):
def test_update_schema(self) -> bool:
"""Check if the existing configuration is valid after update."""
if not self.is_installed or self.is_detached:
return True
@@ -699,17 +824,17 @@ class Addon(CoreSysAttributes):
return False
return True
async def install(self):
async def install(self) -> None:
"""Install an add-on."""
if not self.available:
_LOGGER.error(
"Add-on %s not supported on %s with %s architecture",
self._id, self.sys_machine, self.sys_arch.supported)
return False
raise AddonsNotSupportedError()
if self.is_installed:
_LOGGER.error("Add-on %s is already installed", self._id)
return False
_LOGGER.warning("Add-on %s is already installed", self._id)
return
if not self.path_data.is_dir():
_LOGGER.info(
@@ -719,18 +844,20 @@ class Addon(CoreSysAttributes):
# Setup/Fix AppArmor profile
await self._install_apparmor()
if not await self.instance.install(
self.last_version, self.image_next):
return False
try:
await self.instance.install(self.last_version, self.image_next)
except DockerAPIError:
raise AddonsError() from None
else:
self._set_install(self.image_next, self.last_version)
return True
@check_installed
async def uninstall(self):
async def uninstall(self) -> None:
"""Remove an add-on."""
if not await self.instance.remove():
return False
try:
await self.instance.remove()
except DockerAPIError:
raise AddonsError() from None
if self.path_data.is_dir():
_LOGGER.info(
@@ -747,13 +874,11 @@ class Addon(CoreSysAttributes):
with suppress(HostAppArmorError):
await self.sys_host.apparmor.remove_profile(self.slug)
# Remove discovery messages
# Cleanup internal data
self.remove_discovery()
self._set_uninstall()
return True
async def state(self):
async def state(self) -> str:
"""Return running state of add-on."""
if not self.is_installed:
return STATE_NONE
@@ -763,46 +888,57 @@ class Addon(CoreSysAttributes):
return STATE_STOPPED
@check_installed
async def start(self):
async def start(self) -> None:
"""Set options and start add-on."""
if await self.instance.is_running():
_LOGGER.warning("%s already running!", self.slug)
return
# Access Token
self._data.user[self._id][ATTR_ACCESS_TOKEN] = create_token()
self._data.user[self._id][ATTR_ACCESS_TOKEN] = secrets.token_hex(56)
self.save_data()
# Options
if not self.write_options():
return False
raise AddonsError()
# Sound
if self.with_audio and not self.write_asound():
return False
raise AddonsError()
return await self.instance.run()
try:
await self.instance.run()
except DockerAPIError:
raise AddonsError() from None
@check_installed
def stop(self):
"""Stop add-on.
Return a coroutine.
"""
return self.instance.stop()
async def stop(self) -> None:
"""Stop add-on."""
try:
return await self.instance.stop()
except DockerAPIError:
raise AddonsError() from None
@check_installed
async def update(self):
async def update(self) -> None:
"""Update add-on."""
last_state = await self.state()
if self.last_version == self.version_installed:
_LOGGER.warning("No update available for add-on %s", self._id)
return False
return
if not await self.instance.update(
self.last_version, self.image_next):
return False
# Check if available, Maybe something have changed
if not self.available:
_LOGGER.error(
"Add-on %s not supported on %s with %s architecture",
self._id, self.sys_machine, self.sys_arch.supported)
raise AddonsNotSupportedError()
# Update instance
last_state = await self.state()
try:
await self.instance.update(self.last_version, self.image_next)
except DockerAPIError:
raise AddonsError() from None
self._set_update(self.image_next, self.last_version)
# Setup/Fix AppArmor profile
@@ -811,16 +947,16 @@ class Addon(CoreSysAttributes):
# restore state
if last_state == STATE_STARTED:
await self.start()
return True
@check_installed
async def restart(self):
async def restart(self) -> None:
"""Restart add-on."""
with suppress(AddonsError):
await self.stop()
return await self.start()
await self.start()
@check_installed
def logs(self):
def logs(self) -> Awaitable[bytes]:
"""Return add-ons log output.
Return a coroutine.
@@ -828,33 +964,32 @@ class Addon(CoreSysAttributes):
return self.instance.logs()
@check_installed
def stats(self):
"""Return stats of container.
Return a coroutine.
"""
return self.instance.stats()
async def stats(self) -> DockerStats:
"""Return stats of container."""
try:
return await self.instance.stats()
except DockerAPIError:
raise AddonsError() from None
@check_installed
async def rebuild(self):
async def rebuild(self) -> None:
"""Perform a rebuild of local build add-on."""
last_state = await self.state()
if not self.need_build:
_LOGGER.error("Can't rebuild a none local build add-on!")
return False
raise AddonsNotSupportedError()
# remove docker container but not addon config
if not await self.instance.remove():
return False
if not await self.instance.install(self.version_installed):
return False
try:
await self.instance.remove()
await self.instance.install(self.version_installed)
except DockerAPIError:
raise AddonsError() from None
# restore state
if last_state == STATE_STARTED:
await self.start()
return True
@check_installed
async def write_stdin(self, data):
@@ -864,18 +999,23 @@ class Addon(CoreSysAttributes):
"""
if not self.with_stdin:
_LOGGER.error("Add-on don't support write to stdin!")
return False
raise AddonsNotSupportedError()
try:
return await self.instance.write_stdin(data)
except DockerAPIError:
raise AddonsError() from None
@check_installed
async def snapshot(self, tar_file):
async def snapshot(self, tar_file: tarfile.TarFile) -> None:
"""Snapshot state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
# store local image
if self.need_build and not await \
self.instance.export_image(Path(temp, 'image.tar')):
return False
if self.need_build:
try:
await self.instance.export_image(Path(temp, 'image.tar'))
except DockerAPIError:
raise AddonsError() from None
data = {
ATTR_USER: self._data.user.get(self._id, {}),
@@ -889,7 +1029,7 @@ class Addon(CoreSysAttributes):
write_json_file(Path(temp, 'addon.json'), data)
except JsonFileError:
_LOGGER.error("Can't save meta for %s", self._id)
return False
raise AddonsError() from None
# Store AppArmor Profile
if self.sys_host.apparmor.exists(self.slug):
@@ -898,7 +1038,7 @@ class Addon(CoreSysAttributes):
self.sys_host.apparmor.backup_profile(self.slug, profile)
except HostAppArmorError:
_LOGGER.error("Can't backup AppArmor profile")
return False
raise AddonsError() from None
# write into tarfile
def _write_tarfile():
@@ -912,12 +1052,11 @@ class Addon(CoreSysAttributes):
await self.sys_run_in_executor(_write_tarfile)
except (tarfile.TarError, OSError) as err:
_LOGGER.error("Can't write tarfile %s: %s", tar_file, err)
return False
raise AddonsError() from None
_LOGGER.info("Finish snapshot for addon %s", self._id)
return True
async def restore(self, tar_file):
async def restore(self, tar_file: tarfile.TarFile) -> None:
"""Restore state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
# extract snapshot
@@ -930,13 +1069,13 @@ class Addon(CoreSysAttributes):
await self.sys_run_in_executor(_extract_tarfile)
except tarfile.TarError as err:
_LOGGER.error("Can't read tarfile %s: %s", tar_file, err)
return False
raise AddonsError() from None
# Read snapshot data
try:
data = read_json_file(Path(temp, 'addon.json'))
except JsonFileError:
return False
raise AddonsError() from None
# Validate
try:
@@ -944,9 +1083,9 @@ class Addon(CoreSysAttributes):
except vol.Invalid as err:
_LOGGER.error("Can't validate %s, snapshot data: %s",
self._id, humanize_error(data, err))
return False
raise AddonsError() from None
# Restore data or reload add-on
# Restore local add-on informations
_LOGGER.info("Restore config for addon %s", self._id)
restore_image = self._get_image(data[ATTR_SYSTEM])
self._restore_data(data[ATTR_USER], data[ATTR_SYSTEM], restore_image)
@@ -954,15 +1093,22 @@ class Addon(CoreSysAttributes):
# Check version / restore image
version = data[ATTR_VERSION]
if not await self.instance.exists():
_LOGGER.info("Restore image for addon %s", self._id)
_LOGGER.info("Restore/Install image for addon %s", self._id)
image_file = Path(temp, 'image.tar')
if image_file.is_file():
with suppress(DockerAPIError):
await self.instance.import_image(image_file, version)
else:
if await self.instance.install(version, restore_image):
with suppress(DockerAPIError):
await self.instance.install(version, restore_image)
await self.instance.cleanup()
elif self.instance.version != version or self.legacy:
_LOGGER.info("Restore/Update image for addon %s", self._id)
with suppress(DockerAPIError):
await self.instance.update(version, restore_image)
else:
with suppress(DockerAPIError):
await self.instance.stop()
# Restore data
@@ -977,7 +1123,7 @@ class Addon(CoreSysAttributes):
await self.sys_run_in_executor(_restore_data)
except shutil.Error as err:
_LOGGER.error("Can't restore origin data: %s", err)
return False
raise AddonsError() from None
# Restore AppArmor
profile_file = Path(temp, 'apparmor.txt')
@@ -987,11 +1133,10 @@ class Addon(CoreSysAttributes):
self.slug, profile_file)
except HostAppArmorError:
_LOGGER.error("Can't restore AppArmor profile")
return False
raise AddonsError() from None
# Run add-on
if data[ATTR_STATE] == STATE_STARTED:
return await self.start()
_LOGGER.info("Finish restore for add-on %s", self._id)
return True

View File

@@ -20,6 +20,7 @@ from ..const import (
SECURITY_DISABLE,
SECURITY_PROFILE,
)
from ..exceptions import AddonsNotSupportedError
if TYPE_CHECKING:
from .addon import Addon
@@ -107,7 +108,7 @@ def check_installed(method):
"""Return False if not installed or the function."""
if not addon.is_installed:
_LOGGER.error("Addon %s is not installed", addon.slug)
return False
raise AddonsNotSupportedError()
return await method(addon, *args, **kwargs)
return wrap_check

View File

@@ -1,29 +1,87 @@
"""Validate add-ons options schema."""
import logging
import re
import secrets
import uuid
import voluptuous as vol
from ..const import (
ARCH_ALL, ATTR_ACCESS_TOKEN, ATTR_APPARMOR, ATTR_ARCH, ATTR_ARGS,
ATTR_AUDIO, ATTR_AUDIO_INPUT, ATTR_AUDIO_OUTPUT, ATTR_AUTH_API,
ATTR_AUTO_UART, ATTR_AUTO_UPDATE, ATTR_BOOT, ATTR_BUILD_FROM,
ATTR_DESCRIPTON, ATTR_DEVICES, ATTR_DEVICETREE, ATTR_DISCOVERY,
ATTR_DOCKER_API, ATTR_ENVIRONMENT, ATTR_FULL_ACCESS, ATTR_GPIO,
ATTR_HASSIO_API, ATTR_HASSIO_ROLE, ATTR_HOMEASSISTANT_API, ATTR_HOST_DBUS,
ATTR_HOST_IPC, ATTR_HOST_NETWORK, ATTR_HOST_PID, ATTR_IMAGE,
ATTR_KERNEL_MODULES, ATTR_LEGACY, ATTR_LOCATON, ATTR_MACHINE,
ATTR_MAINTAINER, ATTR_MAP, ATTR_NAME, ATTR_NETWORK, ATTR_OPTIONS,
ATTR_PORTS, ATTR_PRIVILEGED, ATTR_PROTECTED, ATTR_REPOSITORY, ATTR_SCHEMA,
ATTR_SERVICES, ATTR_SLUG, ATTR_SQUASH, ATTR_STARTUP, ATTR_STATE,
ATTR_STDIN, ATTR_SYSTEM, ATTR_TIMEOUT, ATTR_TMPFS, ATTR_URL, ATTR_USER,
ATTR_UUID, ATTR_VERSION, ATTR_WEBUI, BOOT_AUTO, BOOT_MANUAL,
PRIVILEGED_ALL, ROLE_ALL, ROLE_DEFAULT, STARTUP_ALL, STARTUP_APPLICATION,
STARTUP_SERVICES, STATE_STARTED, STATE_STOPPED)
from ..services.validate import DISCOVERY_SERVICES
from ..validate import (
ALSA_DEVICE, DOCKER_PORTS, NETWORK_PORT, SHA256, UUID_MATCH)
ARCH_ALL,
ATTR_ACCESS_TOKEN,
ATTR_APPARMOR,
ATTR_ARCH,
ATTR_ARGS,
ATTR_AUDIO,
ATTR_AUDIO_INPUT,
ATTR_AUDIO_OUTPUT,
ATTR_AUTH_API,
ATTR_AUTO_UART,
ATTR_AUTO_UPDATE,
ATTR_BOOT,
ATTR_BUILD_FROM,
ATTR_DESCRIPTON,
ATTR_DEVICES,
ATTR_DEVICETREE,
ATTR_DISCOVERY,
ATTR_DOCKER_API,
ATTR_ENVIRONMENT,
ATTR_FULL_ACCESS,
ATTR_GPIO,
ATTR_HASSIO_API,
ATTR_HASSIO_ROLE,
ATTR_HOMEASSISTANT_API,
ATTR_HOMEASSISTANT,
ATTR_HOST_DBUS,
ATTR_HOST_IPC,
ATTR_HOST_NETWORK,
ATTR_HOST_PID,
ATTR_IMAGE,
ATTR_INGRESS,
ATTR_INGRESS_ENTRY,
ATTR_INGRESS_PORT,
ATTR_INGRESS_TOKEN,
ATTR_KERNEL_MODULES,
ATTR_LEGACY,
ATTR_LOCATON,
ATTR_MACHINE,
ATTR_MAINTAINER,
ATTR_MAP,
ATTR_NAME,
ATTR_NETWORK,
ATTR_OPTIONS,
ATTR_PORTS,
ATTR_PRIVILEGED,
ATTR_PROTECTED,
ATTR_REPOSITORY,
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SQUASH,
ATTR_STARTUP,
ATTR_STATE,
ATTR_STDIN,
ATTR_SYSTEM,
ATTR_TIMEOUT,
ATTR_TMPFS,
ATTR_URL,
ATTR_USER,
ATTR_UUID,
ATTR_VERSION,
ATTR_WEBUI,
BOOT_AUTO,
BOOT_MANUAL,
PRIVILEGED_ALL,
ROLE_ALL,
ROLE_DEFAULT,
STARTUP_ALL,
STARTUP_APPLICATION,
STARTUP_SERVICES,
STATE_STARTED,
STATE_STOPPED,
)
from ..discovery.validate import valid_discovery_service
from ..validate import ALSA_DEVICE, DOCKER_PORTS, NETWORK_PORT, TOKEN, UUID_MATCH
_LOGGER = logging.getLogger(__name__)
@@ -79,9 +137,9 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Required(ATTR_VERSION): vol.Coerce(str),
vol.Required(ATTR_SLUG): vol.Coerce(str),
vol.Required(ATTR_DESCRIPTON): vol.Coerce(str),
vol.Optional(ATTR_URL): vol.Url(),
vol.Optional(ATTR_ARCH, default=ARCH_ALL): [vol.In(ARCH_ALL)],
vol.Required(ATTR_ARCH): [vol.In(ARCH_ALL)],
vol.Optional(ATTR_MACHINE): [vol.In(MACHINE_ALL)],
vol.Optional(ATTR_URL): vol.Url(),
vol.Required(ATTR_STARTUP):
vol.All(_simple_startup, vol.In(STARTUP_ALL)),
vol.Required(ATTR_BOOT):
@@ -89,6 +147,10 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Optional(ATTR_PORTS): DOCKER_PORTS,
vol.Optional(ATTR_WEBUI):
vol.Match(r"^(?:https?|\[PROTO:\w+\]):\/\/\[HOST\]:\[PORT:\d+\].*$"),
vol.Optional(ATTR_INGRESS, default=False): vol.Boolean(),
vol.Optional(ATTR_INGRESS_PORT, default=8099): NETWORK_PORT,
vol.Optional(ATTR_INGRESS_ENTRY): vol.Coerce(str),
vol.Optional(ATTR_HOMEASSISTANT): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_HOST_NETWORK, default=False): vol.Boolean(),
vol.Optional(ATTR_HOST_PID, default=False): vol.Boolean(),
vol.Optional(ATTR_HOST_IPC, default=False): vol.Boolean(),
@@ -114,7 +176,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Optional(ATTR_DOCKER_API, default=False): vol.Boolean(),
vol.Optional(ATTR_AUTH_API, default=False): vol.Boolean(),
vol.Optional(ATTR_SERVICES): [vol.Match(RE_SERVICE)],
vol.Optional(ATTR_DISCOVERY): [vol.In(DISCOVERY_SERVICES)],
vol.Optional(ATTR_DISCOVERY): [valid_discovery_service],
vol.Required(ATTR_OPTIONS): dict,
vol.Required(ATTR_SCHEMA): vol.Any(vol.Schema({
vol.Coerce(str): vol.Any(SCHEMA_ELEMENT, [
@@ -158,7 +220,8 @@ SCHEMA_ADDON_USER = vol.Schema({
vol.Required(ATTR_VERSION): vol.Coerce(str),
vol.Optional(ATTR_IMAGE): vol.Coerce(str),
vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex): UUID_MATCH,
vol.Optional(ATTR_ACCESS_TOKEN): SHA256,
vol.Optional(ATTR_ACCESS_TOKEN): TOKEN,
vol.Optional(ATTR_INGRESS_TOKEN, default=secrets.token_urlsafe): vol.Coerce(str),
vol.Optional(ATTR_OPTIONS, default=dict): dict,
vol.Optional(ATTR_AUTO_UPDATE, default=False): vol.Boolean(),
vol.Optional(ATTR_BOOT):

View File

@@ -1,23 +1,25 @@
"""Init file for Hass.io RESTful API."""
import logging
from pathlib import Path
from typing import Optional
from aiohttp import web
from ..coresys import CoreSys, CoreSysAttributes
from .addons import APIAddons
from .auth import APIAuth
from .discovery import APIDiscovery
from .homeassistant import APIHomeAssistant
from .hardware import APIHardware
from .host import APIHost
from .hassos import APIHassOS
from .homeassistant import APIHomeAssistant
from .host import APIHost
from .info import APIInfo
from .ingress import APIIngress
from .proxy import APIProxy
from .supervisor import APISupervisor
from .snapshots import APISnapshots
from .services import APIServices
from .security import SecurityMiddleware
from ..coresys import CoreSysAttributes
from .services import APIServices
from .snapshots import APISnapshots
from .supervisor import APISupervisor
_LOGGER = logging.getLogger(__name__)
@@ -25,18 +27,18 @@ _LOGGER = logging.getLogger(__name__)
class RestAPI(CoreSysAttributes):
"""Handle RESTful API for Hass.io."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize Docker base wrapper."""
self.coresys = coresys
self.security = SecurityMiddleware(coresys)
self.webapp = web.Application(
self.coresys: CoreSys = coresys
self.security: SecurityMiddleware = SecurityMiddleware(coresys)
self.webapp: web.Application = web.Application(
middlewares=[self.security.token_validation])
# service stuff
self._runner = web.AppRunner(self.webapp)
self._site = None
self._runner: web.AppRunner = web.AppRunner(self.webapp)
self._site: Optional[web.TCPSite] = None
async def load(self):
async def load(self) -> None:
"""Register REST API Calls."""
self._register_supervisor()
self._register_host()
@@ -46,13 +48,14 @@ class RestAPI(CoreSysAttributes):
self._register_proxy()
self._register_panel()
self._register_addons()
self._register_ingress()
self._register_snapshots()
self._register_discovery()
self._register_services()
self._register_info()
self._register_auth()
def _register_host(self):
def _register_host(self) -> None:
"""Register hostcontrol functions."""
api_host = APIHost()
api_host.coresys = self.coresys
@@ -72,7 +75,7 @@ class RestAPI(CoreSysAttributes):
api_host.service_reload),
])
def _register_hassos(self):
def _register_hassos(self) -> None:
"""Register HassOS functions."""
api_hassos = APIHassOS()
api_hassos.coresys = self.coresys
@@ -84,7 +87,7 @@ class RestAPI(CoreSysAttributes):
web.post('/hassos/config/sync', api_hassos.config_sync),
])
def _register_hardware(self):
def _register_hardware(self) -> None:
"""Register hardware functions."""
api_hardware = APIHardware()
api_hardware.coresys = self.coresys
@@ -94,7 +97,7 @@ class RestAPI(CoreSysAttributes):
web.get('/hardware/audio', api_hardware.audio),
])
def _register_info(self):
def _register_info(self) -> None:
"""Register info functions."""
api_info = APIInfo()
api_info.coresys = self.coresys
@@ -103,7 +106,7 @@ class RestAPI(CoreSysAttributes):
web.get('/info', api_info.info),
])
def _register_auth(self):
def _register_auth(self) -> None:
"""Register auth functions."""
api_auth = APIAuth()
api_auth.coresys = self.coresys
@@ -112,7 +115,7 @@ class RestAPI(CoreSysAttributes):
web.post('/auth', api_auth.auth),
])
def _register_supervisor(self):
def _register_supervisor(self) -> None:
"""Register Supervisor functions."""
api_supervisor = APISupervisor()
api_supervisor.coresys = self.coresys
@@ -127,7 +130,7 @@ class RestAPI(CoreSysAttributes):
web.post('/supervisor/options', api_supervisor.options),
])
def _register_homeassistant(self):
def _register_homeassistant(self) -> None:
"""Register Home Assistant functions."""
api_hass = APIHomeAssistant()
api_hass.coresys = self.coresys
@@ -142,9 +145,10 @@ class RestAPI(CoreSysAttributes):
web.post('/homeassistant/stop', api_hass.stop),
web.post('/homeassistant/start', api_hass.start),
web.post('/homeassistant/check', api_hass.check),
web.post('/homeassistant/rebuild', api_hass.rebuild),
])
def _register_proxy(self):
def _register_proxy(self) -> None:
"""Register Home Assistant API Proxy."""
api_proxy = APIProxy()
api_proxy.coresys = self.coresys
@@ -158,7 +162,7 @@ class RestAPI(CoreSysAttributes):
web.get('/homeassistant/api/', api_proxy.api),
])
def _register_addons(self):
def _register_addons(self) -> None:
"""Register Add-on functions."""
api_addons = APIAddons()
api_addons.coresys = self.coresys
@@ -184,7 +188,17 @@ class RestAPI(CoreSysAttributes):
web.get('/addons/{addon}/stats', api_addons.stats),
])
def _register_snapshots(self):
def _register_ingress(self) -> None:
"""Register Ingress functions."""
api_ingress = APIIngress()
api_ingress.coresys = self.coresys
self.webapp.add_routes([
web.post('/ingress/session', api_ingress.create_session),
web.view('/ingress/{token}/{path:.*}', api_ingress.handler),
])
def _register_snapshots(self) -> None:
"""Register snapshots functions."""
api_snapshots = APISnapshots()
api_snapshots.coresys = self.coresys
@@ -204,7 +218,7 @@ class RestAPI(CoreSysAttributes):
web.get('/snapshots/{snapshot}/download', api_snapshots.download),
])
def _register_services(self):
def _register_services(self) -> None:
"""Register services functions."""
api_services = APIServices()
api_services.coresys = self.coresys
@@ -216,7 +230,7 @@ class RestAPI(CoreSysAttributes):
web.delete('/services/{service}', api_services.del_service),
])
def _register_discovery(self):
def _register_discovery(self) -> None:
"""Register discovery functions."""
api_discovery = APIDiscovery()
api_discovery.coresys = self.coresys
@@ -228,7 +242,7 @@ class RestAPI(CoreSysAttributes):
web.post('/discovery', api_discovery.set_discovery),
])
def _register_panel(self):
def _register_panel(self) -> None:
"""Register panel for Home Assistant."""
panel_dir = Path(__file__).parent.joinpath("panel")
@@ -256,7 +270,7 @@ class RestAPI(CoreSysAttributes):
# This route is for HA > 0.70
self.webapp.add_routes([web.static('/app', panel_dir)])
async def start(self):
async def start(self) -> None:
"""Run RESTful API webserver."""
await self._runner.setup()
self._site = web.TCPSite(
@@ -270,7 +284,7 @@ class RestAPI(CoreSysAttributes):
else:
_LOGGER.info("Start API on %s", self.sys_docker.network.supervisor)
async def stop(self):
async def stop(self) -> None:
"""Stop RESTful API webserver."""
if not self._site:
return

View File

@@ -1,31 +1,89 @@
"""Init file for Hass.io Home Assistant RESTful API."""
import asyncio
import logging
from typing import Any, Awaitable, Dict, List
from aiohttp import web
import voluptuous as vol
from voluptuous.humanize import humanize_error
from .utils import api_process, api_process_raw, api_validate
from ..addons.addon import Addon
from ..addons.utils import rating_security
from ..const import (
ATTR_VERSION, ATTR_LAST_VERSION, ATTR_STATE, ATTR_BOOT, ATTR_OPTIONS,
ATTR_URL, ATTR_DESCRIPTON, ATTR_DETACHED, ATTR_NAME, ATTR_REPOSITORY,
ATTR_BUILD, ATTR_AUTO_UPDATE, ATTR_NETWORK, ATTR_HOST_NETWORK, ATTR_SLUG,
ATTR_SOURCE, ATTR_REPOSITORIES, ATTR_ADDONS, ATTR_ARCH, ATTR_MAINTAINER,
ATTR_INSTALLED, ATTR_LOGO, ATTR_WEBUI, ATTR_DEVICES, ATTR_PRIVILEGED,
ATTR_AUDIO, ATTR_AUDIO_INPUT, ATTR_AUDIO_OUTPUT, ATTR_HASSIO_API,
ATTR_GPIO, ATTR_HOMEASSISTANT_API, ATTR_STDIN, BOOT_AUTO, BOOT_MANUAL,
ATTR_CHANGELOG, ATTR_HOST_IPC, ATTR_HOST_DBUS, ATTR_LONG_DESCRIPTION,
ATTR_CPU_PERCENT, ATTR_MEMORY_LIMIT, ATTR_MEMORY_USAGE, ATTR_NETWORK_TX,
ATTR_NETWORK_RX, ATTR_BLK_READ, ATTR_BLK_WRITE, ATTR_ICON, ATTR_SERVICES,
ATTR_DISCOVERY, ATTR_APPARMOR, ATTR_DEVICETREE, ATTR_DOCKER_API,
ATTR_FULL_ACCESS, ATTR_PROTECTED, ATTR_RATING, ATTR_HOST_PID,
ATTR_HASSIO_ROLE, ATTR_MACHINE, ATTR_AVAILABLE, ATTR_AUTH_API,
ATTR_ADDONS,
ATTR_APPARMOR,
ATTR_ARCH,
ATTR_AUDIO,
ATTR_AUDIO_INPUT,
ATTR_AUDIO_OUTPUT,
ATTR_AUTH_API,
ATTR_AUTO_UPDATE,
ATTR_AVAILABLE,
ATTR_BLK_READ,
ATTR_BLK_WRITE,
ATTR_BOOT,
ATTR_BUILD,
ATTR_CHANGELOG,
ATTR_CPU_PERCENT,
ATTR_DESCRIPTON,
ATTR_DETACHED,
ATTR_DEVICES,
ATTR_DEVICETREE,
ATTR_DISCOVERY,
ATTR_DOCKER_API,
ATTR_FULL_ACCESS,
ATTR_GPIO,
ATTR_HASSIO_API,
ATTR_HASSIO_ROLE,
ATTR_HOMEASSISTANT,
ATTR_HOMEASSISTANT_API,
ATTR_HOST_DBUS,
ATTR_HOST_IPC,
ATTR_HOST_NETWORK,
ATTR_HOST_PID,
ATTR_ICON,
ATTR_INGRESS,
ATTR_INGRESS_ENTRY,
ATTR_INGRESS_URL,
ATTR_INSTALLED,
ATTR_IP_ADDRESS,
ATTR_KERNEL_MODULES,
CONTENT_TYPE_PNG, CONTENT_TYPE_BINARY, CONTENT_TYPE_TEXT, REQUEST_FROM)
ATTR_LAST_VERSION,
ATTR_LOGO,
ATTR_LONG_DESCRIPTION,
ATTR_MACHINE,
ATTR_MAINTAINER,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE,
ATTR_NAME,
ATTR_NETWORK,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_OPTIONS,
ATTR_PRIVILEGED,
ATTR_PROTECTED,
ATTR_RATING,
ATTR_REPOSITORIES,
ATTR_REPOSITORY,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SOURCE,
ATTR_STATE,
ATTR_STDIN,
ATTR_URL,
ATTR_VERSION,
ATTR_WEBUI,
BOOT_AUTO,
BOOT_MANUAL,
CONTENT_TYPE_BINARY,
CONTENT_TYPE_PNG,
CONTENT_TYPE_TEXT,
REQUEST_FROM,
)
from ..coresys import CoreSysAttributes
from ..validate import DOCKER_PORTS, ALSA_DEVICE
from ..exceptions import APIError
from ..validate import ALSA_DEVICE, DOCKER_PORTS
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
@@ -51,7 +109,7 @@ SCHEMA_SECURITY = vol.Schema({
class APIAddons(CoreSysAttributes):
"""Handle RESTful API for add-on functions."""
def _extract_addon(self, request, check_installed=True):
def _extract_addon(self, request: web.Request, check_installed: bool = True) -> Addon:
"""Return addon, throw an exception it it doesn't exist."""
addon_slug = request.match_info.get('addon')
@@ -69,7 +127,7 @@ class APIAddons(CoreSysAttributes):
return addon
@api_process
async def list(self, request):
async def list(self, request: web.Request) -> Dict[str, Any]:
"""Return all add-ons or repositories."""
data_addons = []
for addon in self.sys_addons.list_addons:
@@ -104,13 +162,12 @@ class APIAddons(CoreSysAttributes):
}
@api_process
async def reload(self, request):
async def reload(self, request: web.Request) -> None:
"""Reload all add-on data."""
await asyncio.shield(self.sys_addons.reload())
return True
@api_process
async def info(self, request):
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return add-on information."""
addon = self._extract_addon(request, check_installed=False)
@@ -130,6 +187,7 @@ class APIAddons(CoreSysAttributes):
ATTR_OPTIONS: addon.options,
ATTR_ARCH: addon.supported_arch,
ATTR_MACHINE: addon.supported_machine,
ATTR_HOMEASSISTANT: addon.homeassistant_version,
ATTR_URL: addon.url,
ATTR_DETACHED: addon.is_detached,
ATTR_AVAILABLE: addon.available,
@@ -161,17 +219,20 @@ class APIAddons(CoreSysAttributes):
ATTR_AUDIO_OUTPUT: addon.audio_output,
ATTR_SERVICES: _pretty_services(addon),
ATTR_DISCOVERY: addon.discovery,
ATTR_IP_ADDRESS: str(addon.ip_address),
ATTR_INGRESS: addon.with_ingress,
ATTR_INGRESS_ENTRY: addon.ingress_entry,
ATTR_INGRESS_URL: addon.ingress_url,
}
@api_process
async def options(self, request):
async def options(self, request: web.Request) -> None:
"""Store user options for add-on."""
addon = self._extract_addon(request)
addon_schema = SCHEMA_OPTIONS.extend({
vol.Optional(ATTR_OPTIONS): vol.Any(None, addon.schema),
})
body = await api_validate(addon_schema, request)
if ATTR_OPTIONS in body:
@@ -188,10 +249,9 @@ class APIAddons(CoreSysAttributes):
addon.audio_output = body[ATTR_AUDIO_OUTPUT]
addon.save_data()
return True
@api_process
async def security(self, request):
async def security(self, request: web.Request) -> None:
"""Store security options for add-on."""
addon = self._extract_addon(request)
body = await api_validate(SCHEMA_SECURITY, request)
@@ -201,17 +261,13 @@ class APIAddons(CoreSysAttributes):
addon.protected = body[ATTR_PROTECTED]
addon.save_data()
return True
@api_process
async def stats(self, request):
async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information."""
addon = self._extract_addon(request)
stats = await addon.stats()
if not stats:
raise APIError("No stats available")
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage,
@@ -223,19 +279,19 @@ class APIAddons(CoreSysAttributes):
}
@api_process
def install(self, request):
def install(self, request: web.Request) -> Awaitable[None]:
"""Install add-on."""
addon = self._extract_addon(request, check_installed=False)
return asyncio.shield(addon.install())
@api_process
def uninstall(self, request):
def uninstall(self, request: web.Request) -> Awaitable[None]:
"""Uninstall add-on."""
addon = self._extract_addon(request)
return asyncio.shield(addon.uninstall())
@api_process
def start(self, request):
def start(self, request: web.Request) -> Awaitable[None]:
"""Start add-on."""
addon = self._extract_addon(request)
@@ -249,13 +305,13 @@ class APIAddons(CoreSysAttributes):
return asyncio.shield(addon.start())
@api_process
def stop(self, request):
def stop(self, request: web.Request) -> Awaitable[None]:
"""Stop add-on."""
addon = self._extract_addon(request)
return asyncio.shield(addon.stop())
@api_process
def update(self, request):
def update(self, request: web.Request) -> Awaitable[None]:
"""Update add-on."""
addon = self._extract_addon(request)
@@ -265,13 +321,13 @@ class APIAddons(CoreSysAttributes):
return asyncio.shield(addon.update())
@api_process
def restart(self, request):
def restart(self, request: web.Request) -> Awaitable[None]:
"""Restart add-on."""
addon = self._extract_addon(request)
return asyncio.shield(addon.restart())
@api_process
def rebuild(self, request):
def rebuild(self, request: web.Request) -> Awaitable[None]:
"""Rebuild local build add-on."""
addon = self._extract_addon(request)
if not addon.need_build:
@@ -280,13 +336,13 @@ class APIAddons(CoreSysAttributes):
return asyncio.shield(addon.rebuild())
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request):
def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return logs from add-on."""
addon = self._extract_addon(request)
return addon.logs()
@api_process_raw(CONTENT_TYPE_PNG)
async def icon(self, request):
async def icon(self, request: web.Request) -> bytes:
"""Return icon from add-on."""
addon = self._extract_addon(request, check_installed=False)
if not addon.with_icon:
@@ -296,7 +352,7 @@ class APIAddons(CoreSysAttributes):
return png.read()
@api_process_raw(CONTENT_TYPE_PNG)
async def logo(self, request):
async def logo(self, request: web.Request) -> bytes:
"""Return logo from add-on."""
addon = self._extract_addon(request, check_installed=False)
if not addon.with_logo:
@@ -306,7 +362,7 @@ class APIAddons(CoreSysAttributes):
return png.read()
@api_process_raw(CONTENT_TYPE_TEXT)
async def changelog(self, request):
async def changelog(self, request: web.Request) -> str:
"""Return changelog from add-on."""
addon = self._extract_addon(request, check_installed=False)
if not addon.with_changelog:
@@ -316,17 +372,17 @@ class APIAddons(CoreSysAttributes):
return changelog.read()
@api_process
async def stdin(self, request):
async def stdin(self, request: web.Request) -> None:
"""Write to stdin of add-on."""
addon = self._extract_addon(request)
if not addon.with_stdin:
raise APIError("STDIN not supported by add-on")
data = await request.read()
return await asyncio.shield(addon.write_stdin(data))
await asyncio.shield(addon.write_stdin(data))
def _pretty_devices(addon):
def _pretty_devices(addon: Addon) -> List[str]:
"""Return a simplified device list."""
dev_list = addon.devices
if not dev_list:
@@ -334,7 +390,7 @@ def _pretty_devices(addon):
return [row.split(':')[0] for row in dev_list]
def _pretty_services(addon):
def _pretty_services(addon: Addon) -> List[str]:
"""Return a simplified services role list."""
services = []
for name, access in addon.services_role.items():

View File

@@ -3,17 +3,24 @@ import voluptuous as vol
from .utils import api_process, api_validate
from ..const import (
ATTR_ADDON, ATTR_UUID, ATTR_CONFIG, ATTR_DISCOVERY, ATTR_SERVICE,
REQUEST_FROM)
ATTR_ADDON,
ATTR_UUID,
ATTR_CONFIG,
ATTR_DISCOVERY,
ATTR_SERVICE,
REQUEST_FROM,
)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError, APIForbidden
from ..validate import SERVICE_ALL
from ..discovery.validate import valid_discovery_service
SCHEMA_DISCOVERY = vol.Schema({
vol.Required(ATTR_SERVICE): SERVICE_ALL,
SCHEMA_DISCOVERY = vol.Schema(
{
vol.Required(ATTR_SERVICE): valid_discovery_service,
vol.Optional(ATTR_CONFIG): vol.Maybe(dict),
})
}
)
class APIDiscovery(CoreSysAttributes):
@@ -21,7 +28,7 @@ class APIDiscovery(CoreSysAttributes):
def _extract_message(self, request):
"""Extract discovery message from URL."""
message = self.sys_discovery.get(request.match_info.get('uuid'))
message = self.sys_discovery.get(request.match_info.get("uuid"))
if not message:
raise APIError("Discovery message not found")
return message
@@ -38,12 +45,14 @@ class APIDiscovery(CoreSysAttributes):
discovery = []
for message in self.sys_discovery.list_messages:
discovery.append({
discovery.append(
{
ATTR_ADDON: message.addon,
ATTR_SERVICE: message.service,
ATTR_UUID: message.uuid,
ATTR_CONFIG: message.config,
})
}
)
return {ATTR_DISCOVERY: discovery}

View File

@@ -1,27 +1,31 @@
"""Init file for Hass.io HassOS RESTful API."""
import asyncio
import logging
from typing import Any, Awaitable, Dict
import voluptuous as vol
from aiohttp import web
from .utils import api_process, api_validate
from ..const import (
ATTR_VERSION, ATTR_BOARD, ATTR_VERSION_LATEST, ATTR_VERSION_CLI,
ATTR_VERSION_CLI_LATEST)
ATTR_BOARD,
ATTR_VERSION,
ATTR_VERSION_CLI,
ATTR_VERSION_CLI_LATEST,
ATTR_VERSION_LATEST,
)
from ..coresys import CoreSysAttributes
from .utils import api_process, api_validate
_LOGGER = logging.getLogger(__name__)
SCHEMA_VERSION = vol.Schema({
vol.Optional(ATTR_VERSION): vol.Coerce(str),
})
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
class APIHassOS(CoreSysAttributes):
"""Handle RESTful API for HassOS functions."""
@api_process
async def info(self, request):
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return HassOS information."""
return {
ATTR_VERSION: self.sys_hassos.version,
@@ -32,7 +36,7 @@ class APIHassOS(CoreSysAttributes):
}
@api_process
async def update(self, request):
async def update(self, request: web.Request) -> None:
"""Update HassOS."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_hassos.version_latest)
@@ -40,7 +44,7 @@ class APIHassOS(CoreSysAttributes):
await asyncio.shield(self.sys_hassos.update(version))
@api_process
async def update_cli(self, request):
async def update_cli(self, request: web.Request) -> None:
"""Update HassOS CLI."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_hassos.version_cli_latest)
@@ -48,6 +52,6 @@ class APIHassOS(CoreSysAttributes):
await asyncio.shield(self.sys_hassos.update_cli(version))
@api_process
def config_sync(self, request):
def config_sync(self, request: web.Request) -> Awaitable[None]:
"""Trigger config reload on HassOS."""
return asyncio.shield(self.sys_hassos.config_sync())

View File

@@ -1,15 +1,35 @@
"""Init file for Hass.io Home Assistant RESTful API."""
import asyncio
import logging
from typing import Coroutine, Dict, Any
import voluptuous as vol
from aiohttp import web
from ..const import (
ATTR_ARCH, ATTR_BLK_READ, ATTR_BLK_WRITE, ATTR_BOOT, ATTR_CPU_PERCENT,
ATTR_CUSTOM, ATTR_IMAGE, ATTR_LAST_VERSION, ATTR_MACHINE, ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE, ATTR_NETWORK_RX, ATTR_NETWORK_TX, ATTR_PASSWORD,
ATTR_PORT, ATTR_REFRESH_TOKEN, ATTR_SSL, ATTR_VERSION, ATTR_WAIT_BOOT,
ATTR_WATCHDOG, CONTENT_TYPE_BINARY)
ATTR_ARCH,
ATTR_BLK_READ,
ATTR_BLK_WRITE,
ATTR_BOOT,
ATTR_CPU_PERCENT,
ATTR_CUSTOM,
ATTR_IMAGE,
ATTR_LAST_VERSION,
ATTR_MACHINE,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_REFRESH_TOKEN,
ATTR_SSL,
ATTR_VERSION,
ATTR_WAIT_BOOT,
ATTR_WATCHDOG,
ATTR_IP_ADDRESS,
CONTENT_TYPE_BINARY,
)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError
from ..validate import DOCKER_IMAGE, NETWORK_PORT
@@ -18,42 +38,34 @@ from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_OPTIONS = vol.Schema({
vol.Optional(ATTR_BOOT):
vol.Boolean(),
vol.Inclusive(ATTR_IMAGE, 'custom_hass'):
vol.Maybe(vol.Coerce(str)),
vol.Inclusive(ATTR_LAST_VERSION, 'custom_hass'):
vol.Any(None, DOCKER_IMAGE),
vol.Optional(ATTR_PORT):
NETWORK_PORT,
vol.Optional(ATTR_PASSWORD):
vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_SSL):
vol.Boolean(),
vol.Optional(ATTR_WATCHDOG):
vol.Boolean(),
vol.Optional(ATTR_WAIT_BOOT):
vol.All(vol.Coerce(int), vol.Range(min=60)),
vol.Optional(ATTR_REFRESH_TOKEN):
vol.Maybe(vol.Coerce(str)),
})
SCHEMA_OPTIONS = vol.Schema(
{
vol.Optional(ATTR_BOOT): vol.Boolean(),
vol.Inclusive(ATTR_IMAGE, "custom_hass"): vol.Maybe(vol.Coerce(str)),
vol.Inclusive(ATTR_LAST_VERSION, "custom_hass"): vol.Any(None, DOCKER_IMAGE),
vol.Optional(ATTR_PORT): NETWORK_PORT,
vol.Optional(ATTR_PASSWORD): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_SSL): vol.Boolean(),
vol.Optional(ATTR_WATCHDOG): vol.Boolean(),
vol.Optional(ATTR_WAIT_BOOT): vol.All(vol.Coerce(int), vol.Range(min=60)),
vol.Optional(ATTR_REFRESH_TOKEN): vol.Maybe(vol.Coerce(str)),
}
)
SCHEMA_VERSION = vol.Schema({
vol.Optional(ATTR_VERSION): vol.Coerce(str),
})
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
class APIHomeAssistant(CoreSysAttributes):
"""Handle RESTful API for Home Assistant functions."""
@api_process
async def info(self, request):
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return host information."""
return {
ATTR_VERSION: self.sys_homeassistant.version,
ATTR_LAST_VERSION: self.sys_homeassistant.last_version,
ATTR_MACHINE: self.sys_homeassistant.machine,
ATTR_IP_ADDRESS: str(self.sys_homeassistant.ip_address),
ATTR_ARCH: self.sys_homeassistant.arch,
ATTR_IMAGE: self.sys_homeassistant.image,
ATTR_CUSTOM: self.sys_homeassistant.is_custom_image,
@@ -65,7 +77,7 @@ class APIHomeAssistant(CoreSysAttributes):
}
@api_process
async def options(self, request):
async def options(self, request: web.Request) -> None:
"""Set Home Assistant options."""
body = await api_validate(SCHEMA_OPTIONS, request)
@@ -81,6 +93,7 @@ class APIHomeAssistant(CoreSysAttributes):
if ATTR_PASSWORD in body:
self.sys_homeassistant.api_password = body[ATTR_PASSWORD]
self.sys_homeassistant.refresh_token = None
if ATTR_SSL in body:
self.sys_homeassistant.api_ssl = body[ATTR_SSL]
@@ -97,7 +110,7 @@ class APIHomeAssistant(CoreSysAttributes):
self.sys_homeassistant.save_data()
@api_process
async def stats(self, request):
async def stats(self, request: web.Request) -> Dict[Any, str]:
"""Return resource information."""
stats = await self.sys_homeassistant.stats()
if not stats:
@@ -114,7 +127,7 @@ class APIHomeAssistant(CoreSysAttributes):
}
@api_process
async def update(self, request):
async def update(self, request: web.Request) -> None:
"""Update Home Assistant."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_homeassistant.last_version)
@@ -122,27 +135,32 @@ class APIHomeAssistant(CoreSysAttributes):
await asyncio.shield(self.sys_homeassistant.update(version))
@api_process
def stop(self, request):
def stop(self, request: web.Request) -> Coroutine:
"""Stop Home Assistant."""
return asyncio.shield(self.sys_homeassistant.stop())
@api_process
def start(self, request):
def start(self, request: web.Request) -> Coroutine:
"""Start Home Assistant."""
return asyncio.shield(self.sys_homeassistant.start())
@api_process
def restart(self, request):
def restart(self, request: web.Request) -> Coroutine:
"""Restart Home Assistant."""
return asyncio.shield(self.sys_homeassistant.restart())
@api_process
def rebuild(self, request: web.Request) -> Coroutine:
"""Rebuild Home Assistant."""
return asyncio.shield(self.sys_homeassistant.rebuild())
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request):
def logs(self, request: web.Request) -> Coroutine:
"""Return Home Assistant Docker logs."""
return self.sys_homeassistant.logs()
@api_process
async def check(self, request):
async def check(self, request: web.Request) -> None:
"""Check configuration of Home Assistant."""
result = await self.sys_homeassistant.check_config()
if not result.valid:

217
hassio/api/ingress.py Normal file
View File

@@ -0,0 +1,217 @@
"""Hass.io Add-on ingress service."""
import asyncio
from ipaddress import ip_address
import logging
from typing import Any, Dict, Union
import aiohttp
from aiohttp import hdrs, web
from aiohttp.web_exceptions import (
HTTPBadGateway,
HTTPServiceUnavailable,
HTTPUnauthorized,
)
from multidict import CIMultiDict, istr
from ..addons.addon import Addon
from ..const import ATTR_SESSION, HEADER_TOKEN, REQUEST_FROM, COOKIE_INGRESS
from ..coresys import CoreSysAttributes
from .utils import api_process
_LOGGER = logging.getLogger(__name__)
class APIIngress(CoreSysAttributes):
"""Ingress view to handle add-on webui routing."""
def _extract_addon(self, request: web.Request) -> Addon:
"""Return addon, throw an exception it it doesn't exist."""
token = request.match_info.get("token")
# Find correct add-on
addon = self.sys_ingress.get(token)
if not addon:
_LOGGER.warning("Ingress for %s not available", token)
raise HTTPServiceUnavailable()
return addon
def _check_ha_access(self, request: web.Request) -> None:
if request[REQUEST_FROM] != self.sys_homeassistant:
_LOGGER.warning("Ingress is only available behind Home Assistant")
raise HTTPUnauthorized()
def _create_url(self, addon: Addon, path: str) -> str:
"""Create URL to container."""
return f"{addon.ingress_internal}/{path}"
@api_process
async def create_session(self, request: web.Request) -> Dict[str, Any]:
"""Create a new session."""
self._check_ha_access(request)
session = self.sys_ingress.create_session()
return {ATTR_SESSION: session}
async def handler(
self, request: web.Request
) -> Union[web.Response, web.StreamResponse, web.WebSocketResponse]:
"""Route data to Hass.io ingress service."""
self._check_ha_access(request)
# Check Ingress Session
session = request.cookies.get(COOKIE_INGRESS)
if not self.sys_ingress.validate_session(session):
_LOGGER.warning("No valid ingress session %s", session)
raise HTTPUnauthorized()
# Process requests
addon = self._extract_addon(request)
path = request.match_info.get("path")
try:
# Websocket
if _is_websocket(request):
return await self._handle_websocket(request, addon, path)
# Request
return await self._handle_request(request, addon, path)
except aiohttp.ClientError as err:
_LOGGER.error("Ingress error: %s", err)
raise HTTPBadGateway() from None
async def _handle_websocket(
self, request: web.Request, addon: Addon, path: str
) -> web.WebSocketResponse:
"""Ingress route for websocket."""
ws_server = web.WebSocketResponse()
await ws_server.prepare(request)
# Preparing
url = self._create_url(addon, path)
source_header = _init_header(request, addon)
# Support GET query
if request.query_string:
url = "{}?{}".format(url, request.query_string)
# Start proxy
async with self.sys_websession.ws_connect(
url, headers=source_header
) as ws_client:
# Proxy requests
await asyncio.wait(
[
_websocket_forward(ws_server, ws_client),
_websocket_forward(ws_client, ws_server),
],
return_when=asyncio.FIRST_COMPLETED,
)
return ws_server
async def _handle_request(
self, request: web.Request, addon: Addon, path: str
) -> Union[web.Response, web.StreamResponse]:
"""Ingress route for request."""
url = self._create_url(addon, path)
data = await request.read()
source_header = _init_header(request, addon)
async with self.sys_websession.request(
request.method, url, headers=source_header, params=request.query, data=data
) as result:
headers = _response_header(result)
# Simple request
if (
hdrs.CONTENT_LENGTH in result.headers
and int(result.headers.get(hdrs.CONTENT_LENGTH, 0)) < 4_194_000
):
# Return Response
body = await result.read()
return web.Response(headers=headers, status=result.status, body=body)
# Stream response
response = web.StreamResponse(status=result.status, headers=headers)
response.content_type = result.content_type
try:
await response.prepare(request)
async for data in result.content.iter_chunked(4096):
await response.write(data)
except (aiohttp.ClientError, aiohttp.ClientPayloadError) as err:
_LOGGER.error("Stream error with %s: %s", url, err)
return response
def _init_header(
request: web.Request, addon: str
) -> Union[CIMultiDict, Dict[str, str]]:
"""Create initial header."""
headers = {}
# filter flags
for name, value in request.headers.items():
if name in (
hdrs.CONTENT_LENGTH,
hdrs.CONTENT_TYPE,
hdrs.CONTENT_ENCODING,
istr(HEADER_TOKEN),
):
continue
headers[name] = value
# Update X-Forwarded-For
forward_for = request.headers.get(hdrs.X_FORWARDED_FOR)
connected_ip = ip_address(request.transport.get_extra_info("peername")[0])
headers[hdrs.X_FORWARDED_FOR] = f"{forward_for}, {connected_ip!s}"
return headers
def _response_header(response: aiohttp.ClientResponse) -> Dict[str, str]:
"""Create response header."""
headers = {}
for name, value in response.headers.items():
if name in (
hdrs.TRANSFER_ENCODING,
hdrs.CONTENT_LENGTH,
hdrs.CONTENT_TYPE,
hdrs.CONTENT_ENCODING,
):
continue
headers[name] = value
return headers
def _is_websocket(request: web.Request) -> bool:
"""Return True if request is a websocket."""
headers = request.headers
if (
headers.get(hdrs.CONNECTION) == "Upgrade"
and headers.get(hdrs.UPGRADE) == "websocket"
):
return True
return False
async def _websocket_forward(ws_from, ws_to):
"""Handle websocket message directly."""
async for msg in ws_from:
if msg.type == aiohttp.WSMsgType.TEXT:
await ws_to.send_str(msg.data)
elif msg.type == aiohttp.WSMsgType.BINARY:
await ws_to.send_bytes(msg.data)
elif msg.type == aiohttp.WSMsgType.PING:
await ws_to.ping()
elif msg.type == aiohttp.WSMsgType.PONG:
await ws_to.pong()
elif ws_to.closed:
await ws_to.close(code=ws_to.close_code, message=msg.extra)

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,32 @@
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
* @fileoverview
* @suppress {checkPrototypalTypes}
* @license Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt The complete set of authors may be found
* at http://polymer.github.io/AUTHORS.txt The complete set of contributors may
* be found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by
* Google as part of the polymer project is also subject to an additional IP
* rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/

Binary file not shown.

View File

@@ -0,0 +1 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.1ac383635811d6c2cb4b.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,180 @@
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright 2018 Google Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
/*! *****************************************************************************
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use
this file except in compliance with the License. You may obtain a copy of the
License at http://www.apache.org/licenses/LICENSE-2.0
THIS CODE IS PROVIDED ON AN *AS IS* BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, EITHER EXPRESS OR IMPLIED, INCLUDING WITHOUT LIMITATION ANY IMPLIED
WARRANTIES OR CONDITIONS OF TITLE, FITNESS FOR A PARTICULAR PURPOSE,
MERCHANTABLITY OR NON-INFRINGEMENT.
See the Apache Version 2.0 License for specific language governing permissions
and limitations under the License.
***************************************************************************** */
/**
* @license
* Copyright 2016 Google Inc.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
/**
* @license
* Copyright 2018 Google Inc.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
/**
@license
Copyright (c) 2019 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2018 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2014 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/

Binary file not shown.

View File

@@ -0,0 +1 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.31b41b04602ce627ad98.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

Binary file not shown.

View File

@@ -0,0 +1 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[4],{114:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(104),i=t.n(e),o=t(106),u=t.n(o),a=i.a,c=u.a}}]);

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,2 +0,0 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[4],{100:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(89),i=t.n(e),o=t(91),u=t.n(o),a=i.a,c=u.a}}]);
//# sourceMappingURL=chunk.9e3883f96f68b3ce89f5.js.map

View File

@@ -1 +0,0 @@
{"version":3,"sources":["webpack:///../src/resources/load_markdown.js"],"names":["__webpack_require__","r","__webpack_exports__","d","marked","filterXSS","marked__WEBPACK_IMPORTED_MODULE_0__","marked__WEBPACK_IMPORTED_MODULE_0___default","n","xss__WEBPACK_IMPORTED_MODULE_1__","xss__WEBPACK_IMPORTED_MODULE_1___default","marked_","filterXSS_"],"mappings":"0FAAAA,EAAAC,EAAAC,GAAAF,EAAAG,EAAAD,EAAA,2BAAAE,IAAAJ,EAAAG,EAAAD,EAAA,8BAAAG,IAAA,IAAAC,EAAAN,EAAA,IAAAO,EAAAP,EAAAQ,EAAAF,GAAAG,EAAAT,EAAA,IAAAU,EAAAV,EAAAQ,EAAAC,GAGaL,EAASO,IACTN,EAAYO","file":"chunk.9e3883f96f68b3ce89f5.js","sourcesContent":["import marked_ from \"marked\";\nimport filterXSS_ from \"xss\";\n\nexport const marked = marked_;\nexport const filterXSS = filterXSS_;\n"],"sourceRoot":""}

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -1,820 +0,0 @@
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2014 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,471 +0,0 @@
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

View File

@@ -1,2 +1 @@
!function(e){function n(n){for(var t,o,i=n[0],u=n[1],c=0,f=[];c<i.length;c++)o=i[c],r[o]&&f.push(r[o][0]),r[o]=0;for(t in u)Object.prototype.hasOwnProperty.call(u,t)&&(e[t]=u[t]);for(a&&a(n);f.length;)f.shift()()}var t={},r={1:0};function o(n){if(t[n])return t[n].exports;var r=t[n]={i:n,l:!1,exports:{}};return e[n].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var n=[],t=r[e];if(0!==t)if(t)n.push(t[2]);else{var i=new Promise(function(n,o){t=r[e]=[n,o]});n.push(t[2]=i);var u,c=document.getElementsByTagName("head")[0],a=document.createElement("script");a.charset="utf-8",a.timeout=120,o.nc&&a.setAttribute("nonce",o.nc),a.src=function(e){return o.p+"chunk."+{0:"f32f3c841cc3e1d081f7",2:"8c049a124b9397e54c16",3:"d0eb7b86b775838caf5e",4:"9e3883f96f68b3ce89f5",5:"0cb8b788b03dcc48da14",6:"c1ac97370d72bce0a835",7:"0853908528652fbc5d4f"}[e]+".js"}(e),u=function(n){a.onerror=a.onload=null,clearTimeout(f);var t=r[e];if(0!==t){if(t){var o=n&&("load"===n.type?"missing":n.type),i=n&&n.target&&n.target.src,u=new Error("Loading chunk "+e+" failed.\n("+o+": "+i+")");u.type=o,u.request=i,t[1](u)}r[e]=void 0}};var f=setTimeout(function(){u({type:"timeout",target:a})},12e4);a.onerror=a.onload=u,c.appendChild(a)}return Promise.all(n)},o.m=e,o.c=t,o.d=function(e,n,t){o.o(e,n)||Object.defineProperty(e,n,{enumerable:!0,get:t})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,n){if(1&n&&(e=o(e)),8&n)return e;if(4&n&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(o.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&n&&"string"!=typeof e)for(var r in e)o.d(t,r,function(n){return e[n]}.bind(null,r));return t},o.n=function(e){var n=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(n,"a",n),n},o.o=function(e,n){return Object.prototype.hasOwnProperty.call(e,n)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var i=window.webpackJsonp=window.webpackJsonp||[],u=i.push.bind(i);i.push=n,i=i.slice();for(var c=0;c<i.length;c++)n(i[c]);var a=u;o(o.s=0)}([function(e,n,t){window.loadES5Adapter().then(function(){Promise.all([t.e(0),t.e(2)]).then(t.bind(null,2)),Promise.all([t.e(0),t.e(6),t.e(3)]).then(t.bind(null,1))})}]);
//# sourceMappingURL=entrypoint.js.map
!function(e){function n(n){for(var t,o,i=n[0],a=n[1],u=0,f=[];u<i.length;u++)o=i[u],r[o]&&f.push(r[o][0]),r[o]=0;for(t in a)Object.prototype.hasOwnProperty.call(a,t)&&(e[t]=a[t]);for(c&&c(n);f.length;)f.shift()()}var t={},r={1:0};function o(n){if(t[n])return t[n].exports;var r=t[n]={i:n,l:!1,exports:{}};return e[n].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var n=[],t=r[e];if(0!==t)if(t)n.push(t[2]);else{var i=new Promise(function(n,o){t=r[e]=[n,o]});n.push(t[2]=i);var a,u=document.createElement("script");u.charset="utf-8",u.timeout=120,o.nc&&u.setAttribute("nonce",o.nc),u.src=function(e){return o.p+"chunk."+{0:"1ac383635811d6c2cb4b",2:"381b1e7d41316cfb583c",3:"a6e3bc73416702354e6d",4:"8a4a3a3274af0f09d86b",5:"7589a9f39a552ee63688",6:"31b41b04602ce627ad98",7:"ff45557361d5d6bd46af"}[e]+".js"}(e),a=function(n){u.onerror=u.onload=null,clearTimeout(c);var t=r[e];if(0!==t){if(t){var o=n&&("load"===n.type?"missing":n.type),i=n&&n.target&&n.target.src,a=new Error("Loading chunk "+e+" failed.\n("+o+": "+i+")");a.type=o,a.request=i,t[1](a)}r[e]=void 0}};var c=setTimeout(function(){a({type:"timeout",target:u})},12e4);u.onerror=u.onload=a,document.head.appendChild(u)}return Promise.all(n)},o.m=e,o.c=t,o.d=function(e,n,t){o.o(e,n)||Object.defineProperty(e,n,{enumerable:!0,get:t})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,n){if(1&n&&(e=o(e)),8&n)return e;if(4&n&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(o.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&n&&"string"!=typeof e)for(var r in e)o.d(t,r,function(n){return e[n]}.bind(null,r));return t},o.n=function(e){var n=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(n,"a",n),n},o.o=function(e,n){return Object.prototype.hasOwnProperty.call(e,n)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var i=window.webpackJsonp=window.webpackJsonp||[],a=i.push.bind(i);i.push=n,i=i.slice();for(var u=0;u<i.length;u++)n(i[u]);var c=a;o(o.s=0)}([function(e,n,t){window.loadES5Adapter().then(function(){Promise.all([t.e(0),t.e(2)]).then(t.bind(null,2)),Promise.all([t.e(0),t.e(6),t.e(3)]).then(t.bind(null,1))});var r=document.createElement("style");r.innerHTML="\nbody {\n font-family: Roboto, sans-serif;\n -moz-osx-font-smoothing: grayscale;\n -webkit-font-smoothing: antialiased;\n font-weight: 400;\n margin: 0;\n padding: 0;\n height: 100vh;\n}\n",document.head.appendChild(r)}]);

Binary file not shown.

View File

@@ -35,7 +35,7 @@ class APIProxy(CoreSysAttributes):
elif not addon.access_homeassistant_api:
_LOGGER.warning("Not permitted API access: %s", addon.slug)
else:
_LOGGER.info("%s access from %s", request.path, addon.slug)
_LOGGER.debug("%s access from %s", request.path, addon.slug)
return
raise HTTPUnauthorized()

View File

@@ -6,12 +6,19 @@ from aiohttp.web import middleware
from aiohttp.web_exceptions import HTTPUnauthorized, HTTPForbidden
from ..const import (
HEADER_TOKEN, REQUEST_FROM, ROLE_ADMIN, ROLE_DEFAULT, ROLE_HOMEASSISTANT,
ROLE_MANAGER, ROLE_BACKUP)
HEADER_TOKEN,
REQUEST_FROM,
ROLE_ADMIN,
ROLE_DEFAULT,
ROLE_HOMEASSISTANT,
ROLE_MANAGER,
ROLE_BACKUP,
)
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
# fmt: off
# Block Anytime
BLACKLIST = re.compile(
@@ -65,7 +72,7 @@ ADDONS_ROLE_ACCESS = {
r"|/hardware/.+"
r"|/hassos/.+"
r"|/supervisor/.+"
r"|/addons(?:/[^/]+/(?!security).+)?"
r"|/addons(?:/[^/]+/(?!security).+|/reload)?"
r"|/snapshots.*"
r")$"
),
@@ -74,6 +81,8 @@ ADDONS_ROLE_ACCESS = {
),
}
# fmt: off
class SecurityMiddleware(CoreSysAttributes):
"""Security middleware functions."""
@@ -104,9 +113,7 @@ class SecurityMiddleware(CoreSysAttributes):
raise HTTPUnauthorized()
# Home-Assistant
# UUID check need removed with 131
if hassio_token in (self.sys_homeassistant.uuid,
self.sys_homeassistant.hassio_token):
if hassio_token == self.sys_homeassistant.hassio_token:
_LOGGER.debug("%s access from Home Assistant", request.path)
request_from = self.sys_homeassistant

View File

@@ -1,34 +1,57 @@
"""Init file for Hass.io Supervisor RESTful API."""
import asyncio
import logging
from typing import Any, Awaitable, Dict
from aiohttp import web
import voluptuous as vol
from .utils import api_process, api_process_raw, api_validate
from ..const import (
ATTR_ADDONS, ATTR_VERSION, ATTR_LAST_VERSION, ATTR_CHANNEL, ATTR_ARCH,
HASSIO_VERSION, ATTR_ADDONS_REPOSITORIES, ATTR_LOGO, ATTR_REPOSITORY,
ATTR_DESCRIPTON, ATTR_NAME, ATTR_SLUG, ATTR_INSTALLED, ATTR_TIMEZONE,
ATTR_STATE, ATTR_WAIT_BOOT, ATTR_CPU_PERCENT, ATTR_MEMORY_USAGE,
ATTR_MEMORY_LIMIT, ATTR_NETWORK_RX, ATTR_NETWORK_TX, ATTR_BLK_READ,
ATTR_BLK_WRITE, CONTENT_TYPE_BINARY, ATTR_ICON)
ATTR_ADDONS,
ATTR_ADDONS_REPOSITORIES,
ATTR_ARCH,
ATTR_BLK_READ,
ATTR_BLK_WRITE,
ATTR_CHANNEL,
ATTR_CPU_PERCENT,
ATTR_DESCRIPTON,
ATTR_ICON,
ATTR_INSTALLED,
ATTR_LAST_VERSION,
ATTR_LOGO,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE,
ATTR_NAME,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_REPOSITORY,
ATTR_SLUG,
ATTR_STATE,
ATTR_TIMEZONE,
ATTR_VERSION,
ATTR_WAIT_BOOT,
ATTR_IP_ADDRESS,
CONTENT_TYPE_BINARY,
HASSIO_VERSION,
)
from ..coresys import CoreSysAttributes
from ..validate import WAIT_BOOT, REPOSITORIES, CHANNELS
from ..exceptions import APIError
from ..utils.validate import validate_timezone
from ..validate import CHANNELS, REPOSITORIES, WAIT_BOOT
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
SCHEMA_OPTIONS = vol.Schema({
SCHEMA_OPTIONS = vol.Schema(
{
vol.Optional(ATTR_CHANNEL): CHANNELS,
vol.Optional(ATTR_ADDONS_REPOSITORIES): REPOSITORIES,
vol.Optional(ATTR_TIMEZONE): validate_timezone,
vol.Optional(ATTR_WAIT_BOOT): WAIT_BOOT,
})
}
)
SCHEMA_VERSION = vol.Schema({
vol.Optional(ATTR_VERSION): vol.Coerce(str),
})
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
class APISupervisor(CoreSysAttributes):
@@ -40,12 +63,13 @@ class APISupervisor(CoreSysAttributes):
return True
@api_process
async def info(self, request):
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return host information."""
list_addons = []
for addon in self.sys_addons.list_addons:
if addon.is_installed:
list_addons.append({
list_addons.append(
{
ATTR_NAME: addon.name,
ATTR_SLUG: addon.slug,
ATTR_DESCRIPTON: addon.description,
@@ -55,13 +79,15 @@ class APISupervisor(CoreSysAttributes):
ATTR_REPOSITORY: addon.repository,
ATTR_ICON: addon.with_icon,
ATTR_LOGO: addon.with_logo,
})
}
)
return {
ATTR_VERSION: HASSIO_VERSION,
ATTR_LAST_VERSION: self.sys_updater.version_hassio,
ATTR_CHANNEL: self.sys_updater.channel,
ATTR_ARCH: self.sys_supervisor.arch,
ATTR_IP_ADDRESS: str(self.sys_supervisor.ip_address),
ATTR_WAIT_BOOT: self.sys_config.wait_boot,
ATTR_TIMEZONE: self.sys_config.timezone,
ATTR_ADDONS: list_addons,
@@ -69,7 +95,7 @@ class APISupervisor(CoreSysAttributes):
}
@api_process
async def options(self, request):
async def options(self, request: web.Request) -> None:
"""Set Supervisor options."""
body = await api_validate(SCHEMA_OPTIONS, request)
@@ -88,14 +114,11 @@ class APISupervisor(CoreSysAttributes):
self.sys_updater.save_data()
self.sys_config.save_data()
return True
@api_process
async def stats(self, request):
async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information."""
stats = await self.sys_supervisor.stats()
if not stats:
raise APIError("No stats available")
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
@@ -108,31 +131,21 @@ class APISupervisor(CoreSysAttributes):
}
@api_process
async def update(self, request):
async def update(self, request: web.Request) -> None:
"""Update Supervisor OS."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_updater.version_hassio)
if version == self.sys_supervisor.version:
raise APIError("Version {} is already in use".format(version))
return await asyncio.shield(self.sys_supervisor.update(version))
await asyncio.shield(self.sys_supervisor.update(version))
@api_process
async def reload(self, request):
def reload(self, request: web.Request) -> Awaitable[None]:
"""Reload add-ons, configuration, etc."""
tasks = [
self.sys_updater.reload(),
]
results, _ = await asyncio.shield(asyncio.wait(tasks))
for result in results:
if result.exception() is not None:
raise APIError("Some reload task fails!")
return True
return asyncio.shield(self.sys_updater.reload())
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request):
def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return supervisor Docker logs."""
return self.sys_supervisor.logs()

View File

@@ -19,6 +19,7 @@ from .discovery import Discovery
from .hassos import HassOS
from .homeassistant import HomeAssistant
from .host import HostManager
from .ingress import Ingress
from .services import ServiceManager
from .snapshots import SnapshotManager
from .supervisor import Supervisor
@@ -49,6 +50,7 @@ async def initialize_coresys():
coresys.addons = AddonManager(coresys)
coresys.snapshots = SnapshotManager(coresys)
coresys.host = HostManager(coresys)
coresys.ingress = Ingress(coresys)
coresys.tasks = Tasks(coresys)
coresys.services = ServiceManager(coresys)
coresys.discovery = Discovery(coresys)
@@ -71,8 +73,9 @@ def initialize_system_data(coresys):
# Home Assistant configuration folder
if not config.path_homeassistant.is_dir():
_LOGGER.info("Create Home Assistant configuration folder %s",
config.path_homeassistant)
_LOGGER.info(
"Create Home Assistant configuration folder %s", config.path_homeassistant
)
config.path_homeassistant.mkdir()
# hassio ssl folder
@@ -82,18 +85,19 @@ def initialize_system_data(coresys):
# hassio addon data folder
if not config.path_addons_data.is_dir():
_LOGGER.info("Create Hass.io Add-on data folder %s",
config.path_addons_data)
_LOGGER.info("Create Hass.io Add-on data folder %s", config.path_addons_data)
config.path_addons_data.mkdir(parents=True)
if not config.path_addons_local.is_dir():
_LOGGER.info("Create Hass.io Add-on local repository folder %s",
config.path_addons_local)
_LOGGER.info(
"Create Hass.io Add-on local repository folder %s", config.path_addons_local
)
config.path_addons_local.mkdir(parents=True)
if not config.path_addons_git.is_dir():
_LOGGER.info("Create Hass.io Add-on git repositories folder %s",
config.path_addons_git)
_LOGGER.info(
"Create Hass.io Add-on git repositories folder %s", config.path_addons_git
)
config.path_addons_git.mkdir(parents=True)
# hassio tmp folder
@@ -154,7 +158,8 @@ def initialize_logging():
"ERROR": "red",
"CRITICAL": "red",
},
))
)
)
def check_environment():
@@ -188,19 +193,16 @@ def check_environment():
def reg_signal(loop):
"""Register SIGTERM and SIGKILL to stop system."""
try:
loop.add_signal_handler(signal.SIGTERM,
lambda: loop.call_soon(loop.stop))
loop.add_signal_handler(signal.SIGTERM, lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGTERM")
try:
loop.add_signal_handler(signal.SIGHUP,
lambda: loop.call_soon(loop.stop))
loop.add_signal_handler(signal.SIGHUP, lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGHUP")
try:
loop.add_signal_handler(signal.SIGINT,
lambda: loop.call_soon(loop.stop))
loop.add_signal_handler(signal.SIGINT, lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGINT")

View File

@@ -2,14 +2,17 @@
from pathlib import Path
from ipaddress import ip_network
HASSIO_VERSION = "146"
HASSIO_VERSION = "153"
URL_HASSIO_ADDONS = "https://github.com/home-assistant/hassio-addons"
URL_HASSIO_VERSION = "https://s3.amazonaws.com/hassio-version/{channel}.json"
URL_HASSIO_APPARMOR = "https://s3.amazonaws.com/hassio-version/apparmor.txt"
URL_HASSOS_OTA = ("https://github.com/home-assistant/hassos/releases/download/"
"{version}/hassos_{board}-{version}.raucb")
URL_HASSOS_OTA = (
"https://github.com/home-assistant/hassos/releases/download/"
"{version}/hassos_{board}-{version}.raucb"
)
HASSIO_DATA = Path("/data")
@@ -20,6 +23,7 @@ FILE_HASSIO_HOMEASSISTANT = Path(HASSIO_DATA, "homeassistant.json")
FILE_HASSIO_UPDATER = Path(HASSIO_DATA, "updater.json")
FILE_HASSIO_SERVICES = Path(HASSIO_DATA, "services.json")
FILE_HASSIO_DISCOVERY = Path(HASSIO_DATA, "discovery.json")
FILE_HASSIO_INGRESS = Path(HASSIO_DATA, "ingress.json")
SOCKET_DOCKER = Path("/var/run/docker.sock")
@@ -49,8 +53,9 @@ CONTENT_TYPE_JSON = "application/json"
CONTENT_TYPE_TEXT = "text/plain"
CONTENT_TYPE_TAR = "application/tar"
CONTENT_TYPE_URL = "application/x-www-form-urlencoded"
HEADER_HA_ACCESS = "x-ha-access"
HEADER_TOKEN = "x-hassio-key"
HEADER_HA_ACCESS = "X-Ha-Access"
HEADER_TOKEN = "X-Hassio-Key"
COOKIE_INGRESS = "ingress_session"
ENV_TOKEN = "HASSIO_TOKEN"
ENV_TIME = "TZ"
@@ -157,7 +162,6 @@ ATTR_ADDON = "addon"
ATTR_AVAILABLE = "available"
ATTR_HOST = "host"
ATTR_USERNAME = "username"
ATTR_PROTOCOL = "protocol"
ATTR_DISCOVERY = "discovery"
ATTR_CONFIG = "config"
ATTR_SERVICES = "services"
@@ -186,8 +190,14 @@ ATTR_SUPERVISOR = "supervisor"
ATTR_AUTH_API = "auth_api"
ATTR_KERNEL_MODULES = "kernel_modules"
ATTR_SUPPORTED_ARCH = "supported_arch"
ATTR_INGRESS = "ingress"
ATTR_INGRESS_PORT = "ingress_port"
ATTR_INGRESS_ENTRY = "ingress_entry"
ATTR_INGRESS_TOKEN = "ingress_token"
ATTR_INGRESS_URL = "ingress_url"
ATTR_IP_ADDRESS = "ip_address"
ATTR_SESSION = "session"
SERVICE_MQTT = "mqtt"
PROVIDE_SERVICE = "provide"
NEED_SERVICE = "need"
WANT_SERVICE = "want"
@@ -199,8 +209,11 @@ STARTUP_APPLICATION = "application"
STARTUP_ONCE = "once"
STARTUP_ALL = [
STARTUP_ONCE, STARTUP_INITIALIZE, STARTUP_SYSTEM, STARTUP_SERVICES,
STARTUP_APPLICATION
STARTUP_ONCE,
STARTUP_INITIALIZE,
STARTUP_SYSTEM,
STARTUP_SERVICES,
STARTUP_APPLICATION,
]
BOOT_AUTO = "auto"
@@ -281,13 +294,7 @@ ROLE_BACKUP = "backup"
ROLE_MANAGER = "manager"
ROLE_ADMIN = "admin"
ROLE_ALL = [
ROLE_DEFAULT,
ROLE_HOMEASSISTANT,
ROLE_BACKUP,
ROLE_MANAGER,
ROLE_ADMIN,
]
ROLE_ALL = [ROLE_DEFAULT, ROLE_HOMEASSISTANT, ROLE_BACKUP, ROLE_MANAGER, ROLE_ADMIN]
CHAN_ID = "chan_id"
CHAN_TYPE = "chan_type"

View File

@@ -6,8 +6,12 @@ import logging
import async_timeout
from .coresys import CoreSysAttributes
from .const import (STARTUP_SYSTEM, STARTUP_SERVICES, STARTUP_APPLICATION,
STARTUP_INITIALIZE)
from .const import (
STARTUP_SYSTEM,
STARTUP_SERVICES,
STARTUP_APPLICATION,
STARTUP_INITIALIZE,
)
from .exceptions import HassioError, HomeAssistantError
_LOGGER = logging.getLogger(__name__)
@@ -58,6 +62,9 @@ class HassIO(CoreSysAttributes):
# Load discovery
await self.sys_discovery.load()
# Load ingress
await self.sys_ingress.load()
# start dns forwarding
self.sys_create_task(self.sys_dns.start())
@@ -108,7 +115,7 @@ class HassIO(CoreSysAttributes):
await self.sys_tasks.load()
# If landingpage / run upgrade in background
if self.sys_homeassistant.version == 'landingpage':
if self.sys_homeassistant.version == "landingpage":
self.sys_create_task(self.sys_homeassistant.install())
_LOGGER.info("Hass.io is up and running")
@@ -121,12 +128,15 @@ class HassIO(CoreSysAttributes):
# process async stop tasks
try:
with async_timeout.timeout(10):
await asyncio.wait([
await asyncio.wait(
[
self.sys_api.stop(),
self.sys_dns.stop(),
self.sys_websession.close(),
self.sys_websession_ssl.close()
])
self.sys_websession_ssl.close(),
self.sys_ingress.unload(),
]
)
except asyncio.TimeoutError:
_LOGGER.warning("Force Shutdown!")

View File

@@ -23,6 +23,7 @@ if TYPE_CHECKING:
from .hassos import HassOS
from .homeassistant import HomeAssistant
from .host import HostManager
from .ingress import Ingress
from .services import ServiceManager
from .snapshots import SnapshotManager
from .supervisor import Supervisor
@@ -63,6 +64,7 @@ class CoreSys:
self._snapshots: SnapshotManager = None
self._tasks: Tasks = None
self._host: HostManager = None
self._ingress: Ingress = None
self._dbus: DBusManager = None
self._hassos: HassOS = None
self._services: ServiceManager = None
@@ -293,6 +295,18 @@ class CoreSys:
raise RuntimeError("HostManager already set!")
self._host = value
@property
def ingress(self) -> Ingress:
"""Return Ingress object."""
return self._ingress
@ingress.setter
def ingress(self, value: Ingress):
"""Set a Ingress object."""
if self._ingress:
raise RuntimeError("Ingress already set!")
self._ingress = value
@property
def hassos(self) -> HassOS:
"""Return HassOS object."""
@@ -441,6 +455,11 @@ class CoreSysAttributes:
"""Return HostManager object."""
return self.coresys.host
@property
def sys_ingress(self) -> Ingress:
"""Return Ingress object."""
return self.coresys.ingress
@property
def sys_hassos(self) -> HassOS:
"""Return HassOS object."""

View File

@@ -1,35 +1,50 @@
"""Handle discover message for Home Assistant."""
import logging
from __future__ import annotations
from contextlib import suppress
from uuid import uuid4
import logging
from typing import Any, Dict, List, Optional, TYPE_CHECKING
from uuid import uuid4, UUID
import attr
import voluptuous as vol
from voluptuous.humanize import humanize_error
from .const import FILE_HASSIO_DISCOVERY, ATTR_CONFIG, ATTR_DISCOVERY
from .coresys import CoreSysAttributes
from .exceptions import DiscoveryError, HomeAssistantAPIError
from .validate import SCHEMA_DISCOVERY_CONFIG
from .utils.json import JsonConfig
from .services.validate import DISCOVERY_SERVICES
from ..const import ATTR_CONFIG, ATTR_DISCOVERY, FILE_HASSIO_DISCOVERY
from ..coresys import CoreSys, CoreSysAttributes
from ..exceptions import DiscoveryError, HomeAssistantAPIError
from ..utils.json import JsonConfig
from .validate import SCHEMA_DISCOVERY_CONFIG, valid_discovery_config
if TYPE_CHECKING:
from ..addons.addon import Addon
_LOGGER = logging.getLogger(__name__)
CMD_NEW = 'post'
CMD_DEL = 'delete'
CMD_NEW = "post"
CMD_DEL = "delete"
@attr.s
class Message:
"""Represent a single Discovery message."""
addon: str = attr.ib()
service: str = attr.ib()
config: Dict[str, Any] = attr.ib(cmp=False)
uuid: UUID = attr.ib(factory=lambda: uuid4().hex, cmp=False)
class Discovery(CoreSysAttributes, JsonConfig):
"""Home Assistant Discovery handler."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize discovery handler."""
super().__init__(FILE_HASSIO_DISCOVERY, SCHEMA_DISCOVERY_CONFIG)
self.coresys = coresys
self.message_obj = {}
self.coresys: CoreSys = coresys
self.message_obj: Dict[str, Message] = {}
async def load(self):
async def load(self) -> None:
"""Load exists discovery message into storage."""
messages = {}
for message in self._data[ATTR_DISCOVERY]:
@@ -39,9 +54,9 @@ class Discovery(CoreSysAttributes, JsonConfig):
_LOGGER.info("Load %d messages", len(messages))
self.message_obj = messages
def save(self):
def save(self) -> None:
"""Write discovery message into data file."""
messages = []
messages: List[Dict[str, Any]] = []
for message in self.list_messages:
messages.append(attr.asdict(message))
@@ -49,22 +64,21 @@ class Discovery(CoreSysAttributes, JsonConfig):
self._data[ATTR_DISCOVERY].extend(messages)
self.save_data()
def get(self, uuid):
def get(self, uuid: str) -> Optional[Message]:
"""Return discovery message."""
return self.message_obj.get(uuid)
@property
def list_messages(self):
def list_messages(self) -> List[Message]:
"""Return list of available discovery messages."""
return list(self.message_obj.values())
def send(self, addon, service, config):
def send(self, addon: Addon, service: str, config: Dict[str, Any]) -> Message:
"""Send a discovery message to Home Assistant."""
try:
config = DISCOVERY_SERVICES[service](config)
config = valid_discovery_config(service, config)
except vol.Invalid as err:
_LOGGER.error(
"Invalid discovery %s config", humanize_error(config, err))
_LOGGER.error("Invalid discovery %s config", humanize_error(config, err))
raise DiscoveryError() from None
# Create message
@@ -77,24 +91,26 @@ class Discovery(CoreSysAttributes, JsonConfig):
_LOGGER.info("Duplicate discovery message from %s", addon.slug)
return old_message
_LOGGER.info("Send discovery to Home Assistant %s from %s",
service, addon.slug)
_LOGGER.info("Send discovery to Home Assistant %s from %s", service, addon.slug)
self.message_obj[message.uuid] = message
self.save()
self.sys_create_task(self._push_discovery(message, CMD_NEW))
return message
def remove(self, message):
def remove(self, message: Message) -> None:
"""Remove a discovery message from Home Assistant."""
self.message_obj.pop(message.uuid, None)
self.save()
_LOGGER.info("Delete discovery to Home Assistant %s from %s",
message.service, message.addon)
_LOGGER.info(
"Delete discovery to Home Assistant %s from %s",
message.service,
message.addon,
)
self.sys_create_task(self._push_discovery(message, CMD_DEL))
async def _push_discovery(self, message, command):
async def _push_discovery(self, message: Message, command: str) -> None:
"""Send a discovery request."""
if not await self.sys_homeassistant.check_api_state():
_LOGGER.info("Discovery %s mesage ignore", message.uuid)
@@ -105,18 +121,12 @@ class Discovery(CoreSysAttributes, JsonConfig):
with suppress(HomeAssistantAPIError):
async with self.sys_homeassistant.make_request(
command, f"api/hassio_push/discovery/{message.uuid}",
json=data, timeout=10):
command,
f"api/hassio_push/discovery/{message.uuid}",
json=data,
timeout=10,
):
_LOGGER.info("Discovery %s message send", message.uuid)
return
_LOGGER.warning("Discovery %s message fail", message.uuid)
@attr.s
class Message:
"""Represent a single Discovery message."""
addon = attr.ib()
service = attr.ib()
config = attr.ib(cmp=False)
uuid = attr.ib(factory=lambda: uuid4().hex, cmp=False)

10
hassio/discovery/const.py Normal file
View File

@@ -0,0 +1,10 @@
"""Discovery static data."""
ATTR_HOST = "host"
ATTR_PASSWORD = "password"
ATTR_PORT = "port"
ATTR_PROTOCOL = "protocol"
ATTR_SSL = "ssl"
ATTR_USERNAME = "username"
ATTR_API_KEY = "api_key"
ATTR_SERIAL = "serial"

View File

@@ -0,0 +1 @@
"""Discovery service modules."""

View File

@@ -0,0 +1,16 @@
"""Discovery service for MQTT."""
import voluptuous as vol
from hassio.validate import NETWORK_PORT
from ..const import ATTR_HOST, ATTR_PORT, ATTR_API_KEY, ATTR_SERIAL
SCHEMA = vol.Schema(
{
vol.Required(ATTR_HOST): vol.Coerce(str),
vol.Required(ATTR_PORT): NETWORK_PORT,
vol.Required(ATTR_SERIAL): vol.Coerce(str),
vol.Required(ATTR_API_KEY): vol.Coerce(str),
}
)

View File

@@ -0,0 +1,27 @@
"""Discovery service for MQTT."""
import voluptuous as vol
from hassio.validate import NETWORK_PORT
from ..const import (
ATTR_HOST,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_PROTOCOL,
ATTR_SSL,
ATTR_USERNAME,
)
# pylint: disable=no-value-for-parameter
SCHEMA = vol.Schema(
{
vol.Required(ATTR_HOST): vol.Coerce(str),
vol.Required(ATTR_PORT): NETWORK_PORT,
vol.Optional(ATTR_USERNAME): vol.Coerce(str),
vol.Optional(ATTR_PASSWORD): vol.Coerce(str),
vol.Optional(ATTR_SSL, default=False): vol.Boolean(),
vol.Optional(ATTR_PROTOCOL, default="3.1.1"): vol.All(
vol.Coerce(str), vol.In(["3.1", "3.1.1"])
),
}
)

View File

@@ -0,0 +1,47 @@
"""Validate services schema."""
from pathlib import Path
from importlib import import_module
import voluptuous as vol
from ..const import ATTR_ADDON, ATTR_CONFIG, ATTR_DISCOVERY, ATTR_SERVICE, ATTR_UUID
from ..utils.validate import schema_or
from ..validate import UUID_MATCH
def valid_discovery_service(service):
"""Validate service name."""
service_file = Path(__file__).parent.joinpath(f"services/{service}.py")
if not service_file.exists():
raise vol.Invalid(f"Service {service} not found")
return service
def valid_discovery_config(service, config):
"""Validate service name."""
try:
service_mod = import_module(f".services.{service}", "hassio.discovery")
except ImportError:
raise vol.Invalid(f"Service {service} not found")
return service_mod.SCHEMA(config)
SCHEMA_DISCOVERY = vol.Schema(
[
vol.Schema(
{
vol.Required(ATTR_UUID): UUID_MATCH,
vol.Required(ATTR_ADDON): vol.Coerce(str),
vol.Required(ATTR_SERVICE): valid_discovery_service,
vol.Required(ATTR_CONFIG): vol.Maybe(dict),
},
extra=vol.REMOVE_EXTRA,
)
]
)
SCHEMA_DISCOVERY_CONFIG = vol.Schema(
{vol.Optional(ATTR_DISCOVERY, default=list): schema_or(SCHEMA_DISCOVERY)},
extra=vol.REMOVE_EXTRA,
)

View File

@@ -1,12 +1,14 @@
"""Init file for Hass.io Docker object."""
from contextlib import suppress
import logging
from contextlib import suppress
from typing import Any, Dict, Optional
import attr
import docker
from .network import DockerNetwork
from ..const import SOCKET_DOCKER
from ..exceptions import DockerAPIError
from .network import DockerNetwork
_LOGGER = logging.getLogger(__name__)
@@ -14,8 +16,9 @@ _LOGGER = logging.getLogger(__name__)
@attr.s(frozen=True)
class CommandReturn:
"""Return object from command run."""
exit_code = attr.ib()
output = attr.ib()
exit_code: int = attr.ib()
output: bytes = attr.ib()
class DockerAPI:
@@ -26,75 +29,87 @@ class DockerAPI:
def __init__(self):
"""Initialize Docker base wrapper."""
self.docker = docker.DockerClient(
base_url="unix:/{}".format(str(SOCKET_DOCKER)),
version='auto', timeout=900)
self.network = DockerNetwork(self.docker)
self.docker: docker.DockerClient = docker.DockerClient(
base_url="unix:/{}".format(str(SOCKET_DOCKER)), version="auto", timeout=900
)
self.network: DockerNetwork = DockerNetwork(self.docker)
@property
def images(self):
def images(self) -> docker.models.images.ImageCollection:
"""Return API images."""
return self.docker.images
@property
def containers(self):
def containers(self) -> docker.models.containers.ContainerCollection:
"""Return API containers."""
return self.docker.containers
@property
def api(self):
def api(self) -> docker.APIClient:
"""Return API containers."""
return self.docker.api
def run(self, image, **kwargs):
def run(
self, image: str, **kwargs: Dict[str, Any]
) -> docker.models.containers.Container:
""""Create a Docker container and run it.
Need run inside executor.
"""
name = kwargs.get('name', image)
network_mode = kwargs.get('network_mode')
hostname = kwargs.get('hostname')
name = kwargs.get("name", image)
network_mode = kwargs.get("network_mode")
hostname = kwargs.get("hostname")
# Setup network
kwargs['dns_search'] = ["."]
kwargs["dns_search"] = ["."]
if network_mode:
kwargs['dns'] = [str(self.network.supervisor)]
kwargs['dns_opt'] = ["ndots:0"]
kwargs["dns"] = [str(self.network.supervisor)]
kwargs["dns_opt"] = ["ndots:0"]
else:
kwargs['network'] = None
kwargs["network"] = None
# Create container
try:
container = self.docker.containers.create(
image, use_config_proxy=False, **kwargs)
image, use_config_proxy=False, **kwargs
)
except docker.errors.DockerException as err:
_LOGGER.error("Can't create container from %s: %s", name, err)
return False
raise DockerAPIError() from None
# attach network
# Attach network
if not network_mode:
alias = [hostname] if hostname else None
if self.network.attach_container(container, alias=alias):
self.network.detach_default_bridge(container)
else:
try:
self.network.attach_container(container, alias=alias)
except DockerAPIError:
_LOGGER.warning("Can't attach %s to hassio-net!", name)
else:
with suppress(DockerAPIError):
self.network.detach_default_bridge(container)
# run container
# Run container
try:
container.start()
except docker.errors.DockerException as err:
_LOGGER.error("Can't start %s: %s", name, err)
return False
raise DockerAPIError() from None
return True
# Update metadata
with suppress(docker.errors.DockerException):
container.reload()
def run_command(self, image, command=None, **kwargs):
return container
def run_command(
self, image: str, command: Optional[str] = None, **kwargs: Dict[str, Any]
) -> CommandReturn:
"""Create a temporary container and run command.
Need run inside executor.
"""
stdout = kwargs.get('stdout', True)
stderr = kwargs.get('stderr', True)
stdout = kwargs.get("stdout", True)
stderr = kwargs.get("stderr", True)
_LOGGER.info("Run command '%s' on %s", command, image)
try:
@@ -112,11 +127,11 @@ class DockerAPI:
except docker.errors.DockerException as err:
_LOGGER.error("Can't execute command: %s", err)
return CommandReturn(None, b"")
raise DockerAPIError() from None
finally:
# cleanup container
with suppress(docker.errors.DockerException):
container.remove(force=True)
return CommandReturn(result.get('StatusCode'), output)
return CommandReturn(result.get("StatusCode"), output)

View File

@@ -1,15 +1,35 @@
"""Init file for Hass.io add-on Docker object."""
from __future__ import annotations
from contextlib import suppress
from ipaddress import IPv4Address, ip_address
import logging
import os
from pathlib import Path
from typing import TYPE_CHECKING, Dict, List, Optional, Union, Awaitable
import docker
import requests
from .interface import DockerInterface
from ..addons.build import AddonBuild
from ..const import (MAP_CONFIG, MAP_SSL, MAP_ADDONS, MAP_BACKUP, MAP_SHARE,
ENV_TOKEN, ENV_TIME, SECURITY_PROFILE, SECURITY_DISABLE)
from ..const import (
ENV_TIME,
ENV_TOKEN,
MAP_ADDONS,
MAP_BACKUP,
MAP_CONFIG,
MAP_SHARE,
MAP_SSL,
SECURITY_DISABLE,
SECURITY_PROFILE,
)
from ..coresys import CoreSys
from ..exceptions import DockerAPIError
from ..utils import process_lock
from .interface import DockerInterface
if TYPE_CHECKING:
from ..addons.addon import Addon
_LOGGER = logging.getLogger(__name__)
@@ -19,64 +39,77 @@ AUDIO_DEVICE = "/dev/snd:/dev/snd:rwm"
class DockerAddon(DockerInterface):
"""Docker Hass.io wrapper for Home Assistant."""
def __init__(self, coresys, slug):
def __init__(self, coresys: CoreSys, slug: str):
"""Initialize Docker Home Assistant wrapper."""
super().__init__(coresys)
self._id = slug
self._id: str = slug
@property
def addon(self):
def addon(self) -> Addon:
"""Return add-on of Docker image."""
return self.sys_addons.get(self._id)
@property
def image(self):
def image(self) -> str:
"""Return name of Docker image."""
return self.addon.image
@property
def timeout(self):
def ip_address(self) -> IPv4Address:
"""Return IP address of this container."""
if self.addon.host_network:
return self.sys_docker.network.gateway
# Extract IP-Address
try:
return ip_address(
self._meta["NetworkSettings"]["Networks"]["hassio"]["IPAddress"])
except (KeyError, TypeError, ValueError):
return ip_address("0.0.0.0")
@property
def timeout(self) -> int:
"""Return timeout for Docker actions."""
return self.addon.timeout
@property
def version(self):
def version(self) -> str:
"""Return version of Docker image."""
if self.addon.legacy:
return self.addon.version_installed
return super().version
@property
def arch(self):
def arch(self) -> str:
"""Return arch of Docker image."""
if self.addon.legacy:
return self.sys_arch.default
return super().arch
@property
def name(self):
def name(self) -> str:
"""Return name of Docker container."""
return "addon_{}".format(self.addon.slug)
return f"addon_{self.addon.slug}"
@property
def ipc(self):
def ipc(self) -> Optional[str]:
"""Return the IPC namespace."""
if self.addon.host_ipc:
return 'host'
return "host"
return None
@property
def full_access(self):
def full_access(self) -> bool:
"""Return True if full access is enabled."""
return not self.addon.protected and self.addon.with_full_access
@property
def hostname(self):
def hostname(self) -> str:
"""Return slug/id of add-on."""
return self.addon.slug.replace('_', '-')
return self.addon.slug.replace("_", "-")
@property
def environment(self):
def environment(self) -> Dict[str, str]:
"""Return environment for Docker add-on."""
addon_env = self.addon.environment or {}
@@ -86,8 +119,7 @@ class DockerAddon(DockerInterface):
if isinstance(value, (int, str)):
addon_env[key] = value
else:
_LOGGER.warning(
"Can not set nested option %s as Docker env", key)
_LOGGER.warning("Can not set nested option %s as Docker env", key)
return {
**addon_env,
@@ -96,7 +128,7 @@ class DockerAddon(DockerInterface):
}
@property
def devices(self):
def devices(self) -> List[str]:
"""Return needed devices."""
devices = self.addon.devices or []
@@ -113,7 +145,7 @@ class DockerAddon(DockerInterface):
return devices or None
@property
def ports(self):
def ports(self) -> Optional[Dict[str, Union[str, int, None]]]:
"""Filter None from add-on ports."""
if not self.addon.ports:
return None
@@ -125,7 +157,7 @@ class DockerAddon(DockerInterface):
}
@property
def security_opt(self):
def security_opt(self) -> List[str]:
"""Controlling security options."""
security = []
@@ -143,7 +175,7 @@ class DockerAddon(DockerInterface):
return security
@property
def tmpfs(self):
def tmpfs(self) -> Optional[Dict[str, str]]:
"""Return tmpfs for Docker add-on."""
options = self.addon.tmpfs
if options:
@@ -151,156 +183,148 @@ class DockerAddon(DockerInterface):
return None
@property
def network_mapping(self):
def network_mapping(self) -> Dict[str, str]:
"""Return hosts mapping."""
return {
'homeassistant': self.sys_docker.network.gateway,
'hassio': self.sys_docker.network.supervisor,
"homeassistant": self.sys_docker.network.gateway,
"hassio": self.sys_docker.network.supervisor,
}
@property
def network_mode(self):
def network_mode(self) -> Optional[str]:
"""Return network mode for add-on."""
if self.addon.host_network:
return 'host'
return "host"
return None
@property
def pid_mode(self):
def pid_mode(self) -> Optional[str]:
"""Return PID mode for add-on."""
if not self.addon.protected and self.addon.host_pid:
return 'host'
return "host"
return None
@property
def volumes(self):
def volumes(self) -> Dict[str, Dict[str, str]]:
"""Generate volumes for mappings."""
volumes = {
str(self.addon.path_extern_data): {
'bind': "/data",
'mode': 'rw'
}
}
volumes = {str(self.addon.path_extern_data): {"bind": "/data", "mode": "rw"}}
addon_mapping = self.addon.map_volumes
# setup config mappings
if MAP_CONFIG in addon_mapping:
volumes.update({
volumes.update(
{
str(self.sys_config.path_extern_homeassistant): {
'bind': "/config",
'mode': addon_mapping[MAP_CONFIG]
"bind": "/config",
"mode": addon_mapping[MAP_CONFIG],
}
})
}
)
if MAP_SSL in addon_mapping:
volumes.update({
volumes.update(
{
str(self.sys_config.path_extern_ssl): {
'bind': "/ssl",
'mode': addon_mapping[MAP_SSL]
"bind": "/ssl",
"mode": addon_mapping[MAP_SSL],
}
})
}
)
if MAP_ADDONS in addon_mapping:
volumes.update({
volumes.update(
{
str(self.sys_config.path_extern_addons_local): {
'bind': "/addons",
'mode': addon_mapping[MAP_ADDONS]
"bind": "/addons",
"mode": addon_mapping[MAP_ADDONS],
}
})
}
)
if MAP_BACKUP in addon_mapping:
volumes.update({
volumes.update(
{
str(self.sys_config.path_extern_backup): {
'bind': "/backup",
'mode': addon_mapping[MAP_BACKUP]
"bind": "/backup",
"mode": addon_mapping[MAP_BACKUP],
}
})
}
)
if MAP_SHARE in addon_mapping:
volumes.update({
volumes.update(
{
str(self.sys_config.path_extern_share): {
'bind': "/share",
'mode': addon_mapping[MAP_SHARE]
"bind": "/share",
"mode": addon_mapping[MAP_SHARE],
}
})
}
)
# Init other hardware mappings
# GPIO support
if self.addon.with_gpio and self.sys_hardware.support_gpio:
for gpio_path in ("/sys/class/gpio", "/sys/devices/platform/soc"):
volumes.update({
gpio_path: {
'bind': gpio_path,
'mode': 'rw'
},
})
volumes.update({gpio_path: {"bind": gpio_path, "mode": "rw"}})
# DeviceTree support
if self.addon.with_devicetree:
volumes.update({
volumes.update(
{
"/sys/firmware/devicetree/base": {
'bind': "/device-tree",
'mode': 'ro'
},
})
"bind": "/device-tree",
"mode": "ro",
}
}
)
# Kernel Modules support
if self.addon.with_kernel_modules:
volumes.update({
"/lib/modules": {
'bind': "/lib/modules",
'mode': 'ro'
},
})
volumes.update({"/lib/modules": {"bind": "/lib/modules", "mode": "ro"}})
# Docker API support
if not self.addon.protected and self.addon.access_docker_api:
volumes.update({
"/var/run/docker.sock": {
'bind': "/var/run/docker.sock",
'mode': 'ro'
},
})
volumes.update(
{"/var/run/docker.sock": {"bind": "/var/run/docker.sock", "mode": "ro"}}
)
# Host D-Bus system
if self.addon.host_dbus:
volumes.update({
"/var/run/dbus": {
'bind': "/var/run/dbus",
'mode': 'rw'
}
})
volumes.update({"/var/run/dbus": {"bind": "/var/run/dbus", "mode": "rw"}})
# ALSA configuration
if self.addon.with_audio:
volumes.update({
volumes.update(
{
str(self.addon.path_extern_asound): {
'bind': "/etc/asound.conf",
'mode': 'ro'
"bind": "/etc/asound.conf",
"mode": "ro",
}
})
}
)
return volumes
def _run(self):
def _run(self) -> None:
"""Run Docker image.
Need run inside executor.
"""
if self._is_running():
return True
return
# Security check
if not self.addon.protected:
_LOGGER.warning("%s run with disabled protected mode!",
self.addon.name)
_LOGGER.warning("%s run with disabled protected mode!", self.addon.name)
# cleanup
# Cleanup
with suppress(DockerAPIError):
self._stop()
ret = self.sys_docker.run(
# Create & Run container
docker_container = self.sys_docker.run(
self.image,
name=self.name,
hostname=self.hostname,
@@ -318,25 +342,23 @@ class DockerAddon(DockerInterface):
security_opt=self.security_opt,
environment=self.environment,
volumes=self.volumes,
tmpfs=self.tmpfs)
tmpfs=self.tmpfs,
)
if ret:
_LOGGER.info("Start Docker add-on %s with version %s", self.image,
self.version)
_LOGGER.info("Start Docker add-on %s with version %s", self.image, self.version)
self._meta = docker_container.attrs
return ret
def _install(self, tag, image=None):
def _install(self, tag: str, image: Optional[str] = None) -> None:
"""Pull Docker image or build it.
Need run inside executor.
"""
if self.addon.need_build:
return self._build(tag)
self._build(tag)
return super()._install(tag, image)
super()._install(tag, image)
def _build(self, tag):
def _build(self, tag: str) -> None:
"""Build a Docker container.
Need run inside executor.
@@ -346,27 +368,27 @@ class DockerAddon(DockerInterface):
_LOGGER.info("Start build %s:%s", self.image, tag)
try:
image, log = self.sys_docker.images.build(
use_config_proxy=False, **build_env.get_docker_args(tag))
use_config_proxy=False, **build_env.get_docker_args(tag)
)
_LOGGER.debug("Build %s:%s done: %s", self.image, tag, log)
image.tag(self.image, tag='latest')
image.tag(self.image, tag="latest")
# Update meta data
self._meta = image.attrs
except docker.errors.DockerException as err:
_LOGGER.error("Can't build %s:%s: %s", self.image, tag, err)
return False
raise DockerAPIError() from None
_LOGGER.info("Build %s:%s done", self.image, tag)
return True
@process_lock
def export_image(self, path):
def export_image(self, tar_file: Path) -> Awaitable[None]:
"""Export current images into a tar file."""
return self.sys_run_in_executor(self._export_image, path)
return self.sys_run_in_executor(self._export_image, tar_file)
def _export_image(self, tar_file):
def _export_image(self, tar_file: Path) -> None:
"""Export current images into a tar file.
Need run inside executor.
@@ -375,7 +397,7 @@ class DockerAddon(DockerInterface):
image = self.sys_docker.api.get_image(self.image)
except docker.errors.DockerException as err:
_LOGGER.error("Can't fetch image %s: %s", self.image, err)
return False
raise DockerAPIError() from None
_LOGGER.info("Export image %s to %s", self.image, tar_file)
try:
@@ -384,17 +406,16 @@ class DockerAddon(DockerInterface):
write_tar.write(chunk)
except (OSError, requests.exceptions.ReadTimeout) as err:
_LOGGER.error("Can't write tar file %s: %s", tar_file, err)
return False
raise DockerAPIError() from None
_LOGGER.info("Export image %s done", self.image)
return True
@process_lock
def import_image(self, path, tag):
def import_image(self, tar_file: Path, tag: str) -> Awaitable[None]:
"""Import a tar file as image."""
return self.sys_run_in_executor(self._import_image, path, tag)
return self.sys_run_in_executor(self._import_image, tar_file, tag)
def _import_image(self, tar_file, tag):
def _import_image(self, tar_file: Path, tag: str) -> None:
"""Import a tar file as image.
Need run inside executor.
@@ -403,37 +424,38 @@ class DockerAddon(DockerInterface):
with tar_file.open("rb") as read_tar:
self.sys_docker.api.load_image(read_tar, quiet=True)
image = self.sys_docker.images.get(self.image)
image.tag(self.image, tag=tag)
docker_image = self.sys_docker.images.get(self.image)
docker_image.tag(self.image, tag=tag)
except (docker.errors.DockerException, OSError) as err:
_LOGGER.error("Can't import image %s: %s", self.image, err)
return False
raise DockerAPIError() from None
_LOGGER.info("Import image %s and tag %s", tar_file, tag)
self._meta = image.attrs
self._meta = docker_image.attrs
with suppress(DockerAPIError):
self._cleanup()
return True
@process_lock
def write_stdin(self, data):
def write_stdin(self, data: bytes) -> Awaitable[None]:
"""Write to add-on stdin."""
return self.sys_run_in_executor(self._write_stdin, data)
def _write_stdin(self, data):
def _write_stdin(self, data: bytes) -> None:
"""Write to add-on stdin.
Need run inside executor.
"""
if not self._is_running():
return False
raise DockerAPIError() from None
try:
# Load needed docker objects
container = self.sys_docker.containers.get(self.name)
socket = container.attach_socket(params={'stdin': 1, 'stream': 1})
socket = container.attach_socket(params={"stdin": 1, "stream": 1})
except docker.errors.DockerException as err:
_LOGGER.error("Can't attach to %s stdin: %s", self.name, err)
return False
raise DockerAPIError() from None
try:
# Write to stdin
@@ -442,6 +464,4 @@ class DockerAddon(DockerInterface):
socket.close()
except OSError as err:
_LOGGER.error("Can't write to %s stdin: %s", self.name, err)
return False
return True
raise DockerAPIError() from None

View File

@@ -17,7 +17,7 @@ class DockerHassOSCli(DockerInterface, CoreSysAttributes):
"""Return name of HassOS CLI image."""
return f"homeassistant/{self.sys_arch.supervisor}-hassio-cli"
def _stop(self):
def _stop(self, remove_container=True):
"""Don't need stop."""
return True
@@ -33,5 +33,6 @@ class DockerHassOSCli(DockerInterface, CoreSysAttributes):
else:
self._meta = image.attrs
_LOGGER.info("Found HassOS CLI %s with version %s", self.image,
self.version)
_LOGGER.info(
"Found HassOS CLI %s with version %s", self.image, self.version
)

View File

@@ -1,14 +1,18 @@
"""Init file for Hass.io Docker object."""
from contextlib import suppress
from ipaddress import IPv4Address
import logging
from typing import Awaitable
import docker
from .interface import DockerInterface
from ..const import ENV_TOKEN, ENV_TIME, LABEL_MACHINE
from ..const import ENV_TIME, ENV_TOKEN, LABEL_MACHINE
from ..exceptions import DockerAPIError
from .interface import CommandReturn, DockerInterface
_LOGGER = logging.getLogger(__name__)
HASS_DOCKER_NAME = 'homeassistant'
HASS_DOCKER_NAME = "homeassistant"
class DockerHomeAssistant(DockerInterface):
@@ -17,8 +21,8 @@ class DockerHomeAssistant(DockerInterface):
@property
def machine(self):
"""Return machine of Home Assistant Docker image."""
if self._meta and LABEL_MACHINE in self._meta['Config']['Labels']:
return self._meta['Config']['Labels'][LABEL_MACHINE]
if self._meta and LABEL_MACHINE in self._meta["Config"]["Labels"]:
return self._meta["Config"]["Labels"][LABEL_MACHINE]
return None
@property
@@ -39,18 +43,25 @@ class DockerHomeAssistant(DockerInterface):
devices.append(f"{device}:{device}:rwm")
return devices or None
def _run(self):
@property
def ip_address(self) -> IPv4Address:
"""Return IP address of this container."""
return self.sys_docker.network.gateway
def _run(self) -> None:
"""Run Docker image.
Need run inside executor.
"""
if self._is_running():
return False
return
# cleanup
# Cleanup
with suppress(DockerAPIError):
self._stop()
ret = self.sys_docker.run(
# Create & Run container
docker_container = self.sys_docker.run(
self.image,
name=self.name,
hostname=self.name,
@@ -58,29 +69,29 @@ class DockerHomeAssistant(DockerInterface):
privileged=True,
init=True,
devices=self.devices,
network_mode='host',
network_mode="host",
environment={
'HASSIO': self.sys_docker.network.supervisor,
"HASSIO": self.sys_docker.network.supervisor,
ENV_TIME: self.sys_timezone,
ENV_TOKEN: self.sys_homeassistant.hassio_token,
},
volumes={
str(self.sys_config.path_extern_homeassistant):
{'bind': '/config', 'mode': 'rw'},
str(self.sys_config.path_extern_ssl):
{'bind': '/ssl', 'mode': 'ro'},
str(self.sys_config.path_extern_share):
{'bind': '/share', 'mode': 'rw'},
}
str(self.sys_config.path_extern_homeassistant): {
"bind": "/config",
"mode": "rw",
},
str(self.sys_config.path_extern_ssl): {"bind": "/ssl", "mode": "ro"},
str(self.sys_config.path_extern_share): {
"bind": "/share",
"mode": "rw",
},
},
)
if ret:
_LOGGER.info("Start homeassistant %s with version %s",
self.image, self.version)
_LOGGER.info("Start homeassistant %s with version %s", self.image, self.version)
self._meta = docker_container.attrs
return ret
def _execute_command(self, command):
def _execute_command(self, command: str) -> CommandReturn:
"""Create a temporary container and run command.
Need run inside executor.
@@ -94,31 +105,37 @@ class DockerHomeAssistant(DockerInterface):
detach=True,
stdout=True,
stderr=True,
environment={
ENV_TIME: self.sys_timezone,
},
environment={ENV_TIME: self.sys_timezone},
volumes={
str(self.sys_config.path_extern_homeassistant):
{'bind': '/config', 'mode': 'rw'},
str(self.sys_config.path_extern_ssl):
{'bind': '/ssl', 'mode': 'ro'},
str(self.sys_config.path_extern_share):
{'bind': '/share', 'mode': 'ro'},
}
str(self.sys_config.path_extern_homeassistant): {
"bind": "/config",
"mode": "rw",
},
str(self.sys_config.path_extern_ssl): {"bind": "/ssl", "mode": "ro"},
str(self.sys_config.path_extern_share): {
"bind": "/share",
"mode": "ro",
},
},
)
def is_initialize(self):
def is_initialize(self) -> Awaitable[bool]:
"""Return True if Docker container exists."""
return self.sys_run_in_executor(self._is_initialize)
def _is_initialize(self):
def _is_initialize(self) -> bool:
"""Return True if docker container exists.
Need run inside executor.
"""
try:
self.sys_docker.containers.get(self.name)
docker_container = self.sys_docker.containers.get(self.name)
docker_image = self.sys_docker.images.get(self.image)
except docker.errors.DockerException:
return False
# we run on an old image, stop and start it
if docker_container.image.id != docker_image.id:
return False
return True

View File

@@ -2,13 +2,16 @@
import asyncio
from contextlib import suppress
import logging
from typing import Any, Dict, Optional, Awaitable
import docker
from .stats import DockerStats
from ..const import LABEL_VERSION, LABEL_ARCH
from ..coresys import CoreSysAttributes
from ..const import LABEL_ARCH, LABEL_VERSION
from ..coresys import CoreSys, CoreSysAttributes
from ..exceptions import DockerAPIError
from ..utils import process_lock
from .stats import DockerStats
from . import CommandReturn
_LOGGER = logging.getLogger(__name__)
@@ -16,60 +19,60 @@ _LOGGER = logging.getLogger(__name__)
class DockerInterface(CoreSysAttributes):
"""Docker Hass.io interface."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize Docker base wrapper."""
self.coresys = coresys
self._meta = None
self.lock = asyncio.Lock(loop=coresys.loop)
self.coresys: CoreSys = coresys
self._meta: Optional[Dict[str, Any]] = None
self.lock: asyncio.Lock = asyncio.Lock(loop=coresys.loop)
@property
def timeout(self):
def timeout(self) -> str:
"""Return timeout for Docker actions."""
return 30
@property
def name(self):
def name(self) -> Optional[str]:
"""Return name of Docker container."""
return None
@property
def meta_config(self):
def meta_config(self) -> Dict[str, Any]:
"""Return meta data of configuration for container/image."""
if not self._meta:
return {}
return self._meta.get('Config', {})
return self._meta.get("Config", {})
@property
def meta_labels(self):
def meta_labels(self) -> Dict[str, str]:
"""Return meta data of labels for container/image."""
return self.meta_config.get('Labels') or {}
return self.meta_config.get("Labels") or {}
@property
def image(self):
def image(self) -> Optional[str]:
"""Return name of Docker image."""
return self.meta_config.get('Image')
return self.meta_config.get("Image")
@property
def version(self):
def version(self) -> Optional[str]:
"""Return version of Docker image."""
return self.meta_labels.get(LABEL_VERSION)
@property
def arch(self):
def arch(self) -> Optional[str]:
"""Return arch of Docker image."""
return self.meta_labels.get(LABEL_ARCH)
@property
def in_progress(self):
def in_progress(self) -> bool:
"""Return True if a task is in progress."""
return self.lock.locked()
@process_lock
def install(self, tag, image=None):
def install(self, tag: str, image: Optional[str] = None):
"""Pull docker image."""
return self.sys_run_in_executor(self._install, tag, image)
def _install(self, tag, image=None):
def _install(self, tag: str, image: Optional[str] = None) -> None:
"""Pull Docker image.
Need run inside executor.
@@ -80,20 +83,19 @@ class DockerInterface(CoreSysAttributes):
_LOGGER.info("Pull image %s tag %s.", image, tag)
docker_image = self.sys_docker.images.pull(f"{image}:{tag}")
docker_image.tag(image, tag='latest')
self._meta = docker_image.attrs
_LOGGER.info("Tag image %s with version %s as latest", image, tag)
docker_image.tag(image, tag="latest")
except docker.errors.APIError as err:
_LOGGER.error("Can't install %s:%s -> %s.", image, tag, err)
return False
raise DockerAPIError() from None
else:
self._meta = docker_image.attrs
_LOGGER.info("Tag image %s with version %s as latest", image, tag)
return True
def exists(self):
def exists(self) -> Awaitable[bool]:
"""Return True if Docker image exists in local repository."""
return self.sys_run_in_executor(self._exists)
def _exists(self):
def _exists(self) -> bool:
"""Return True if Docker image exists in local repository.
Need run inside executor.
@@ -106,14 +108,14 @@ class DockerInterface(CoreSysAttributes):
return True
def is_running(self):
def is_running(self) -> Awaitable[bool]:
"""Return True if Docker is running.
Return a Future.
"""
return self.sys_run_in_executor(self._is_running)
def _is_running(self):
def _is_running(self) -> bool:
"""Return True if Docker is running.
Need run inside executor.
@@ -125,7 +127,7 @@ class DockerInterface(CoreSysAttributes):
return False
# container is not running
if docker_container.status != 'running':
if docker_container.status != "running":
return False
# we run on an old image, stop and start it
@@ -139,7 +141,7 @@ class DockerInterface(CoreSysAttributes):
"""Attach to running Docker container."""
return self.sys_run_in_executor(self._attach)
def _attach(self):
def _attach(self) -> None:
"""Attach to running docker container.
Need run inside executor.
@@ -147,22 +149,21 @@ class DockerInterface(CoreSysAttributes):
try:
if self.image:
self._meta = self.sys_docker.images.get(self.image).attrs
else:
self._meta = self.sys_docker.containers.get(self.name).attrs
except docker.errors.DockerException:
return False
pass
_LOGGER.info("Attach to image %s with version %s", self.image,
self.version)
return True
# Successfull?
if not self._meta:
raise DockerAPIError() from None
_LOGGER.info("Attach to %s with version %s", self.image, self.version)
@process_lock
def run(self):
def run(self) -> Awaitable[None]:
"""Run Docker image."""
return self.sys_run_in_executor(self._run)
def _run(self):
def _run(self) -> None:
"""Run Docker image.
Need run inside executor.
@@ -170,96 +171,117 @@ class DockerInterface(CoreSysAttributes):
raise NotImplementedError()
@process_lock
def stop(self):
def stop(self, remove_container=True) -> Awaitable[None]:
"""Stop/remove Docker container."""
return self.sys_run_in_executor(self._stop)
return self.sys_run_in_executor(self._stop, remove_container)
def _stop(self):
"""Stop/remove and remove docker container.
def _stop(self, remove_container=True) -> None:
"""Stop/remove Docker container.
Need run inside executor.
"""
try:
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
return False
raise DockerAPIError() from None
if docker_container.status == 'running':
_LOGGER.info("Stop %s Docker application", self.image)
if docker_container.status == "running":
_LOGGER.info("Stop %s application", self.name)
with suppress(docker.errors.DockerException):
docker_container.stop(timeout=self.timeout)
if remove_container:
with suppress(docker.errors.DockerException):
_LOGGER.info("Clean %s Docker application", self.image)
_LOGGER.info("Clean %s application", self.name)
docker_container.remove(force=True)
return True
@process_lock
def start(self) -> Awaitable[None]:
"""Start Docker container."""
return self.sys_run_in_executor(self._start)
def _start(self) -> None:
"""Start docker container.
Need run inside executor.
"""
try:
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
raise DockerAPIError() from None
_LOGGER.info("Start %s", self.image)
try:
docker_container.start()
except docker.errors.DockerException as err:
_LOGGER.error("Can't start %s: %s", self.image, err)
raise DockerAPIError() from None
@process_lock
def remove(self):
def remove(self) -> Awaitable[None]:
"""Remove Docker images."""
return self.sys_run_in_executor(self._remove)
def _remove(self):
def _remove(self) -> None:
"""remove docker images.
Need run inside executor.
"""
# Cleanup container
with suppress(DockerAPIError):
self._stop()
_LOGGER.info("Remove Docker %s with latest and %s", self.image,
self.version)
_LOGGER.info("Remove image %s with latest and %s", self.image, self.version)
try:
with suppress(docker.errors.ImageNotFound):
self.sys_docker.images.remove(
image=f"{self.image}:latest", force=True)
self.sys_docker.images.remove(image=f"{self.image}:latest", force=True)
with suppress(docker.errors.ImageNotFound):
self.sys_docker.images.remove(
image=f"{self.image}:{self.version}", force=True)
image=f"{self.image}:{self.version}", force=True
)
except docker.errors.DockerException as err:
_LOGGER.warning("Can't remove image %s: %s", self.image, err)
return False
raise DockerAPIError() from None
self._meta = None
return True
@process_lock
def update(self, tag, image=None):
def update(self, tag: str, image: Optional[str] = None) -> Awaitable[None]:
"""Update a Docker image."""
return self.sys_run_in_executor(self._update, tag, image)
def _update(self, tag, image=None):
def _update(self, tag: str, image: Optional[str] = None) -> None:
"""Update a docker image.
Need run inside executor.
"""
image = image or self.image
_LOGGER.info("Update Docker %s:%s to %s:%s", self.image, self.version,
image, tag)
_LOGGER.info(
"Update image %s:%s to %s:%s", self.image, self.version, image, tag
)
# Update docker image
if not self._install(tag, image):
return False
self._install(tag, image)
# Stop container & cleanup
with suppress(DockerAPIError):
try:
self._stop()
finally:
self._cleanup()
return True
def logs(self):
def logs(self) -> Awaitable[bytes]:
"""Return Docker logs of container.
Return a Future.
"""
return self.sys_run_in_executor(self._logs)
def _logs(self):
def _logs(self) -> bytes:
"""Return Docker logs of container.
Need run inside executor.
@@ -275,11 +297,11 @@ class DockerInterface(CoreSysAttributes):
_LOGGER.warning("Can't grep logs from %s: %s", self.image, err)
@process_lock
def cleanup(self):
def cleanup(self) -> Awaitable[None]:
"""Check if old version exists and cleanup."""
return self.sys_run_in_executor(self._cleanup)
def _cleanup(self):
def _cleanup(self) -> None:
"""Check if old version exists and cleanup.
Need run inside executor.
@@ -288,35 +310,55 @@ class DockerInterface(CoreSysAttributes):
latest = self.sys_docker.images.get(self.image)
except docker.errors.DockerException:
_LOGGER.warning("Can't find %s for cleanup", self.image)
return False
raise DockerAPIError() from None
for image in self.sys_docker.images.list(name=self.image):
if latest.id == image.id:
continue
with suppress(docker.errors.DockerException):
_LOGGER.info("Cleanup Docker images: %s", image.tags)
_LOGGER.info("Cleanup images: %s", image.tags)
self.sys_docker.images.remove(image.id, force=True)
return True
@process_lock
def restart(self) -> Awaitable[None]:
"""Restart docker container."""
return self.sys_loop.run_in_executor(None, self._restart)
def _restart(self) -> None:
"""Restart docker container.
Need run inside executor.
"""
try:
container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
raise DockerAPIError() from None
_LOGGER.info("Restart %s", self.image)
try:
container.restart(timeout=self.timeout)
except docker.errors.DockerException as err:
_LOGGER.warning("Can't restart %s: %s", self.image, err)
raise DockerAPIError() from None
@process_lock
def execute_command(self, command):
def execute_command(self, command: str) -> Awaitable[CommandReturn]:
"""Create a temporary container and run command."""
return self.sys_run_in_executor(self._execute_command, command)
def _execute_command(self, command):
def _execute_command(self, command: str) -> CommandReturn:
"""Create a temporary container and run command.
Need run inside executor.
"""
raise NotImplementedError()
def stats(self):
def stats(self) -> Awaitable[DockerStats]:
"""Read and return stats from container."""
return self.sys_run_in_executor(self._stats)
def _stats(self):
def _stats(self) -> DockerStats:
"""Create a temporary container and run command.
Need run inside executor.
@@ -324,11 +366,38 @@ class DockerInterface(CoreSysAttributes):
try:
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
return None
raise DockerAPIError() from None
try:
stats = docker_container.stats(stream=False)
return DockerStats(stats)
except docker.errors.DockerException as err:
_LOGGER.error("Can't read stats from %s: %s", self.name, err)
return None
raise DockerAPIError() from None
def is_fails(self) -> Awaitable[bool]:
"""Return True if Docker is failing state.
Return a Future.
"""
return self.sys_run_in_executor(self._is_fails)
def _is_fails(self) -> bool:
"""Return True if Docker is failing state.
Need run inside executor.
"""
try:
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
return False
# container is not running
if docker_container.status != "exited":
return False
# Check return value
if int(docker_container.attrs["State"]["ExitCode"]) != 0:
return True
return False

View File

@@ -1,9 +1,12 @@
"""Internal network manager for Hass.io."""
from ipaddress import IPv4Address
import logging
from typing import List, Optional
import docker
from ..const import DOCKER_NETWORK_MASK, DOCKER_NETWORK, DOCKER_NETWORK_RANGE
from ..const import DOCKER_NETWORK, DOCKER_NETWORK_MASK, DOCKER_NETWORK_RANGE
from ..exceptions import DockerAPIError
_LOGGER = logging.getLogger(__name__)
@@ -14,32 +17,32 @@ class DockerNetwork:
This class is not AsyncIO safe!
"""
def __init__(self, dock):
def __init__(self, docker_client: docker.DockerClient):
"""Initialize internal Hass.io network."""
self.docker = dock
self.network = self._get_network()
self.docker: docker.DockerClient = docker_client
self.network: docker.models.networks.Network = self._get_network()
@property
def name(self):
def name(self) -> str:
"""Return name of network."""
return DOCKER_NETWORK
@property
def containers(self):
def containers(self) -> List[docker.models.containers.Container]:
"""Return of connected containers from network."""
return self.network.containers
@property
def gateway(self):
def gateway(self) -> IPv4Address:
"""Return gateway of the network."""
return DOCKER_NETWORK_MASK[1]
@property
def supervisor(self):
def supervisor(self) -> IPv4Address:
"""Return supervisor of the network."""
return DOCKER_NETWORK_MASK[2]
def _get_network(self):
def _get_network(self) -> docker.models.networks.Network:
"""Get HassIO network."""
try:
return self.docker.networks.get(DOCKER_NETWORK)
@@ -49,18 +52,25 @@ class DockerNetwork:
ipam_pool = docker.types.IPAMPool(
subnet=str(DOCKER_NETWORK_MASK),
gateway=str(self.gateway),
iprange=str(DOCKER_NETWORK_RANGE)
iprange=str(DOCKER_NETWORK_RANGE),
)
ipam_config = docker.types.IPAMConfig(pool_configs=[ipam_pool])
return self.docker.networks.create(
DOCKER_NETWORK, driver='bridge', ipam=ipam_config,
enable_ipv6=False, options={
"com.docker.network.bridge.name": DOCKER_NETWORK,
})
DOCKER_NETWORK,
driver="bridge",
ipam=ipam_config,
enable_ipv6=False,
options={"com.docker.network.bridge.name": DOCKER_NETWORK},
)
def attach_container(self, container, alias=None, ipv4=None):
def attach_container(
self,
container: docker.models.containers.Container,
alias: Optional[List[str]] = None,
ipv4: Optional[IPv4Address] = None,
) -> None:
"""Attach container to Hass.io network.
Need run inside executor.
@@ -71,23 +81,24 @@ class DockerNetwork:
self.network.connect(container, aliases=alias, ipv4_address=ipv4)
except docker.errors.APIError as err:
_LOGGER.error("Can't link container to hassio-net: %s", err)
return False
raise DockerAPIError() from None
self.network.reload()
return True
def detach_default_bridge(self, container):
def detach_default_bridge(
self, container: docker.models.containers.Container
) -> None:
"""Detach default Docker bridge.
Need run inside executor.
"""
try:
default_network = self.docker.networks.get('bridge')
default_network = self.docker.networks.get("bridge")
default_network.disconnect(container)
except docker.errors.NotFound:
return
except docker.errors.APIError as err:
_LOGGER.warning(
"Can't disconnect container from default: %s", err)
_LOGGER.warning("Can't disconnect container from default: %s", err)
raise DockerAPIError() from None

View File

@@ -1,11 +1,13 @@
"""Init file for Hass.io Docker object."""
from ipaddress import IPv4Address
import logging
import os
import docker
from .interface import DockerInterface
from ..coresys import CoreSysAttributes
from ..exceptions import DockerAPIError
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__)
@@ -14,29 +16,36 @@ class DockerSupervisor(DockerInterface, CoreSysAttributes):
"""Docker Hass.io wrapper for Supervisor."""
@property
def name(self):
def name(self) -> str:
"""Return name of Docker container."""
return os.environ['SUPERVISOR_NAME']
return os.environ["SUPERVISOR_NAME"]
def _attach(self):
@property
def ip_address(self) -> IPv4Address:
"""Return IP address of this container."""
return self.sys_docker.network.supervisor
def _attach(self) -> None:
"""Attach to running docker container.
Need run inside executor.
"""
try:
container = self.sys_docker.containers.get(self.name)
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
return False
raise DockerAPIError() from None
self._meta = container.attrs
_LOGGER.info("Attach to Supervisor %s with version %s",
self.image, self.version)
self._meta = docker_container.attrs
_LOGGER.info(
"Attach to Supervisor %s with version %s", self.image, self.version
)
# If already attach
if container in self.sys_docker.network.containers:
return True
if docker_container in self.sys_docker.network.containers:
return
# Attach to network
return self.sys_docker.network.attach_container(
container, alias=['hassio'],
ipv4=self.sys_docker.network.supervisor)
_LOGGER.info("Connect Supervisor to Hass.io Network")
self.sys_docker.network.attach_container(
docker_container, alias=["hassio"], ipv4=self.sys_docker.network.supervisor
)

View File

@@ -28,6 +28,17 @@ class HomeAssistantAuthError(HomeAssistantAPIError):
"""Home Assistant Auth API exception."""
# Supervisor
class SupervisorError(HassioError):
"""Supervisor error."""
class SupervisorUpdateError(SupervisorError):
"""Supervisor update error."""
# HassOS
@@ -43,6 +54,17 @@ class HassOSNotSupportedError(HassioNotSupportedError):
"""Function not supported by HassOS."""
# Addons
class AddonsError(HassioError):
"""Addons exception."""
class AddonsNotSupportedError(HassioNotSupportedError):
"""Addons don't support a function."""
# Arch
@@ -144,3 +166,10 @@ class AppArmorInvalidError(AppArmorError):
class JsonFileError(HassioError):
"""Invalid json file."""
# docker/api
class DockerAPIError(HassioError):
"""Docker API error."""

View File

@@ -1,15 +1,22 @@
"""HassOS support on supervisor."""
import asyncio
from contextlib import suppress
import logging
from pathlib import Path
from typing import Awaitable, Optional
import aiohttp
from cpe import CPE
from .coresys import CoreSysAttributes
from .const import URL_HASSOS_OTA
from .coresys import CoreSysAttributes, CoreSys
from .docker.hassos_cli import DockerHassOSCli
from .exceptions import HassOSNotSupportedError, HassOSUpdateError, DBusError
from .exceptions import (
DBusError,
HassOSNotSupportedError,
HassOSUpdateError,
DockerAPIError,
)
_LOGGER = logging.getLogger(__name__)
@@ -17,61 +24,61 @@ _LOGGER = logging.getLogger(__name__)
class HassOS(CoreSysAttributes):
"""HassOS interface inside HassIO."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize HassOS handler."""
self.coresys = coresys
self.instance = DockerHassOSCli(coresys)
self._available = False
self._version = None
self._board = None
self.coresys: CoreSys = coresys
self.instance: DockerHassOSCli = DockerHassOSCli(coresys)
self._available: bool = False
self._version: Optional[str] = None
self._board: Optional[str] = None
@property
def available(self):
def available(self) -> bool:
"""Return True, if HassOS on host."""
return self._available
@property
def version(self):
def version(self) -> Optional[str]:
"""Return version of HassOS."""
return self._version
@property
def version_cli(self):
def version_cli(self) -> Optional[str]:
"""Return version of HassOS cli."""
return self.instance.version
@property
def version_latest(self):
def version_latest(self) -> str:
"""Return version of HassOS."""
return self.sys_updater.version_hassos
@property
def version_cli_latest(self):
def version_cli_latest(self) -> str:
"""Return version of HassOS."""
return self.sys_updater.version_hassos_cli
@property
def need_update(self):
def need_update(self) -> bool:
"""Return true if a HassOS update is available."""
return self.version != self.version_latest
@property
def need_cli_update(self):
def need_cli_update(self) -> bool:
"""Return true if a HassOS cli update is available."""
return self.version_cli != self.version_cli_latest
@property
def board(self):
def board(self) -> Optional[str]:
"""Return board name."""
return self._board
def _check_host(self):
def _check_host(self) -> None:
"""Check if HassOS is available."""
if not self.available:
_LOGGER.error("No HassOS available")
raise HassOSNotSupportedError()
async def _download_raucb(self, version):
async def _download_raucb(self, version: str) -> None:
"""Download rauc bundle (OTA) from github."""
url = URL_HASSOS_OTA.format(version=version, board=self.board)
raucb = Path(self.sys_config.path_tmp, f"hassos-{version}.raucb")
@@ -83,9 +90,9 @@ class HassOS(CoreSysAttributes):
raise HassOSUpdateError()
# Download RAUCB file
with raucb.open('wb') as ota_file:
with raucb.open("wb") as ota_file:
while True:
chunk = await request.content.read(1048576)
chunk = await request.content.read(1_048_576)
if not chunk:
break
ota_file.write(chunk)
@@ -101,7 +108,7 @@ class HassOS(CoreSysAttributes):
raise HassOSUpdateError()
async def load(self):
async def load(self) -> None:
"""Load HassOS data."""
try:
# Check needed host functions
@@ -111,7 +118,7 @@ class HassOS(CoreSysAttributes):
assert self.sys_host.info.cpe is not None
cpe = CPE(self.sys_host.info.cpe)
assert cpe.get_product()[0] == 'hassos'
assert cpe.get_product()[0] == "hassos"
except (AssertionError, NotImplementedError):
_LOGGER.debug("Found no HassOS")
return
@@ -122,9 +129,10 @@ class HassOS(CoreSysAttributes):
self._board = cpe.get_target_hardware()[0]
_LOGGER.info("Detect HassOS %s on host system", self.version)
with suppress(DockerAPIError):
await self.instance.attach()
def config_sync(self):
def config_sync(self) -> Awaitable[None]:
"""Trigger a host config reload from usb.
Return a coroutine.
@@ -132,9 +140,9 @@ class HassOS(CoreSysAttributes):
self._check_host()
_LOGGER.info("Syncing configuration from USB with HassOS.")
return self.sys_host.services.restart('hassos-config.service')
return self.sys_host.services.restart("hassos-config.service")
async def update(self, version=None):
async def update(self, version: Optional[str] = None) -> None:
"""Update HassOS system."""
version = version or self.version_latest
@@ -167,20 +175,19 @@ class HassOS(CoreSysAttributes):
# Update fails
rauc_status = await self.sys_dbus.get_properties()
_LOGGER.error(
"HassOS update fails with: %s", rauc_status.get('LastError'))
_LOGGER.error("HassOS update fails with: %s", rauc_status.get("LastError"))
raise HassOSUpdateError()
async def update_cli(self, version=None):
async def update_cli(self, version: Optional[str] = None) -> None:
"""Update local HassOS cli."""
version = version or self.version_cli_latest
if version == self.version_cli:
_LOGGER.warning("Version %s is already installed for CLI", version)
raise HassOSUpdateError()
if await self.instance.update(version):
return
try:
await self.instance.update(version)
except DockerAPIError:
_LOGGER.error("HassOS CLI update fails")
raise HassOSUpdateError()
raise HassOSUpdateError() from None

View File

@@ -2,26 +2,47 @@
import asyncio
from contextlib import asynccontextmanager, suppress
from datetime import datetime, timedelta
from ipaddress import IPv4Address
import logging
import os
import re
from pathlib import Path
import re
import secrets
import socket
import time
from typing import Any, AsyncContextManager, Awaitable, Dict, Optional
from uuid import UUID
import aiohttp
from aiohttp import hdrs
import attr
from .const import (FILE_HASSIO_HOMEASSISTANT, ATTR_IMAGE, ATTR_LAST_VERSION,
ATTR_UUID, ATTR_BOOT, ATTR_PASSWORD, ATTR_PORT, ATTR_SSL,
ATTR_WATCHDOG, ATTR_WAIT_BOOT, ATTR_REFRESH_TOKEN,
ATTR_ACCESS_TOKEN, HEADER_HA_ACCESS)
from .coresys import CoreSysAttributes
from .const import (
ATTR_ACCESS_TOKEN,
ATTR_BOOT,
ATTR_IMAGE,
ATTR_LAST_VERSION,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_REFRESH_TOKEN,
ATTR_SSL,
ATTR_UUID,
ATTR_WAIT_BOOT,
ATTR_WATCHDOG,
FILE_HASSIO_HOMEASSISTANT,
HEADER_HA_ACCESS,
)
from .coresys import CoreSys, CoreSysAttributes
from .docker.homeassistant import DockerHomeAssistant
from .exceptions import (HomeAssistantUpdateError, HomeAssistantError,
HomeAssistantAPIError, HomeAssistantAuthError)
from .utils import convert_to_ascii, process_lock, create_token
from .docker.stats import DockerStats
from .exceptions import (
DockerAPIError,
HomeAssistantAPIError,
HomeAssistantAuthError,
HomeAssistantError,
HomeAssistantUpdateError,
)
from .utils import convert_to_ascii, process_lock
from .utils.json import JsonConfig
from .validate import SCHEMA_HASS_CONFIG
@@ -40,115 +61,117 @@ class ConfigResult:
class HomeAssistant(JsonConfig, CoreSysAttributes):
"""Home Assistant core object for handle it."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize Home Assistant object."""
super().__init__(FILE_HASSIO_HOMEASSISTANT, SCHEMA_HASS_CONFIG)
self.coresys = coresys
self.instance = DockerHomeAssistant(coresys)
self.lock = asyncio.Lock(loop=coresys.loop)
self._error_state = False
# We don't persist access tokens. Instead we fetch new ones when needed
self.access_token = None
self._access_token_expires = None
self.coresys: CoreSys = coresys
self.instance: DockerHomeAssistant = DockerHomeAssistant(coresys)
self.lock: asyncio.Lock = asyncio.Lock(loop=coresys.loop)
self._error_state: bool = False
async def load(self):
# We don't persist access tokens. Instead we fetch new ones when needed
self.access_token: Optional[str] = None
self._access_token_expires: Optional[datetime] = None
async def load(self) -> None:
"""Prepare Home Assistant object."""
if await self.instance.attach():
with suppress(DockerAPIError):
await self.instance.attach()
return
_LOGGER.info("No Home Assistant Docker image %s found.", self.image)
await self.install_landingpage()
@property
def machine(self):
def machine(self) -> str:
"""Return the system machines."""
return self.instance.machine
@property
def arch(self):
def arch(self) -> str:
"""Return arch of running Home Assistant."""
return self.instance.arch
@property
def error_state(self):
def error_state(self) -> bool:
"""Return True if system is in error."""
return self._error_state
@property
def api_ip(self):
def ip_address(self) -> IPv4Address:
"""Return IP of Home Assistant instance."""
return self.sys_docker.network.gateway
return self.instance.ip_address
@property
def api_port(self):
def api_port(self) -> int:
"""Return network port to Home Assistant instance."""
return self._data[ATTR_PORT]
@api_port.setter
def api_port(self, value):
def api_port(self, value: int) -> None:
"""Set network port for Home Assistant instance."""
self._data[ATTR_PORT] = value
@property
def api_password(self):
def api_password(self) -> str:
"""Return password for Home Assistant instance."""
return self._data.get(ATTR_PASSWORD)
@api_password.setter
def api_password(self, value):
def api_password(self, value: str):
"""Set password for Home Assistant instance."""
self._data[ATTR_PASSWORD] = value
@property
def api_ssl(self):
def api_ssl(self) -> bool:
"""Return if we need ssl to Home Assistant instance."""
return self._data[ATTR_SSL]
@api_ssl.setter
def api_ssl(self, value):
def api_ssl(self, value: bool):
"""Set SSL for Home Assistant instance."""
self._data[ATTR_SSL] = value
@property
def api_url(self):
def api_url(self) -> str:
"""Return API url to Home Assistant."""
return "{}://{}:{}".format('https' if self.api_ssl else 'http',
self.api_ip, self.api_port)
self.ip_address, self.api_port)
@property
def watchdog(self):
def watchdog(self) -> bool:
"""Return True if the watchdog should protect Home Assistant."""
return self._data[ATTR_WATCHDOG]
@watchdog.setter
def watchdog(self, value):
def watchdog(self, value: bool):
"""Return True if the watchdog should protect Home Assistant."""
self._data[ATTR_WATCHDOG] = value
@property
def wait_boot(self):
def wait_boot(self) -> int:
"""Return time to wait for Home Assistant startup."""
return self._data[ATTR_WAIT_BOOT]
@wait_boot.setter
def wait_boot(self, value):
def wait_boot(self, value: int):
"""Set time to wait for Home Assistant startup."""
self._data[ATTR_WAIT_BOOT] = value
@property
def version(self):
def version(self) -> str:
"""Return version of running Home Assistant."""
return self.instance.version
@property
def last_version(self):
def last_version(self) -> str:
"""Return last available version of Home Assistant."""
if self.is_custom_image:
return self._data.get(ATTR_LAST_VERSION)
return self.sys_updater.version_homeassistant
@last_version.setter
def last_version(self, value):
def last_version(self, value: str):
"""Set last available version of Home Assistant."""
if value:
self._data[ATTR_LAST_VERSION] = value
@@ -156,14 +179,14 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
self._data.pop(ATTR_LAST_VERSION, None)
@property
def image(self):
def image(self) -> str:
"""Return image name of the Home Assistant container."""
if self._data.get(ATTR_IMAGE):
return self._data[ATTR_IMAGE]
return os.environ['HOMEASSISTANT_REPOSITORY']
@image.setter
def image(self, value):
def image(self, value: str):
"""Set image name of Home Assistant container."""
if value:
self._data[ATTR_IMAGE] = value
@@ -171,53 +194,54 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
self._data.pop(ATTR_IMAGE, None)
@property
def is_custom_image(self):
def is_custom_image(self) -> bool:
"""Return True if a custom image is used."""
return all(
attr in self._data for attr in (ATTR_IMAGE, ATTR_LAST_VERSION))
@property
def boot(self):
def boot(self) -> bool:
"""Return True if Home Assistant boot is enabled."""
return self._data[ATTR_BOOT]
@boot.setter
def boot(self, value):
def boot(self, value: bool):
"""Set Home Assistant boot options."""
self._data[ATTR_BOOT] = value
@property
def uuid(self):
def uuid(self) -> UUID:
"""Return a UUID of this Home Assistant instance."""
return self._data[ATTR_UUID]
@property
def hassio_token(self):
def hassio_token(self) -> str:
"""Return an access token for the Hass.io API."""
return self._data.get(ATTR_ACCESS_TOKEN)
@property
def refresh_token(self):
def refresh_token(self) -> str:
"""Return the refresh token to authenticate with Home Assistant."""
return self._data.get(ATTR_REFRESH_TOKEN)
@refresh_token.setter
def refresh_token(self, value):
def refresh_token(self, value: str):
"""Set Home Assistant refresh_token."""
self._data[ATTR_REFRESH_TOKEN] = value
@process_lock
async def install_landingpage(self):
async def install_landingpage(self) -> None:
"""Install a landing page."""
_LOGGER.info("Setup HomeAssistant landingpage")
while True:
if await self.instance.install('landingpage'):
break
with suppress(DockerAPIError):
await self.instance.install('landingpage')
return
_LOGGER.warning("Fails install landingpage, retry after 30sec")
await asyncio.sleep(30)
@process_lock
async def install(self):
async def install(self) -> None:
"""Install a landing page."""
_LOGGER.info("Setup Home Assistant")
while True:
@@ -226,7 +250,9 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
await self.sys_updater.reload()
tag = self.last_version
if tag and await self.instance.install(tag):
if tag:
with suppress(DockerAPIError):
await self.instance.install(tag)
break
_LOGGER.warning("Error on install Home Assistant. Retry in 30sec")
await asyncio.sleep(30)
@@ -241,10 +267,11 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
except HomeAssistantError:
_LOGGER.error("Can't start Home Assistant!")
finally:
with suppress(DockerAPIError):
await self.instance.cleanup()
@process_lock
async def update(self, version=None):
async def update(self, version=None) -> None:
"""Update HomeAssistant version."""
version = version or self.last_version
rollback = self.version if not self.error_state else None
@@ -253,16 +280,18 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
if exists and version == self.instance.version:
_LOGGER.warning("Version %s is already installed", version)
return HomeAssistantUpdateError()
return
# process an update
async def _update(to_version):
"""Run Home Assistant update."""
try:
_LOGGER.info("Update Home Assistant to version %s", to_version)
if not await self.instance.update(to_version):
raise HomeAssistantUpdateError()
finally:
try:
await self.instance.update(to_version)
except DockerAPIError:
_LOGGER.warning("Update Home Assistant image fails")
raise HomeAssistantUpdateError() from None
if running:
await self._start()
_LOGGER.info("Successful run Home Assistant %s", to_version)
@@ -279,76 +308,103 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
else:
raise HomeAssistantUpdateError()
async def _start(self):
async def _start(self) -> None:
"""Start Home Assistant Docker & wait."""
if await self.instance.is_running():
_LOGGER.warning("Home Assistant is already running!")
return
# Create new API token
self._data[ATTR_ACCESS_TOKEN] = create_token()
self._data[ATTR_ACCESS_TOKEN] = secrets.token_hex(56)
self.save_data()
if not await self.instance.run():
raise HomeAssistantError()
try:
await self.instance.run()
except DockerAPIError:
raise HomeAssistantError() from None
await self._block_till_run()
@process_lock
def start(self):
"""Run Home Assistant docker.
async def start(self) -> None:
"""Run Home Assistant docker."""
try:
if await self.instance.is_running():
await self.instance.restart()
elif await self.instance.is_initialize():
await self.instance.start()
else:
await self._start()
return
Return a coroutine.
"""
return self._start()
await self._block_till_run()
except DockerAPIError:
raise HomeAssistantError() from None
@process_lock
def stop(self):
async def stop(self) -> None:
"""Stop Home Assistant Docker.
Return a coroutine.
"""
return self.instance.stop()
try:
return await self.instance.stop(remove_container=False)
except DockerAPIError:
raise HomeAssistantError() from None
@process_lock
async def restart(self):
async def restart(self) -> None:
"""Restart Home Assistant Docker."""
try:
await self.instance.restart()
except DockerAPIError:
raise HomeAssistantError() from None
await self._block_till_run()
@process_lock
async def rebuild(self) -> None:
"""Rebuild Home Assistant Docker container."""
with suppress(DockerAPIError):
await self.instance.stop()
await self._start()
def logs(self):
def logs(self) -> Awaitable[bytes]:
"""Get HomeAssistant docker logs.
Return a coroutine.
"""
return self.instance.logs()
def stats(self):
async def stats(self) -> DockerStats:
"""Return stats of Home Assistant.
Return a coroutine.
"""
return self.instance.stats()
try:
return await self.instance.stats()
except DockerAPIError:
raise HomeAssistantError() from None
def is_running(self):
def is_running(self) -> Awaitable[bool]:
"""Return True if Docker container is running.
Return a coroutine.
"""
return self.instance.is_running()
def is_initialize(self):
"""Return True if a Docker container is exists.
def is_fails(self) -> Awaitable[bool]:
"""Return True if a Docker container is fails state.
Return a coroutine.
"""
return self.instance.is_initialize()
return self.instance.is_fails()
@property
def in_progress(self):
def in_progress(self) -> bool:
"""Return True if a task is in progress."""
return self.instance.in_progress or self.lock.locked()
async def check_config(self):
async def check_config(self) -> ConfigResult:
"""Run Home Assistant config check."""
result = await self.instance.execute_command(
"python3 -m homeassistant -c /config --script check_config")
@@ -367,7 +423,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
_LOGGER.info("Home Assistant config is valid")
return ConfigResult(True, log)
async def ensure_access_token(self):
async def ensure_access_token(self) -> None:
"""Ensures there is an access token."""
if self.access_token is not None and self._access_token_expires > datetime.utcnow():
return
@@ -392,12 +448,12 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
@asynccontextmanager
async def make_request(self,
method,
path,
json=None,
content_type=None,
data=None,
timeout=30):
method: str,
path: str,
json: Optional[Dict[str, Any]] = None,
content_type: Optional[str] = None,
data: Optional[bytes] = None,
timeout=30) -> AsyncContextManager[aiohttp.ClientResponse]:
"""Async context manager to make a request with right auth."""
url = f"{self.api_url}/{path}"
headers = {}
@@ -432,7 +488,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
raise HomeAssistantAPIError()
async def check_api_state(self):
async def check_api_state(self) -> bool:
"""Return True if Home Assistant up and running."""
with suppress(HomeAssistantAPIError):
async with self.make_request('get', 'api/') as resp:
@@ -443,7 +499,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
return False
async def _block_till_run(self):
async def _block_till_run(self) -> None:
"""Block until Home-Assistant is booting up or startup timeout."""
start_time = time.monotonic()
migration_progress = False
@@ -454,7 +510,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
"""Check if port is mapped."""
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
result = sock.connect_ex((str(self.api_ip), self.api_port))
result = sock.connect_ex((str(self.ip_address), self.api_port))
sock.close()
# Check if the port is available

103
hassio/ingress.py Normal file
View File

@@ -0,0 +1,103 @@
"""Fetch last versions from webserver."""
from datetime import timedelta
import logging
from typing import Dict, Optional
import secrets
from .addons.addon import Addon
from .const import ATTR_SESSION, FILE_HASSIO_INGRESS
from .coresys import CoreSys, CoreSysAttributes
from .utils.json import JsonConfig
from .utils.dt import utcnow, utc_from_timestamp
from .validate import SCHEMA_INGRESS_CONFIG
_LOGGER = logging.getLogger(__name__)
class Ingress(JsonConfig, CoreSysAttributes):
"""Fetch last versions from version.json."""
def __init__(self, coresys: CoreSys):
"""Initialize updater."""
super().__init__(FILE_HASSIO_INGRESS, SCHEMA_INGRESS_CONFIG)
self.coresys: CoreSys = coresys
self.tokens: Dict[str, str] = {}
def get(self, token: str) -> Optional[Addon]:
"""Return addon they have this ingress token."""
if token not in self.tokens:
self._update_token_list()
return self.sys_addons.get(self.tokens.get(token))
@property
def sessions(self) -> Dict[str, float]:
"""Return sessions."""
return self._data[ATTR_SESSION]
async def load(self) -> None:
"""Update internal data."""
self._update_token_list()
self._cleanup_sessions()
_LOGGER.info("Load %d ingress session", len(self.sessions))
async def reload(self) -> None:
"""Reload/Validate sessions."""
self._cleanup_sessions()
async def unload(self) -> None:
"""Shutdown sessions."""
self.save_data()
def _cleanup_sessions(self) -> None:
"""Remove not used sessions."""
now = utcnow()
sessions = {}
for session, valid in self.sessions.items():
valid_dt = utc_from_timestamp(valid)
if valid_dt < now:
continue
# Is valid
sessions[session] = valid
# Write back
self.sessions.clear()
self.sessions.update(sessions)
def _update_token_list(self) -> None:
"""Regenerate token <-> Add-on map."""
self.tokens.clear()
# Read all ingress token and build a map
for addon in self.sys_addons.list_installed:
if not addon.with_ingress:
continue
self.tokens[addon.ingress_token] = addon.slug
def create_session(self) -> str:
"""Create new session."""
session = secrets.token_hex(64)
valid = utcnow() + timedelta(minutes=15)
self.sessions[session] = valid.timestamp()
self.save_data()
return session
def validate_session(self, session: str) -> bool:
"""Return True if session valid and make it longer valid."""
if session not in self.sessions:
return False
valid_until = utc_from_timestamp(self.sessions[session])
# Is still valid?
if valid_until < utcnow():
return False
# Update time
valid_until = valid_until + timedelta(minutes=15)
self.sessions[session] = valid_until.timestamp()
return True

View File

@@ -1,38 +1,38 @@
"""Handle internal services discovery."""
from .mqtt import MQTTService
from typing import Dict, List
from ..coresys import CoreSys, CoreSysAttributes
from .const import SERVICE_MQTT
from .data import ServicesData
from ..const import SERVICE_MQTT
from ..coresys import CoreSysAttributes
from .interface import ServiceInterface
from .modules.mqtt import MQTTService
AVAILABLE_SERVICES = {
SERVICE_MQTT: MQTTService
}
AVAILABLE_SERVICES = {SERVICE_MQTT: MQTTService}
class ServiceManager(CoreSysAttributes):
"""Handle internal services discovery."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize Services handler."""
self.coresys = coresys
self.data = ServicesData()
self.services_obj = {}
self.coresys: CoreSys = coresys
self.data: ServicesData = ServicesData()
self.services_obj: Dict[str, ServiceInterface] = {}
@property
def list_services(self):
def list_services(self) -> List[ServiceInterface]:
"""Return a list of services."""
return list(self.services_obj.values())
def get(self, slug):
def get(self, slug: str) -> ServiceInterface:
"""Return service object from slug."""
return self.services_obj.get(slug)
async def load(self):
async def load(self) -> None:
"""Load available services."""
for slug, service in AVAILABLE_SERVICES.items():
self.services_obj[slug] = service(self.coresys)
def reset(self):
def reset(self) -> None:
"""Reset available data."""
self.data.reset_data()

11
hassio/services/const.py Normal file
View File

@@ -0,0 +1,11 @@
"""Service API static data."""
ATTR_ADDON = "addon"
ATTR_HOST = "host"
ATTR_PASSWORD = "password"
ATTR_PORT = "port"
ATTR_PROTOCOL = "protocol"
ATTR_SSL = "ssl"
ATTR_USERNAME = "username"
SERVICE_MQTT = "mqtt"

View File

@@ -1,8 +1,10 @@
"""Handle service data for persistent supervisor reboot."""
from typing import Any, Dict
from .validate import SCHEMA_SERVICES_CONFIG
from ..const import FILE_HASSIO_SERVICES, SERVICE_MQTT
from ..const import FILE_HASSIO_SERVICES
from ..utils.json import JsonConfig
from .const import SERVICE_MQTT
from .validate import SCHEMA_SERVICES_CONFIG
class ServicesData(JsonConfig):
@@ -13,6 +15,6 @@ class ServicesData(JsonConfig):
super().__init__(FILE_HASSIO_SERVICES, SCHEMA_SERVICES_CONFIG)
@property
def mqtt(self):
def mqtt(self) -> Dict[str, Any]:
"""Return settings for MQTT service."""
return self._data[SERVICE_MQTT]

View File

@@ -1,33 +1,37 @@
"""Interface for single service."""
from typing import Any, Dict, List, Optional
from ..coresys import CoreSysAttributes
import voluptuous as vol
from ..addons.addon import Addon
from ..const import PROVIDE_SERVICE
from ..coresys import CoreSys, CoreSysAttributes
class ServiceInterface(CoreSysAttributes):
"""Interface class for service integration."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize service interface."""
self.coresys = coresys
self.coresys: CoreSys = coresys
@property
def slug(self):
def slug(self) -> str:
"""Return slug of this service."""
return None
raise NotImplementedError()
@property
def _data(self):
def _data(self) -> Dict[str, Any]:
"""Return data of this service."""
return None
raise NotImplementedError()
@property
def schema(self):
def schema(self) -> vol.Schema:
"""Return data schema of this service."""
return None
raise NotImplementedError()
@property
def providers(self):
def providers(self) -> List[str]:
"""Return name of service providers addon."""
addons = []
for addon in self.sys_addons.list_installed:
@@ -36,24 +40,24 @@ class ServiceInterface(CoreSysAttributes):
return addons
@property
def enabled(self):
def enabled(self) -> bool:
"""Return True if the service is in use."""
return bool(self._data)
def save(self):
def save(self) -> None:
"""Save changes."""
self.sys_services.data.save_data()
def get_service_data(self):
def get_service_data(self) -> Optional[Dict[str, Any]]:
"""Return the requested service data."""
if self.enabled:
return self._data
return None
def set_service_data(self, addon, data):
def set_service_data(self, addon: Addon, data: Dict[str, Any]) -> None:
"""Write the data into service object."""
raise NotImplementedError()
def del_service_data(self, addon):
def del_service_data(self, addon: Addon) -> None:
"""Remove the data from service object."""
raise NotImplementedError()

View File

@@ -0,0 +1 @@
"""Services modules."""

View File

@@ -0,0 +1,81 @@
"""Provide the MQTT Service."""
import logging
from typing import Any, Dict
from hassio.addons.addon import Addon
from hassio.exceptions import ServicesError
from hassio.validate import NETWORK_PORT
import voluptuous as vol
from ..const import (
ATTR_ADDON,
ATTR_HOST,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_PROTOCOL,
ATTR_SSL,
ATTR_USERNAME,
SERVICE_MQTT,
)
from ..interface import ServiceInterface
_LOGGER = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_SERVICE_MQTT = vol.Schema(
{
vol.Required(ATTR_HOST): vol.Coerce(str),
vol.Required(ATTR_PORT): NETWORK_PORT,
vol.Optional(ATTR_USERNAME): vol.Coerce(str),
vol.Optional(ATTR_PASSWORD): vol.Coerce(str),
vol.Optional(ATTR_SSL, default=False): vol.Boolean(),
vol.Optional(ATTR_PROTOCOL, default="3.1.1"): vol.All(
vol.Coerce(str), vol.In(["3.1", "3.1.1"])
),
}
)
SCHEMA_CONFIG_MQTT = SCHEMA_SERVICE_MQTT.extend(
{vol.Required(ATTR_ADDON): vol.Coerce(str)}
)
class MQTTService(ServiceInterface):
"""Provide MQTT services."""
@property
def slug(self) -> str:
"""Return slug of this service."""
return SERVICE_MQTT
@property
def _data(self) -> Dict[str, Any]:
"""Return data of this service."""
return self.sys_services.data.mqtt
@property
def schema(self) -> vol.Schema:
"""Return data schema of this service."""
return SCHEMA_SERVICE_MQTT
def set_service_data(self, addon: Addon, data: Dict[str, Any]) -> None:
"""Write the data into service object."""
if self.enabled:
_LOGGER.error("It is already a MQTT in use from %s", self._data[ATTR_ADDON])
raise ServicesError()
self._data.update(data)
self._data[ATTR_ADDON] = addon.slug
_LOGGER.info("Set %s as service provider for mqtt", addon.slug)
self.save()
def del_service_data(self, addon: Addon) -> None:
"""Remove the data from service object."""
if not self.enabled:
_LOGGER.warning("Can't remove not exists services")
raise ServicesError()
self._data.clear()
self.save()

View File

@@ -1,50 +0,0 @@
"""Provide the MQTT Service."""
import logging
from .interface import ServiceInterface
from .validate import SCHEMA_SERVICE_MQTT
from ..const import ATTR_ADDON, SERVICE_MQTT
from ..exceptions import ServicesError
_LOGGER = logging.getLogger(__name__)
class MQTTService(ServiceInterface):
"""Provide MQTT services."""
@property
def slug(self):
"""Return slug of this service."""
return SERVICE_MQTT
@property
def _data(self):
"""Return data of this service."""
return self.sys_services.data.mqtt
@property
def schema(self):
"""Return data schema of this service."""
return SCHEMA_SERVICE_MQTT
def set_service_data(self, addon, data):
"""Write the data into service object."""
if self.enabled:
_LOGGER.error(
"It is already a MQTT in use from %s", self._data[ATTR_ADDON])
raise ServicesError()
self._data.update(data)
self._data[ATTR_ADDON] = addon.slug
_LOGGER.info("Set %s as service provider for mqtt", addon.slug)
self.save()
def del_service_data(self, addon):
"""Remove the data from service object."""
if not self.enabled:
_LOGGER.warning("Can't remove not exists services")
raise ServicesError()
self._data.clear()
self.save()

View File

@@ -1,35 +1,12 @@
"""Validate services schema."""
import voluptuous as vol
from ..const import (
SERVICE_MQTT, ATTR_HOST, ATTR_PORT, ATTR_PASSWORD, ATTR_USERNAME, ATTR_SSL,
ATTR_ADDON, ATTR_PROTOCOL)
from ..validate import NETWORK_PORT
from ..utils.validate import schema_or
from .const import SERVICE_MQTT
from .modules.mqtt import SCHEMA_CONFIG_MQTT
# pylint: disable=no-value-for-parameter
SCHEMA_SERVICE_MQTT = vol.Schema({
vol.Required(ATTR_HOST): vol.Coerce(str),
vol.Required(ATTR_PORT): NETWORK_PORT,
vol.Optional(ATTR_USERNAME): vol.Coerce(str),
vol.Optional(ATTR_PASSWORD): vol.Coerce(str),
vol.Optional(ATTR_SSL, default=False): vol.Boolean(),
vol.Optional(ATTR_PROTOCOL, default='3.1.1'):
vol.All(vol.Coerce(str), vol.In(['3.1', '3.1.1'])),
})
SCHEMA_CONFIG_MQTT = SCHEMA_SERVICE_MQTT.extend({
vol.Required(ATTR_ADDON): vol.Coerce(str),
})
SCHEMA_SERVICES_CONFIG = vol.Schema({
vol.Optional(SERVICE_MQTT, default=dict): schema_or(SCHEMA_CONFIG_MQTT),
}, extra=vol.REMOVE_EXTRA)
DISCOVERY_SERVICES = {
SERVICE_MQTT: SCHEMA_SERVICE_MQTT,
}
SCHEMA_SERVICES_CONFIG = vol.Schema(
{vol.Optional(SERVICE_MQTT, default=dict): schema_or(SCHEMA_CONFIG_MQTT)},
extra=vol.REMOVE_EXTRA,
)

View File

@@ -39,6 +39,7 @@ from ..const import (
CRYPTO_AES128,
)
from ..coresys import CoreSys, CoreSysAttributes
from ..exceptions import AddonsError
from ..utils.json import write_json_file
from ..utils.tar import SecureTarFile
from .utils import key_to_iv, password_for_validating, password_to_key, remove_folder
@@ -289,7 +290,9 @@ class Snapshot(CoreSysAttributes):
'w', key=self._key)
# Take snapshot
if not await addon.snapshot(addon_file):
try:
await addon.snapshot(addon_file)
except AddonsError:
_LOGGER.error("Can't make snapshot from %s", addon.slug)
return
@@ -326,10 +329,11 @@ class Snapshot(CoreSysAttributes):
_LOGGER.error("Can't find snapshot for %s", addon.slug)
return
# Performe a restore
if not await addon.restore(addon_file):
# Perform a restore
try:
await addon.restore(addon_file)
except AddonsError:
_LOGGER.error("Can't restore snapshot for %s", addon.slug)
return
# Run tasks
tasks = [_addon_restore(addon) for addon in addon_list]

View File

@@ -1,15 +1,24 @@
"""Home Assistant control object."""
import asyncio
from contextlib import suppress
from ipaddress import IPv4Address
import logging
from pathlib import Path
from tempfile import TemporaryDirectory
from typing import Awaitable, Optional
import aiohttp
from .coresys import CoreSysAttributes
from .docker.supervisor import DockerSupervisor
from .const import URL_HASSIO_APPARMOR
from .exceptions import HostAppArmorError
from .coresys import CoreSys, CoreSysAttributes
from .docker.stats import DockerStats
from .docker.supervisor import DockerSupervisor
from .exceptions import (
DockerAPIError,
HostAppArmorError,
SupervisorError,
SupervisorUpdateError,
)
_LOGGER = logging.getLogger(__name__)
@@ -17,43 +26,52 @@ _LOGGER = logging.getLogger(__name__)
class Supervisor(CoreSysAttributes):
"""Home Assistant core object for handle it."""
def __init__(self, coresys):
def __init__(self, coresys: CoreSys):
"""Initialize hass object."""
self.coresys = coresys
self.instance = DockerSupervisor(coresys)
self.coresys: CoreSys = coresys
self.instance: DockerSupervisor = DockerSupervisor(coresys)
async def load(self):
async def load(self) -> None:
"""Prepare Home Assistant object."""
if not await self.instance.attach():
try:
await self.instance.attach()
except DockerAPIError:
_LOGGER.fatal("Can't setup Supervisor Docker container!")
with suppress(DockerAPIError):
await self.instance.cleanup()
@property
def need_update(self):
def ip_address(self) -> IPv4Address:
"""Return IP of Supervisor instance."""
return self.instance.ip_address
@property
def need_update(self) -> bool:
"""Return True if an update is available."""
return self.version != self.last_version
@property
def version(self):
def version(self) -> str:
"""Return version of running Home Assistant."""
return self.instance.version
@property
def last_version(self):
def last_version(self) -> str:
"""Return last available version of Home Assistant."""
return self.sys_updater.version_hassio
@property
def image(self):
def image(self) -> str:
"""Return image name of Home Assistant container."""
return self.instance.image
@property
def arch(self):
def arch(self) -> str:
"""Return arch of the Hass.io container."""
return self.instance.arch
async def update_apparmor(self):
async def update_apparmor(self) -> None:
"""Fetch last version and update profile."""
url = URL_HASSIO_APPARMOR
try:
@@ -63,22 +81,25 @@ class Supervisor(CoreSysAttributes):
except (aiohttp.ClientError, asyncio.TimeoutError) as err:
_LOGGER.warning("Can't fetch AppArmor profile: %s", err)
return
raise SupervisorError() from None
with TemporaryDirectory(dir=self.sys_config.path_tmp) as tmp_dir:
profile_file = Path(tmp_dir, 'apparmor.txt')
profile_file = Path(tmp_dir, "apparmor.txt")
try:
profile_file.write_text(data)
except OSError as err:
_LOGGER.error("Can't write temporary profile: %s", err)
return
raise SupervisorError() from None
try:
await self.sys_host.apparmor.load_profile(
"hassio-supervisor", profile_file)
"hassio-supervisor", profile_file
)
except HostAppArmorError:
_LOGGER.error("Can't update AppArmor profile!")
raise SupervisorError() from None
async def update(self, version=None):
async def update(self, version: Optional[str] = None) -> None:
"""Update Home Assistant version."""
version = version or self.last_version
@@ -87,29 +108,31 @@ class Supervisor(CoreSysAttributes):
return
_LOGGER.info("Update Supervisor to version %s", version)
if await self.instance.install(version):
try:
await self.instance.install(version)
except DockerAPIError:
_LOGGER.error("Update of Hass.io fails!")
raise SupervisorUpdateError() from None
with suppress(SupervisorError):
await self.update_apparmor()
self.sys_loop.call_later(1, self.sys_loop.stop)
return True
_LOGGER.error("Update of Hass.io fails!")
return False
@property
def in_progress(self):
def in_progress(self) -> bool:
"""Return True if a task is in progress."""
return self.instance.in_progress
def logs(self):
def logs(self) -> Awaitable[bytes]:
"""Get Supervisor docker logs.
Return a coroutine.
Return Coroutine.
"""
return self.instance.logs()
def stats(self):
"""Return stats of Supervisor.
Return a coroutine.
"""
return self.instance.stats()
async def stats(self) -> DockerStats:
"""Return stats of Supervisor."""
try:
return await self.instance.stats()
except DockerAPIError:
raise SupervisorError() from None

View File

@@ -7,7 +7,7 @@ from .exceptions import HomeAssistantError
_LOGGER = logging.getLogger(__name__)
HASS_WATCHDOG_API = 'HASS_WATCHDOG_API'
HASS_WATCHDOG_API = "HASS_WATCHDOG_API"
RUN_UPDATE_SUPERVISOR = 29100
RUN_UPDATE_ADDONS = 57600
@@ -17,6 +17,7 @@ RUN_RELOAD_ADDONS = 21600
RUN_RELOAD_SNAPSHOTS = 72000
RUN_RELOAD_HOST = 72000
RUN_RELOAD_UPDATER = 21600
RUN_RELOAD_INGRESS = 930
RUN_WATCHDOG_HOMEASSISTANT_DOCKER = 15
RUN_WATCHDOG_HOMEASSISTANT_API = 300
@@ -33,28 +34,55 @@ class Tasks(CoreSysAttributes):
async def load(self):
"""Add Tasks to scheduler."""
self.jobs.add(self.sys_scheduler.register_task(
self._update_addons, RUN_UPDATE_ADDONS))
self.jobs.add(self.sys_scheduler.register_task(
self._update_supervisor, RUN_UPDATE_SUPERVISOR))
self.jobs.add(self.sys_scheduler.register_task(
self._update_hassos_cli, RUN_UPDATE_HASSOSCLI))
# Update
self.jobs.add(
self.sys_scheduler.register_task(self._update_addons, RUN_UPDATE_ADDONS)
)
self.jobs.add(
self.sys_scheduler.register_task(
self._update_supervisor, RUN_UPDATE_SUPERVISOR
)
)
self.jobs.add(
self.sys_scheduler.register_task(
self._update_hassos_cli, RUN_UPDATE_HASSOSCLI
)
)
self.jobs.add(self.sys_scheduler.register_task(
self.sys_addons.reload, RUN_RELOAD_ADDONS))
self.jobs.add(self.sys_scheduler.register_task(
self.sys_updater.reload, RUN_RELOAD_UPDATER))
self.jobs.add(self.sys_scheduler.register_task(
self.sys_snapshots.reload, RUN_RELOAD_SNAPSHOTS))
self.jobs.add(self.sys_scheduler.register_task(
self.sys_host.reload, RUN_RELOAD_HOST))
# Reload
self.jobs.add(
self.sys_scheduler.register_task(self.sys_addons.reload, RUN_RELOAD_ADDONS)
)
self.jobs.add(
self.sys_scheduler.register_task(
self.sys_updater.reload, RUN_RELOAD_UPDATER
)
)
self.jobs.add(
self.sys_scheduler.register_task(
self.sys_snapshots.reload, RUN_RELOAD_SNAPSHOTS
)
)
self.jobs.add(
self.sys_scheduler.register_task(self.sys_host.reload, RUN_RELOAD_HOST)
)
self.jobs.add(
self.sys_scheduler.register_task(
self.sys_ingress.reload, RUN_RELOAD_INGRESS
)
)
self.jobs.add(self.sys_scheduler.register_task(
self._watchdog_homeassistant_docker,
RUN_WATCHDOG_HOMEASSISTANT_DOCKER))
self.jobs.add(self.sys_scheduler.register_task(
self._watchdog_homeassistant_api,
RUN_WATCHDOG_HOMEASSISTANT_API))
# Watchdog
self.jobs.add(
self.sys_scheduler.register_task(
self._watchdog_homeassistant_docker, RUN_WATCHDOG_HOMEASSISTANT_DOCKER
)
)
self.jobs.add(
self.sys_scheduler.register_task(
self._watchdog_homeassistant_api, RUN_WATCHDOG_HOMEASSISTANT_API
)
)
_LOGGER.info("All core tasks are scheduled")
@@ -72,7 +100,8 @@ class Tasks(CoreSysAttributes):
tasks.append(addon.update())
else:
_LOGGER.warning(
"Add-on %s will be ignore, schema tests fails", addon.slug)
"Add-on %s will be ignored, schema tests fails", addon.slug
)
if tasks:
_LOGGER.info("Add-on auto update process %d tasks", len(tasks))
@@ -94,14 +123,18 @@ class Tasks(CoreSysAttributes):
async def _watchdog_homeassistant_docker(self):
"""Check running state of Docker and start if they is close."""
# if Home Assistant is active
if not await self.sys_homeassistant.is_initialize() or \
not self.sys_homeassistant.watchdog or \
self.sys_homeassistant.error_state:
if (
not await self.sys_homeassistant.is_fails()
or not self.sys_homeassistant.watchdog
or self.sys_homeassistant.error_state
):
return
# if Home Assistant is running
if self.sys_homeassistant.in_progress or \
await self.sys_homeassistant.is_running():
if (
self.sys_homeassistant.in_progress
or await self.sys_homeassistant.is_running()
):
return
_LOGGER.warning("Watchdog found a problem with Home Assistant Docker!")
@@ -117,17 +150,21 @@ class Tasks(CoreSysAttributes):
a delay in our system.
"""
# If Home-Assistant is active
if not await self.sys_homeassistant.is_initialize() or \
not self.sys_homeassistant.watchdog or \
self.sys_homeassistant.error_state:
if (
not await self.sys_homeassistant.is_fails()
or not self.sys_homeassistant.watchdog
or self.sys_homeassistant.error_state
):
return
# Init cache data
retry_scan = self._cache.get(HASS_WATCHDOG_API, 0)
# If Home-Assistant API is up
if self.sys_homeassistant.in_progress or \
await self.sys_homeassistant.check_api_state():
if (
self.sys_homeassistant.in_progress
or await self.sys_homeassistant.check_api_state()
):
return
# Look like we run into a problem

View File

@@ -1,32 +1,26 @@
"""Tools file for Hass.io."""
from datetime import datetime
import hashlib
import logging
import re
import uuid
from datetime import datetime
_LOGGER = logging.getLogger(__name__)
RE_STRING = re.compile(r"\x1b(\[.*?[@-~]|\].*?(\x07|\x1b\\))")
def convert_to_ascii(raw):
def convert_to_ascii(raw) -> str:
"""Convert binary to ascii and remove colors."""
return RE_STRING.sub("", raw.decode())
def create_token():
"""Create token for API access."""
return hashlib.sha256(uuid.uuid4().bytes).hexdigest()
def process_lock(method):
"""Wrap function with only run once."""
async def wrap_api(api, *args, **kwargs):
"""Return api wrapper."""
if api.lock.locked():
_LOGGER.error(
"Can't execute %s while a task is in progress",
method.__name__)
"Can't execute %s while a task is in progress", method.__name__
)
return False
async with api.lock:
@@ -40,6 +34,7 @@ class AsyncThrottle:
Decorator that prevents a function from being called more than once every
time period.
"""
def __init__(self, delta):
"""Initialize async throttle."""
self.throttle_period = delta
@@ -47,6 +42,7 @@ class AsyncThrottle:
def __call__(self, method):
"""Throttle function"""
async def wrapper(*args, **kwargs):
"""Throttle function wrapper"""
now = datetime.now()

View File

@@ -58,6 +58,11 @@ def parse_datetime(dt_str):
return datetime(**kws)
def utcnow():
def utcnow() -> datetime:
"""Return the current timestamp including timezone."""
return datetime.now(UTC)
def utc_from_timestamp(timestamp: float) -> datetime:
"""Return a UTC time from a timestamp."""
return UTC.localize(datetime.utcfromtimestamp(timestamp))

View File

@@ -1,19 +1,35 @@
"""Validate functions."""
import uuid
import re
import uuid
import voluptuous as vol
from .const import (
ATTR_IMAGE, ATTR_LAST_VERSION, ATTR_CHANNEL, ATTR_TIMEZONE, ATTR_HASSOS,
ATTR_ADDONS_CUSTOM_LIST, ATTR_PASSWORD, ATTR_HOMEASSISTANT, ATTR_HASSIO,
ATTR_BOOT, ATTR_LAST_BOOT, ATTR_SSL, ATTR_PORT, ATTR_WATCHDOG, ATTR_CONFIG,
ATTR_WAIT_BOOT, ATTR_UUID, ATTR_REFRESH_TOKEN, ATTR_HASSOS_CLI,
ATTR_ACCESS_TOKEN, ATTR_DISCOVERY, ATTR_ADDON, ATTR_SERVICE,
SERVICE_MQTT,
CHANNEL_STABLE, CHANNEL_BETA, CHANNEL_DEV)
from .utils.validate import schema_or, validate_timezone
ATTR_ACCESS_TOKEN,
ATTR_ADDONS_CUSTOM_LIST,
ATTR_BOOT,
ATTR_CHANNEL,
ATTR_HASSIO,
ATTR_HASSOS,
ATTR_HASSOS_CLI,
ATTR_HOMEASSISTANT,
ATTR_IMAGE,
ATTR_LAST_BOOT,
ATTR_LAST_VERSION,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_REFRESH_TOKEN,
ATTR_SESSION,
ATTR_SSL,
ATTR_TIMEZONE,
ATTR_UUID,
ATTR_WAIT_BOOT,
ATTR_WATCHDOG,
CHANNEL_BETA,
CHANNEL_DEV,
CHANNEL_STABLE,
)
from .utils.validate import validate_timezone
RE_REPOSITORY = re.compile(r"^(?P<url>[^#]+)(?:#(?P<branch>[\w\-]+))?$")
@@ -24,7 +40,7 @@ ALSA_DEVICE = vol.Maybe(vol.Match(r"\d+,\d+"))
CHANNELS = vol.In([CHANNEL_STABLE, CHANNEL_BETA, CHANNEL_DEV])
UUID_MATCH = vol.Match(r"^[0-9a-f]{32}$")
SHA256 = vol.Match(r"^[0-9a-f]{64}$")
SERVICE_ALL = vol.In([SERVICE_MQTT])
TOKEN = vol.Match(r"^[0-9a-f]{32,256}$")
def validate_repository(repository):
@@ -35,7 +51,7 @@ def validate_repository(repository):
# Validate URL
# pylint: disable=no-value-for-parameter
vol.Url()(data.group('url'))
vol.Url()(data.group("url"))
return repository
@@ -66,63 +82,67 @@ def convert_to_docker_ports(data):
raise vol.Invalid("Can't validate Docker host settings")
DOCKER_PORTS = vol.Schema({
vol.All(vol.Coerce(str), vol.Match(r"^\d+(?:/tcp|/udp)?$")):
convert_to_docker_ports,
})
DOCKER_PORTS = vol.Schema(
{
vol.All(
vol.Coerce(str), vol.Match(r"^\d+(?:/tcp|/udp)?$")
): convert_to_docker_ports
}
)
# pylint: disable=no-value-for-parameter
SCHEMA_HASS_CONFIG = vol.Schema({
SCHEMA_HASS_CONFIG = vol.Schema(
{
vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex): UUID_MATCH,
vol.Optional(ATTR_ACCESS_TOKEN): SHA256,
vol.Optional(ATTR_ACCESS_TOKEN): TOKEN,
vol.Optional(ATTR_BOOT, default=True): vol.Boolean(),
vol.Inclusive(ATTR_IMAGE, 'custom_hass'): DOCKER_IMAGE,
vol.Inclusive(ATTR_LAST_VERSION, 'custom_hass'): vol.Coerce(str),
vol.Inclusive(ATTR_IMAGE, "custom_hass"): DOCKER_IMAGE,
vol.Inclusive(ATTR_LAST_VERSION, "custom_hass"): vol.Coerce(str),
vol.Optional(ATTR_PORT, default=8123): NETWORK_PORT,
vol.Optional(ATTR_PASSWORD): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_REFRESH_TOKEN): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_SSL, default=False): vol.Boolean(),
vol.Optional(ATTR_WATCHDOG, default=True): vol.Boolean(),
vol.Optional(ATTR_WAIT_BOOT, default=600):
vol.All(vol.Coerce(int), vol.Range(min=60)),
}, extra=vol.REMOVE_EXTRA)
vol.Optional(ATTR_WAIT_BOOT, default=600): vol.All(
vol.Coerce(int), vol.Range(min=60)
),
},
extra=vol.REMOVE_EXTRA,
)
SCHEMA_UPDATER_CONFIG = vol.Schema({
SCHEMA_UPDATER_CONFIG = vol.Schema(
{
vol.Optional(ATTR_CHANNEL, default=CHANNEL_STABLE): CHANNELS,
vol.Optional(ATTR_HOMEASSISTANT): vol.Coerce(str),
vol.Optional(ATTR_HASSIO): vol.Coerce(str),
vol.Optional(ATTR_HASSOS): vol.Coerce(str),
vol.Optional(ATTR_HASSOS_CLI): vol.Coerce(str),
}, extra=vol.REMOVE_EXTRA)
},
extra=vol.REMOVE_EXTRA,
)
# pylint: disable=no-value-for-parameter
SCHEMA_HASSIO_CONFIG = vol.Schema({
vol.Optional(ATTR_TIMEZONE, default='UTC'): validate_timezone,
SCHEMA_HASSIO_CONFIG = vol.Schema(
{
vol.Optional(ATTR_TIMEZONE, default="UTC"): validate_timezone,
vol.Optional(ATTR_LAST_BOOT): vol.Coerce(str),
vol.Optional(ATTR_ADDONS_CUSTOM_LIST, default=[
"https://github.com/hassio-addons/repository",
]): REPOSITORIES,
vol.Optional(
ATTR_ADDONS_CUSTOM_LIST,
default=["https://github.com/hassio-addons/repository"],
): REPOSITORIES,
vol.Optional(ATTR_WAIT_BOOT, default=5): WAIT_BOOT,
}, extra=vol.REMOVE_EXTRA)
},
extra=vol.REMOVE_EXTRA,
)
SCHEMA_DISCOVERY = vol.Schema([
vol.Schema({
vol.Required(ATTR_UUID): UUID_MATCH,
vol.Required(ATTR_ADDON): vol.Coerce(str),
vol.Required(ATTR_SERVICE): SERVICE_ALL,
vol.Required(ATTR_CONFIG): vol.Maybe(dict),
}, extra=vol.REMOVE_EXTRA)
])
SCHEMA_DISCOVERY_CONFIG = vol.Schema({
vol.Optional(ATTR_DISCOVERY, default=list): schema_or(SCHEMA_DISCOVERY),
}, extra=vol.REMOVE_EXTRA)
SCHEMA_AUTH_CONFIG = vol.Schema({SHA256: SHA256})
SCHEMA_AUTH_CONFIG = vol.Schema({
SHA256: SHA256
})
SCHEMA_INGRESS_CONFIG = vol.Schema(
{vol.Required(ATTR_SESSION, default=dict): vol.Schema({TOKEN: vol.Coerce(float)})},
extra=vol.REMOVE_EXTRA,
)

View File

@@ -45,3 +45,7 @@ disable=
[EXCEPTIONS]
overgeneral-exceptions=Exception
[TYPECHECK]
ignored-modules = distutils

View File

@@ -4,10 +4,10 @@ attrs==18.2.0
cchardet==2.1.4
colorlog==4.0.2
cpe==1.2.1
cryptography==2.5
docker==3.7.0
cryptography==2.6.1
docker==3.7.2
gitpython==2.1.11
pytz==2018.9
pyudev==0.21.0
uvloop==0.11.3
uvloop==0.12.2
voluptuous==0.11.5

View File

@@ -1,5 +1,5 @@
flake8==3.7.7
pylint==2.3.0
pylint==2.3.1
pytest==4.3.0
pytest-timeout==1.3.3
pytest-aiohttp==0.3.0

View File

@@ -14,4 +14,4 @@ use_parentheses = true
[flake8]
max-line-length = 88
ignore = E501
ignore = E501, W503

View File

@@ -11,32 +11,30 @@ from hassio.bootstrap import initialize_coresys
@pytest.fixture
def docker():
"""Mock Docker API."""
with patch('hassio.coresys.DockerAPI') as mock:
with patch("hassio.coresys.DockerAPI") as mock:
yield mock
@pytest.fixture
async def coresys(loop, docker):
"""Create a CoreSys Mock."""
with patch('hassio.bootstrap.initialize_system_data'):
with patch("hassio.bootstrap.initialize_system_data"):
coresys_obj = await initialize_coresys()
coresys_obj.ingress.save_data = MagicMock()
yield coresys_obj
@pytest.fixture
def sys_machine():
"""Mock sys_machine."""
with patch(
'hassio.coresys.CoreSys.machine',
new_callable=PropertyMock) as mock:
with patch("hassio.coresys.CoreSys.machine", new_callable=PropertyMock) as mock:
yield mock
@pytest.fixture
def sys_supervisor():
with patch(
'hassio.coresys.CoreSys.supervisor',
new_callable=PropertyMock) as mock:
with patch("hassio.coresys.CoreSys.supervisor", new_callable=PropertyMock) as mock:
mock.return_value = MagicMock()
yield MagicMock

Some files were not shown because too many files have changed in this diff Show More