Compare commits

...

149 Commits
132 ... 153

Author SHA1 Message Date
Pascal Vizeli
82b2f66920 Merge pull request #1007 from home-assistant/dev
Release 153
2019-04-07 00:21:54 +02:00
Pascal Vizeli
01da42e1b6 Add add-ons reload for manager (#1006) 2019-04-07 00:15:33 +02:00
Pascal Vizeli
d652d22547 Panel ingress fix (#1004) 2019-04-06 23:03:54 +02:00
Pascal Vizeli
baea84abe6 Panel ingress support (#999) 2019-04-06 12:03:26 +02:00
Pascal Vizeli
c2d705a42a Fix ingress_url with not installed add-ons (#998)
Fix ingress_url with not installed add-ons
2019-04-06 11:24:04 +02:00
Pascal Vizeli
f10b433e1f Fix token handling with new secrets (#996)
* Fix token handling with new secrets

* add schema also to ingress
2019-04-05 17:49:43 +02:00
Pascal Vizeli
67f562a846 Init ingress session boarder / lookup (#995)
* Init ingress session boarder / lookup

* Add session to API

* Add cokkie validate

* Do it without event bus

* Add logger

* Fix validation

* Add tests

* Update tests

* Mock json storage
2019-04-05 17:36:07 +02:00
Pascal Vizeli
1edec61133 Add Ingress support (#991)
* Add Ingress support to supervisor

* Update security

* cleanup add-on extraction

* update description

* fix header part

* fix

* Fix header check

* fix tox

* Migrate docker interface typing

* Update home assistant to new docker

* Migrate supervisor

* Fix host add-on problem

* Update hassos

* Update API

* Expose data to API

* Check on API ingress support

* Add ingress URL

* Some cleanups

* debug

* disable uvloop

* Fix issue

* test

* Fix bug

* Fix flow

* Fix interface

* Fix network

* Fix metadata

* cleanups

* Fix exception

* Migrate to token system

* Fix webui

* Fix update

* Fix relaod

* Update log messages

* Attach ingress url only if enabled

* Cleanup ingress url handling

* Ingress update

* Support check version

* Fix raise error

* Migrate default port

* Fix junks

* search error

* Fix content filter

* Add debug

* Update log

* Update flags

* Update documentation

* Cleanup debugs

* Fix lint

* change default port to 8099

* Fix lint

* fix lint
2019-04-05 12:13:44 +02:00
Stephen Beechen
c13a33bf71 Downgrade add-on API access logging to debug (#992)
resolves home-assistant/hassio#865
2019-04-05 11:42:31 +02:00
Pascal Vizeli
2ae93ae7b1 Update API data for deconz (#989)
* Update API data for deconz

* Fix tests
2019-04-01 16:58:58 +02:00
Pascal Vizeli
8451020afe Update uvloop 0.12.2 (#988) 2019-04-01 14:02:28 +02:00
Pascal Vizeli
a48e568efc Update azure-pipelines.yml for Azure Pipelines [skip ci] 2019-04-01 14:01:59 +02:00
Pascal Vizeli
dee2808cb5 Delete main.workflow 2019-04-01 13:57:42 +02:00
Pascal Vizeli
06a2ab26a2 Update azure-pipelines.yml for Azure Pipelines [skip ci] 2019-04-01 13:56:02 +02:00
Pascal Vizeli
45de0f2f39 Update azure-pipelines.yml for Azure Pipelines [skip ci] 2019-04-01 13:05:23 +02:00
Pascal Vizeli
bac5f704dc Update azure-pipelines.yml for Azure Pipelines [skip ci] 2019-04-01 13:01:37 +02:00
Pascal Vizeli
79669a5d04 Set up CI with Azure Pipelines [skip ci] 2019-04-01 12:49:59 +02:00
Pascal Vizeli
a6e712c9ea Bump version 153 2019-03-30 23:37:11 +01:00
Pascal Vizeli
069fe99699 Merge pull request #984 from home-assistant/dev
Release 152
2019-03-30 23:35:54 +01:00
Pascal Vizeli
4754f067ad Update panel for 0.91.0 (#983) 2019-03-30 23:30:46 +01:00
Pascal Vizeli
dce9818812 Bump version 152 2019-03-28 15:01:53 +01:00
Pascal Vizeli
d054b6dbb7 Merge pull request #979 from home-assistant/dev
Release 151
2019-03-28 15:01:23 +01:00
Pascal Vizeli
3093165325 Update cryptography (#981) 2019-03-28 14:37:06 +01:00
Pascal Vizeli
fd9c5bd412 Make arch required (#978) 2019-03-28 14:23:46 +01:00
Pascal Vizeli
9a8850fecd Remove unused pylint 2019-03-28 14:13:36 +01:00
Pascal Vizeli
b12175ab9a Support for deconz discovery & cleanup (#974)
* Support for deconz discovery & cleanup

* Split discovery

* Fix lint

* Fix lint / import
2019-03-28 14:11:18 +01:00
Pascal Vizeli
b52f90187b Make homeassistant container constant (#808)
* Make homeassistant container constant

* Update homeassistant.py

* Update homeassistant.py

* Update interface.py

* Update homeassistant.py

* Fix handling

* add start function

* Add typing

* Fix lint

* Add API call

* Update logs

* Fix some issue with watchdog

* Fix lint
2019-03-27 17:20:05 +01:00
Pascal Vizeli
4eb02f474d Bump version 151 2019-03-20 22:09:42 +01:00
Pascal Vizeli
dfdcddfd0b Merge pull request #968 from home-assistant/dev
Release 150
2019-03-20 22:08:17 +01:00
Pascal Vizeli
0391277bad Fix panel for 0.90.0 (#967) 2019-03-20 22:03:31 +01:00
Pascal Vizeli
73643b9bfe Bump version 150 2019-03-19 21:29:47 +01:00
Pascal Vizeli
93a52b8382 Merge pull request #965 from home-assistant/dev
Release 149
2019-03-19 21:26:38 +01:00
Pascal Vizeli
7a91bb1f6c Update panel for 0.90.0 v6 (#963) 2019-03-19 19:01:52 +01:00
Pascal Vizeli
26efa998a1 Revert dev link (#956) 2019-03-18 09:42:31 +01:00
Pascal Vizeli
fc9f3fee0a Fix 2019-03-18 09:20:48 +01:00
cadwal
ec19bd570b Include serial device node links in container device mapping to allow for persistent names in the HA serial config (#944) 2019-03-18 09:05:04 +01:00
David McNett
3335bad9e1 Correct typo: 'ignore' -> 'ignored' (#947) 2019-03-18 09:02:29 +01:00
Pascal Vizeli
71ae334e24 Update pylint (#945) 2019-03-11 14:03:28 +01:00
Pascal Vizeli
0807651fbd Bump version 149 2019-03-08 11:59:41 +01:00
Pascal Vizeli
7026d42d77 Merge pull request #942 from home-assistant/dev
Release 148
2019-03-08 11:41:58 +01:00
Pascal Vizeli
31047b9ec2 Down or upgrade exists image on restore (#941) 2019-03-08 11:36:36 +01:00
Pascal Vizeli
714791de8f Bump version 148 2019-03-07 21:12:54 +01:00
Pascal Vizeli
c544fff2b2 Merge pull request #939 from home-assistant/dev
Release 147
2019-03-07 21:12:20 +01:00
Pascal Vizeli
fc45670686 Fix bug with update (#938) 2019-03-07 21:09:43 +01:00
Pascal Vizeli
5cefa0a2ee Bump version 147 2019-03-07 16:28:39 +01:00
Pascal Vizeli
a1910d4135 Merge pull request #937 from home-assistant/dev
Release 146
2019-03-07 16:28:09 +01:00
Pascal Vizeli
f1fecdde3a Enable Armv7 for Add-ons (#936)
* Enable Armv7 for Add-ons

* Cleanups

* fix tests
2019-03-07 16:00:41 +01:00
Pascal Vizeli
9ba4ea7d18 Check json files too 2019-03-07 10:03:07 +01:00
Pascal Vizeli
58a455d639 Fix lint 2019-03-04 10:09:34 +01:00
Pascal Vizeli
3ea85f6a28 Delete .travis.yml 2019-03-04 10:04:19 +01:00
Pascal Vizeli
4e1469ada4 Replace travis 2019-03-04 10:03:54 +01:00
Curtis Gibby
5778f78f28 Fix misspelling on "environment" (#934) 2019-03-04 10:00:41 +01:00
Pascal Vizeli
227125cc0b Change json error handling (#930)
* Change json error handling

* Typing + modern way to read file

* fix lint
2019-02-26 00:19:05 +01:00
Pascal Vizeli
b36e178c45 Bump version to 146 2019-02-21 17:24:01 +01:00
Pascal Vizeli
32c9198fb2 Merge pull request #929 from home-assistant/dev
Release 145
2019-02-21 17:21:43 +01:00
Pascal Vizeli
6983dcc267 Fix image arch version on restore/update (#928) 2019-02-21 16:40:49 +01:00
Pascal Vizeli
813fcc41f0 Bump version 145 2019-02-20 17:04:41 +01:00
Pascal Vizeli
f4e9dd0f1c Merge pull request #927 from home-assistant/dev
Release 144
2019-02-20 17:04:15 +01:00
Pascal Vizeli
7f074142bf Replace pycrpytodome with cryptocraphy (#923)
* Replace pycrpytodome with cryptocraphy

* Fix typing

* fix typing

* Fix lints

* Fix build

* Add musl libc

* Fix lint

* fix lint

* Fix algo

* Add more typing fix crypto imports v2

* Fix padding
2019-02-20 10:30:22 +01:00
Pascal Vizeli
b6df37628d Merge pull request #924 from home-assistant/feat-wait-time
Increase wait time for home assistant startup
2019-02-18 16:24:21 +01:00
Pascal Vizeli
7867eded50 Increase wait time for home assistant startup 2019-02-18 09:51:21 +01:00
Pascal Vizeli
311abb8a90 Bump version 144 2019-02-02 11:48:29 +01:00
Pascal Vizeli
21303f4b05 Merge pull request #913 from home-assistant/dev
Release 143
2019-02-02 11:47:13 +01:00
Pascal Vizeli
da3270af67 Fix that need_build work like image (#912) 2019-01-31 22:08:10 +01:00
Pascal Vizeli
35aae69f23 Support armv7 and allow support of multible arch types per CPU (#892)
* Support armv7 and first abstraction

* Change layout

* Add more type hints

* Fix imports

* Update

* move forward

* add tests

* fix type

* fix lint & tests

* fix tests

* Fix unittests

* Fix create folder

* cleanup

* Fix import order

* cleanup loop parameter

* cleanup init function

* Allow changeable image name

* fix setup

* Fix load of arch

* Fix lint

* Add typing

* fix init

* fix hassos cli problem & stick on supervisor arch

* address comments

* cleanup

* Fix image selfheal

* Add comment

* update uvloop

* remove uvloop

* fix tagging

* Fix install name

* Fix validate build config

* Abstract image_name from system cache
2019-01-31 18:47:44 +01:00
Franck Nijhof
118a2e1951 Revert "Delete move.yml" (#901)
This reverts commit 07c4058a8c.
2019-01-22 12:19:38 +01:00
Pascal Vizeli
9053341581 Fix wrong UTF-8 config files (#895)
* Fix wrong UTF-8 config files

* Fix lint

* Update data.py
2019-01-18 18:57:54 +01:00
Pascal Vizeli
27532a8a00 Update aioHttp 3.5.4 (#894) 2019-01-17 21:40:52 +01:00
Pascal Vizeli
7fdfa630b5 Bump version 143 2019-01-15 12:11:56 +01:00
Pascal Vizeli
3974d5859f Merge pull request #890 from home-assistant/dev
Release 142
2019-01-15 12:10:58 +01:00
Pascal Vizeli
aa1c765c4b Add support for SYS_MODULE (#889)
* Add support for SYS_MODULE

* Update flake stuff

* Fix lint

* Fix lint

* Fix lint

* Fix lint
2019-01-15 00:56:07 +01:00
Pascal Vizeli
e78385e7ea Support to map kernel modules ro into container (#888) 2019-01-14 23:20:30 +01:00
Pascal Vizeli
9d59b56c94 Fix lint 2019-01-14 23:20:07 +01:00
Pascal Vizeli
9d72dcabfc Support to map kernel modules ro into container 2019-01-14 21:57:14 +01:00
Pascal Vizeli
a0b5d0b67e Fix error on first run because the landing page already run (#886)
* Fix error on first run because the landing page already run

* Update homeassistant.py
2019-01-14 21:25:17 +01:00
Pascal Vizeli
2b5520405f Fix log info about update on dev (#885) 2019-01-14 20:05:03 +01:00
Pascal Vizeli
ca376b3fcd Update docker-py to 3.7.0 (#882)
* Update docker-py to 3.7.0

* Update __init__.py

* Update addon.py
2019-01-14 20:04:27 +01:00
Pascal Vizeli
11e3c0c547 Update aioHttp to 3.5.2 (#881) 2019-01-13 12:22:01 +01:00
Pascal Vizeli
9da136e037 Fix API descriptions 2019-01-02 23:31:35 +01:00
Pascal Vizeli
9b3e59d876 Merge pull request #861 from casperklein/patch-1
Duplicate entry removed.
2018-12-20 16:18:29 +01:00
Casper
7a592795b5 Duplicate entry removed. 2018-12-20 13:45:04 +01:00
Pascal Vizeli
5b92137699 Bump version 142 2018-12-11 23:46:01 +01:00
Pascal Vizeli
7520cdfeb4 Merge pull request #853 from home-assistant/dev
Release 141
2018-12-11 23:45:29 +01:00
Pascal Vizeli
0ada791e3a Update Panel for Home Assistant 0.84.0 (#852) 2018-12-11 20:54:30 +01:00
Pascal Vizeli
73afced4dc Bugfix stack trace on remove (#842) 2018-11-30 00:09:33 +01:00
Pascal Vizeli
633a2e93bf Create ISSUE_TEMPLATE.md 2018-11-22 14:53:49 +01:00
Pascal Vizeli
07c4058a8c Delete move.yml 2018-11-22 14:46:58 +01:00
Alastair D'Silva
b6f3938b14 Add support for the Orange Pi Prime (#829)
Signed-off-by: Alastair D'Silva <alastair@d-silva.org>
2018-11-21 17:03:25 +01:00
Pascal Vizeli
57534fac96 Bump version 141 2018-11-20 17:39:39 +01:00
Pascal Vizeli
4a03e72983 Merge pull request #827 from home-assistant/dev
Release 140
2018-11-20 17:39:12 +01:00
Pascal Vizeli
ddb29ea9b1 Speedup build 2018-11-20 17:17:04 +01:00
Pascal Vizeli
95179c30f7 Update Panel with new security functions (#826) 2018-11-20 17:13:55 +01:00
Pascal Vizeli
f49970ce2c Update .gitmodules 2018-11-20 12:25:45 +01:00
Pascal Vizeli
790818d1aa Update README.md 2018-11-20 10:56:19 +01:00
Pascal Vizeli
62f675e613 Fix documentation 2018-11-19 22:37:46 +01:00
Pascal Vizeli
f33434fb01 Downgrade discovery duplicate logging (#824) 2018-11-19 21:05:51 +01:00
Pascal Vizeli
254d6aee32 Small code cleanups (#822)
* Small code cleanups

* Update homeassistant.py
2018-11-19 16:44:21 +01:00
Pascal Vizeli
a5ecd597ed Add tests for add-ons map (#821) 2018-11-19 16:43:24 +01:00
Pascal Vizeli
0fab3e940a Merge pull request #820 from home-assistant/master
Master
2018-11-19 14:52:45 +01:00
Pascal Vizeli
60fbebc16b Rating add-on better they implement hass auth (#819)
* Rating add-on better they implement hass auth

* Update utils.py
2018-11-19 14:51:03 +01:00
Christian
ec366d8112 Provide options for legacy add-ons (#814)
* Provide options for legacy add-ons

* Remove whitespace from blank line

* Only provide primitive data types as Docker environment variable

* Fix linting issues

* Update addon.py
2018-11-19 12:05:12 +01:00
Christian
b8818788c9 Bugfix Add-on validate correct image url (#810)
* Bugfix Add-on validate correct image path

* Add tests for different add-on image urls
2018-11-18 19:29:23 +01:00
Pascal Vizeli
e23f6f6998 Update uvloop to version 0.11.3 (#818) 2018-11-18 12:08:59 +01:00
Pascal Vizeli
05b58d76b9 Add tests for hass.io (#817)
* Add tests for hass.io

* Fix folder

* Fix test command
2018-11-18 12:08:46 +01:00
Pascal Vizeli
644d13e3fa Bugfix Add-on validate on RO (#803) 2018-11-09 23:53:41 +01:00
Pascal Vizeli
9de71472d4 Remove links they are not needed 2018-11-09 10:26:01 +01:00
Pascal Vizeli
bf28227b91 Add developer guide 2018-11-09 10:25:29 +01:00
Pascal Vizeli
4c1ee49068 Bump version 140 2018-11-05 16:20:01 +01:00
Pascal Vizeli
6e7cf5e4c9 Merge pull request #796 from home-assistant/dev
Release 139
2018-11-05 16:19:17 +01:00
Pascal Vizeli
11f8c97347 Fix discovery update (#795)
* Update discovery.py

* Update discovery.py

* Update discovery.py

* Update discovery.py

* Update discovery.py

* Update discovery.py

* Update discovery.py
2018-11-05 14:59:57 +01:00
Pascal Vizeli
a1461fd518 Update requirements.txt 2018-11-05 13:53:16 +01:00
Pascal Vizeli
fa5c2e37d3 Discovery default config (#793) 2018-11-05 07:45:28 +01:00
luca-simonetti
1f091b20ad fix: use a different convention to handle multiple devices on same card (#767)
* fix: use a different convention to handle multiple devices on same card

* fix: use a different convention to handle multiple devices on same card

* Update alsa.py

* Update alsa.py
2018-11-02 10:47:25 +01:00
Pascal Vizeli
d3b4a03851 Catch exception on watchdog for pretty log (#778)
* Catch exception on watchdog for pretty log

* Update tasks.py
2018-10-29 16:40:19 +01:00
Jorim Tielemans
fb12fee59b Expand add-on installation error message (#783)
* Expand error message

Since an add-on is only available for certain machine and architecture combination we should log both.

* Update addon.py
2018-10-27 15:24:56 +02:00
Pascal Vizeli
7a87d2334a flake8 update to 3.6.0 (#777)
* flake8 update to 3.6.0

* fix lint
2018-10-27 15:23:26 +02:00
Pascal Vizeli
9591e71138 Update auth.py (#771) 2018-10-24 14:02:16 +02:00
Ville Skyttä
cecad526a2 Grammar and spelling fixes (#772) 2018-10-24 14:01:28 +02:00
Pascal Vizeli
53dab4ee45 Bump version 139 2018-10-16 12:52:19 +02:00
Pascal Vizeli
8abbba46c7 Merge pull request #766 from home-assistant/dev
Release 138
2018-10-16 12:51:47 +02:00
Pascal Vizeli
0f01ac1b59 Fix syntax 2018-10-16 12:45:06 +02:00
Pascal Vizeli
aa8ab593c0 Rename login_backend to auth_api (#764)
* Update const.py

* Update validate.py

* Update addon.py

* Update auth.py

* Update addons.py

* Update API.md
2018-10-16 12:33:40 +02:00
Pascal Vizeli
84f791220e Don't clean cache on fake auth (#765)
* Don't clean cache on fake auth

* Update auth.py
2018-10-16 12:30:24 +02:00
Pascal Vizeli
cee2c5469f Bump version 138 2018-10-15 15:25:29 +02:00
Pascal Vizeli
6e75964a8b Merge pull request #761 from home-assistant/dev
Release 137
2018-10-15 15:25:05 +02:00
Pascal Vizeli
5ab5036504 Fix proxy handling with failing connection (#760)
* Fix proxy handling with failing connection

* fix lint

* Fix exception handling

* clenaup error handling

* Fix type error

* Fix event stream

* Fix stream handling

* Fix

* Fix lint

* Handle

* Update proxy.py

* fix lint
2018-10-15 13:01:52 +02:00
Pascal Vizeli
000a3c1f7e Bump to 137 2018-10-12 14:39:47 +02:00
Pascal Vizeli
8ea123eb94 Merge pull request #754 from home-assistant/dev
Release 136
2018-10-12 14:39:18 +02:00
Pascal Vizeli
571c42ef7d Create role for backup add-ons (#755)
* Create role for backup add-ons

* Update validate.py

* Update security.py
2018-10-12 12:48:12 +02:00
Pascal Vizeli
8443da0b9f Add-on SSO support with Home Assistant auth system (#752)
* Create auth.py

* Finish auth cache

* Add documentation

* Add valid schema

* Update auth.py

* Update auth.py

* Update security.py

* Create auth.py

* Update coresys.py

* Update bootstrap.py

* Update const.py

* Update validate.py

* Update const.py

* Update addon.py

* Update auth.py

* Update __init__.py

* Update auth.py

* Update auth.py

* Update auth.py

* Update const.py

* Update auth.py

* Update auth.py

* Update auth.py

* Update validate.py

* Update coresys.py

* Update auth.py

* Update auth.py

* more security

* Update API.md

* Update auth.py

* Update auth.py

* Update auth.py

* Update auth.py

* Update auth.py

* Update homeassistant.py

* Update homeassistant.py
2018-10-12 12:21:48 +02:00
Pascal Vizeli
7dbbcf24c8 Check exists hardware for audio/gpio devices (#753)
* Update hardware.py

* Update addon.py

* Update hardware.py

* Update addon.py
2018-10-12 10:22:58 +02:00
Pascal Vizeli
468cb0c36b Rename info (#750)
* Rename version to info

* fix security
2018-10-10 16:46:34 +02:00
Pascal Vizeli
78e093df96 Bump version 136 2018-10-09 17:10:25 +02:00
Pascal Vizeli
ec4d7dab21 Merge pull request #749 from home-assistant/dev
Release 135
2018-10-09 17:08:19 +02:00
Pascal Vizeli
d00ee0adea Add hostname into version API call (#748) 2018-10-09 15:40:44 +02:00
Pascal Vizeli
55d5ee4ed4 Merge pull request #747 from mbo18/patch-1
Add missing tinker board
2018-10-09 14:00:44 +02:00
mbo18
0e51d74265 Add missing tinker board 2018-10-09 09:29:38 +02:00
Pascal Vizeli
916f3caedd Bump version 135 2018-10-08 00:21:59 +02:00
Pascal Vizeli
ff80ccce64 Merge pull request #745 from home-assistant/dev
Release 134
2018-10-08 00:20:20 +02:00
Pascal Vizeli
23f28b38e9 small code cleanups (#740)
* small code cleanups

* Update __init__.py

* Update homeassistant.py

* Update __init__.py

* Update homeassistant.py

* Update homeassistant.py

* Update __init__.py

* fix list

* Fix api call
2018-10-07 23:50:18 +02:00
Franck Nijhof
da425a0530 Adds support for privilege DAC_READ_SEARCH (#743)
* Adds support for privilege DAC_READ_SEARCH

* 🚑 Fixes security rating regarding privileges
2018-10-07 19:17:06 +02:00
Jorim Tielemans
79dca1608e Fix machine 'odroid-c2' (#744)
Odroid-cu2 does not exist AFAIK, it needs to be c2.
2018-10-07 19:16:29 +02:00
Pascal Vizeli
33b615e40d Fix manager access to /addons (#738) 2018-10-05 13:48:29 +02:00
Pascal Vizeli
c825c40c4d Bump version 134 2018-10-01 19:07:48 +02:00
Pascal Vizeli
8beb723cc2 Merge pull request #736 from home-assistant/dev
Release 133
2018-10-01 19:07:17 +02:00
Pascal Vizeli
94fd24c251 Bugfix message handling (#735) 2018-10-01 18:57:31 +02:00
Pascal Vizeli
bf75a8a439 Cleanup discovery data (#734)
* Cleanup discovery data

* Update API.md

* Update validate.py

* Update discovery.py

* Update const.py
2018-10-01 16:17:46 +02:00
Pascal Vizeli
36cdb05387 Don't allow add-on to update itself (#733) 2018-10-01 15:22:26 +02:00
Pascal Vizeli
dccc652d42 Bump version 133 2018-09-30 20:16:42 +02:00
134 changed files with 4620 additions and 3085 deletions

29
.github/ISSUE_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,29 @@
<!-- READ THIS FIRST:
- If you need additional help with this template please refer to https://www.home-assistant.io/help/reporting_issues/
- Make sure you are running the latest version of Home Assistant before reporting an issue: https://github.com/home-assistant/home-assistant/releases
- Do not report issues for components here, plaese refer to https://github.com/home-assistant/home-assistant/issues
- This is for bugs only. Feature and enhancement requests should go in our community forum: https://community.home-assistant.io/c/feature-requests
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks. Do not delete any text from this template!
- If you have a problem with a Add-on, make a issue on there repository.
-->
**Home Assistant release with the issue:**
<!--
- Frontend -> Developer tools -> Info
- Or use this command: hass --version
-->
**Operating environment (HassOS/Generic):**
<!--
Please provide details about your environment.
-->
**Supervisor logs:**
<!--
- Frontend -> Hass.io -> System
- Or use this command: hassio su logs
-->
**Description of problem:**

1
.gitmodules vendored
View File

@@ -1,3 +1,4 @@
[submodule "home-assistant-polymer"] [submodule "home-assistant-polymer"]
path = home-assistant-polymer path = home-assistant-polymer
url = https://github.com/home-assistant/home-assistant-polymer url = https://github.com/home-assistant/home-assistant-polymer
branch = dev

View File

@@ -1,6 +0,0 @@
sudo: true
dist: xenial
install: pip install -U tox
language: python
python: 3.7
script: tox

71
API.md
View File

@@ -22,7 +22,7 @@ On success / Code 200:
} }
``` ```
For access to API you need set the `X-HASSIO-KEY` they will be available for Add-ons/HomeAssistant with envoriment `HASSIO_TOKEN`. For access to API you need set the `X-HASSIO-KEY` they will be available for Add-ons/HomeAssistant with environment `HASSIO_TOKEN`.
### Hass.io ### Hass.io
@@ -41,6 +41,7 @@ The addons from `addons` are only installed one.
"arch": "armhf|aarch64|i386|amd64", "arch": "armhf|aarch64|i386|amd64",
"channel": "stable|beta|dev", "channel": "stable|beta|dev",
"timezone": "TIMEZONE", "timezone": "TIMEZONE",
"ip_address": "ip address",
"wait_boot": "int", "wait_boot": "int",
"addons": [ "addons": [
{ {
@@ -314,9 +315,10 @@ Load host configs from a USB stick.
"CARD_ID": { "CARD_ID": {
"name": "xy", "name": "xy",
"type": "microphone", "type": "microphone",
"devices": { "devices": [
"DEV_ID": "type of device" "chan_id": "channel ID",
} "chan_type": "type of device"
]
} }
} }
} }
@@ -345,14 +347,16 @@ Load host configs from a USB stick.
{ {
"version": "INSTALL_VERSION", "version": "INSTALL_VERSION",
"last_version": "LAST_VERSION", "last_version": "LAST_VERSION",
"arch": "arch",
"machine": "Image machine type", "machine": "Image machine type",
"ip_address": "ip address",
"image": "str", "image": "str",
"custom": "bool -> if custom image", "custom": "bool -> if custom image",
"boot": "bool", "boot": "bool",
"port": 8123, "port": 8123,
"ssl": "bool", "ssl": "bool",
"watchdog": "bool", "watchdog": "bool",
"startup_time": 600 "wait_boot": 600
} }
``` ```
@@ -374,6 +378,7 @@ Output is the raw Docker log.
- POST `/homeassistant/check` - POST `/homeassistant/check`
- POST `/homeassistant/start` - POST `/homeassistant/start`
- POST `/homeassistant/stop` - POST `/homeassistant/stop`
- POST `/homeassistant/rebuild`
- POST `/homeassistant/options` - POST `/homeassistant/options`
@@ -386,7 +391,7 @@ Output is the raw Docker log.
"password": "", "password": "",
"refresh_token": "", "refresh_token": "",
"watchdog": "bool", "watchdog": "bool",
"startup_time": 600 "wait_boot": 600
} }
``` ```
@@ -415,7 +420,7 @@ Proxy to real websocket instance.
### RESTful for API addons ### RESTful for API addons
If a add-on will call itself, you can use `/addons/self/...`. If an add-on will call itself, you can use `/addons/self/...`.
- GET `/addons` - GET `/addons`
@@ -466,6 +471,7 @@ Get all available addons.
"available": "bool", "available": "bool",
"arch": ["armhf", "aarch64", "i386", "amd64"], "arch": ["armhf", "aarch64", "i386", "amd64"],
"machine": "[raspberrypi2, tinker]", "machine": "[raspberrypi2, tinker]",
"homeassistant": "null|min Home Assistant version",
"repository": "12345678|null", "repository": "12345678|null",
"version": "null|VERSION_INSTALLED", "version": "null|VERSION_INSTALLED",
"last_version": "LAST_VERSION", "last_version": "LAST_VERSION",
@@ -488,19 +494,25 @@ Get all available addons.
"hassio_api": "bool", "hassio_api": "bool",
"hassio_role": "default|homeassistant|manager|admin", "hassio_role": "default|homeassistant|manager|admin",
"homeassistant_api": "bool", "homeassistant_api": "bool",
"auth_api": "bool",
"full_access": "bool", "full_access": "bool",
"protected": "bool", "protected": "bool",
"rating": "1-6", "rating": "1-6",
"stdin": "bool", "stdin": "bool",
"webui": "null|http(s)://[HOST]:port/xy/zx", "webui": "null|http(s)://[HOST]:port/xy/zx",
"gpio": "bool", "gpio": "bool",
"kernel_modules": "bool",
"devicetree": "bool", "devicetree": "bool",
"docker_api": "bool", "docker_api": "bool",
"audio": "bool", "audio": "bool",
"audio_input": "null|0,0", "audio_input": "null|0,0",
"audio_output": "null|0,0", "audio_output": "null|0,0",
"services_role": "['service:access']", "services_role": "['service:access']",
"discovery": "['service']" "discovery": "['service']",
"ip_address": "ip address",
"ingress": "bool",
"ingress_entry": "null|/api/hassio_ingress/slug",
"ingress_url": "null|/api/hassio_ingress/slug/entry.html"
} }
``` ```
@@ -574,7 +586,24 @@ Write data to add-on stdin
} }
``` ```
### Service discovery ### ingress
- POST `/ingress/session`
Create a new Session for access to ingress service.
```json
{
"session": "token"
}
```
- VIEW `/ingress/{token}`
Ingress WebUI for this Add-on. The addon need support HASS Auth!
Need ingress session as cookie.
### discovery
- GET `/discovery` - GET `/discovery`
```json ```json
@@ -584,8 +613,6 @@ Write data to add-on stdin
"addon": "slug", "addon": "slug",
"service": "name", "service": "name",
"uuid": "uuid", "uuid": "uuid",
"component": "component",
"platform": "null|platform",
"config": {} "config": {}
} }
] ]
@@ -598,8 +625,6 @@ Write data to add-on stdin
"addon": "slug", "addon": "slug",
"service": "name", "service": "name",
"uuid": "uuid", "uuid": "uuid",
"component": "component",
"platform": "null|platform",
"config": {} "config": {}
} }
``` ```
@@ -608,8 +633,6 @@ Write data to add-on stdin
```json ```json
{ {
"service": "name", "service": "name",
"component": "component",
"platform": "null|platform",
"config": {} "config": {}
} }
``` ```
@@ -623,6 +646,8 @@ return:
- DEL `/discovery/{UUID}` - DEL `/discovery/{UUID}`
### Services
- GET `/services` - GET `/services`
```json ```json
{ {
@@ -667,14 +692,28 @@ return:
### Misc ### Misc
- GET `/version` - GET `/info`
```json ```json
{ {
"supervisor": "version", "supervisor": "version",
"homeassistant": "version", "homeassistant": "version",
"hassos": "null|version", "hassos": "null|version",
"hostname": "name",
"machine": "type", "machine": "type",
"arch": "arch", "arch": "arch",
"supported_arch": ["arch1", "arch2"],
"channel": "stable|beta|dev" "channel": "stable|beta|dev"
} }
``` ```
### Auth / SSO API
You can use the user system on homeassistant. We handle this auth system on
supervisor.
You can call post `/auth`
We support:
- Json `{ "user|name": "...", "password": "..." }`
- application/x-www-form-urlencoded `user|name=...&password=...`
- BasicAuth

View File

@@ -3,6 +3,9 @@ FROM $BUILD_FROM
# Install base # Install base
RUN apk add --no-cache \ RUN apk add --no-cache \
openssl \
libffi \
musl \
git \ git \
socat \ socat \
glib \ glib \
@@ -14,6 +17,10 @@ COPY requirements.txt /usr/src/
RUN apk add --no-cache --virtual .build-dependencies \ RUN apk add --no-cache --virtual .build-dependencies \
make \ make \
g++ \ g++ \
openssl-dev \
libffi-dev \
musl-dev \
&& export MAKEFLAGS="-j$(nproc)" \
&& pip3 install --no-cache-dir -r /usr/src/requirements.txt \ && pip3 install --no-cache-dir -r /usr/src/requirements.txt \
&& apk del .build-dependencies \ && apk del .build-dependencies \
&& rm -f /usr/src/requirements.txt && rm -f /usr/src/requirements.txt

View File

@@ -10,9 +10,19 @@ and updating software.
![](misc/hassio.png?raw=true) ![](misc/hassio.png?raw=true)
- [Hass.io Addons](https://github.com/home-assistant/hassio-addons)
- [Hass.io Build](https://github.com/home-assistant/hassio-build)
## Installation ## Installation
Installation instructions can be found at <https://home-assistant.io/hassio>. Installation instructions can be found at <https://home-assistant.io/hassio>.
## Development
The development of the supervisor is a bit tricky. Not difficult but tricky.
- You can use the builder to build your supervisor: https://github.com/home-assistant/hassio-build/tree/master/builder
- Go into a HassOS device or VM and pull your supervisor.
- Set the developer modus on updater.json
- Tag it as `homeassistant/xy-hassio-supervisor:latest`
- Restart the service like `systemctl restart hassos-supervisor | journalctl -fu hassos-supervisor`
- Test your changes
Small Bugfix or improvements, make a PR. Significant change makes first an RFC.

45
azure-pipelines.yml Normal file
View File

@@ -0,0 +1,45 @@
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
- dev
pr:
- dev
jobs:
- job: "Tox"
pool:
vmImage: 'ubuntu-16.04'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python $(python.version)'
inputs:
versionSpec: '3.7'
- script: pip install tox
displayName: 'Install Tox'
- script: tox
displayName: 'Run Tox'
- job: "JQ"
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: sudo apt-get install -y jq
displayName: 'Install JQ'
- bash: |
shopt -s globstar
cat **/*.json | jq '.'
displayName: 'Run JQ'

View File

@@ -9,21 +9,26 @@ from hassio import bootstrap
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
def attempt_use_uvloop(): def initialize_event_loop():
"""Attempt to use uvloop.""" """Attempt to use uvloop."""
try: try:
import uvloop import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
uvloop.install()
except ImportError: except ImportError:
pass pass
return asyncio.get_event_loop()
# pylint: disable=invalid-name # pylint: disable=invalid-name
if __name__ == "__main__": if __name__ == "__main__":
bootstrap.initialize_logging() bootstrap.initialize_logging()
attempt_use_uvloop()
loop = asyncio.get_event_loop()
# Init async event loop
loop = initialize_event_loop()
# Check if all information are available to setup Hass.io
if not bootstrap.check_environment(): if not bootstrap.check_environment():
sys.exit(1) sys.exit(1)
@@ -32,7 +37,7 @@ if __name__ == "__main__":
loop.set_default_executor(executor) loop.set_default_executor(executor)
_LOGGER.info("Initialize Hass.io setup") _LOGGER.info("Initialize Hass.io setup")
coresys = bootstrap.initialize_coresys(loop) coresys = loop.run_until_complete(bootstrap.initialize_coresys())
bootstrap.migrate_system_env(coresys) bootstrap.migrate_system_env(coresys)

View File

@@ -1,41 +1,105 @@
"""Init file for Hass.io add-ons.""" """Init file for Hass.io add-ons."""
from contextlib import suppress from contextlib import suppress
from copy import deepcopy from copy import deepcopy
from distutils.version import StrictVersion
from ipaddress import IPv4Address, ip_address
import logging import logging
import json
from pathlib import Path, PurePath from pathlib import Path, PurePath
import re import re
import secrets
import shutil import shutil
import tarfile import tarfile
from tempfile import TemporaryDirectory from tempfile import TemporaryDirectory
from typing import Any, Awaitable, Dict, Optional
import voluptuous as vol import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
from .validate import (
validate_options, SCHEMA_ADDON_SNAPSHOT, RE_VOLUME, RE_SERVICE,
MACHINE_ALL)
from .utils import check_installed, remove_data
from ..const import ( from ..const import (
ATTR_NAME, ATTR_VERSION, ATTR_SLUG, ATTR_DESCRIPTON, ATTR_BOOT, ATTR_MAP, ATTR_ACCESS_TOKEN,
ATTR_OPTIONS, ATTR_PORTS, ATTR_SCHEMA, ATTR_IMAGE, ATTR_REPOSITORY, ATTR_APPARMOR,
ATTR_URL, ATTR_ARCH, ATTR_LOCATON, ATTR_DEVICES, ATTR_ENVIRONMENT, ATTR_ARCH,
ATTR_HOST_NETWORK, ATTR_TMPFS, ATTR_PRIVILEGED, ATTR_STARTUP, ATTR_UUID, ATTR_AUDIO,
STATE_STARTED, STATE_STOPPED, STATE_NONE, ATTR_USER, ATTR_SYSTEM, ATTR_AUDIO_INPUT,
ATTR_STATE, ATTR_TIMEOUT, ATTR_AUTO_UPDATE, ATTR_NETWORK, ATTR_WEBUI, ATTR_AUDIO_OUTPUT,
ATTR_HASSIO_API, ATTR_AUDIO, ATTR_AUDIO_OUTPUT, ATTR_AUDIO_INPUT, ATTR_AUTH_API,
ATTR_GPIO, ATTR_HOMEASSISTANT_API, ATTR_STDIN, ATTR_LEGACY, ATTR_HOST_IPC, ATTR_AUTO_UART,
ATTR_HOST_DBUS, ATTR_AUTO_UART, ATTR_DISCOVERY, ATTR_SERVICES, ATTR_AUTO_UPDATE,
ATTR_APPARMOR, ATTR_DEVICETREE, ATTR_DOCKER_API, ATTR_FULL_ACCESS, ATTR_BOOT,
ATTR_PROTECTED, ATTR_ACCESS_TOKEN, ATTR_HOST_PID, ATTR_HASSIO_ROLE, ATTR_DESCRIPTON,
ATTR_DEVICES,
ATTR_DEVICETREE,
ATTR_DISCOVERY,
ATTR_DOCKER_API,
ATTR_ENVIRONMENT,
ATTR_FULL_ACCESS,
ATTR_GPIO,
ATTR_HASSIO_API,
ATTR_HASSIO_ROLE,
ATTR_HOMEASSISTANT,
ATTR_HOMEASSISTANT_API,
ATTR_HOST_DBUS,
ATTR_HOST_IPC,
ATTR_HOST_NETWORK,
ATTR_HOST_PID,
ATTR_IMAGE,
ATTR_INGRESS,
ATTR_INGRESS_ENTRY,
ATTR_INGRESS_PORT,
ATTR_INGRESS_TOKEN,
ATTR_KERNEL_MODULES,
ATTR_LEGACY,
ATTR_LOCATON,
ATTR_MACHINE, ATTR_MACHINE,
SECURITY_PROFILE, SECURITY_DISABLE, SECURITY_DEFAULT) ATTR_MAP,
from ..coresys import CoreSysAttributes ATTR_NAME,
ATTR_NETWORK,
ATTR_OPTIONS,
ATTR_PORTS,
ATTR_PRIVILEGED,
ATTR_PROTECTED,
ATTR_REPOSITORY,
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_STARTUP,
ATTR_STATE,
ATTR_STDIN,
ATTR_SYSTEM,
ATTR_TIMEOUT,
ATTR_TMPFS,
ATTR_URL,
ATTR_USER,
ATTR_UUID,
ATTR_VERSION,
ATTR_WEBUI,
SECURITY_DEFAULT,
SECURITY_DISABLE,
SECURITY_PROFILE,
STATE_NONE,
STATE_STARTED,
STATE_STOPPED,
)
from ..coresys import CoreSys, CoreSysAttributes
from ..docker.addon import DockerAddon from ..docker.addon import DockerAddon
from ..utils import create_token from ..docker.stats import DockerStats
from ..utils.json import write_json_file, read_json_file from ..exceptions import (
AddonsError,
AddonsNotSupportedError,
DockerAPIError,
HostAppArmorError,
JsonFileError,
)
from ..utils.apparmor import adjust_profile from ..utils.apparmor import adjust_profile
from ..exceptions import HostAppArmorError from ..utils.json import read_json_file, write_json_file
from .utils import check_installed, remove_data
from .validate import (
MACHINE_ALL,
RE_SERVICE,
RE_VOLUME,
SCHEMA_ADDON_SNAPSHOT,
validate_options,
)
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -47,20 +111,28 @@ RE_WEBUI = re.compile(
class Addon(CoreSysAttributes): class Addon(CoreSysAttributes):
"""Hold data for add-on inside Hass.io.""" """Hold data for add-on inside Hass.io."""
def __init__(self, coresys, slug): def __init__(self, coresys: CoreSys, slug: str):
"""Initialize data holder.""" """Initialize data holder."""
self.coresys = coresys self.coresys: CoreSys = coresys
self.instance = DockerAddon(coresys, slug) self.instance: DockerAddon = DockerAddon(coresys, slug)
self._id: str = slug
self._id = slug async def load(self) -> None:
async def load(self):
"""Async initialize of object.""" """Async initialize of object."""
if self.is_installed: if not self.is_installed:
return
with suppress(DockerAPIError):
await self.instance.attach() await self.instance.attach()
@property @property
def slug(self): def ip_address(self) -> IPv4Address:
"""Return IP of Add-on instance."""
if not self.is_installed:
return ip_address("0.0.0.0")
return self.instance.ip_address
@property
def slug(self) -> str:
"""Return slug/id of add-on.""" """Return slug/id of add-on."""
return self._id return self._id
@@ -75,55 +147,76 @@ class Addon(CoreSysAttributes):
return self.sys_addons.data return self.sys_addons.data
@property @property
def is_installed(self): def is_installed(self) -> bool:
"""Return True if an add-on is installed.""" """Return True if an add-on is installed."""
return self._id in self._data.system return self._id in self._data.system
@property @property
def is_detached(self): def is_detached(self) -> bool:
"""Return True if add-on is detached.""" """Return True if add-on is detached."""
return self._id not in self._data.cache return self._id not in self._data.cache
@property @property
def available(self): def available(self) -> bool:
"""Return True if this add-on is available on this platform.""" """Return True if this add-on is available on this platform."""
if self.sys_arch not in self.supported_arch: if self.is_detached:
addon_data = self._data.system.get(self._id)
else:
addon_data = self._data.cache.get(self._id)
# Architecture
if not self.sys_arch.is_supported(addon_data[ATTR_ARCH]):
return False return False
if self.sys_machine not in self.supported_machine:
# Machine / Hardware
machine = addon_data.get(ATTR_MACHINE) or MACHINE_ALL
if self.sys_machine not in machine:
return False return False
# Home Assistant
version = addon_data.get(ATTR_HOMEASSISTANT) or self.sys_homeassistant.version
if StrictVersion(self.sys_homeassistant.version) < StrictVersion(version):
return False
return True return True
@property @property
def version_installed(self): def version_installed(self) -> Optional[str]:
"""Return installed version.""" """Return installed version."""
return self._data.user.get(self._id, {}).get(ATTR_VERSION) return self._data.user.get(self._id, {}).get(ATTR_VERSION)
def _set_install(self, version): def _set_install(self, image: str, version: str) -> None:
"""Set addon as installed.""" """Set addon as installed."""
self._data.system[self._id] = deepcopy(self._data.cache[self._id]) self._data.system[self._id] = deepcopy(self._data.cache[self._id])
self._data.user[self._id] = { self._data.user[self._id] = {
ATTR_OPTIONS: {}, ATTR_OPTIONS: {},
ATTR_VERSION: version, ATTR_VERSION: version,
ATTR_IMAGE: image,
} }
self._data.save_data() self.save_data()
def _set_uninstall(self): def _set_uninstall(self) -> None:
"""Set add-on as uninstalled.""" """Set add-on as uninstalled."""
self._data.system.pop(self._id, None) self._data.system.pop(self._id, None)
self._data.user.pop(self._id, None) self._data.user.pop(self._id, None)
self._data.save_data() self.save_data()
def _set_update(self, version): def _set_update(self, image: str, version: str) -> None:
"""Update version of add-on.""" """Update version of add-on."""
self._data.system[self._id] = deepcopy(self._data.cache[self._id]) self._data.system[self._id] = deepcopy(self._data.cache[self._id])
self._data.user[self._id][ATTR_VERSION] = version self._data.user[self._id].update({
self._data.save_data() ATTR_VERSION: version,
ATTR_IMAGE: image,
})
self.save_data()
def _restore_data(self, user, system): def _restore_data(self, user: Dict[str, Any], system: Dict[str, Any], image: str) -> None:
"""Restore data to add-on.""" """Restore data to add-on."""
self._data.user[self._id] = deepcopy(user) self._data.user[self._id] = deepcopy(user)
self._data.system[self._id] = deepcopy(system) self._data.system[self._id] = deepcopy(system)
self._data.save_data()
self._data.user[self._id][ATTR_IMAGE] = image
self.save_data()
@property @property
def options(self): def options(self):
@@ -191,6 +284,20 @@ class Addon(CoreSysAttributes):
return self._data.user[self._id].get(ATTR_ACCESS_TOKEN) return self._data.user[self._id].get(ATTR_ACCESS_TOKEN)
return None return None
@property
def ingress_token(self):
"""Return access token for Hass.io API."""
if self.is_installed:
return self._data.user[self._id].get(ATTR_INGRESS_TOKEN)
return None
@property
def ingress_entry(self):
"""Return ingress external URL."""
if self.is_installed and self.with_ingress:
return f"/api/hassio_ingress/{self.ingress_token}"
return None
@property @property
def description(self): def description(self):
"""Return description of add-on.""" """Return description of add-on."""
@@ -281,6 +388,17 @@ class Addon(CoreSysAttributes):
self._data.user[self._id][ATTR_NETWORK] = new_ports self._data.user[self._id][ATTR_NETWORK] = new_ports
@property
def ingress_url(self):
"""Return URL to ingress url."""
if not self.is_installed or not self.with_ingress:
return None
webui = f"/api/hassio_ingress/{self.ingress_token}/"
if ATTR_INGRESS_ENTRY in self._mesh:
return f"{webui}{self._mesh[ATTR_INGRESS_ENTRY]}"
return webui
@property @property
def webui(self): def webui(self):
"""Return URL to webui or None.""" """Return URL to webui or None."""
@@ -312,6 +430,11 @@ class Addon(CoreSysAttributes):
return f"{proto}://[HOST]:{port}{s_suffix}" return f"{proto}://[HOST]:{port}{s_suffix}"
@property
def ingress_internal(self):
"""Return Ingress host URL."""
return f"http://{self.ip_address}:{self._mesh[ATTR_INGRESS_PORT]}"
@property @property
def host_network(self): def host_network(self):
"""Return True if add-on run on host network.""" """Return True if add-on run on host network."""
@@ -355,7 +478,7 @@ class Addon(CoreSysAttributes):
@property @property
def privileged(self): def privileged(self):
"""Return list of privilege.""" """Return list of privilege."""
return self._mesh.get(ATTR_PRIVILEGED) return self._mesh.get(ATTR_PRIVILEGED, [])
@property @property
def apparmor(self): def apparmor(self):
@@ -396,11 +519,21 @@ class Addon(CoreSysAttributes):
"""Return True if the add-on access use stdin input.""" """Return True if the add-on access use stdin input."""
return self._mesh[ATTR_STDIN] return self._mesh[ATTR_STDIN]
@property
def with_ingress(self):
"""Return True if the add-on access support ingress."""
return self._mesh[ATTR_INGRESS]
@property @property
def with_gpio(self): def with_gpio(self):
"""Return True if the add-on access to GPIO interface.""" """Return True if the add-on access to GPIO interface."""
return self._mesh[ATTR_GPIO] return self._mesh[ATTR_GPIO]
@property
def with_kernel_modules(self):
"""Return True if the add-on access to kernel modules."""
return self._mesh[ATTR_KERNEL_MODULES]
@property @property
def with_full_access(self): def with_full_access(self):
"""Return True if the add-on want full access to hardware.""" """Return True if the add-on want full access to hardware."""
@@ -411,11 +544,21 @@ class Addon(CoreSysAttributes):
"""Return True if the add-on read access to devicetree.""" """Return True if the add-on read access to devicetree."""
return self._mesh[ATTR_DEVICETREE] return self._mesh[ATTR_DEVICETREE]
@property
def access_auth_api(self):
"""Return True if the add-on access to login/auth backend."""
return self._mesh[ATTR_AUTH_API]
@property @property
def with_audio(self): def with_audio(self):
"""Return True if the add-on access to audio.""" """Return True if the add-on access to audio."""
return self._mesh[ATTR_AUDIO] return self._mesh[ATTR_AUDIO]
@property
def homeassistant_version(self) -> Optional[str]:
"""Return min Home Assistant version they needed by Add-on."""
return self._mesh.get(ATTR_HOMEASSISTANT)
@property @property
def audio_output(self): def audio_output(self):
"""Return ALSA config for output or None.""" """Return ALSA config for output or None."""
@@ -486,21 +629,37 @@ class Addon(CoreSysAttributes):
@property @property
def image(self): def image(self):
"""Return image name of add-on.""" """Return image name of add-on."""
addon_data = self._mesh if self.is_installed:
return self._data.user[self._id].get(ATTR_IMAGE)
return self.image_next
@property
def image_next(self):
"""Return image name for install/update."""
if self.is_detached:
addon_data = self._data.system.get(self._id)
else:
addon_data = self._data.cache.get(self._id)
return self._get_image(addon_data)
def _get_image(self, addon_data) -> str:
"""Generate image name from data."""
# Repository with Dockerhub images # Repository with Dockerhub images
if ATTR_IMAGE in addon_data: if ATTR_IMAGE in addon_data:
return addon_data[ATTR_IMAGE].format(arch=self.sys_arch) arch = self.sys_arch.match(addon_data[ATTR_ARCH])
return addon_data[ATTR_IMAGE].format(arch=arch)
# local build # local build
return "{}/{}-addon-{}".format( return (f"{addon_data[ATTR_REPOSITORY]}/"
addon_data[ATTR_REPOSITORY], self.sys_arch, f"{self.sys_arch.default}-"
addon_data[ATTR_SLUG]) f"addon-{addon_data[ATTR_SLUG]}")
@property @property
def need_build(self): def need_build(self):
"""Return True if this add-on need a local build.""" """Return True if this add-on need a local build."""
return ATTR_IMAGE not in self._mesh if self.is_detached:
return ATTR_IMAGE not in self._data.system.get(self._id)
return ATTR_IMAGE not in self._data.cache.get(self._id)
@property @property
def map_volumes(self): def map_volumes(self):
@@ -577,8 +736,8 @@ class Addon(CoreSysAttributes):
except vol.Invalid as ex: except vol.Invalid as ex:
_LOGGER.error("Add-on %s have wrong options: %s", self._id, _LOGGER.error("Add-on %s have wrong options: %s", self._id,
humanize_error(options, ex)) humanize_error(options, ex))
except (OSError, json.JSONDecodeError) as err: except JsonFileError:
_LOGGER.error("Add-on %s can't write options: %s", self._id, err) _LOGGER.error("Add-on %s can't write options", self._id)
else: else:
return True return True
@@ -605,7 +764,7 @@ class Addon(CoreSysAttributes):
return True return True
async def _install_apparmor(self): async def _install_apparmor(self) -> None:
"""Install or Update AppArmor profile for Add-on.""" """Install or Update AppArmor profile for Add-on."""
exists_local = self.sys_host.apparmor.exists(self.slug) exists_local = self.sys_host.apparmor.exists(self.slug)
exists_addon = self.path_apparmor.exists() exists_addon = self.path_apparmor.exists()
@@ -627,7 +786,7 @@ class Addon(CoreSysAttributes):
await self.sys_host.apparmor.load_profile(self.slug, profile_file) await self.sys_host.apparmor.load_profile(self.slug, profile_file)
@property @property
def schema(self): def schema(self) -> vol.Schema:
"""Create a schema for add-on options.""" """Create a schema for add-on options."""
raw_schema = self._mesh[ATTR_SCHEMA] raw_schema = self._mesh[ATTR_SCHEMA]
@@ -635,7 +794,7 @@ class Addon(CoreSysAttributes):
return vol.Schema(dict) return vol.Schema(dict)
return vol.Schema(vol.All(dict, validate_options(raw_schema))) return vol.Schema(vol.All(dict, validate_options(raw_schema)))
def test_update_schema(self): def test_update_schema(self) -> bool:
"""Check if the existing configuration is valid after update.""" """Check if the existing configuration is valid after update."""
if not self.is_installed or self.is_detached: if not self.is_installed or self.is_detached:
return True return True
@@ -665,16 +824,17 @@ class Addon(CoreSysAttributes):
return False return False
return True return True
async def install(self): async def install(self) -> None:
"""Install an add-on.""" """Install an add-on."""
if not self.available: if not self.available:
_LOGGER.error( _LOGGER.error(
"Add-on %s not supported on %s", self._id, self.sys_arch) "Add-on %s not supported on %s with %s architecture",
return False self._id, self.sys_machine, self.sys_arch.supported)
raise AddonsNotSupportedError()
if self.is_installed: if self.is_installed:
_LOGGER.error("Add-on %s is already installed", self._id) _LOGGER.warning("Add-on %s is already installed", self._id)
return False return
if not self.path_data.is_dir(): if not self.path_data.is_dir():
_LOGGER.info( _LOGGER.info(
@@ -684,17 +844,20 @@ class Addon(CoreSysAttributes):
# Setup/Fix AppArmor profile # Setup/Fix AppArmor profile
await self._install_apparmor() await self._install_apparmor()
if not await self.instance.install(self.last_version): try:
return False await self.instance.install(self.last_version, self.image_next)
except DockerAPIError:
self._set_install(self.last_version) raise AddonsError() from None
return True else:
self._set_install(self.image_next, self.last_version)
@check_installed @check_installed
async def uninstall(self): async def uninstall(self) -> None:
"""Remove an add-on.""" """Remove an add-on."""
if not await self.instance.remove(): try:
return False await self.instance.remove()
except DockerAPIError:
raise AddonsError() from None
if self.path_data.is_dir(): if self.path_data.is_dir():
_LOGGER.info( _LOGGER.info(
@@ -711,13 +874,11 @@ class Addon(CoreSysAttributes):
with suppress(HostAppArmorError): with suppress(HostAppArmorError):
await self.sys_host.apparmor.remove_profile(self.slug) await self.sys_host.apparmor.remove_profile(self.slug)
# Remove discovery messages # Cleanup internal data
self.remove_discovery() self.remove_discovery()
self._set_uninstall() self._set_uninstall()
return True
async def state(self): async def state(self) -> str:
"""Return running state of add-on.""" """Return running state of add-on."""
if not self.is_installed: if not self.is_installed:
return STATE_NONE return STATE_NONE
@@ -727,46 +888,58 @@ class Addon(CoreSysAttributes):
return STATE_STOPPED return STATE_STOPPED
@check_installed @check_installed
async def start(self): async def start(self) -> None:
"""Set options and start add-on.""" """Set options and start add-on."""
if await self.instance.is_running(): if await self.instance.is_running():
_LOGGER.warning("%s already running!", self.slug) _LOGGER.warning("%s already running!", self.slug)
return return
# Access Token # Access Token
self._data.user[self._id][ATTR_ACCESS_TOKEN] = create_token() self._data.user[self._id][ATTR_ACCESS_TOKEN] = secrets.token_hex(56)
self._data.save_data() self.save_data()
# Options # Options
if not self.write_options(): if not self.write_options():
return False raise AddonsError()
# Sound # Sound
if self.with_audio and not self.write_asound(): if self.with_audio and not self.write_asound():
return False raise AddonsError()
return await self.instance.run() try:
await self.instance.run()
except DockerAPIError:
raise AddonsError() from None
@check_installed @check_installed
def stop(self): async def stop(self) -> None:
"""Stop add-on. """Stop add-on."""
try:
Return a coroutine. return await self.instance.stop()
""" except DockerAPIError:
return self.instance.stop() raise AddonsError() from None
@check_installed @check_installed
async def update(self): async def update(self) -> None:
"""Update add-on.""" """Update add-on."""
last_state = await self.state()
if self.last_version == self.version_installed: if self.last_version == self.version_installed:
_LOGGER.warning("No update available for add-on %s", self._id) _LOGGER.warning("No update available for add-on %s", self._id)
return False return
if not await self.instance.update(self.last_version): # Check if available, Maybe something have changed
return False if not self.available:
self._set_update(self.last_version) _LOGGER.error(
"Add-on %s not supported on %s with %s architecture",
self._id, self.sys_machine, self.sys_arch.supported)
raise AddonsNotSupportedError()
# Update instance
last_state = await self.state()
try:
await self.instance.update(self.last_version, self.image_next)
except DockerAPIError:
raise AddonsError() from None
self._set_update(self.image_next, self.last_version)
# Setup/Fix AppArmor profile # Setup/Fix AppArmor profile
await self._install_apparmor() await self._install_apparmor()
@@ -774,16 +947,16 @@ class Addon(CoreSysAttributes):
# restore state # restore state
if last_state == STATE_STARTED: if last_state == STATE_STARTED:
await self.start() await self.start()
return True
@check_installed @check_installed
async def restart(self): async def restart(self) -> None:
"""Restart add-on.""" """Restart add-on."""
with suppress(AddonsError):
await self.stop() await self.stop()
return await self.start() await self.start()
@check_installed @check_installed
def logs(self): def logs(self) -> Awaitable[bytes]:
"""Return add-ons log output. """Return add-ons log output.
Return a coroutine. Return a coroutine.
@@ -791,33 +964,32 @@ class Addon(CoreSysAttributes):
return self.instance.logs() return self.instance.logs()
@check_installed @check_installed
def stats(self): async def stats(self) -> DockerStats:
"""Return stats of container. """Return stats of container."""
try:
Return a coroutine. return await self.instance.stats()
""" except DockerAPIError:
return self.instance.stats() raise AddonsError() from None
@check_installed @check_installed
async def rebuild(self): async def rebuild(self) -> None:
"""Perform a rebuild of local build add-on.""" """Perform a rebuild of local build add-on."""
last_state = await self.state() last_state = await self.state()
if not self.need_build: if not self.need_build:
_LOGGER.error("Can't rebuild a none local build add-on!") _LOGGER.error("Can't rebuild a none local build add-on!")
return False raise AddonsNotSupportedError()
# remove docker container but not addon config # remove docker container but not addon config
if not await self.instance.remove(): try:
return False await self.instance.remove()
await self.instance.install(self.version_installed)
if not await self.instance.install(self.version_installed): except DockerAPIError:
return False raise AddonsError() from None
# restore state # restore state
if last_state == STATE_STARTED: if last_state == STATE_STARTED:
await self.start() await self.start()
return True
@check_installed @check_installed
async def write_stdin(self, data): async def write_stdin(self, data):
@@ -827,18 +999,23 @@ class Addon(CoreSysAttributes):
""" """
if not self.with_stdin: if not self.with_stdin:
_LOGGER.error("Add-on don't support write to stdin!") _LOGGER.error("Add-on don't support write to stdin!")
return False raise AddonsNotSupportedError()
try:
return await self.instance.write_stdin(data) return await self.instance.write_stdin(data)
except DockerAPIError:
raise AddonsError() from None
@check_installed @check_installed
async def snapshot(self, tar_file): async def snapshot(self, tar_file: tarfile.TarFile) -> None:
"""Snapshot state of an add-on.""" """Snapshot state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp: with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
# store local image # store local image
if self.need_build and not await \ if self.need_build:
self.instance.export_image(Path(temp, 'image.tar')): try:
return False await self.instance.export_image(Path(temp, 'image.tar'))
except DockerAPIError:
raise AddonsError() from None
data = { data = {
ATTR_USER: self._data.user.get(self._id, {}), ATTR_USER: self._data.user.get(self._id, {}),
@@ -850,9 +1027,9 @@ class Addon(CoreSysAttributes):
# Store local configs/state # Store local configs/state
try: try:
write_json_file(Path(temp, 'addon.json'), data) write_json_file(Path(temp, 'addon.json'), data)
except (OSError, json.JSONDecodeError) as err: except JsonFileError:
_LOGGER.error("Can't save meta for %s: %s", self._id, err) _LOGGER.error("Can't save meta for %s", self._id)
return False raise AddonsError() from None
# Store AppArmor Profile # Store AppArmor Profile
if self.sys_host.apparmor.exists(self.slug): if self.sys_host.apparmor.exists(self.slug):
@@ -861,7 +1038,7 @@ class Addon(CoreSysAttributes):
self.sys_host.apparmor.backup_profile(self.slug, profile) self.sys_host.apparmor.backup_profile(self.slug, profile)
except HostAppArmorError: except HostAppArmorError:
_LOGGER.error("Can't backup AppArmor profile") _LOGGER.error("Can't backup AppArmor profile")
return False raise AddonsError() from None
# write into tarfile # write into tarfile
def _write_tarfile(): def _write_tarfile():
@@ -875,12 +1052,11 @@ class Addon(CoreSysAttributes):
await self.sys_run_in_executor(_write_tarfile) await self.sys_run_in_executor(_write_tarfile)
except (tarfile.TarError, OSError) as err: except (tarfile.TarError, OSError) as err:
_LOGGER.error("Can't write tarfile %s: %s", tar_file, err) _LOGGER.error("Can't write tarfile %s: %s", tar_file, err)
return False raise AddonsError() from None
_LOGGER.info("Finish snapshot for addon %s", self._id) _LOGGER.info("Finish snapshot for addon %s", self._id)
return True
async def restore(self, tar_file): async def restore(self, tar_file: tarfile.TarFile) -> None:
"""Restore state of an add-on.""" """Restore state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp: with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
# extract snapshot # extract snapshot
@@ -893,13 +1069,13 @@ class Addon(CoreSysAttributes):
await self.sys_run_in_executor(_extract_tarfile) await self.sys_run_in_executor(_extract_tarfile)
except tarfile.TarError as err: except tarfile.TarError as err:
_LOGGER.error("Can't read tarfile %s: %s", tar_file, err) _LOGGER.error("Can't read tarfile %s: %s", tar_file, err)
return False raise AddonsError() from None
# Read snapshot data # Read snapshot data
try: try:
data = read_json_file(Path(temp, 'addon.json')) data = read_json_file(Path(temp, 'addon.json'))
except (OSError, json.JSONDecodeError) as err: except JsonFileError:
_LOGGER.error("Can't read addon.json: %s", err) raise AddonsError() from None
# Validate # Validate
try: try:
@@ -907,24 +1083,32 @@ class Addon(CoreSysAttributes):
except vol.Invalid as err: except vol.Invalid as err:
_LOGGER.error("Can't validate %s, snapshot data: %s", _LOGGER.error("Can't validate %s, snapshot data: %s",
self._id, humanize_error(data, err)) self._id, humanize_error(data, err))
return False raise AddonsError() from None
# Restore data or reload add-on # Restore local add-on informations
_LOGGER.info("Restore config for addon %s", self._id) _LOGGER.info("Restore config for addon %s", self._id)
self._restore_data(data[ATTR_USER], data[ATTR_SYSTEM]) restore_image = self._get_image(data[ATTR_SYSTEM])
self._restore_data(data[ATTR_USER], data[ATTR_SYSTEM], restore_image)
# Check version / restore image # Check version / restore image
version = data[ATTR_VERSION] version = data[ATTR_VERSION]
if not await self.instance.exists(): if not await self.instance.exists():
_LOGGER.info("Restore image for addon %s", self._id) _LOGGER.info("Restore/Install image for addon %s", self._id)
image_file = Path(temp, 'image.tar') image_file = Path(temp, 'image.tar')
if image_file.is_file(): if image_file.is_file():
with suppress(DockerAPIError):
await self.instance.import_image(image_file, version) await self.instance.import_image(image_file, version)
else: else:
if await self.instance.install(version): with suppress(DockerAPIError):
await self.instance.install(version, restore_image)
await self.instance.cleanup() await self.instance.cleanup()
elif self.instance.version != version or self.legacy:
_LOGGER.info("Restore/Update image for addon %s", self._id)
with suppress(DockerAPIError):
await self.instance.update(version, restore_image)
else: else:
with suppress(DockerAPIError):
await self.instance.stop() await self.instance.stop()
# Restore data # Restore data
@@ -939,7 +1123,7 @@ class Addon(CoreSysAttributes):
await self.sys_run_in_executor(_restore_data) await self.sys_run_in_executor(_restore_data)
except shutil.Error as err: except shutil.Error as err:
_LOGGER.error("Can't restore origin data: %s", err) _LOGGER.error("Can't restore origin data: %s", err)
return False raise AddonsError() from None
# Restore AppArmor # Restore AppArmor
profile_file = Path(temp, 'apparmor.txt') profile_file = Path(temp, 'apparmor.txt')
@@ -949,11 +1133,10 @@ class Addon(CoreSysAttributes):
self.slug, profile_file) self.slug, profile_file)
except HostAppArmorError: except HostAppArmorError:
_LOGGER.error("Can't restore AppArmor profile") _LOGGER.error("Can't restore AppArmor profile")
return False raise AddonsError() from None
# Run add-on # Run add-on
if data[ATTR_STATE] == STATE_STARTED: if data[ATTR_STATE] == STATE_STARTED:
return await self.start() return await self.start()
_LOGGER.info("Finish restore for add-on %s", self._id) _LOGGER.info("Finish restore for add-on %s", self._id)
return True

View File

@@ -1,45 +1,50 @@
"""Hass.io add-on build environment.""" """Hass.io add-on build environment."""
from __future__ import annotations
from pathlib import Path from pathlib import Path
from typing import TYPE_CHECKING, Dict
from .validate import SCHEMA_BUILD_CONFIG, BASE_IMAGE from ..const import ATTR_ARGS, ATTR_BUILD_FROM, ATTR_SQUASH, META_ADDON
from ..const import ATTR_SQUASH, ATTR_BUILD_FROM, ATTR_ARGS, META_ADDON from ..coresys import CoreSys, CoreSysAttributes
from ..coresys import CoreSysAttributes
from ..utils.json import JsonConfig from ..utils.json import JsonConfig
from .validate import SCHEMA_BUILD_CONFIG
if TYPE_CHECKING:
from .addon import Addon
class AddonBuild(JsonConfig, CoreSysAttributes): class AddonBuild(JsonConfig, CoreSysAttributes):
"""Handle build options for add-ons.""" """Handle build options for add-ons."""
def __init__(self, coresys, slug): def __init__(self, coresys: CoreSys, slug: str) -> None:
"""Initialize Hass.io add-on builder.""" """Initialize Hass.io add-on builder."""
self.coresys = coresys self.coresys: CoreSys = coresys
self._id = slug self._id: str = slug
super().__init__( super().__init__(
Path(self.addon.path_location, 'build.json'), SCHEMA_BUILD_CONFIG) Path(self.addon.path_location, 'build.json'), SCHEMA_BUILD_CONFIG)
def save_data(self): def save_data(self):
"""Ignore save function.""" """Ignore save function."""
pass
@property @property
def addon(self): def addon(self) -> Addon:
"""Return add-on of build data.""" """Return add-on of build data."""
return self.sys_addons.get(self._id) return self.sys_addons.get(self._id)
@property @property
def base_image(self): def base_image(self) -> str:
"""Base images for this add-on.""" """Base images for this add-on."""
return self._data[ATTR_BUILD_FROM].get( return self._data[ATTR_BUILD_FROM].get(
self.sys_arch, BASE_IMAGE[self.sys_arch]) self.sys_arch.default,
f"homeassistant/{self.sys_arch.default}-base:latest")
@property @property
def squash(self): def squash(self) -> bool:
"""Return True or False if squash is active.""" """Return True or False if squash is active."""
return self._data[ATTR_SQUASH] return self._data[ATTR_SQUASH]
@property @property
def additional_args(self): def additional_args(self) -> Dict[str, str]:
"""Return additional Docker build arguments.""" """Return additional Docker build arguments."""
return self._data[ATTR_ARGS] return self._data[ATTR_ARGS]
@@ -53,7 +58,7 @@ class AddonBuild(JsonConfig, CoreSysAttributes):
'squash': self.squash, 'squash': self.squash,
'labels': { 'labels': {
'io.hass.version': version, 'io.hass.version': version,
'io.hass.arch': self.sys_arch, 'io.hass.arch': self.sys_arch.default,
'io.hass.type': META_ADDON, 'io.hass.type': META_ADDON,
'io.hass.name': self._fix_label('name'), 'io.hass.name': self._fix_label('name'),
'io.hass.description': self._fix_label('description'), 'io.hass.description': self._fix_label('description'),
@@ -61,7 +66,7 @@ class AddonBuild(JsonConfig, CoreSysAttributes):
'buildargs': { 'buildargs': {
'BUILD_FROM': self.base_image, 'BUILD_FROM': self.base_image,
'BUILD_VERSION': version, 'BUILD_VERSION': version,
'BUILD_ARCH': self.sys_arch, 'BUILD_ARCH': self.sys_arch.default,
**self.additional_args, **self.additional_args,
} }
} }
@@ -71,7 +76,7 @@ class AddonBuild(JsonConfig, CoreSysAttributes):
return args return args
def _fix_label(self, label_name): def _fix_label(self, label_name: str) -> str:
"""Remove characters they are not supported.""" """Remove characters they are not supported."""
label = getattr(self.addon, label_name, "") label = getattr(self.addon, label_name, "")
return label.replace("'", "") return label.replace("'", "")

View File

@@ -1,19 +1,25 @@
"""Init file for Hass.io add-on data.""" """Init file for Hass.io add-on data."""
import logging import logging
import json
from pathlib import Path from pathlib import Path
import voluptuous as vol import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
from .utils import extract_hash_from_path
from .validate import (
SCHEMA_ADDON_CONFIG, SCHEMA_ADDONS_FILE, SCHEMA_REPOSITORY_CONFIG)
from ..const import ( from ..const import (
FILE_HASSIO_ADDONS, ATTR_SLUG, ATTR_REPOSITORY, ATTR_LOCATON, ATTR_LOCATON,
REPOSITORY_CORE, REPOSITORY_LOCAL, ATTR_USER, ATTR_SYSTEM) ATTR_REPOSITORY,
ATTR_SLUG,
ATTR_SYSTEM,
ATTR_USER,
FILE_HASSIO_ADDONS,
REPOSITORY_CORE,
REPOSITORY_LOCAL,
)
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..exceptions import JsonFileError
from ..utils.json import JsonConfig, read_json_file from ..utils.json import JsonConfig, read_json_file
from .utils import extract_hash_from_path
from .validate import SCHEMA_ADDON_CONFIG, SCHEMA_ADDONS_FILE, SCHEMA_REPOSITORY_CONFIG
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -54,12 +60,10 @@ class AddonsData(JsonConfig, CoreSysAttributes):
self._repositories = {} self._repositories = {}
# read core repository # read core repository
self._read_addons_folder( self._read_addons_folder(self.sys_config.path_addons_core, REPOSITORY_CORE)
self.sys_config.path_addons_core, REPOSITORY_CORE)
# read local repository # read local repository
self._read_addons_folder( self._read_addons_folder(self.sys_config.path_addons_local, REPOSITORY_LOCAL)
self.sys_config.path_addons_local, REPOSITORY_LOCAL)
# add built-in repositories information # add built-in repositories information
self._set_builtin_repositories() self._set_builtin_repositories()
@@ -76,15 +80,12 @@ class AddonsData(JsonConfig, CoreSysAttributes):
# exists repository json # exists repository json
repository_file = Path(path, "repository.json") repository_file = Path(path, "repository.json")
try: try:
repository_info = SCHEMA_REPOSITORY_CONFIG( repository_info = SCHEMA_REPOSITORY_CONFIG(read_json_file(repository_file))
read_json_file(repository_file) except JsonFileError:
_LOGGER.warning(
"Can't read repository information from %s", repository_file
) )
except (OSError, json.JSONDecodeError, UnicodeDecodeError):
_LOGGER.warning("Can't read repository information from %s",
repository_file)
return return
except vol.Invalid: except vol.Invalid:
_LOGGER.warning("Repository parse error %s", repository_file) _LOGGER.warning("Repository parse error %s", repository_file)
return return
@@ -98,39 +99,38 @@ class AddonsData(JsonConfig, CoreSysAttributes):
for addon in path.glob("**/config.json"): for addon in path.glob("**/config.json"):
try: try:
addon_config = read_json_file(addon) addon_config = read_json_file(addon)
except JsonFileError:
_LOGGER.warning("Can't read %s from repository %s", addon, repository)
continue
# validate # validate
try:
addon_config = SCHEMA_ADDON_CONFIG(addon_config) addon_config = SCHEMA_ADDON_CONFIG(addon_config)
except vol.Invalid as ex:
_LOGGER.warning(
"Can't read %s: %s", addon, humanize_error(addon_config, ex)
)
continue
# Generate slug # Generate slug
addon_slug = "{}_{}".format( addon_slug = "{}_{}".format(repository, addon_config[ATTR_SLUG])
repository, addon_config[ATTR_SLUG])
# store # store
addon_config[ATTR_REPOSITORY] = repository addon_config[ATTR_REPOSITORY] = repository
addon_config[ATTR_LOCATON] = str(addon.parent) addon_config[ATTR_LOCATON] = str(addon.parent)
self._cache[addon_slug] = addon_config self._cache[addon_slug] = addon_config
except (OSError, json.JSONDecodeError):
_LOGGER.warning("Can't read %s", addon)
except vol.Invalid as ex:
_LOGGER.warning("Can't read %s: %s", addon,
humanize_error(addon_config, ex))
def _set_builtin_repositories(self): def _set_builtin_repositories(self):
"""Add local built-in repository into dataset.""" """Add local built-in repository into dataset."""
try: try:
builtin_file = Path(__file__).parent.joinpath('built-in.json') builtin_file = Path(__file__).parent.joinpath("built-in.json")
builtin_data = read_json_file(builtin_file) builtin_data = read_json_file(builtin_file)
except (OSError, json.JSONDecodeError) as err: except JsonFileError:
_LOGGER.warning("Can't read built-in json: %s", err) _LOGGER.warning("Can't read built-in json")
return return
# core repository # core repository
self._repositories[REPOSITORY_CORE] = \ self._repositories[REPOSITORY_CORE] = builtin_data[REPOSITORY_CORE]
builtin_data[REPOSITORY_CORE]
# local repository # local repository
self._repositories[REPOSITORY_LOCAL] = \ self._repositories[REPOSITORY_LOCAL] = builtin_data[REPOSITORY_LOCAL]
builtin_data[REPOSITORY_LOCAL]

View File

@@ -1,24 +1,39 @@
"""Util add-ons functions.""" """Util add-ons functions."""
from __future__ import annotations
import asyncio import asyncio
import hashlib import hashlib
import logging import logging
from pathlib import Path
import re import re
from typing import TYPE_CHECKING
from ..const import ( from ..const import (
SECURITY_DISABLE, SECURITY_PROFILE, PRIVILEGED_NET_ADMIN, PRIVILEGED_DAC_READ_SEARCH,
PRIVILEGED_SYS_ADMIN, PRIVILEGED_SYS_RAWIO, PRIVILEGED_SYS_PTRACE, PRIVILEGED_NET_ADMIN,
ROLE_ADMIN, ROLE_MANAGER) PRIVILEGED_SYS_ADMIN,
PRIVILEGED_SYS_MODULE,
PRIVILEGED_SYS_PTRACE,
PRIVILEGED_SYS_RAWIO,
ROLE_ADMIN,
ROLE_MANAGER,
SECURITY_DISABLE,
SECURITY_PROFILE,
)
from ..exceptions import AddonsNotSupportedError
if TYPE_CHECKING:
from .addon import Addon
RE_SHA1 = re.compile(r"[a-f0-9]{8}") RE_SHA1 = re.compile(r"[a-f0-9]{8}")
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
def rating_security(addon): def rating_security(addon: Addon) -> int:
"""Return 1-5 for security rating. """Return 1-6 for security rating.
1 = not secure 1 = not secure
5 = high secure 6 = high secure
""" """
rating = 5 rating = 5
@@ -28,9 +43,22 @@ def rating_security(addon):
elif addon.apparmor == SECURITY_PROFILE: elif addon.apparmor == SECURITY_PROFILE:
rating += 1 rating += 1
# Home Assistant Login
if addon.access_auth_api:
rating += 1
# Privileged options # Privileged options
if addon.privileged in (PRIVILEGED_NET_ADMIN, PRIVILEGED_SYS_ADMIN, if any(
PRIVILEGED_SYS_RAWIO, PRIVILEGED_SYS_PTRACE): privilege in addon.privileged
for privilege in (
PRIVILEGED_NET_ADMIN,
PRIVILEGED_SYS_ADMIN,
PRIVILEGED_SYS_RAWIO,
PRIVILEGED_SYS_PTRACE,
PRIVILEGED_SYS_MODULE,
PRIVILEGED_DAC_READ_SEARCH,
)
):
rating += -1 rating += -1
# API Hass.io role # API Hass.io role
@@ -58,45 +86,46 @@ def rating_security(addon):
return max(min(6, rating), 1) return max(min(6, rating), 1)
def get_hash_from_repository(name): def get_hash_from_repository(name: str) -> str:
"""Generate a hash from repository.""" """Generate a hash from repository."""
key = name.lower().encode() key = name.lower().encode()
return hashlib.sha1(key).hexdigest()[:8] return hashlib.sha1(key).hexdigest()[:8]
def extract_hash_from_path(path): def extract_hash_from_path(path: Path) -> str:
"""Extract repo id from path.""" """Extract repo id from path."""
repo_dir = path.parts[-1] repository_dir = path.parts[-1]
if not RE_SHA1.match(repo_dir): if not RE_SHA1.match(repository_dir):
return get_hash_from_repository(repo_dir) return get_hash_from_repository(repository_dir)
return repo_dir return repository_dir
def check_installed(method): def check_installed(method):
"""Wrap function with check if add-on is installed.""" """Wrap function with check if add-on is installed."""
async def wrap_check(addon, *args, **kwargs): async def wrap_check(addon, *args, **kwargs):
"""Return False if not installed or the function.""" """Return False if not installed or the function."""
if not addon.is_installed: if not addon.is_installed:
_LOGGER.error("Addon %s is not installed", addon.slug) _LOGGER.error("Addon %s is not installed", addon.slug)
return False raise AddonsNotSupportedError()
return await method(addon, *args, **kwargs) return await method(addon, *args, **kwargs)
return wrap_check return wrap_check
async def remove_data(folder): async def remove_data(folder: Path) -> None:
"""Remove folder and reset privileged.""" """Remove folder and reset privileged."""
try: try:
proc = await asyncio.create_subprocess_exec( proc = await asyncio.create_subprocess_exec(
"rm", "-rf", str(folder), "rm", "-rf", str(folder), stdout=asyncio.subprocess.DEVNULL
stdout=asyncio.subprocess.DEVNULL
) )
_, error_msg = await proc.communicate() _, error_msg = await proc.communicate()
except OSError as err: except OSError as err:
error_msg = str(err) error_msg = str(err)
else:
if proc.returncode == 0: if proc.returncode == 0:
return return
_LOGGER.error("Can't remove Add-on Data: %s", error_msg) _LOGGER.error("Can't remove Add-on Data: %s", error_msg)

View File

@@ -1,37 +1,92 @@
"""Validate add-ons options schema.""" """Validate add-ons options schema."""
import logging import logging
import re import re
import secrets
import uuid import uuid
import voluptuous as vol import voluptuous as vol
from ..const import ( from ..const import (
ATTR_NAME, ATTR_VERSION, ATTR_SLUG, ATTR_DESCRIPTON, ATTR_STARTUP, ARCH_ALL,
ATTR_BOOT, ATTR_MAP, ATTR_OPTIONS, ATTR_PORTS, STARTUP_ONCE, ATTR_ACCESS_TOKEN,
STARTUP_SYSTEM, STARTUP_SERVICES, STARTUP_APPLICATION, STARTUP_INITIALIZE, ATTR_APPARMOR,
BOOT_AUTO, BOOT_MANUAL, ATTR_SCHEMA, ATTR_IMAGE, ATTR_URL, ATTR_MAINTAINER, ATTR_ARCH,
ATTR_ARCH, ATTR_DEVICES, ATTR_ENVIRONMENT, ATTR_HOST_NETWORK, ARCH_ARMHF, ATTR_ARGS,
ARCH_AARCH64, ARCH_AMD64, ARCH_I386, ATTR_TMPFS, ATTR_PRIVILEGED, ATTR_AUDIO,
ATTR_USER, ATTR_STATE, ATTR_SYSTEM, STATE_STARTED, STATE_STOPPED, ATTR_AUDIO_INPUT,
ATTR_LOCATON, ATTR_REPOSITORY, ATTR_TIMEOUT, ATTR_NETWORK, ATTR_UUID, ATTR_AUDIO_OUTPUT,
ATTR_AUTO_UPDATE, ATTR_WEBUI, ATTR_AUDIO, ATTR_AUDIO_INPUT, ATTR_HOST_IPC, ATTR_AUTH_API,
ATTR_AUDIO_OUTPUT, ATTR_HASSIO_API, ATTR_BUILD_FROM, ATTR_SQUASH, ATTR_AUTO_UART,
ATTR_ARGS, ATTR_GPIO, ATTR_HOMEASSISTANT_API, ATTR_STDIN, ATTR_LEGACY, ATTR_AUTO_UPDATE,
ATTR_HOST_DBUS, ATTR_AUTO_UART, ATTR_SERVICES, ATTR_DISCOVERY, ATTR_BOOT,
ATTR_APPARMOR, ATTR_DEVICETREE, ATTR_DOCKER_API, ATTR_PROTECTED, ATTR_BUILD_FROM,
ATTR_FULL_ACCESS, ATTR_ACCESS_TOKEN, ATTR_HOST_PID, ATTR_HASSIO_ROLE, ATTR_DESCRIPTON,
ATTR_DEVICES,
ATTR_DEVICETREE,
ATTR_DISCOVERY,
ATTR_DOCKER_API,
ATTR_ENVIRONMENT,
ATTR_FULL_ACCESS,
ATTR_GPIO,
ATTR_HASSIO_API,
ATTR_HASSIO_ROLE,
ATTR_HOMEASSISTANT_API,
ATTR_HOMEASSISTANT,
ATTR_HOST_DBUS,
ATTR_HOST_IPC,
ATTR_HOST_NETWORK,
ATTR_HOST_PID,
ATTR_IMAGE,
ATTR_INGRESS,
ATTR_INGRESS_ENTRY,
ATTR_INGRESS_PORT,
ATTR_INGRESS_TOKEN,
ATTR_KERNEL_MODULES,
ATTR_LEGACY,
ATTR_LOCATON,
ATTR_MACHINE, ATTR_MACHINE,
PRIVILEGED_NET_ADMIN, PRIVILEGED_SYS_ADMIN, PRIVILEGED_SYS_RAWIO, ATTR_MAINTAINER,
PRIVILEGED_IPC_LOCK, PRIVILEGED_SYS_TIME, PRIVILEGED_SYS_NICE, ATTR_MAP,
PRIVILEGED_SYS_RESOURCE, PRIVILEGED_SYS_PTRACE, ATTR_NAME,
ROLE_DEFAULT, ROLE_HOMEASSISTANT, ROLE_MANAGER, ROLE_ADMIN) ATTR_NETWORK,
from ..validate import NETWORK_PORT, DOCKER_PORTS, ALSA_DEVICE, UUID_MATCH ATTR_OPTIONS,
from ..services.validate import DISCOVERY_SERVICES ATTR_PORTS,
ATTR_PRIVILEGED,
ATTR_PROTECTED,
ATTR_REPOSITORY,
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SQUASH,
ATTR_STARTUP,
ATTR_STATE,
ATTR_STDIN,
ATTR_SYSTEM,
ATTR_TIMEOUT,
ATTR_TMPFS,
ATTR_URL,
ATTR_USER,
ATTR_UUID,
ATTR_VERSION,
ATTR_WEBUI,
BOOT_AUTO,
BOOT_MANUAL,
PRIVILEGED_ALL,
ROLE_ALL,
ROLE_DEFAULT,
STARTUP_ALL,
STARTUP_APPLICATION,
STARTUP_SERVICES,
STATE_STARTED,
STATE_STOPPED,
)
from ..discovery.validate import valid_discovery_service
from ..validate import ALSA_DEVICE, DOCKER_PORTS, NETWORK_PORT, TOKEN, UUID_MATCH
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
RE_VOLUME = re.compile(r"^(config|ssl|addons|backup|share)(?::(rw|:ro))?$") RE_VOLUME = re.compile(r"^(config|ssl|addons|backup|share)(?::(rw|ro))?$")
RE_SERVICE = re.compile(r"^(?P<service>mqtt):(?P<rights>provide|want|need)$") RE_SERVICE = re.compile(r"^(?P<service>mqtt):(?P<rights>provide|want|need)$")
V_STR = 'str' V_STR = 'str'
@@ -52,48 +107,20 @@ RE_SCHEMA_ELEMENT = re.compile(
r")\??$" r")\??$"
) )
RE_DOCKER_IMAGE = re.compile(
r"^([a-zA-Z\-\.:\d{}]+/)*?([\-\w{}]+)/([\-\w{}]+)$")
RE_DOCKER_IMAGE_BUILD = re.compile(
r"^([a-zA-Z\-\.:\d{}]+/)*?([\-\w{}]+)/([\-\w{}]+)(:[\.\-\w{}]+)?$")
SCHEMA_ELEMENT = vol.Match(RE_SCHEMA_ELEMENT) SCHEMA_ELEMENT = vol.Match(RE_SCHEMA_ELEMENT)
ARCH_ALL = [
ARCH_ARMHF, ARCH_AARCH64, ARCH_AMD64, ARCH_I386
]
MACHINE_ALL = [ MACHINE_ALL = [
'intel-nuc', 'qemux86', 'qemux86-64', 'qemuarm', 'qemuarm-64', 'intel-nuc', 'odroid-c2', 'odroid-xu', 'orangepi-prime', 'qemux86',
'raspberrypi', 'raspberrypi2', 'raspberrypi3', 'raspberrypi3-64', 'qemux86-64', 'qemuarm', 'qemuarm-64', 'raspberrypi', 'raspberrypi2',
'odroid-cu2', 'odroid-xu', 'raspberrypi3', 'raspberrypi3-64', 'tinker',
] ]
STARTUP_ALL = [
STARTUP_ONCE, STARTUP_INITIALIZE, STARTUP_SYSTEM, STARTUP_SERVICES,
STARTUP_APPLICATION
]
PRIVILEGED_ALL = [
PRIVILEGED_NET_ADMIN,
PRIVILEGED_SYS_ADMIN,
PRIVILEGED_SYS_RAWIO,
PRIVILEGED_IPC_LOCK,
PRIVILEGED_SYS_TIME,
PRIVILEGED_SYS_NICE,
PRIVILEGED_SYS_RESOURCE,
PRIVILEGED_SYS_PTRACE,
]
ROLE_ALL = [
ROLE_DEFAULT,
ROLE_HOMEASSISTANT,
ROLE_MANAGER,
ROLE_ADMIN,
]
BASE_IMAGE = {
ARCH_ARMHF: "homeassistant/armhf-base:latest",
ARCH_AARCH64: "homeassistant/aarch64-base:latest",
ARCH_I386: "homeassistant/i386-base:latest",
ARCH_AMD64: "homeassistant/amd64-base:latest",
}
def _simple_startup(value): def _simple_startup(value):
"""Simple startup schema.""" """Simple startup schema."""
@@ -110,9 +137,9 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Required(ATTR_VERSION): vol.Coerce(str), vol.Required(ATTR_VERSION): vol.Coerce(str),
vol.Required(ATTR_SLUG): vol.Coerce(str), vol.Required(ATTR_SLUG): vol.Coerce(str),
vol.Required(ATTR_DESCRIPTON): vol.Coerce(str), vol.Required(ATTR_DESCRIPTON): vol.Coerce(str),
vol.Optional(ATTR_URL): vol.Url(), vol.Required(ATTR_ARCH): [vol.In(ARCH_ALL)],
vol.Optional(ATTR_ARCH, default=ARCH_ALL): [vol.In(ARCH_ALL)],
vol.Optional(ATTR_MACHINE): [vol.In(MACHINE_ALL)], vol.Optional(ATTR_MACHINE): [vol.In(MACHINE_ALL)],
vol.Optional(ATTR_URL): vol.Url(),
vol.Required(ATTR_STARTUP): vol.Required(ATTR_STARTUP):
vol.All(_simple_startup, vol.In(STARTUP_ALL)), vol.All(_simple_startup, vol.In(STARTUP_ALL)),
vol.Required(ATTR_BOOT): vol.Required(ATTR_BOOT):
@@ -120,6 +147,10 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Optional(ATTR_PORTS): DOCKER_PORTS, vol.Optional(ATTR_PORTS): DOCKER_PORTS,
vol.Optional(ATTR_WEBUI): vol.Optional(ATTR_WEBUI):
vol.Match(r"^(?:https?|\[PROTO:\w+\]):\/\/\[HOST\]:\[PORT:\d+\].*$"), vol.Match(r"^(?:https?|\[PROTO:\w+\]):\/\/\[HOST\]:\[PORT:\d+\].*$"),
vol.Optional(ATTR_INGRESS, default=False): vol.Boolean(),
vol.Optional(ATTR_INGRESS_PORT, default=8099): NETWORK_PORT,
vol.Optional(ATTR_INGRESS_ENTRY): vol.Coerce(str),
vol.Optional(ATTR_HOMEASSISTANT): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_HOST_NETWORK, default=False): vol.Boolean(), vol.Optional(ATTR_HOST_NETWORK, default=False): vol.Boolean(),
vol.Optional(ATTR_HOST_PID, default=False): vol.Boolean(), vol.Optional(ATTR_HOST_PID, default=False): vol.Boolean(),
vol.Optional(ATTR_HOST_IPC, default=False): vol.Boolean(), vol.Optional(ATTR_HOST_IPC, default=False): vol.Boolean(),
@@ -136,14 +167,16 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Optional(ATTR_AUDIO, default=False): vol.Boolean(), vol.Optional(ATTR_AUDIO, default=False): vol.Boolean(),
vol.Optional(ATTR_GPIO, default=False): vol.Boolean(), vol.Optional(ATTR_GPIO, default=False): vol.Boolean(),
vol.Optional(ATTR_DEVICETREE, default=False): vol.Boolean(), vol.Optional(ATTR_DEVICETREE, default=False): vol.Boolean(),
vol.Optional(ATTR_KERNEL_MODULES, default=False): vol.Boolean(),
vol.Optional(ATTR_HASSIO_API, default=False): vol.Boolean(), vol.Optional(ATTR_HASSIO_API, default=False): vol.Boolean(),
vol.Optional(ATTR_HASSIO_ROLE, default=ROLE_DEFAULT): vol.In(ROLE_ALL), vol.Optional(ATTR_HASSIO_ROLE, default=ROLE_DEFAULT): vol.In(ROLE_ALL),
vol.Optional(ATTR_HOMEASSISTANT_API, default=False): vol.Boolean(), vol.Optional(ATTR_HOMEASSISTANT_API, default=False): vol.Boolean(),
vol.Optional(ATTR_STDIN, default=False): vol.Boolean(), vol.Optional(ATTR_STDIN, default=False): vol.Boolean(),
vol.Optional(ATTR_LEGACY, default=False): vol.Boolean(), vol.Optional(ATTR_LEGACY, default=False): vol.Boolean(),
vol.Optional(ATTR_DOCKER_API, default=False): vol.Boolean(), vol.Optional(ATTR_DOCKER_API, default=False): vol.Boolean(),
vol.Optional(ATTR_AUTH_API, default=False): vol.Boolean(),
vol.Optional(ATTR_SERVICES): [vol.Match(RE_SERVICE)], vol.Optional(ATTR_SERVICES): [vol.Match(RE_SERVICE)],
vol.Optional(ATTR_DISCOVERY): [vol.In(DISCOVERY_SERVICES)], vol.Optional(ATTR_DISCOVERY): [valid_discovery_service],
vol.Required(ATTR_OPTIONS): dict, vol.Required(ATTR_OPTIONS): dict,
vol.Required(ATTR_SCHEMA): vol.Any(vol.Schema({ vol.Required(ATTR_SCHEMA): vol.Any(vol.Schema({
vol.Coerce(str): vol.Any(SCHEMA_ELEMENT, [ vol.Coerce(str): vol.Any(SCHEMA_ELEMENT, [
@@ -156,7 +189,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
})) }))
}), False), }), False),
vol.Optional(ATTR_IMAGE): vol.Optional(ATTR_IMAGE):
vol.Match(r"^([a-zA-Z.:\d{}]+/)*?([\w{}]+)/([\-\w{}]+)$"), vol.Match(RE_DOCKER_IMAGE),
vol.Optional(ATTR_TIMEOUT, default=10): vol.Optional(ATTR_TIMEOUT, default=10):
vol.All(vol.Coerce(int), vol.Range(min=10, max=120)), vol.All(vol.Coerce(int), vol.Range(min=10, max=120)),
}, extra=vol.REMOVE_EXTRA) }, extra=vol.REMOVE_EXTRA)
@@ -172,8 +205,8 @@ SCHEMA_REPOSITORY_CONFIG = vol.Schema({
# pylint: disable=no-value-for-parameter # pylint: disable=no-value-for-parameter
SCHEMA_BUILD_CONFIG = vol.Schema({ SCHEMA_BUILD_CONFIG = vol.Schema({
vol.Optional(ATTR_BUILD_FROM, default=BASE_IMAGE): vol.Schema({ vol.Optional(ATTR_BUILD_FROM, default=dict): vol.Schema({
vol.In(ARCH_ALL): vol.Match(r"(?:^[\w{}]+/)?[\-\w{}]+:[\.\-\w{}]+$"), vol.In(ARCH_ALL): vol.Match(RE_DOCKER_IMAGE_BUILD),
}), }),
vol.Optional(ATTR_SQUASH, default=False): vol.Boolean(), vol.Optional(ATTR_SQUASH, default=False): vol.Boolean(),
vol.Optional(ATTR_ARGS, default=dict): vol.Schema({ vol.Optional(ATTR_ARGS, default=dict): vol.Schema({
@@ -185,8 +218,10 @@ SCHEMA_BUILD_CONFIG = vol.Schema({
# pylint: disable=no-value-for-parameter # pylint: disable=no-value-for-parameter
SCHEMA_ADDON_USER = vol.Schema({ SCHEMA_ADDON_USER = vol.Schema({
vol.Required(ATTR_VERSION): vol.Coerce(str), vol.Required(ATTR_VERSION): vol.Coerce(str),
vol.Optional(ATTR_IMAGE): vol.Coerce(str),
vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex): UUID_MATCH, vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex): UUID_MATCH,
vol.Optional(ATTR_ACCESS_TOKEN): vol.Match(r"^[0-9a-f]{64}$"), vol.Optional(ATTR_ACCESS_TOKEN): TOKEN,
vol.Optional(ATTR_INGRESS_TOKEN, default=secrets.token_urlsafe): vol.Coerce(str),
vol.Optional(ATTR_OPTIONS, default=dict): dict, vol.Optional(ATTR_OPTIONS, default=dict): dict,
vol.Optional(ATTR_AUTO_UPDATE, default=False): vol.Boolean(), vol.Optional(ATTR_AUTO_UPDATE, default=False): vol.Boolean(),
vol.Optional(ATTR_BOOT): vol.Optional(ATTR_BOOT):

View File

@@ -1,22 +1,25 @@
"""Init file for Hass.io RESTful API.""" """Init file for Hass.io RESTful API."""
import logging import logging
from pathlib import Path from pathlib import Path
from typing import Optional
from aiohttp import web from aiohttp import web
from ..coresys import CoreSys, CoreSysAttributes
from .addons import APIAddons from .addons import APIAddons
from .auth import APIAuth
from .discovery import APIDiscovery from .discovery import APIDiscovery
from .homeassistant import APIHomeAssistant
from .hardware import APIHardware from .hardware import APIHardware
from .host import APIHost
from .hassos import APIHassOS from .hassos import APIHassOS
from .homeassistant import APIHomeAssistant
from .host import APIHost
from .info import APIInfo
from .ingress import APIIngress
from .proxy import APIProxy from .proxy import APIProxy
from .supervisor import APISupervisor
from .snapshots import APISnapshots
from .services import APIServices
from .version import APIVersion
from .security import SecurityMiddleware from .security import SecurityMiddleware
from ..coresys import CoreSysAttributes from .services import APIServices
from .snapshots import APISnapshots
from .supervisor import APISupervisor
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -24,18 +27,18 @@ _LOGGER = logging.getLogger(__name__)
class RestAPI(CoreSysAttributes): class RestAPI(CoreSysAttributes):
"""Handle RESTful API for Hass.io.""" """Handle RESTful API for Hass.io."""
def __init__(self, coresys): def __init__(self, coresys: CoreSys):
"""Initialize Docker base wrapper.""" """Initialize Docker base wrapper."""
self.coresys = coresys self.coresys: CoreSys = coresys
self.security = SecurityMiddleware(coresys) self.security: SecurityMiddleware = SecurityMiddleware(coresys)
self.webapp = web.Application( self.webapp: web.Application = web.Application(
middlewares=[self.security.token_validation], loop=coresys.loop) middlewares=[self.security.token_validation])
# service stuff # service stuff
self._runner = web.AppRunner(self.webapp) self._runner: web.AppRunner = web.AppRunner(self.webapp)
self._site = None self._site: Optional[web.TCPSite] = None
async def load(self): async def load(self) -> None:
"""Register REST API Calls.""" """Register REST API Calls."""
self._register_supervisor() self._register_supervisor()
self._register_host() self._register_host()
@@ -45,12 +48,14 @@ class RestAPI(CoreSysAttributes):
self._register_proxy() self._register_proxy()
self._register_panel() self._register_panel()
self._register_addons() self._register_addons()
self._register_ingress()
self._register_snapshots() self._register_snapshots()
self._register_discovery() self._register_discovery()
self._register_services() self._register_services()
self._register_version() self._register_info()
self._register_auth()
def _register_host(self): def _register_host(self) -> None:
"""Register hostcontrol functions.""" """Register hostcontrol functions."""
api_host = APIHost() api_host = APIHost()
api_host.coresys = self.coresys api_host.coresys = self.coresys
@@ -64,13 +69,13 @@ class RestAPI(CoreSysAttributes):
web.get('/host/services', api_host.services), web.get('/host/services', api_host.services),
web.post('/host/services/{service}/stop', api_host.service_stop), web.post('/host/services/{service}/stop', api_host.service_stop),
web.post('/host/services/{service}/start', api_host.service_start), web.post('/host/services/{service}/start', api_host.service_start),
web.post( web.post('/host/services/{service}/restart',
'/host/services/{service}/restart', api_host.service_restart), api_host.service_restart),
web.post( web.post('/host/services/{service}/reload',
'/host/services/{service}/reload', api_host.service_reload), api_host.service_reload),
]) ])
def _register_hassos(self): def _register_hassos(self) -> None:
"""Register HassOS functions.""" """Register HassOS functions."""
api_hassos = APIHassOS() api_hassos = APIHassOS()
api_hassos.coresys = self.coresys api_hassos.coresys = self.coresys
@@ -82,7 +87,7 @@ class RestAPI(CoreSysAttributes):
web.post('/hassos/config/sync', api_hassos.config_sync), web.post('/hassos/config/sync', api_hassos.config_sync),
]) ])
def _register_hardware(self): def _register_hardware(self) -> None:
"""Register hardware functions.""" """Register hardware functions."""
api_hardware = APIHardware() api_hardware = APIHardware()
api_hardware.coresys = self.coresys api_hardware.coresys = self.coresys
@@ -92,16 +97,25 @@ class RestAPI(CoreSysAttributes):
web.get('/hardware/audio', api_hardware.audio), web.get('/hardware/audio', api_hardware.audio),
]) ])
def _register_version(self): def _register_info(self) -> None:
"""Register version functions.""" """Register info functions."""
api_version = APIVersion() api_info = APIInfo()
api_version.coresys = self.coresys api_info.coresys = self.coresys
self.webapp.add_routes([ self.webapp.add_routes([
web.get('/version', api_version.info), web.get('/info', api_info.info),
]) ])
def _register_supervisor(self): def _register_auth(self) -> None:
"""Register auth functions."""
api_auth = APIAuth()
api_auth.coresys = self.coresys
self.webapp.add_routes([
web.post('/auth', api_auth.auth),
])
def _register_supervisor(self) -> None:
"""Register Supervisor functions.""" """Register Supervisor functions."""
api_supervisor = APISupervisor() api_supervisor = APISupervisor()
api_supervisor.coresys = self.coresys api_supervisor.coresys = self.coresys
@@ -116,7 +130,7 @@ class RestAPI(CoreSysAttributes):
web.post('/supervisor/options', api_supervisor.options), web.post('/supervisor/options', api_supervisor.options),
]) ])
def _register_homeassistant(self): def _register_homeassistant(self) -> None:
"""Register Home Assistant functions.""" """Register Home Assistant functions."""
api_hass = APIHomeAssistant() api_hass = APIHomeAssistant()
api_hass.coresys = self.coresys api_hass.coresys = self.coresys
@@ -131,9 +145,10 @@ class RestAPI(CoreSysAttributes):
web.post('/homeassistant/stop', api_hass.stop), web.post('/homeassistant/stop', api_hass.stop),
web.post('/homeassistant/start', api_hass.start), web.post('/homeassistant/start', api_hass.start),
web.post('/homeassistant/check', api_hass.check), web.post('/homeassistant/check', api_hass.check),
web.post('/homeassistant/rebuild', api_hass.rebuild),
]) ])
def _register_proxy(self): def _register_proxy(self) -> None:
"""Register Home Assistant API Proxy.""" """Register Home Assistant API Proxy."""
api_proxy = APIProxy() api_proxy = APIProxy()
api_proxy.coresys = self.coresys api_proxy.coresys = self.coresys
@@ -147,7 +162,7 @@ class RestAPI(CoreSysAttributes):
web.get('/homeassistant/api/', api_proxy.api), web.get('/homeassistant/api/', api_proxy.api),
]) ])
def _register_addons(self): def _register_addons(self) -> None:
"""Register Add-on functions.""" """Register Add-on functions."""
api_addons = APIAddons() api_addons = APIAddons()
api_addons.coresys = self.coresys api_addons.coresys = self.coresys
@@ -173,7 +188,17 @@ class RestAPI(CoreSysAttributes):
web.get('/addons/{addon}/stats', api_addons.stats), web.get('/addons/{addon}/stats', api_addons.stats),
]) ])
def _register_snapshots(self): def _register_ingress(self) -> None:
"""Register Ingress functions."""
api_ingress = APIIngress()
api_ingress.coresys = self.coresys
self.webapp.add_routes([
web.post('/ingress/session', api_ingress.create_session),
web.view('/ingress/{token}/{path:.*}', api_ingress.handler),
])
def _register_snapshots(self) -> None:
"""Register snapshots functions.""" """Register snapshots functions."""
api_snapshots = APISnapshots() api_snapshots = APISnapshots()
api_snapshots.coresys = self.coresys api_snapshots.coresys = self.coresys
@@ -193,7 +218,7 @@ class RestAPI(CoreSysAttributes):
web.get('/snapshots/{snapshot}/download', api_snapshots.download), web.get('/snapshots/{snapshot}/download', api_snapshots.download),
]) ])
def _register_services(self): def _register_services(self) -> None:
"""Register services functions.""" """Register services functions."""
api_services = APIServices() api_services = APIServices()
api_services.coresys = self.coresys api_services.coresys = self.coresys
@@ -205,7 +230,7 @@ class RestAPI(CoreSysAttributes):
web.delete('/services/{service}', api_services.del_service), web.delete('/services/{service}', api_services.del_service),
]) ])
def _register_discovery(self): def _register_discovery(self) -> None:
"""Register discovery functions.""" """Register discovery functions."""
api_discovery = APIDiscovery() api_discovery = APIDiscovery()
api_discovery.coresys = self.coresys api_discovery.coresys = self.coresys
@@ -213,12 +238,11 @@ class RestAPI(CoreSysAttributes):
self.webapp.add_routes([ self.webapp.add_routes([
web.get('/discovery', api_discovery.list), web.get('/discovery', api_discovery.list),
web.get('/discovery/{uuid}', api_discovery.get_discovery), web.get('/discovery/{uuid}', api_discovery.get_discovery),
web.delete('/discovery/{uuid}', web.delete('/discovery/{uuid}', api_discovery.del_discovery),
api_discovery.del_discovery),
web.post('/discovery', api_discovery.set_discovery), web.post('/discovery', api_discovery.set_discovery),
]) ])
def _register_panel(self): def _register_panel(self) -> None:
"""Register panel for Home Assistant.""" """Register panel for Home Assistant."""
panel_dir = Path(__file__).parent.joinpath("panel") panel_dir = Path(__file__).parent.joinpath("panel")
@@ -228,8 +252,8 @@ class RestAPI(CoreSysAttributes):
return lambda request: web.FileResponse(path) return lambda request: web.FileResponse(path)
# This route is for backwards compatibility with HA < 0.58 # This route is for backwards compatibility with HA < 0.58
self.webapp.add_routes([ self.webapp.add_routes(
web.get('/panel', create_response('hassio-main-es5'))]) [web.get('/panel', create_response('hassio-main-es5'))])
# This route is for backwards compatibility with HA 0.58 - 0.61 # This route is for backwards compatibility with HA 0.58 - 0.61
self.webapp.add_routes([ self.webapp.add_routes([
@@ -246,7 +270,7 @@ class RestAPI(CoreSysAttributes):
# This route is for HA > 0.70 # This route is for HA > 0.70
self.webapp.add_routes([web.static('/app', panel_dir)]) self.webapp.add_routes([web.static('/app', panel_dir)])
async def start(self): async def start(self) -> None:
"""Run RESTful API webserver.""" """Run RESTful API webserver."""
await self._runner.setup() await self._runner.setup()
self._site = web.TCPSite( self._site = web.TCPSite(
@@ -255,12 +279,12 @@ class RestAPI(CoreSysAttributes):
try: try:
await self._site.start() await self._site.start()
except OSError as err: except OSError as err:
_LOGGER.fatal( _LOGGER.fatal("Failed to create HTTP server at 0.0.0.0:80 -> %s",
"Failed to create HTTP server at 0.0.0.0:80 -> %s", err) err)
else: else:
_LOGGER.info("Start API on %s", self.sys_docker.network.supervisor) _LOGGER.info("Start API on %s", self.sys_docker.network.supervisor)
async def stop(self): async def stop(self) -> None:
"""Stop RESTful API webserver.""" """Stop RESTful API webserver."""
if not self._site: if not self._site:
return return

View File

@@ -1,29 +1,89 @@
"""Init file for Hass.io Home Assistant RESTful API.""" """Init file for Hass.io Home Assistant RESTful API."""
import asyncio import asyncio
import logging import logging
from typing import Any, Awaitable, Dict, List
from aiohttp import web
import voluptuous as vol import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
from .utils import api_process, api_process_raw, api_validate from ..addons.addon import Addon
from ..addons.utils import rating_security from ..addons.utils import rating_security
from ..const import ( from ..const import (
ATTR_VERSION, ATTR_LAST_VERSION, ATTR_STATE, ATTR_BOOT, ATTR_OPTIONS, ATTR_ADDONS,
ATTR_URL, ATTR_DESCRIPTON, ATTR_DETACHED, ATTR_NAME, ATTR_REPOSITORY, ATTR_APPARMOR,
ATTR_BUILD, ATTR_AUTO_UPDATE, ATTR_NETWORK, ATTR_HOST_NETWORK, ATTR_SLUG, ATTR_ARCH,
ATTR_SOURCE, ATTR_REPOSITORIES, ATTR_ADDONS, ATTR_ARCH, ATTR_MAINTAINER, ATTR_AUDIO,
ATTR_INSTALLED, ATTR_LOGO, ATTR_WEBUI, ATTR_DEVICES, ATTR_PRIVILEGED, ATTR_AUDIO_INPUT,
ATTR_AUDIO, ATTR_AUDIO_INPUT, ATTR_AUDIO_OUTPUT, ATTR_HASSIO_API, ATTR_AUDIO_OUTPUT,
ATTR_GPIO, ATTR_HOMEASSISTANT_API, ATTR_STDIN, BOOT_AUTO, BOOT_MANUAL, ATTR_AUTH_API,
ATTR_CHANGELOG, ATTR_HOST_IPC, ATTR_HOST_DBUS, ATTR_LONG_DESCRIPTION, ATTR_AUTO_UPDATE,
ATTR_CPU_PERCENT, ATTR_MEMORY_LIMIT, ATTR_MEMORY_USAGE, ATTR_NETWORK_TX, ATTR_AVAILABLE,
ATTR_NETWORK_RX, ATTR_BLK_READ, ATTR_BLK_WRITE, ATTR_ICON, ATTR_SERVICES, ATTR_BLK_READ,
ATTR_DISCOVERY, ATTR_APPARMOR, ATTR_DEVICETREE, ATTR_DOCKER_API, ATTR_BLK_WRITE,
ATTR_FULL_ACCESS, ATTR_PROTECTED, ATTR_RATING, ATTR_HOST_PID, ATTR_BOOT,
ATTR_HASSIO_ROLE, ATTR_MACHINE, ATTR_AVAILABLE, ATTR_BUILD,
CONTENT_TYPE_PNG, CONTENT_TYPE_BINARY, CONTENT_TYPE_TEXT, REQUEST_FROM) ATTR_CHANGELOG,
ATTR_CPU_PERCENT,
ATTR_DESCRIPTON,
ATTR_DETACHED,
ATTR_DEVICES,
ATTR_DEVICETREE,
ATTR_DISCOVERY,
ATTR_DOCKER_API,
ATTR_FULL_ACCESS,
ATTR_GPIO,
ATTR_HASSIO_API,
ATTR_HASSIO_ROLE,
ATTR_HOMEASSISTANT,
ATTR_HOMEASSISTANT_API,
ATTR_HOST_DBUS,
ATTR_HOST_IPC,
ATTR_HOST_NETWORK,
ATTR_HOST_PID,
ATTR_ICON,
ATTR_INGRESS,
ATTR_INGRESS_ENTRY,
ATTR_INGRESS_URL,
ATTR_INSTALLED,
ATTR_IP_ADDRESS,
ATTR_KERNEL_MODULES,
ATTR_LAST_VERSION,
ATTR_LOGO,
ATTR_LONG_DESCRIPTION,
ATTR_MACHINE,
ATTR_MAINTAINER,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE,
ATTR_NAME,
ATTR_NETWORK,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_OPTIONS,
ATTR_PRIVILEGED,
ATTR_PROTECTED,
ATTR_RATING,
ATTR_REPOSITORIES,
ATTR_REPOSITORY,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SOURCE,
ATTR_STATE,
ATTR_STDIN,
ATTR_URL,
ATTR_VERSION,
ATTR_WEBUI,
BOOT_AUTO,
BOOT_MANUAL,
CONTENT_TYPE_BINARY,
CONTENT_TYPE_PNG,
CONTENT_TYPE_TEXT,
REQUEST_FROM,
)
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..validate import DOCKER_PORTS, ALSA_DEVICE from ..exceptions import APIError
from ..validate import ALSA_DEVICE, DOCKER_PORTS
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -49,7 +109,7 @@ SCHEMA_SECURITY = vol.Schema({
class APIAddons(CoreSysAttributes): class APIAddons(CoreSysAttributes):
"""Handle RESTful API for add-on functions.""" """Handle RESTful API for add-on functions."""
def _extract_addon(self, request, check_installed=True): def _extract_addon(self, request: web.Request, check_installed: bool = True) -> Addon:
"""Return addon, throw an exception it it doesn't exist.""" """Return addon, throw an exception it it doesn't exist."""
addon_slug = request.match_info.get('addon') addon_slug = request.match_info.get('addon')
@@ -59,15 +119,15 @@ class APIAddons(CoreSysAttributes):
addon = self.sys_addons.get(addon_slug) addon = self.sys_addons.get(addon_slug)
if not addon: if not addon:
raise RuntimeError("Addon does not exist") raise APIError("Addon does not exist")
if check_installed and not addon.is_installed: if check_installed and not addon.is_installed:
raise RuntimeError("Addon is not installed") raise APIError("Addon is not installed")
return addon return addon
@api_process @api_process
async def list(self, request): async def list(self, request: web.Request) -> Dict[str, Any]:
"""Return all add-ons or repositories.""" """Return all add-ons or repositories."""
data_addons = [] data_addons = []
for addon in self.sys_addons.list_addons: for addon in self.sys_addons.list_addons:
@@ -102,13 +162,12 @@ class APIAddons(CoreSysAttributes):
} }
@api_process @api_process
async def reload(self, request): async def reload(self, request: web.Request) -> None:
"""Reload all add-on data.""" """Reload all add-on data."""
await asyncio.shield(self.sys_addons.reload()) await asyncio.shield(self.sys_addons.reload())
return True
@api_process @api_process
async def info(self, request): async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return add-on information.""" """Return add-on information."""
addon = self._extract_addon(request, check_installed=False) addon = self._extract_addon(request, check_installed=False)
@@ -128,6 +187,7 @@ class APIAddons(CoreSysAttributes):
ATTR_OPTIONS: addon.options, ATTR_OPTIONS: addon.options,
ATTR_ARCH: addon.supported_arch, ATTR_ARCH: addon.supported_arch,
ATTR_MACHINE: addon.supported_machine, ATTR_MACHINE: addon.supported_machine,
ATTR_HOMEASSISTANT: addon.homeassistant_version,
ATTR_URL: addon.url, ATTR_URL: addon.url,
ATTR_DETACHED: addon.is_detached, ATTR_DETACHED: addon.is_detached,
ATTR_AVAILABLE: addon.available, ATTR_AVAILABLE: addon.available,
@@ -148,8 +208,10 @@ class APIAddons(CoreSysAttributes):
ATTR_STDIN: addon.with_stdin, ATTR_STDIN: addon.with_stdin,
ATTR_HASSIO_API: addon.access_hassio_api, ATTR_HASSIO_API: addon.access_hassio_api,
ATTR_HASSIO_ROLE: addon.hassio_role, ATTR_HASSIO_ROLE: addon.hassio_role,
ATTR_AUTH_API: addon.access_auth_api,
ATTR_HOMEASSISTANT_API: addon.access_homeassistant_api, ATTR_HOMEASSISTANT_API: addon.access_homeassistant_api,
ATTR_GPIO: addon.with_gpio, ATTR_GPIO: addon.with_gpio,
ATTR_KERNEL_MODULES: addon.with_kernel_modules,
ATTR_DEVICETREE: addon.with_devicetree, ATTR_DEVICETREE: addon.with_devicetree,
ATTR_DOCKER_API: addon.access_docker_api, ATTR_DOCKER_API: addon.access_docker_api,
ATTR_AUDIO: addon.with_audio, ATTR_AUDIO: addon.with_audio,
@@ -157,17 +219,20 @@ class APIAddons(CoreSysAttributes):
ATTR_AUDIO_OUTPUT: addon.audio_output, ATTR_AUDIO_OUTPUT: addon.audio_output,
ATTR_SERVICES: _pretty_services(addon), ATTR_SERVICES: _pretty_services(addon),
ATTR_DISCOVERY: addon.discovery, ATTR_DISCOVERY: addon.discovery,
ATTR_IP_ADDRESS: str(addon.ip_address),
ATTR_INGRESS: addon.with_ingress,
ATTR_INGRESS_ENTRY: addon.ingress_entry,
ATTR_INGRESS_URL: addon.ingress_url,
} }
@api_process @api_process
async def options(self, request): async def options(self, request: web.Request) -> None:
"""Store user options for add-on.""" """Store user options for add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
addon_schema = SCHEMA_OPTIONS.extend({ addon_schema = SCHEMA_OPTIONS.extend({
vol.Optional(ATTR_OPTIONS): vol.Any(None, addon.schema), vol.Optional(ATTR_OPTIONS): vol.Any(None, addon.schema),
}) })
body = await api_validate(addon_schema, request) body = await api_validate(addon_schema, request)
if ATTR_OPTIONS in body: if ATTR_OPTIONS in body:
@@ -184,10 +249,9 @@ class APIAddons(CoreSysAttributes):
addon.audio_output = body[ATTR_AUDIO_OUTPUT] addon.audio_output = body[ATTR_AUDIO_OUTPUT]
addon.save_data() addon.save_data()
return True
@api_process @api_process
async def security(self, request): async def security(self, request: web.Request) -> None:
"""Store security options for add-on.""" """Store security options for add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
body = await api_validate(SCHEMA_SECURITY, request) body = await api_validate(SCHEMA_SECURITY, request)
@@ -197,17 +261,13 @@ class APIAddons(CoreSysAttributes):
addon.protected = body[ATTR_PROTECTED] addon.protected = body[ATTR_PROTECTED]
addon.save_data() addon.save_data()
return True
@api_process @api_process
async def stats(self, request): async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information.""" """Return resource information."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
stats = await addon.stats() stats = await addon.stats()
if not stats:
raise RuntimeError("No stats available")
return { return {
ATTR_CPU_PERCENT: stats.cpu_percent, ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage, ATTR_MEMORY_USAGE: stats.memory_usage,
@@ -219,19 +279,19 @@ class APIAddons(CoreSysAttributes):
} }
@api_process @api_process
def install(self, request): def install(self, request: web.Request) -> Awaitable[None]:
"""Install add-on.""" """Install add-on."""
addon = self._extract_addon(request, check_installed=False) addon = self._extract_addon(request, check_installed=False)
return asyncio.shield(addon.install()) return asyncio.shield(addon.install())
@api_process @api_process
def uninstall(self, request): def uninstall(self, request: web.Request) -> Awaitable[None]:
"""Uninstall add-on.""" """Uninstall add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
return asyncio.shield(addon.uninstall()) return asyncio.shield(addon.uninstall())
@api_process @api_process
def start(self, request): def start(self, request: web.Request) -> Awaitable[None]:
"""Start add-on.""" """Start add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
@@ -240,89 +300,89 @@ class APIAddons(CoreSysAttributes):
try: try:
addon.schema(options) addon.schema(options)
except vol.Invalid as ex: except vol.Invalid as ex:
raise RuntimeError(humanize_error(options, ex)) from None raise APIError(humanize_error(options, ex)) from None
return asyncio.shield(addon.start()) return asyncio.shield(addon.start())
@api_process @api_process
def stop(self, request): def stop(self, request: web.Request) -> Awaitable[None]:
"""Stop add-on.""" """Stop add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
return asyncio.shield(addon.stop()) return asyncio.shield(addon.stop())
@api_process @api_process
def update(self, request): def update(self, request: web.Request) -> Awaitable[None]:
"""Update add-on.""" """Update add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
if addon.last_version == addon.version_installed: if addon.last_version == addon.version_installed:
raise RuntimeError("No update available!") raise APIError("No update available!")
return asyncio.shield(addon.update()) return asyncio.shield(addon.update())
@api_process @api_process
def restart(self, request): def restart(self, request: web.Request) -> Awaitable[None]:
"""Restart add-on.""" """Restart add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
return asyncio.shield(addon.restart()) return asyncio.shield(addon.restart())
@api_process @api_process
def rebuild(self, request): def rebuild(self, request: web.Request) -> Awaitable[None]:
"""Rebuild local build add-on.""" """Rebuild local build add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
if not addon.need_build: if not addon.need_build:
raise RuntimeError("Only local build addons are supported") raise APIError("Only local build addons are supported")
return asyncio.shield(addon.rebuild()) return asyncio.shield(addon.rebuild())
@api_process_raw(CONTENT_TYPE_BINARY) @api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request): def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return logs from add-on.""" """Return logs from add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
return addon.logs() return addon.logs()
@api_process_raw(CONTENT_TYPE_PNG) @api_process_raw(CONTENT_TYPE_PNG)
async def icon(self, request): async def icon(self, request: web.Request) -> bytes:
"""Return icon from add-on.""" """Return icon from add-on."""
addon = self._extract_addon(request, check_installed=False) addon = self._extract_addon(request, check_installed=False)
if not addon.with_icon: if not addon.with_icon:
raise RuntimeError("No icon found!") raise APIError("No icon found!")
with addon.path_icon.open('rb') as png: with addon.path_icon.open('rb') as png:
return png.read() return png.read()
@api_process_raw(CONTENT_TYPE_PNG) @api_process_raw(CONTENT_TYPE_PNG)
async def logo(self, request): async def logo(self, request: web.Request) -> bytes:
"""Return logo from add-on.""" """Return logo from add-on."""
addon = self._extract_addon(request, check_installed=False) addon = self._extract_addon(request, check_installed=False)
if not addon.with_logo: if not addon.with_logo:
raise RuntimeError("No logo found!") raise APIError("No logo found!")
with addon.path_logo.open('rb') as png: with addon.path_logo.open('rb') as png:
return png.read() return png.read()
@api_process_raw(CONTENT_TYPE_TEXT) @api_process_raw(CONTENT_TYPE_TEXT)
async def changelog(self, request): async def changelog(self, request: web.Request) -> str:
"""Return changelog from add-on.""" """Return changelog from add-on."""
addon = self._extract_addon(request, check_installed=False) addon = self._extract_addon(request, check_installed=False)
if not addon.with_changelog: if not addon.with_changelog:
raise RuntimeError("No changelog found!") raise APIError("No changelog found!")
with addon.path_changelog.open('r') as changelog: with addon.path_changelog.open('r') as changelog:
return changelog.read() return changelog.read()
@api_process @api_process
async def stdin(self, request): async def stdin(self, request: web.Request) -> None:
"""Write to stdin of add-on.""" """Write to stdin of add-on."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
if not addon.with_stdin: if not addon.with_stdin:
raise RuntimeError("STDIN not supported by add-on") raise APIError("STDIN not supported by add-on")
data = await request.read() data = await request.read()
return await asyncio.shield(addon.write_stdin(data)) await asyncio.shield(addon.write_stdin(data))
def _pretty_devices(addon): def _pretty_devices(addon: Addon) -> List[str]:
"""Return a simplified device list.""" """Return a simplified device list."""
dev_list = addon.devices dev_list = addon.devices
if not dev_list: if not dev_list:
@@ -330,7 +390,7 @@ def _pretty_devices(addon):
return [row.split(':')[0] for row in dev_list] return [row.split(':')[0] for row in dev_list]
def _pretty_services(addon): def _pretty_services(addon: Addon) -> List[str]:
"""Return a simplified services role list.""" """Return a simplified services role list."""
services = [] services = []
for name, access in addon.services_role.items(): for name, access in addon.services_role.items():

61
hassio/api/auth.py Normal file
View File

@@ -0,0 +1,61 @@
"""Init file for Hass.io auth/SSO RESTful API."""
import logging
from aiohttp import BasicAuth
from aiohttp.web_exceptions import HTTPUnauthorized
from aiohttp.hdrs import CONTENT_TYPE, AUTHORIZATION, WWW_AUTHENTICATE
from .utils import api_process
from ..const import REQUEST_FROM, CONTENT_TYPE_JSON, CONTENT_TYPE_URL
from ..coresys import CoreSysAttributes
from ..exceptions import APIForbidden
_LOGGER = logging.getLogger(__name__)
class APIAuth(CoreSysAttributes):
"""Handle RESTful API for auth functions."""
def _process_basic(self, request, addon):
"""Process login request with basic auth.
Return a coroutine.
"""
auth = BasicAuth.decode(request.headers[AUTHORIZATION])
return self.sys_auth.check_login(addon, auth.login, auth.password)
def _process_dict(self, request, addon, data):
"""Process login with dict data.
Return a coroutine.
"""
username = data.get('username') or data.get('user')
password = data.get('password')
return self.sys_auth.check_login(addon, username, password)
@api_process
async def auth(self, request):
"""Process login request."""
addon = request[REQUEST_FROM]
if not addon.access_auth_api:
raise APIForbidden("Can't use Home Assistant auth!")
# BasicAuth
if AUTHORIZATION in request.headers:
return await self._process_basic(request, addon)
# Json
if request.headers.get(CONTENT_TYPE) == CONTENT_TYPE_JSON:
data = await request.json()
return await self._process_dict(request, addon, data)
# URL encoded
if request.headers.get(CONTENT_TYPE) == CONTENT_TYPE_URL:
data = await request.post()
return await self._process_dict(request, addon, data)
raise HTTPUnauthorized(headers={
WWW_AUTHENTICATE: "Basic realm=\"Hass.io Authentication\""
})

View File

@@ -3,19 +3,24 @@ import voluptuous as vol
from .utils import api_process, api_validate from .utils import api_process, api_validate
from ..const import ( from ..const import (
ATTR_ADDON, ATTR_UUID, ATTR_COMPONENT, ATTR_PLATFORM, ATTR_CONFIG, ATTR_ADDON,
ATTR_DISCOVERY, ATTR_SERVICE, REQUEST_FROM) ATTR_UUID,
ATTR_CONFIG,
ATTR_DISCOVERY,
ATTR_SERVICE,
REQUEST_FROM,
)
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..exceptions import APIError, APIForbidden from ..exceptions import APIError, APIForbidden
from ..validate import SERVICE_ALL from ..discovery.validate import valid_discovery_service
SCHEMA_DISCOVERY = vol.Schema({ SCHEMA_DISCOVERY = vol.Schema(
vol.Required(ATTR_SERVICE): SERVICE_ALL, {
vol.Required(ATTR_COMPONENT): vol.Coerce(str), vol.Required(ATTR_SERVICE): valid_discovery_service,
vol.Optional(ATTR_PLATFORM): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_CONFIG): vol.Maybe(dict), vol.Optional(ATTR_CONFIG): vol.Maybe(dict),
}) }
)
class APIDiscovery(CoreSysAttributes): class APIDiscovery(CoreSysAttributes):
@@ -23,7 +28,7 @@ class APIDiscovery(CoreSysAttributes):
def _extract_message(self, request): def _extract_message(self, request):
"""Extract discovery message from URL.""" """Extract discovery message from URL."""
message = self.sys_discovery.get(request.match_info.get('uuid')) message = self.sys_discovery.get(request.match_info.get("uuid"))
if not message: if not message:
raise APIError("Discovery message not found") raise APIError("Discovery message not found")
return message return message
@@ -40,14 +45,14 @@ class APIDiscovery(CoreSysAttributes):
discovery = [] discovery = []
for message in self.sys_discovery.list_messages: for message in self.sys_discovery.list_messages:
discovery.append({ discovery.append(
{
ATTR_ADDON: message.addon, ATTR_ADDON: message.addon,
ATTR_SERVICE: message.service, ATTR_SERVICE: message.service,
ATTR_UUID: message.uuid, ATTR_UUID: message.uuid,
ATTR_COMPONENT: message.component,
ATTR_PLATFORM: message.platform,
ATTR_CONFIG: message.config, ATTR_CONFIG: message.config,
}) }
)
return {ATTR_DISCOVERY: discovery} return {ATTR_DISCOVERY: discovery}
@@ -78,8 +83,6 @@ class APIDiscovery(CoreSysAttributes):
ATTR_ADDON: message.addon, ATTR_ADDON: message.addon,
ATTR_SERVICE: message.service, ATTR_SERVICE: message.service,
ATTR_UUID: message.uuid, ATTR_UUID: message.uuid,
ATTR_COMPONENT: message.component,
ATTR_PLATFORM: message.platform,
ATTR_CONFIG: message.config, ATTR_CONFIG: message.config,
} }

View File

@@ -1,27 +1,31 @@
"""Init file for Hass.io HassOS RESTful API.""" """Init file for Hass.io HassOS RESTful API."""
import asyncio import asyncio
import logging import logging
from typing import Any, Awaitable, Dict
import voluptuous as vol import voluptuous as vol
from aiohttp import web
from .utils import api_process, api_validate
from ..const import ( from ..const import (
ATTR_VERSION, ATTR_BOARD, ATTR_VERSION_LATEST, ATTR_VERSION_CLI, ATTR_BOARD,
ATTR_VERSION_CLI_LATEST) ATTR_VERSION,
ATTR_VERSION_CLI,
ATTR_VERSION_CLI_LATEST,
ATTR_VERSION_LATEST,
)
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from .utils import api_process, api_validate
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
SCHEMA_VERSION = vol.Schema({ SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
vol.Optional(ATTR_VERSION): vol.Coerce(str),
})
class APIHassOS(CoreSysAttributes): class APIHassOS(CoreSysAttributes):
"""Handle RESTful API for HassOS functions.""" """Handle RESTful API for HassOS functions."""
@api_process @api_process
async def info(self, request): async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return HassOS information.""" """Return HassOS information."""
return { return {
ATTR_VERSION: self.sys_hassos.version, ATTR_VERSION: self.sys_hassos.version,
@@ -32,7 +36,7 @@ class APIHassOS(CoreSysAttributes):
} }
@api_process @api_process
async def update(self, request): async def update(self, request: web.Request) -> None:
"""Update HassOS.""" """Update HassOS."""
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_hassos.version_latest) version = body.get(ATTR_VERSION, self.sys_hassos.version_latest)
@@ -40,7 +44,7 @@ class APIHassOS(CoreSysAttributes):
await asyncio.shield(self.sys_hassos.update(version)) await asyncio.shield(self.sys_hassos.update(version))
@api_process @api_process
async def update_cli(self, request): async def update_cli(self, request: web.Request) -> None:
"""Update HassOS CLI.""" """Update HassOS CLI."""
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_hassos.version_cli_latest) version = body.get(ATTR_VERSION, self.sys_hassos.version_cli_latest)
@@ -48,6 +52,6 @@ class APIHassOS(CoreSysAttributes):
await asyncio.shield(self.sys_hassos.update_cli(version)) await asyncio.shield(self.sys_hassos.update_cli(version))
@api_process @api_process
def config_sync(self, request): def config_sync(self, request: web.Request) -> Awaitable[None]:
"""Trigger config reload on HassOS.""" """Trigger config reload on HassOS."""
return asyncio.shield(self.sys_hassos.config_sync()) return asyncio.shield(self.sys_hassos.config_sync())

View File

@@ -1,54 +1,72 @@
"""Init file for Hass.io Home Assistant RESTful API.""" """Init file for Hass.io Home Assistant RESTful API."""
import asyncio import asyncio
import logging import logging
from typing import Coroutine, Dict, Any
import voluptuous as vol import voluptuous as vol
from aiohttp import web
from .utils import api_process, api_process_raw, api_validate
from ..const import ( from ..const import (
ATTR_VERSION, ATTR_LAST_VERSION, ATTR_IMAGE, ATTR_CUSTOM, ATTR_BOOT, ATTR_ARCH,
ATTR_PORT, ATTR_PASSWORD, ATTR_SSL, ATTR_WATCHDOG, ATTR_CPU_PERCENT, ATTR_BLK_READ,
ATTR_MEMORY_USAGE, ATTR_MEMORY_LIMIT, ATTR_NETWORK_RX, ATTR_NETWORK_TX, ATTR_BLK_WRITE,
ATTR_BLK_READ, ATTR_BLK_WRITE, ATTR_WAIT_BOOT, ATTR_MACHINE, ATTR_BOOT,
ATTR_REFRESH_TOKEN, CONTENT_TYPE_BINARY) ATTR_CPU_PERCENT,
ATTR_CUSTOM,
ATTR_IMAGE,
ATTR_LAST_VERSION,
ATTR_MACHINE,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_REFRESH_TOKEN,
ATTR_SSL,
ATTR_VERSION,
ATTR_WAIT_BOOT,
ATTR_WATCHDOG,
ATTR_IP_ADDRESS,
CONTENT_TYPE_BINARY,
)
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..validate import NETWORK_PORT, DOCKER_IMAGE
from ..exceptions import APIError from ..exceptions import APIError
from ..validate import DOCKER_IMAGE, NETWORK_PORT
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter # pylint: disable=no-value-for-parameter
SCHEMA_OPTIONS = vol.Schema({ SCHEMA_OPTIONS = vol.Schema(
{
vol.Optional(ATTR_BOOT): vol.Boolean(), vol.Optional(ATTR_BOOT): vol.Boolean(),
vol.Inclusive(ATTR_IMAGE, 'custom_hass'): vol.Inclusive(ATTR_IMAGE, "custom_hass"): vol.Maybe(vol.Coerce(str)),
vol.Maybe(vol.Coerce(str)), vol.Inclusive(ATTR_LAST_VERSION, "custom_hass"): vol.Any(None, DOCKER_IMAGE),
vol.Inclusive(ATTR_LAST_VERSION, 'custom_hass'):
vol.Any(None, DOCKER_IMAGE),
vol.Optional(ATTR_PORT): NETWORK_PORT, vol.Optional(ATTR_PORT): NETWORK_PORT,
vol.Optional(ATTR_PASSWORD): vol.Maybe(vol.Coerce(str)), vol.Optional(ATTR_PASSWORD): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_SSL): vol.Boolean(), vol.Optional(ATTR_SSL): vol.Boolean(),
vol.Optional(ATTR_WATCHDOG): vol.Boolean(), vol.Optional(ATTR_WATCHDOG): vol.Boolean(),
vol.Optional(ATTR_WAIT_BOOT): vol.Optional(ATTR_WAIT_BOOT): vol.All(vol.Coerce(int), vol.Range(min=60)),
vol.All(vol.Coerce(int), vol.Range(min=60)),
vol.Optional(ATTR_REFRESH_TOKEN): vol.Maybe(vol.Coerce(str)), vol.Optional(ATTR_REFRESH_TOKEN): vol.Maybe(vol.Coerce(str)),
}) }
)
SCHEMA_VERSION = vol.Schema({ SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
vol.Optional(ATTR_VERSION): vol.Coerce(str),
})
class APIHomeAssistant(CoreSysAttributes): class APIHomeAssistant(CoreSysAttributes):
"""Handle RESTful API for Home Assistant functions.""" """Handle RESTful API for Home Assistant functions."""
@api_process @api_process
async def info(self, request): async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return host information.""" """Return host information."""
return { return {
ATTR_VERSION: self.sys_homeassistant.version, ATTR_VERSION: self.sys_homeassistant.version,
ATTR_LAST_VERSION: self.sys_homeassistant.last_version, ATTR_LAST_VERSION: self.sys_homeassistant.last_version,
ATTR_MACHINE: self.sys_homeassistant.machine, ATTR_MACHINE: self.sys_homeassistant.machine,
ATTR_IP_ADDRESS: str(self.sys_homeassistant.ip_address),
ATTR_ARCH: self.sys_homeassistant.arch,
ATTR_IMAGE: self.sys_homeassistant.image, ATTR_IMAGE: self.sys_homeassistant.image,
ATTR_CUSTOM: self.sys_homeassistant.is_custom_image, ATTR_CUSTOM: self.sys_homeassistant.is_custom_image,
ATTR_BOOT: self.sys_homeassistant.boot, ATTR_BOOT: self.sys_homeassistant.boot,
@@ -59,7 +77,7 @@ class APIHomeAssistant(CoreSysAttributes):
} }
@api_process @api_process
async def options(self, request): async def options(self, request: web.Request) -> None:
"""Set Home Assistant options.""" """Set Home Assistant options."""
body = await api_validate(SCHEMA_OPTIONS, request) body = await api_validate(SCHEMA_OPTIONS, request)
@@ -75,6 +93,7 @@ class APIHomeAssistant(CoreSysAttributes):
if ATTR_PASSWORD in body: if ATTR_PASSWORD in body:
self.sys_homeassistant.api_password = body[ATTR_PASSWORD] self.sys_homeassistant.api_password = body[ATTR_PASSWORD]
self.sys_homeassistant.refresh_token = None
if ATTR_SSL in body: if ATTR_SSL in body:
self.sys_homeassistant.api_ssl = body[ATTR_SSL] self.sys_homeassistant.api_ssl = body[ATTR_SSL]
@@ -91,7 +110,7 @@ class APIHomeAssistant(CoreSysAttributes):
self.sys_homeassistant.save_data() self.sys_homeassistant.save_data()
@api_process @api_process
async def stats(self, request): async def stats(self, request: web.Request) -> Dict[Any, str]:
"""Return resource information.""" """Return resource information."""
stats = await self.sys_homeassistant.stats() stats = await self.sys_homeassistant.stats()
if not stats: if not stats:
@@ -108,7 +127,7 @@ class APIHomeAssistant(CoreSysAttributes):
} }
@api_process @api_process
async def update(self, request): async def update(self, request: web.Request) -> None:
"""Update Home Assistant.""" """Update Home Assistant."""
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_homeassistant.last_version) version = body.get(ATTR_VERSION, self.sys_homeassistant.last_version)
@@ -116,30 +135,33 @@ class APIHomeAssistant(CoreSysAttributes):
await asyncio.shield(self.sys_homeassistant.update(version)) await asyncio.shield(self.sys_homeassistant.update(version))
@api_process @api_process
def stop(self, request): def stop(self, request: web.Request) -> Coroutine:
"""Stop Home Assistant.""" """Stop Home Assistant."""
return asyncio.shield(self.sys_homeassistant.stop()) return asyncio.shield(self.sys_homeassistant.stop())
@api_process @api_process
def start(self, request): def start(self, request: web.Request) -> Coroutine:
"""Start Home Assistant.""" """Start Home Assistant."""
return asyncio.shield(self.sys_homeassistant.start()) return asyncio.shield(self.sys_homeassistant.start())
@api_process @api_process
def restart(self, request): def restart(self, request: web.Request) -> Coroutine:
"""Restart Home Assistant.""" """Restart Home Assistant."""
return asyncio.shield(self.sys_homeassistant.restart()) return asyncio.shield(self.sys_homeassistant.restart())
@api_process
def rebuild(self, request: web.Request) -> Coroutine:
"""Rebuild Home Assistant."""
return asyncio.shield(self.sys_homeassistant.rebuild())
@api_process_raw(CONTENT_TYPE_BINARY) @api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request): def logs(self, request: web.Request) -> Coroutine:
"""Return Home Assistant Docker logs.""" """Return Home Assistant Docker logs."""
return self.sys_homeassistant.logs() return self.sys_homeassistant.logs()
@api_process @api_process
async def check(self, request): async def check(self, request: web.Request) -> None:
"""Check configuration of Home Assistant.""" """Check configuration of Home Assistant."""
result = await self.sys_homeassistant.check_config() result = await self.sys_homeassistant.check_config()
if not result.valid: if not result.valid:
raise APIError(result.log) raise APIError(result.log)
return True

28
hassio/api/info.py Normal file
View File

@@ -0,0 +1,28 @@
"""Init file for Hass.io info RESTful API."""
import logging
from ..const import (ATTR_ARCH, ATTR_CHANNEL, ATTR_HASSOS, ATTR_HOMEASSISTANT,
ATTR_HOSTNAME, ATTR_MACHINE, ATTR_SUPERVISOR,
ATTR_SUPPORTED_ARCH)
from ..coresys import CoreSysAttributes
from .utils import api_process
_LOGGER = logging.getLogger(__name__)
class APIInfo(CoreSysAttributes):
"""Handle RESTful API for info functions."""
@api_process
async def info(self, request):
"""Show system info."""
return {
ATTR_SUPERVISOR: self.sys_supervisor.version,
ATTR_HOMEASSISTANT: self.sys_homeassistant.version,
ATTR_HASSOS: self.sys_hassos.version,
ATTR_HOSTNAME: self.sys_host.info.hostname,
ATTR_MACHINE: self.sys_machine,
ATTR_ARCH: self.sys_arch.default,
ATTR_SUPPORTED_ARCH: self.sys_arch.supported,
ATTR_CHANNEL: self.sys_updater.channel,
}

217
hassio/api/ingress.py Normal file
View File

@@ -0,0 +1,217 @@
"""Hass.io Add-on ingress service."""
import asyncio
from ipaddress import ip_address
import logging
from typing import Any, Dict, Union
import aiohttp
from aiohttp import hdrs, web
from aiohttp.web_exceptions import (
HTTPBadGateway,
HTTPServiceUnavailable,
HTTPUnauthorized,
)
from multidict import CIMultiDict, istr
from ..addons.addon import Addon
from ..const import ATTR_SESSION, HEADER_TOKEN, REQUEST_FROM, COOKIE_INGRESS
from ..coresys import CoreSysAttributes
from .utils import api_process
_LOGGER = logging.getLogger(__name__)
class APIIngress(CoreSysAttributes):
"""Ingress view to handle add-on webui routing."""
def _extract_addon(self, request: web.Request) -> Addon:
"""Return addon, throw an exception it it doesn't exist."""
token = request.match_info.get("token")
# Find correct add-on
addon = self.sys_ingress.get(token)
if not addon:
_LOGGER.warning("Ingress for %s not available", token)
raise HTTPServiceUnavailable()
return addon
def _check_ha_access(self, request: web.Request) -> None:
if request[REQUEST_FROM] != self.sys_homeassistant:
_LOGGER.warning("Ingress is only available behind Home Assistant")
raise HTTPUnauthorized()
def _create_url(self, addon: Addon, path: str) -> str:
"""Create URL to container."""
return f"{addon.ingress_internal}/{path}"
@api_process
async def create_session(self, request: web.Request) -> Dict[str, Any]:
"""Create a new session."""
self._check_ha_access(request)
session = self.sys_ingress.create_session()
return {ATTR_SESSION: session}
async def handler(
self, request: web.Request
) -> Union[web.Response, web.StreamResponse, web.WebSocketResponse]:
"""Route data to Hass.io ingress service."""
self._check_ha_access(request)
# Check Ingress Session
session = request.cookies.get(COOKIE_INGRESS)
if not self.sys_ingress.validate_session(session):
_LOGGER.warning("No valid ingress session %s", session)
raise HTTPUnauthorized()
# Process requests
addon = self._extract_addon(request)
path = request.match_info.get("path")
try:
# Websocket
if _is_websocket(request):
return await self._handle_websocket(request, addon, path)
# Request
return await self._handle_request(request, addon, path)
except aiohttp.ClientError as err:
_LOGGER.error("Ingress error: %s", err)
raise HTTPBadGateway() from None
async def _handle_websocket(
self, request: web.Request, addon: Addon, path: str
) -> web.WebSocketResponse:
"""Ingress route for websocket."""
ws_server = web.WebSocketResponse()
await ws_server.prepare(request)
# Preparing
url = self._create_url(addon, path)
source_header = _init_header(request, addon)
# Support GET query
if request.query_string:
url = "{}?{}".format(url, request.query_string)
# Start proxy
async with self.sys_websession.ws_connect(
url, headers=source_header
) as ws_client:
# Proxy requests
await asyncio.wait(
[
_websocket_forward(ws_server, ws_client),
_websocket_forward(ws_client, ws_server),
],
return_when=asyncio.FIRST_COMPLETED,
)
return ws_server
async def _handle_request(
self, request: web.Request, addon: Addon, path: str
) -> Union[web.Response, web.StreamResponse]:
"""Ingress route for request."""
url = self._create_url(addon, path)
data = await request.read()
source_header = _init_header(request, addon)
async with self.sys_websession.request(
request.method, url, headers=source_header, params=request.query, data=data
) as result:
headers = _response_header(result)
# Simple request
if (
hdrs.CONTENT_LENGTH in result.headers
and int(result.headers.get(hdrs.CONTENT_LENGTH, 0)) < 4_194_000
):
# Return Response
body = await result.read()
return web.Response(headers=headers, status=result.status, body=body)
# Stream response
response = web.StreamResponse(status=result.status, headers=headers)
response.content_type = result.content_type
try:
await response.prepare(request)
async for data in result.content.iter_chunked(4096):
await response.write(data)
except (aiohttp.ClientError, aiohttp.ClientPayloadError) as err:
_LOGGER.error("Stream error with %s: %s", url, err)
return response
def _init_header(
request: web.Request, addon: str
) -> Union[CIMultiDict, Dict[str, str]]:
"""Create initial header."""
headers = {}
# filter flags
for name, value in request.headers.items():
if name in (
hdrs.CONTENT_LENGTH,
hdrs.CONTENT_TYPE,
hdrs.CONTENT_ENCODING,
istr(HEADER_TOKEN),
):
continue
headers[name] = value
# Update X-Forwarded-For
forward_for = request.headers.get(hdrs.X_FORWARDED_FOR)
connected_ip = ip_address(request.transport.get_extra_info("peername")[0])
headers[hdrs.X_FORWARDED_FOR] = f"{forward_for}, {connected_ip!s}"
return headers
def _response_header(response: aiohttp.ClientResponse) -> Dict[str, str]:
"""Create response header."""
headers = {}
for name, value in response.headers.items():
if name in (
hdrs.TRANSFER_ENCODING,
hdrs.CONTENT_LENGTH,
hdrs.CONTENT_TYPE,
hdrs.CONTENT_ENCODING,
):
continue
headers[name] = value
return headers
def _is_websocket(request: web.Request) -> bool:
"""Return True if request is a websocket."""
headers = request.headers
if (
headers.get(hdrs.CONNECTION) == "Upgrade"
and headers.get(hdrs.UPGRADE) == "websocket"
):
return True
return False
async def _websocket_forward(ws_from, ws_to):
"""Handle websocket message directly."""
async for msg in ws_from:
if msg.type == aiohttp.WSMsgType.TEXT:
await ws_to.send_str(msg.data)
elif msg.type == aiohttp.WSMsgType.BINARY:
await ws_to.send_bytes(msg.data)
elif msg.type == aiohttp.WSMsgType.PING:
await ws_to.ping()
elif msg.type == aiohttp.WSMsgType.PONG:
await ws_to.pong()
elif ws_to.closed:
await ws_to.close(code=ws_to.close_code, message=msg.extra)

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,32 @@
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
* @fileoverview
* @suppress {checkPrototypalTypes}
* @license Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt The complete set of authors may be found
* at http://polymer.github.io/AUTHORS.txt The complete set of contributors may
* be found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by
* Google as part of the polymer project is also subject to an additional IP
* rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/

Binary file not shown.

View File

@@ -0,0 +1 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.1ac383635811d6c2cb4b.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,180 @@
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright 2018 Google Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
/*! *****************************************************************************
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use
this file except in compliance with the License. You may obtain a copy of the
License at http://www.apache.org/licenses/LICENSE-2.0
THIS CODE IS PROVIDED ON AN *AS IS* BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, EITHER EXPRESS OR IMPLIED, INCLUDING WITHOUT LIMITATION ANY IMPLIED
WARRANTIES OR CONDITIONS OF TITLE, FITNESS FOR A PARTICULAR PURPOSE,
MERCHANTABLITY OR NON-INFRINGEMENT.
See the Apache Version 2.0 License for specific language governing permissions
and limitations under the License.
***************************************************************************** */
/**
* @license
* Copyright 2016 Google Inc.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
/**
* @license
* Copyright 2018 Google Inc.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
/**
@license
Copyright (c) 2019 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2018 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2014 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/

Binary file not shown.

View File

@@ -0,0 +1 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.31b41b04602ce627ad98.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,652 +0,0 @@
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2014 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

View File

@@ -1,2 +0,0 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[4],{102:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(91),i=t.n(e),o=t(93),u=t.n(o),a=i.a,c=u.a}}]);
//# sourceMappingURL=chunk.7ee37c2565bcf2d88182.js.map

View File

@@ -1 +0,0 @@
{"version":3,"sources":["webpack:///../src/resources/load_markdown.js"],"names":["__webpack_require__","r","__webpack_exports__","d","marked","filterXSS","marked__WEBPACK_IMPORTED_MODULE_0__","marked__WEBPACK_IMPORTED_MODULE_0___default","n","xss__WEBPACK_IMPORTED_MODULE_1__","xss__WEBPACK_IMPORTED_MODULE_1___default","a"],"mappings":"0FAAAA,EAAAC,EAAAC,GAAAF,EAAAG,EAAAD,EAAA,2BAAAE,IAAAJ,EAAAG,EAAAD,EAAA,8BAAAG,IAAA,IAAAC,EAAAN,EAAA,IAAAO,EAAAP,EAAAQ,EAAAF,GAAAG,EAAAT,EAAA,IAAAU,EAAAV,EAAAQ,EAAAC,GAGaL,EAASG,EAAAI,EACTN,EAAYK,EAAAC","file":"chunk.7ee37c2565bcf2d88182.js","sourcesContent":["import marked_ from 'marked';\nimport filterXSS_ from 'xss';\n\nexport const marked = marked_;\nexport const filterXSS = filterXSS_;\n"],"sourceRoot":""}

View File

@@ -0,0 +1 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[4],{114:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(104),i=t.n(e),o=t(106),u=t.n(o),a=i.a,c=u.a}}]);

Binary file not shown.

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -1,401 +0,0 @@
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

View File

@@ -1,2 +1 @@
!function(e){function n(n){for(var t,o,a=n[0],i=n[1],u=0,l=[];u<a.length;u++)o=a[u],r[o]&&l.push(r[o][0]),r[o]=0;for(t in i)Object.prototype.hasOwnProperty.call(i,t)&&(e[t]=i[t]);for(c&&c(n);l.length;)l.shift()()}var t={},r={1:0};function o(n){if(t[n])return t[n].exports;var r=t[n]={i:n,l:!1,exports:{}};return e[n].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var n=[],t=r[e];if(0!==t)if(t)n.push(t[2]);else{var a=new Promise(function(n,o){t=r[e]=[n,o]});n.push(t[2]=a);var i,u=document.getElementsByTagName("head")[0],c=document.createElement("script");c.charset="utf-8",c.timeout=120,o.nc&&c.setAttribute("nonce",o.nc),c.src=function(e){return o.p+"chunk."+{0:"a8fa5591357cce978816",2:"457ac71b0904d7243237",3:"57f5b43a82b988080555",4:"7ee37c2565bcf2d88182",5:"72a6da063fe4cb6308e8",6:"ad9001ac29bd3acbb520"}[e]+".js"}(e),i=function(n){c.onerror=c.onload=null,clearTimeout(l);var t=r[e];if(0!==t){if(t){var o=n&&("load"===n.type?"missing":n.type),a=n&&n.target&&n.target.src,i=new Error("Loading chunk "+e+" failed.\n("+o+": "+a+")");i.type=o,i.request=a,t[1](i)}r[e]=void 0}};var l=setTimeout(function(){i({type:"timeout",target:c})},12e4);c.onerror=c.onload=i,u.appendChild(c)}return Promise.all(n)},o.m=e,o.c=t,o.d=function(e,n,t){o.o(e,n)||Object.defineProperty(e,n,{enumerable:!0,get:t})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,n){if(1&n&&(e=o(e)),8&n)return e;if(4&n&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(o.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&n&&"string"!=typeof e)for(var r in e)o.d(t,r,function(n){return e[n]}.bind(null,r));return t},o.n=function(e){var n=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(n,"a",n),n},o.o=function(e,n){return Object.prototype.hasOwnProperty.call(e,n)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var a=window.webpackJsonp=window.webpackJsonp||[],i=a.push.bind(a);a.push=n,a=a.slice();for(var u=0;u<a.length;u++)n(a[u]);var c=i;o(o.s=0)}([function(e,n,t){window.loadES5Adapter().then(function(){Promise.all([t.e(0),t.e(2)]).then(t.bind(null,2)),Promise.all([t.e(0),t.e(5),t.e(3)]).then(t.bind(null,1))})}]); !function(e){function n(n){for(var t,o,i=n[0],a=n[1],u=0,f=[];u<i.length;u++)o=i[u],r[o]&&f.push(r[o][0]),r[o]=0;for(t in a)Object.prototype.hasOwnProperty.call(a,t)&&(e[t]=a[t]);for(c&&c(n);f.length;)f.shift()()}var t={},r={1:0};function o(n){if(t[n])return t[n].exports;var r=t[n]={i:n,l:!1,exports:{}};return e[n].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var n=[],t=r[e];if(0!==t)if(t)n.push(t[2]);else{var i=new Promise(function(n,o){t=r[e]=[n,o]});n.push(t[2]=i);var a,u=document.createElement("script");u.charset="utf-8",u.timeout=120,o.nc&&u.setAttribute("nonce",o.nc),u.src=function(e){return o.p+"chunk."+{0:"1ac383635811d6c2cb4b",2:"381b1e7d41316cfb583c",3:"a6e3bc73416702354e6d",4:"8a4a3a3274af0f09d86b",5:"7589a9f39a552ee63688",6:"31b41b04602ce627ad98",7:"ff45557361d5d6bd46af"}[e]+".js"}(e),a=function(n){u.onerror=u.onload=null,clearTimeout(c);var t=r[e];if(0!==t){if(t){var o=n&&("load"===n.type?"missing":n.type),i=n&&n.target&&n.target.src,a=new Error("Loading chunk "+e+" failed.\n("+o+": "+i+")");a.type=o,a.request=i,t[1](a)}r[e]=void 0}};var c=setTimeout(function(){a({type:"timeout",target:u})},12e4);u.onerror=u.onload=a,document.head.appendChild(u)}return Promise.all(n)},o.m=e,o.c=t,o.d=function(e,n,t){o.o(e,n)||Object.defineProperty(e,n,{enumerable:!0,get:t})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,n){if(1&n&&(e=o(e)),8&n)return e;if(4&n&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(o.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&n&&"string"!=typeof e)for(var r in e)o.d(t,r,function(n){return e[n]}.bind(null,r));return t},o.n=function(e){var n=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(n,"a",n),n},o.o=function(e,n){return Object.prototype.hasOwnProperty.call(e,n)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var i=window.webpackJsonp=window.webpackJsonp||[],a=i.push.bind(i);i.push=n,i=i.slice();for(var u=0;u<i.length;u++)n(i[u]);var c=a;o(o.s=0)}([function(e,n,t){window.loadES5Adapter().then(function(){Promise.all([t.e(0),t.e(2)]).then(t.bind(null,2)),Promise.all([t.e(0),t.e(6),t.e(3)]).then(t.bind(null,1))});var r=document.createElement("style");r.innerHTML="\nbody {\n font-family: Roboto, sans-serif;\n -moz-osx-font-smoothing: grayscale;\n -webkit-font-smoothing: antialiased;\n font-weight: 400;\n margin: 0;\n padding: 0;\n height: 100vh;\n}\n",document.head.appendChild(r)}]);
//# sourceMappingURL=entrypoint.js.map

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -5,14 +5,15 @@ import logging
import aiohttp import aiohttp
from aiohttp import web from aiohttp import web
from aiohttp.web_exceptions import ( from aiohttp.web_exceptions import HTTPBadGateway, HTTPUnauthorized
HTTPBadGateway, HTTPInternalServerError, HTTPUnauthorized) from aiohttp.client_exceptions import ClientConnectorError
from aiohttp.hdrs import CONTENT_TYPE, AUTHORIZATION from aiohttp.hdrs import CONTENT_TYPE, AUTHORIZATION
import async_timeout import async_timeout
from ..const import HEADER_HA_ACCESS from ..const import HEADER_HA_ACCESS
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..exceptions import HomeAssistantAuthError, HomeAssistantAPIError from ..exceptions import (
HomeAssistantAuthError, HomeAssistantAPIError, APIError)
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -34,7 +35,7 @@ class APIProxy(CoreSysAttributes):
elif not addon.access_homeassistant_api: elif not addon.access_homeassistant_api:
_LOGGER.warning("Not permitted API access: %s", addon.slug) _LOGGER.warning("Not permitted API access: %s", addon.slug)
else: else:
_LOGGER.info("%s access from %s", request.path, addon.slug) _LOGGER.debug("%s access from %s", request.path, addon.slug)
return return
raise HTTPUnauthorized() raise HTTPUnauthorized()
@@ -82,19 +83,13 @@ class APIProxy(CoreSysAttributes):
response.content_type = request.headers.get(CONTENT_TYPE) response.content_type = request.headers.get(CONTENT_TYPE)
try: try:
await response.prepare(request) await response.prepare(request)
while True: async for data in client.content:
data = await client.content.read(10)
if not data:
break
await response.write(data) await response.write(data)
except aiohttp.ClientError: except (aiohttp.ClientError, aiohttp.ClientPayloadError):
pass pass
finally:
client.close()
_LOGGER.info("Home Assistant EventStream close") _LOGGER.info("Home Assistant EventStream close")
return response return response
async def api(self, request): async def api(self, request):
@@ -117,7 +112,7 @@ class APIProxy(CoreSysAttributes):
try: try:
client = await self.sys_websession_ssl.ws_connect( client = await self.sys_websession_ssl.ws_connect(
url, heartbeat=60, verify_ssl=False) url, heartbeat=30, verify_ssl=False)
# Handle authentication # Handle authentication
data = await client.receive_json() data = await client.receive_json()
@@ -129,7 +124,7 @@ class APIProxy(CoreSysAttributes):
# Invalid protocol # Invalid protocol
_LOGGER.error( _LOGGER.error(
"Got unexpected response from HA WebSocket: %s", data) "Got unexpected response from HA WebSocket: %s", data)
raise HTTPBadGateway() raise APIError()
if self.sys_homeassistant.refresh_token: if self.sys_homeassistant.refresh_token:
await self.sys_homeassistant.ensure_access_token() await self.sys_homeassistant.ensure_access_token()
@@ -149,26 +144,25 @@ class APIProxy(CoreSysAttributes):
return client return client
# Renew the Token is invalid # Renew the Token is invalid
if (data.get('type') == 'invalid_auth' and if data.get('type') == 'invalid_auth' and self.sys_homeassistant.refresh_token:
self.sys_homeassistant.refresh_token):
self.sys_homeassistant.access_token = None self.sys_homeassistant.access_token = None
return await self._websocket_client() return await self._websocket_client()
raise HomeAssistantAuthError() raise HomeAssistantAuthError()
except (RuntimeError, ValueError) as err: except (RuntimeError, ValueError, ClientConnectorError) as err:
_LOGGER.error("Client error on WebSocket API %s.", err) _LOGGER.error("Client error on WebSocket API %s.", err)
except HomeAssistantAuthError as err: except HomeAssistantAuthError:
_LOGGER.error("Failed authentication to Home Assistant WebSocket") _LOGGER.error("Failed authentication to Home Assistant WebSocket")
raise HTTPBadGateway() raise APIError()
async def websocket(self, request): async def websocket(self, request):
"""Initialize a WebSocket API connection.""" """Initialize a WebSocket API connection."""
_LOGGER.info("Home Assistant WebSocket API request initialize") _LOGGER.info("Home Assistant WebSocket API request initialize")
# init server # init server
server = web.WebSocketResponse(heartbeat=60) server = web.WebSocketResponse(heartbeat=30)
await server.prepare(request) await server.prepare(request)
# handle authentication # handle authentication
@@ -180,8 +174,7 @@ class APIProxy(CoreSysAttributes):
# Check API access # Check API access
response = await server.receive_json() response = await server.receive_json()
hassio_token = (response.get('api_password') or hassio_token = response.get('api_password') or response.get('access_token')
response.get('access_token'))
addon = self.sys_addons.from_token(hassio_token) addon = self.sys_addons.from_token(hassio_token)
if not addon or not addon.access_homeassistant_api: if not addon or not addon.access_homeassistant_api:
@@ -200,10 +193,13 @@ class APIProxy(CoreSysAttributes):
}) })
except (RuntimeError, ValueError) as err: except (RuntimeError, ValueError) as err:
_LOGGER.error("Can't initialize handshake: %s", err) _LOGGER.error("Can't initialize handshake: %s", err)
raise HTTPInternalServerError() from None return server
# init connection to hass # init connection to hass
try:
client = await self._websocket_client() client = await self._websocket_client()
except APIError:
return server
_LOGGER.info("Home Assistant WebSocket API request running") _LOGGER.info("Home Assistant WebSocket API request running")
try: try:
@@ -238,7 +234,7 @@ class APIProxy(CoreSysAttributes):
except asyncio.CancelledError: except asyncio.CancelledError:
pass pass
except RuntimeError as err: except (RuntimeError, ConnectionError, TypeError) as err:
_LOGGER.info("Home Assistant WebSocket API error: %s", err) _LOGGER.info("Home Assistant WebSocket API error: %s", err)
finally: finally:
@@ -248,7 +244,9 @@ class APIProxy(CoreSysAttributes):
server_read.cancel() server_read.cancel()
# close connections # close connections
if not client.closed:
await client.close() await client.close()
if not server.closed:
await server.close() await server.close()
_LOGGER.info("Home Assistant WebSocket API connection is closed") _LOGGER.info("Home Assistant WebSocket API connection is closed")

View File

@@ -6,12 +6,19 @@ from aiohttp.web import middleware
from aiohttp.web_exceptions import HTTPUnauthorized, HTTPForbidden from aiohttp.web_exceptions import HTTPUnauthorized, HTTPForbidden
from ..const import ( from ..const import (
HEADER_TOKEN, REQUEST_FROM, ROLE_ADMIN, ROLE_DEFAULT, ROLE_HOMEASSISTANT, HEADER_TOKEN,
ROLE_MANAGER) REQUEST_FROM,
ROLE_ADMIN,
ROLE_DEFAULT,
ROLE_HOMEASSISTANT,
ROLE_MANAGER,
ROLE_BACKUP,
)
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
# fmt: off
# Block Anytime # Block Anytime
BLACKLIST = re.compile( BLACKLIST = re.compile(
@@ -32,10 +39,11 @@ NO_SECURITY_CHECK = re.compile(
# Can called by every add-on # Can called by every add-on
ADDONS_API_BYPASS = re.compile( ADDONS_API_BYPASS = re.compile(
r"^(?:" r"^(?:"
r"|/addons/self/(?!security)[^/]+" r"|/addons/self/(?!security|update)[^/]+"
r"|/version" r"|/info"
r"|/services.*" r"|/services.*"
r"|/discovery.*" r"|/discovery.*"
r"|/auth"
r")$" r")$"
) )
@@ -52,6 +60,11 @@ ADDONS_ROLE_ACCESS = {
r"|/homeassistant/.+" r"|/homeassistant/.+"
r")$" r")$"
), ),
ROLE_BACKUP: re.compile(
r"^(?:"
r"|/snapshots.*"
r")$"
),
ROLE_MANAGER: re.compile( ROLE_MANAGER: re.compile(
r"^(?:" r"^(?:"
r"|/homeassistant/.+" r"|/homeassistant/.+"
@@ -59,7 +72,7 @@ ADDONS_ROLE_ACCESS = {
r"|/hardware/.+" r"|/hardware/.+"
r"|/hassos/.+" r"|/hassos/.+"
r"|/supervisor/.+" r"|/supervisor/.+"
r"|/addons/[^/]+/(?!security).+" r"|/addons(?:/[^/]+/(?!security).+|/reload)?"
r"|/snapshots.*" r"|/snapshots.*"
r")$" r")$"
), ),
@@ -68,6 +81,8 @@ ADDONS_ROLE_ACCESS = {
), ),
} }
# fmt: off
class SecurityMiddleware(CoreSysAttributes): class SecurityMiddleware(CoreSysAttributes):
"""Security middleware functions.""" """Security middleware functions."""
@@ -98,9 +113,7 @@ class SecurityMiddleware(CoreSysAttributes):
raise HTTPUnauthorized() raise HTTPUnauthorized()
# Home-Assistant # Home-Assistant
# UUID check need removed with 131 if hassio_token == self.sys_homeassistant.hassio_token:
if hassio_token in (self.sys_homeassistant.uuid,
self.sys_homeassistant.hassio_token):
_LOGGER.debug("%s access from Home Assistant", request.path) _LOGGER.debug("%s access from Home Assistant", request.path)
request_from = self.sys_homeassistant request_from = self.sys_homeassistant

View File

@@ -1,34 +1,57 @@
"""Init file for Hass.io Supervisor RESTful API.""" """Init file for Hass.io Supervisor RESTful API."""
import asyncio import asyncio
import logging import logging
from typing import Any, Awaitable, Dict
from aiohttp import web
import voluptuous as vol import voluptuous as vol
from .utils import api_process, api_process_raw, api_validate
from ..const import ( from ..const import (
ATTR_ADDONS, ATTR_VERSION, ATTR_LAST_VERSION, ATTR_CHANNEL, ATTR_ARCH, ATTR_ADDONS,
HASSIO_VERSION, ATTR_ADDONS_REPOSITORIES, ATTR_LOGO, ATTR_REPOSITORY, ATTR_ADDONS_REPOSITORIES,
ATTR_DESCRIPTON, ATTR_NAME, ATTR_SLUG, ATTR_INSTALLED, ATTR_TIMEZONE, ATTR_ARCH,
ATTR_STATE, ATTR_WAIT_BOOT, ATTR_CPU_PERCENT, ATTR_MEMORY_USAGE, ATTR_BLK_READ,
ATTR_MEMORY_LIMIT, ATTR_NETWORK_RX, ATTR_NETWORK_TX, ATTR_BLK_READ, ATTR_BLK_WRITE,
ATTR_BLK_WRITE, CONTENT_TYPE_BINARY, ATTR_ICON) ATTR_CHANNEL,
ATTR_CPU_PERCENT,
ATTR_DESCRIPTON,
ATTR_ICON,
ATTR_INSTALLED,
ATTR_LAST_VERSION,
ATTR_LOGO,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE,
ATTR_NAME,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_REPOSITORY,
ATTR_SLUG,
ATTR_STATE,
ATTR_TIMEZONE,
ATTR_VERSION,
ATTR_WAIT_BOOT,
ATTR_IP_ADDRESS,
CONTENT_TYPE_BINARY,
HASSIO_VERSION,
)
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..validate import WAIT_BOOT, REPOSITORIES, CHANNELS
from ..exceptions import APIError from ..exceptions import APIError
from ..utils.validate import validate_timezone from ..utils.validate import validate_timezone
from ..validate import CHANNELS, REPOSITORIES, WAIT_BOOT
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
SCHEMA_OPTIONS = vol.Schema({ SCHEMA_OPTIONS = vol.Schema(
{
vol.Optional(ATTR_CHANNEL): CHANNELS, vol.Optional(ATTR_CHANNEL): CHANNELS,
vol.Optional(ATTR_ADDONS_REPOSITORIES): REPOSITORIES, vol.Optional(ATTR_ADDONS_REPOSITORIES): REPOSITORIES,
vol.Optional(ATTR_TIMEZONE): validate_timezone, vol.Optional(ATTR_TIMEZONE): validate_timezone,
vol.Optional(ATTR_WAIT_BOOT): WAIT_BOOT, vol.Optional(ATTR_WAIT_BOOT): WAIT_BOOT,
}) }
)
SCHEMA_VERSION = vol.Schema({ SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
vol.Optional(ATTR_VERSION): vol.Coerce(str),
})
class APISupervisor(CoreSysAttributes): class APISupervisor(CoreSysAttributes):
@@ -40,12 +63,13 @@ class APISupervisor(CoreSysAttributes):
return True return True
@api_process @api_process
async def info(self, request): async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return host information.""" """Return host information."""
list_addons = [] list_addons = []
for addon in self.sys_addons.list_addons: for addon in self.sys_addons.list_addons:
if addon.is_installed: if addon.is_installed:
list_addons.append({ list_addons.append(
{
ATTR_NAME: addon.name, ATTR_NAME: addon.name,
ATTR_SLUG: addon.slug, ATTR_SLUG: addon.slug,
ATTR_DESCRIPTON: addon.description, ATTR_DESCRIPTON: addon.description,
@@ -55,13 +79,15 @@ class APISupervisor(CoreSysAttributes):
ATTR_REPOSITORY: addon.repository, ATTR_REPOSITORY: addon.repository,
ATTR_ICON: addon.with_icon, ATTR_ICON: addon.with_icon,
ATTR_LOGO: addon.with_logo, ATTR_LOGO: addon.with_logo,
}) }
)
return { return {
ATTR_VERSION: HASSIO_VERSION, ATTR_VERSION: HASSIO_VERSION,
ATTR_LAST_VERSION: self.sys_updater.version_hassio, ATTR_LAST_VERSION: self.sys_updater.version_hassio,
ATTR_CHANNEL: self.sys_updater.channel, ATTR_CHANNEL: self.sys_updater.channel,
ATTR_ARCH: self.sys_arch, ATTR_ARCH: self.sys_supervisor.arch,
ATTR_IP_ADDRESS: str(self.sys_supervisor.ip_address),
ATTR_WAIT_BOOT: self.sys_config.wait_boot, ATTR_WAIT_BOOT: self.sys_config.wait_boot,
ATTR_TIMEZONE: self.sys_config.timezone, ATTR_TIMEZONE: self.sys_config.timezone,
ATTR_ADDONS: list_addons, ATTR_ADDONS: list_addons,
@@ -69,7 +95,7 @@ class APISupervisor(CoreSysAttributes):
} }
@api_process @api_process
async def options(self, request): async def options(self, request: web.Request) -> None:
"""Set Supervisor options.""" """Set Supervisor options."""
body = await api_validate(SCHEMA_OPTIONS, request) body = await api_validate(SCHEMA_OPTIONS, request)
@@ -88,14 +114,11 @@ class APISupervisor(CoreSysAttributes):
self.sys_updater.save_data() self.sys_updater.save_data()
self.sys_config.save_data() self.sys_config.save_data()
return True
@api_process @api_process
async def stats(self, request): async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information.""" """Return resource information."""
stats = await self.sys_supervisor.stats() stats = await self.sys_supervisor.stats()
if not stats:
raise APIError("No stats available")
return { return {
ATTR_CPU_PERCENT: stats.cpu_percent, ATTR_CPU_PERCENT: stats.cpu_percent,
@@ -108,33 +131,21 @@ class APISupervisor(CoreSysAttributes):
} }
@api_process @api_process
async def update(self, request): async def update(self, request: web.Request) -> None:
"""Update Supervisor OS.""" """Update Supervisor OS."""
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_updater.version_hassio) version = body.get(ATTR_VERSION, self.sys_updater.version_hassio)
if version == self.sys_supervisor.version: if version == self.sys_supervisor.version:
raise APIError("Version {} is already in use".format(version)) raise APIError("Version {} is already in use".format(version))
await asyncio.shield(self.sys_supervisor.update(version))
return await asyncio.shield(
self.sys_supervisor.update(version))
@api_process @api_process
async def reload(self, request): def reload(self, request: web.Request) -> Awaitable[None]:
"""Reload add-ons, configuration, etc.""" """Reload add-ons, configuration, etc."""
tasks = [ return asyncio.shield(self.sys_updater.reload())
self.sys_updater.reload(),
]
results, _ = await asyncio.shield(
asyncio.wait(tasks))
for result in results:
if result.exception() is not None:
raise APIError("Some reload task fails!")
return True
@api_process_raw(CONTENT_TYPE_BINARY) @api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request): def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return supervisor Docker logs.""" """Return supervisor Docker logs."""
return self.sys_supervisor.logs() return self.sys_supervisor.logs()

View File

@@ -1,26 +0,0 @@
"""Init file for Hass.io version RESTful API."""
import logging
from .utils import api_process
from ..const import (
ATTR_HOMEASSISTANT, ATTR_SUPERVISOR, ATTR_MACHINE, ATTR_ARCH, ATTR_HASSOS,
ATTR_CHANNEL)
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
class APIVersion(CoreSysAttributes):
"""Handle RESTful API for version functions."""
@api_process
async def info(self, request):
"""Show version info."""
return {
ATTR_SUPERVISOR: self.sys_supervisor.version,
ATTR_HOMEASSISTANT: self.sys_homeassistant.version,
ATTR_HASSOS: self.sys_hassos.version,
ATTR_MACHINE: self.sys_machine,
ATTR_ARCH: self.sys_arch,
ATTR_CHANNEL: self.sys_updater.channel,
}

49
hassio/arch.json Normal file
View File

@@ -0,0 +1,49 @@
{
"raspberrypi": [
"armhf"
],
"raspberrypi2": [
"armv7",
"armhf"
],
"raspberrypi3": [
"armv7",
"armhf"
],
"raspberrypi3-64": [
"aarch64",
"armv7",
"armhf"
],
"tinker": [
"armv7",
"armhf"
],
"odroid-c2": [
"aarch64"
],
"odroid-xu": [
"armv7",
"armhf"
],
"orangepi-prime": [
"aarch64"
],
"qemux86": [
"i386"
],
"qemux86-64": [
"amd64",
"i386"
],
"qemuarm": [
"armhf"
],
"qemuarm-64": [
"aarch64"
],
"intel-nuc": [
"amd64",
"i386"
]
}

65
hassio/arch.py Normal file
View File

@@ -0,0 +1,65 @@
"""Handle Arch for underlay maschine/platforms."""
import logging
from typing import List
from pathlib import Path
from .coresys import CoreSysAttributes, CoreSys
from .exceptions import HassioArchNotFound, JsonFileError
from .utils.json import read_json_file
_LOGGER = logging.getLogger(__name__)
class CpuArch(CoreSysAttributes):
"""Manage available architectures."""
def __init__(self, coresys: CoreSys) -> None:
"""Initialize CPU Architecture handler."""
self.coresys = coresys
self._supported_arch: List[str] = []
self._default_arch: str
@property
def default(self) -> str:
"""Return system default arch."""
return self._default_arch
@property
def supervisor(self) -> str:
"""Return supervisor arch."""
return self.sys_supervisor.arch
@property
def supported(self) -> List[str]:
"""Return support arch by CPU/Machine."""
return self._supported_arch
async def load(self) -> None:
"""Load data and initialize default arch."""
try:
arch_data = read_json_file(Path(__file__).parent.joinpath("arch.json"))
except JsonFileError:
_LOGGER.warning("Can't read arch json")
return
# Evaluate current CPU/Platform
if not self.sys_machine or self.sys_machine not in arch_data:
_LOGGER.warning("Can't detect underlay machine type!")
self._default_arch = self.sys_supervisor.arch
self._supported_arch.append(self.default)
return
# Use configs from arch.json
self._supported_arch.extend(arch_data[self.sys_machine])
self._default_arch = self.supported[0]
def is_supported(self, arch_list: List[str]) -> bool:
"""Return True if there is a supported arch by this platform."""
return not set(self.supported).isdisjoint(set(arch_list))
def match(self, arch_list: List[str]) -> str:
"""Return best match for this CPU/Platform."""
for self_arch in self.supported:
if self_arch in arch_list:
return self_arch
raise HassioArchNotFound()

95
hassio/auth.py Normal file
View File

@@ -0,0 +1,95 @@
"""Manage SSO for Add-ons with Home Assistant user."""
import logging
import hashlib
from .const import (
FILE_HASSIO_AUTH, ATTR_PASSWORD, ATTR_USERNAME, ATTR_ADDON)
from .coresys import CoreSysAttributes
from .utils.json import JsonConfig
from .validate import SCHEMA_AUTH_CONFIG
from .exceptions import AuthError, HomeAssistantAPIError
_LOGGER = logging.getLogger(__name__)
class Auth(JsonConfig, CoreSysAttributes):
"""Manage SSO for Add-ons with Home Assistant user."""
def __init__(self, coresys):
"""Initialize updater."""
super().__init__(FILE_HASSIO_AUTH, SCHEMA_AUTH_CONFIG)
self.coresys = coresys
def _check_cache(self, username, password):
"""Check password in cache."""
username_h = _rehash(username)
password_h = _rehash(password, username)
if self._data.get(username_h) == password_h:
_LOGGER.info("Cache hit for %s", username)
return True
_LOGGER.warning("No cache hit for %s", username)
return False
def _update_cache(self, username, password):
"""Cache a username, password."""
username_h = _rehash(username)
password_h = _rehash(password, username)
if self._data.get(username_h) == password_h:
return
self._data[username_h] = password_h
self.save_data()
def _dismatch_cache(self, username, password):
"""Remove user from cache."""
username_h = _rehash(username)
password_h = _rehash(password, username)
if self._data.get(username_h) != password_h:
return
self._data.pop(username_h, None)
self.save_data()
async def check_login(self, addon, username, password):
"""Check username login."""
if password is None:
_LOGGER.error("None as password is not supported!")
raise AuthError()
_LOGGER.info("Auth request from %s for %s", addon.slug, username)
# Check API state
if not await self.sys_homeassistant.check_api_state():
_LOGGER.info("Home Assistant not running, check cache")
return self._check_cache(username, password)
try:
async with self.sys_homeassistant.make_request(
'post', 'api/hassio_auth', json={
ATTR_USERNAME: username,
ATTR_PASSWORD: password,
ATTR_ADDON: addon.slug,
}) as req:
if req.status == 200:
_LOGGER.info("Success login from %s", username)
self._update_cache(username, password)
return True
_LOGGER.warning("Wrong login from %s", username)
self._dismatch_cache(username, password)
return False
except HomeAssistantAPIError:
_LOGGER.error("Can't request auth on Home Assistant!")
raise AuthError()
def _rehash(value, salt2=""):
"""Rehash a value."""
for idx in range(1, 20):
value = hashlib.sha256(f"{value}{idx}{salt2}".encode()).hexdigest()
return value

View File

@@ -1,43 +1,48 @@
"""Bootstrap Hass.io.""" """Bootstrap Hass.io."""
import logging import logging
import os import os
import signal
import shutil
from pathlib import Path from pathlib import Path
import shutil
import signal
from colorlog import ColoredFormatter from colorlog import ColoredFormatter
from .core import HassIO
from .addons import AddonManager from .addons import AddonManager
from .api import RestAPI from .api import RestAPI
from .arch import CpuArch
from .auth import Auth
from .const import SOCKET_DOCKER from .const import SOCKET_DOCKER
from .core import HassIO
from .coresys import CoreSys from .coresys import CoreSys
from .supervisor import Supervisor from .dbus import DBusManager
from .discovery import Discovery
from .hassos import HassOS
from .homeassistant import HomeAssistant from .homeassistant import HomeAssistant
from .host import HostManager
from .ingress import Ingress
from .services import ServiceManager
from .snapshots import SnapshotManager from .snapshots import SnapshotManager
from .supervisor import Supervisor
from .tasks import Tasks from .tasks import Tasks
from .updater import Updater from .updater import Updater
from .services import ServiceManager
from .discovery import Discovery
from .host import HostManager
from .dbus import DBusManager
from .hassos import HassOS
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
ENV_SHARE = 'SUPERVISOR_SHARE' ENV_SHARE = "SUPERVISOR_SHARE"
ENV_NAME = 'SUPERVISOR_NAME' ENV_NAME = "SUPERVISOR_NAME"
ENV_REPO = 'HOMEASSISTANT_REPOSITORY' ENV_REPO = "HOMEASSISTANT_REPOSITORY"
MACHINE_ID = Path('/etc/machine-id') MACHINE_ID = Path("/etc/machine-id")
def initialize_coresys(loop): async def initialize_coresys():
"""Initialize HassIO coresys/objects.""" """Initialize HassIO coresys/objects."""
coresys = CoreSys(loop) coresys = CoreSys()
# Initialize core objects # Initialize core objects
coresys.core = HassIO(coresys) coresys.core = HassIO(coresys)
coresys.arch = CpuArch(coresys)
coresys.auth = Auth(coresys)
coresys.updater = Updater(coresys) coresys.updater = Updater(coresys)
coresys.api = RestAPI(coresys) coresys.api = RestAPI(coresys)
coresys.supervisor = Supervisor(coresys) coresys.supervisor = Supervisor(coresys)
@@ -45,6 +50,7 @@ def initialize_coresys(loop):
coresys.addons = AddonManager(coresys) coresys.addons = AddonManager(coresys)
coresys.snapshots = SnapshotManager(coresys) coresys.snapshots = SnapshotManager(coresys)
coresys.host = HostManager(coresys) coresys.host = HostManager(coresys)
coresys.ingress = Ingress(coresys)
coresys.tasks = Tasks(coresys) coresys.tasks = Tasks(coresys)
coresys.services = ServiceManager(coresys) coresys.services = ServiceManager(coresys)
coresys.discovery = Discovery(coresys) coresys.discovery = Discovery(coresys)
@@ -68,8 +74,8 @@ def initialize_system_data(coresys):
# Home Assistant configuration folder # Home Assistant configuration folder
if not config.path_homeassistant.is_dir(): if not config.path_homeassistant.is_dir():
_LOGGER.info( _LOGGER.info(
"Create Home Assistant configuration folder %s", "Create Home Assistant configuration folder %s", config.path_homeassistant
config.path_homeassistant) )
config.path_homeassistant.mkdir() config.path_homeassistant.mkdir()
# hassio ssl folder # hassio ssl folder
@@ -79,18 +85,19 @@ def initialize_system_data(coresys):
# hassio addon data folder # hassio addon data folder
if not config.path_addons_data.is_dir(): if not config.path_addons_data.is_dir():
_LOGGER.info( _LOGGER.info("Create Hass.io Add-on data folder %s", config.path_addons_data)
"Create Hass.io Add-on data folder %s", config.path_addons_data)
config.path_addons_data.mkdir(parents=True) config.path_addons_data.mkdir(parents=True)
if not config.path_addons_local.is_dir(): if not config.path_addons_local.is_dir():
_LOGGER.info("Create Hass.io Add-on local repository folder %s", _LOGGER.info(
config.path_addons_local) "Create Hass.io Add-on local repository folder %s", config.path_addons_local
)
config.path_addons_local.mkdir(parents=True) config.path_addons_local.mkdir(parents=True)
if not config.path_addons_git.is_dir(): if not config.path_addons_git.is_dir():
_LOGGER.info("Create Hass.io Add-on git repositories folder %s", _LOGGER.info(
config.path_addons_git) "Create Hass.io Add-on git repositories folder %s", config.path_addons_git
)
config.path_addons_git.mkdir(parents=True) config.path_addons_git.mkdir(parents=True)
# hassio tmp folder # hassio tmp folder
@@ -132,26 +139,27 @@ def migrate_system_env(coresys):
def initialize_logging(): def initialize_logging():
"""Setup the logging.""" """Setup the logging."""
logging.basicConfig(level=logging.INFO) logging.basicConfig(level=logging.INFO)
fmt = ("%(asctime)s %(levelname)s (%(threadName)s) " fmt = "%(asctime)s %(levelname)s (%(threadName)s) [%(name)s] %(message)s"
"[%(name)s] %(message)s") colorfmt = f"%(log_color)s{fmt}%(reset)s"
colorfmt = "%(log_color)s{}%(reset)s".format(fmt) datefmt = "%y-%m-%d %H:%M:%S"
datefmt = '%y-%m-%d %H:%M:%S'
# suppress overly verbose logs from libraries that aren't helpful # suppress overly verbose logs from libraries that aren't helpful
logging.getLogger("aiohttp.access").setLevel(logging.WARNING) logging.getLogger("aiohttp.access").setLevel(logging.WARNING)
logging.getLogger().handlers[0].setFormatter(ColoredFormatter( logging.getLogger().handlers[0].setFormatter(
ColoredFormatter(
colorfmt, colorfmt,
datefmt=datefmt, datefmt=datefmt,
reset=True, reset=True,
log_colors={ log_colors={
'DEBUG': 'cyan', "DEBUG": "cyan",
'INFO': 'green', "INFO": "green",
'WARNING': 'yellow', "WARNING": "yellow",
'ERROR': 'red', "ERROR": "red",
'CRITICAL': 'red', "CRITICAL": "red",
} },
)) )
)
def check_environment(): def check_environment():
@@ -170,12 +178,12 @@ def check_environment():
return False return False
# check socat exec # check socat exec
if not shutil.which('socat'): if not shutil.which("socat"):
_LOGGER.fatal("Can't find socat!") _LOGGER.fatal("Can't find socat!")
return False return False
# check socat exec # check socat exec
if not shutil.which('gdbus'): if not shutil.which("gdbus"):
_LOGGER.fatal("Can't find gdbus!") _LOGGER.fatal("Can't find gdbus!")
return False return False
@@ -185,19 +193,16 @@ def check_environment():
def reg_signal(loop): def reg_signal(loop):
"""Register SIGTERM and SIGKILL to stop system.""" """Register SIGTERM and SIGKILL to stop system."""
try: try:
loop.add_signal_handler( loop.add_signal_handler(signal.SIGTERM, lambda: loop.call_soon(loop.stop))
signal.SIGTERM, lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError): except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGTERM") _LOGGER.warning("Could not bind to SIGTERM")
try: try:
loop.add_signal_handler( loop.add_signal_handler(signal.SIGHUP, lambda: loop.call_soon(loop.stop))
signal.SIGHUP, lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError): except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGHUP") _LOGGER.warning("Could not bind to SIGHUP")
try: try:
loop.add_signal_handler( loop.add_signal_handler(signal.SIGINT, lambda: loop.call_soon(loop.stop))
signal.SIGINT, lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError): except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGINT") _LOGGER.warning("Could not bind to SIGINT")

View File

@@ -2,257 +2,299 @@
from pathlib import Path from pathlib import Path
from ipaddress import ip_network from ipaddress import ip_network
HASSIO_VERSION = '132'
HASSIO_VERSION = "153"
URL_HASSIO_ADDONS = "https://github.com/home-assistant/hassio-addons" URL_HASSIO_ADDONS = "https://github.com/home-assistant/hassio-addons"
URL_HASSIO_VERSION = \ URL_HASSIO_VERSION = "https://s3.amazonaws.com/hassio-version/{channel}.json"
"https://s3.amazonaws.com/hassio-version/{channel}.json" URL_HASSIO_APPARMOR = "https://s3.amazonaws.com/hassio-version/apparmor.txt"
URL_HASSIO_APPARMOR = \
"https://s3.amazonaws.com/hassio-version/apparmor.txt"
URL_HASSOS_OTA = ( URL_HASSOS_OTA = (
"https://github.com/home-assistant/hassos/releases/download/" "https://github.com/home-assistant/hassos/releases/download/"
"{version}/hassos_{board}-{version}.raucb") "{version}/hassos_{board}-{version}.raucb"
)
HASSIO_DATA = Path("/data") HASSIO_DATA = Path("/data")
FILE_HASSIO_AUTH = Path(HASSIO_DATA, "auth.json")
FILE_HASSIO_ADDONS = Path(HASSIO_DATA, "addons.json") FILE_HASSIO_ADDONS = Path(HASSIO_DATA, "addons.json")
FILE_HASSIO_CONFIG = Path(HASSIO_DATA, "config.json") FILE_HASSIO_CONFIG = Path(HASSIO_DATA, "config.json")
FILE_HASSIO_HOMEASSISTANT = Path(HASSIO_DATA, "homeassistant.json") FILE_HASSIO_HOMEASSISTANT = Path(HASSIO_DATA, "homeassistant.json")
FILE_HASSIO_UPDATER = Path(HASSIO_DATA, "updater.json") FILE_HASSIO_UPDATER = Path(HASSIO_DATA, "updater.json")
FILE_HASSIO_SERVICES = Path(HASSIO_DATA, "services.json") FILE_HASSIO_SERVICES = Path(HASSIO_DATA, "services.json")
FILE_HASSIO_DISCOVERY = Path(HASSIO_DATA, "discovery.json") FILE_HASSIO_DISCOVERY = Path(HASSIO_DATA, "discovery.json")
FILE_HASSIO_INGRESS = Path(HASSIO_DATA, "ingress.json")
SOCKET_DOCKER = Path("/var/run/docker.sock") SOCKET_DOCKER = Path("/var/run/docker.sock")
DOCKER_NETWORK = 'hassio' DOCKER_NETWORK = "hassio"
DOCKER_NETWORK_MASK = ip_network('172.30.32.0/23') DOCKER_NETWORK_MASK = ip_network("172.30.32.0/23")
DOCKER_NETWORK_RANGE = ip_network('172.30.33.0/24') DOCKER_NETWORK_RANGE = ip_network("172.30.33.0/24")
LABEL_VERSION = 'io.hass.version' LABEL_VERSION = "io.hass.version"
LABEL_ARCH = 'io.hass.arch' LABEL_ARCH = "io.hass.arch"
LABEL_TYPE = 'io.hass.type' LABEL_TYPE = "io.hass.type"
LABEL_MACHINE = 'io.hass.machine' LABEL_MACHINE = "io.hass.machine"
META_ADDON = 'addon' META_ADDON = "addon"
META_SUPERVISOR = 'supervisor' META_SUPERVISOR = "supervisor"
META_HOMEASSISTANT = 'homeassistant' META_HOMEASSISTANT = "homeassistant"
JSON_RESULT = 'result' JSON_RESULT = "result"
JSON_DATA = 'data' JSON_DATA = "data"
JSON_MESSAGE = 'message' JSON_MESSAGE = "message"
RESULT_ERROR = 'error' RESULT_ERROR = "error"
RESULT_OK = 'ok' RESULT_OK = "ok"
CONTENT_TYPE_BINARY = 'application/octet-stream' CONTENT_TYPE_BINARY = "application/octet-stream"
CONTENT_TYPE_PNG = 'image/png' CONTENT_TYPE_PNG = "image/png"
CONTENT_TYPE_JSON = 'application/json' CONTENT_TYPE_JSON = "application/json"
CONTENT_TYPE_TEXT = 'text/plain' CONTENT_TYPE_TEXT = "text/plain"
CONTENT_TYPE_TAR = 'application/tar' CONTENT_TYPE_TAR = "application/tar"
HEADER_HA_ACCESS = 'x-ha-access' CONTENT_TYPE_URL = "application/x-www-form-urlencoded"
HEADER_TOKEN = 'x-hassio-key' HEADER_HA_ACCESS = "X-Ha-Access"
HEADER_TOKEN = "X-Hassio-Key"
COOKIE_INGRESS = "ingress_session"
ENV_TOKEN = 'HASSIO_TOKEN' ENV_TOKEN = "HASSIO_TOKEN"
ENV_TIME = 'TZ' ENV_TIME = "TZ"
REQUEST_FROM = 'HASSIO_FROM' REQUEST_FROM = "HASSIO_FROM"
ATTR_MACHINE = 'machine' ATTR_MACHINE = "machine"
ATTR_WAIT_BOOT = 'wait_boot' ATTR_WAIT_BOOT = "wait_boot"
ATTR_DEPLOYMENT = 'deployment' ATTR_DEPLOYMENT = "deployment"
ATTR_WATCHDOG = 'watchdog' ATTR_WATCHDOG = "watchdog"
ATTR_CHANGELOG = 'changelog' ATTR_CHANGELOG = "changelog"
ATTR_DATE = 'date' ATTR_DATE = "date"
ATTR_ARCH = 'arch' ATTR_ARCH = "arch"
ATTR_LONG_DESCRIPTION = 'long_description' ATTR_LONG_DESCRIPTION = "long_description"
ATTR_HOSTNAME = 'hostname' ATTR_HOSTNAME = "hostname"
ATTR_TIMEZONE = 'timezone' ATTR_TIMEZONE = "timezone"
ATTR_ARGS = 'args' ATTR_ARGS = "args"
ATTR_OPERATING_SYSTEM = 'operating_system' ATTR_OPERATING_SYSTEM = "operating_system"
ATTR_CHASSIS = 'chassis' ATTR_CHASSIS = "chassis"
ATTR_TYPE = 'type' ATTR_TYPE = "type"
ATTR_SOURCE = 'source' ATTR_SOURCE = "source"
ATTR_FEATURES = 'features' ATTR_FEATURES = "features"
ATTR_ADDONS = 'addons' ATTR_ADDONS = "addons"
ATTR_PROVIDERS = 'providers' ATTR_PROVIDERS = "providers"
ATTR_VERSION = 'version' ATTR_VERSION = "version"
ATTR_VERSION_LATEST = 'version_latest' ATTR_VERSION_LATEST = "version_latest"
ATTR_AUTO_UART = 'auto_uart' ATTR_AUTO_UART = "auto_uart"
ATTR_LAST_BOOT = 'last_boot' ATTR_LAST_BOOT = "last_boot"
ATTR_LAST_VERSION = 'last_version' ATTR_LAST_VERSION = "last_version"
ATTR_CHANNEL = 'channel' ATTR_CHANNEL = "channel"
ATTR_NAME = 'name' ATTR_NAME = "name"
ATTR_SLUG = 'slug' ATTR_SLUG = "slug"
ATTR_DESCRIPTON = 'description' ATTR_DESCRIPTON = "description"
ATTR_STARTUP = 'startup' ATTR_STARTUP = "startup"
ATTR_BOOT = 'boot' ATTR_BOOT = "boot"
ATTR_PORTS = 'ports' ATTR_PORTS = "ports"
ATTR_PORT = 'port' ATTR_PORT = "port"
ATTR_SSL = 'ssl' ATTR_SSL = "ssl"
ATTR_MAP = 'map' ATTR_MAP = "map"
ATTR_WEBUI = 'webui' ATTR_WEBUI = "webui"
ATTR_OPTIONS = 'options' ATTR_OPTIONS = "options"
ATTR_INSTALLED = 'installed' ATTR_INSTALLED = "installed"
ATTR_DETACHED = 'detached' ATTR_DETACHED = "detached"
ATTR_STATE = 'state' ATTR_STATE = "state"
ATTR_SCHEMA = 'schema' ATTR_SCHEMA = "schema"
ATTR_IMAGE = 'image' ATTR_IMAGE = "image"
ATTR_ICON = 'icon' ATTR_ICON = "icon"
ATTR_LOGO = 'logo' ATTR_LOGO = "logo"
ATTR_STDIN = 'stdin' ATTR_STDIN = "stdin"
ATTR_ADDONS_REPOSITORIES = 'addons_repositories' ATTR_ADDONS_REPOSITORIES = "addons_repositories"
ATTR_REPOSITORY = 'repository' ATTR_REPOSITORY = "repository"
ATTR_REPOSITORIES = 'repositories' ATTR_REPOSITORIES = "repositories"
ATTR_URL = 'url' ATTR_URL = "url"
ATTR_MAINTAINER = 'maintainer' ATTR_MAINTAINER = "maintainer"
ATTR_PASSWORD = 'password' ATTR_PASSWORD = "password"
ATTR_TOTP = 'totp' ATTR_TOTP = "totp"
ATTR_INITIALIZE = 'initialize' ATTR_INITIALIZE = "initialize"
ATTR_LOCATON = 'location' ATTR_LOCATON = "location"
ATTR_BUILD = 'build' ATTR_BUILD = "build"
ATTR_DEVICES = 'devices' ATTR_DEVICES = "devices"
ATTR_ENVIRONMENT = 'environment' ATTR_ENVIRONMENT = "environment"
ATTR_HOST_NETWORK = 'host_network' ATTR_HOST_NETWORK = "host_network"
ATTR_HOST_PID = 'host_pid' ATTR_HOST_PID = "host_pid"
ATTR_HOST_IPC = 'host_ipc' ATTR_HOST_IPC = "host_ipc"
ATTR_HOST_DBUS = 'host_dbus' ATTR_HOST_DBUS = "host_dbus"
ATTR_NETWORK = 'network' ATTR_NETWORK = "network"
ATTR_TMPFS = 'tmpfs' ATTR_TMPFS = "tmpfs"
ATTR_PRIVILEGED = 'privileged' ATTR_PRIVILEGED = "privileged"
ATTR_USER = 'user' ATTR_USER = "user"
ATTR_SYSTEM = 'system' ATTR_SYSTEM = "system"
ATTR_SNAPSHOTS = 'snapshots' ATTR_SNAPSHOTS = "snapshots"
ATTR_HOMEASSISTANT = 'homeassistant' ATTR_HOMEASSISTANT = "homeassistant"
ATTR_HASSIO = 'hassio' ATTR_HASSIO = "hassio"
ATTR_HASSIO_API = 'hassio_api' ATTR_HASSIO_API = "hassio_api"
ATTR_HOMEASSISTANT_API = 'homeassistant_api' ATTR_HOMEASSISTANT_API = "homeassistant_api"
ATTR_UUID = 'uuid' ATTR_UUID = "uuid"
ATTR_FOLDERS = 'folders' ATTR_FOLDERS = "folders"
ATTR_SIZE = 'size' ATTR_SIZE = "size"
ATTR_TYPE = 'type' ATTR_TYPE = "type"
ATTR_TIMEOUT = 'timeout' ATTR_TIMEOUT = "timeout"
ATTR_AUTO_UPDATE = 'auto_update' ATTR_AUTO_UPDATE = "auto_update"
ATTR_CUSTOM = 'custom' ATTR_CUSTOM = "custom"
ATTR_AUDIO = 'audio' ATTR_AUDIO = "audio"
ATTR_AUDIO_INPUT = 'audio_input' ATTR_AUDIO_INPUT = "audio_input"
ATTR_AUDIO_OUTPUT = 'audio_output' ATTR_AUDIO_OUTPUT = "audio_output"
ATTR_INPUT = 'input' ATTR_INPUT = "input"
ATTR_OUTPUT = 'output' ATTR_OUTPUT = "output"
ATTR_DISK = 'disk' ATTR_DISK = "disk"
ATTR_SERIAL = 'serial' ATTR_SERIAL = "serial"
ATTR_SECURITY = 'security' ATTR_SECURITY = "security"
ATTR_BUILD_FROM = 'build_from' ATTR_BUILD_FROM = "build_from"
ATTR_SQUASH = 'squash' ATTR_SQUASH = "squash"
ATTR_GPIO = 'gpio' ATTR_GPIO = "gpio"
ATTR_LEGACY = 'legacy' ATTR_LEGACY = "legacy"
ATTR_ADDONS_CUSTOM_LIST = 'addons_custom_list' ATTR_ADDONS_CUSTOM_LIST = "addons_custom_list"
ATTR_CPU_PERCENT = 'cpu_percent' ATTR_CPU_PERCENT = "cpu_percent"
ATTR_NETWORK_RX = 'network_rx' ATTR_NETWORK_RX = "network_rx"
ATTR_NETWORK_TX = 'network_tx' ATTR_NETWORK_TX = "network_tx"
ATTR_MEMORY_LIMIT = 'memory_limit' ATTR_MEMORY_LIMIT = "memory_limit"
ATTR_MEMORY_USAGE = 'memory_usage' ATTR_MEMORY_USAGE = "memory_usage"
ATTR_BLK_READ = 'blk_read' ATTR_BLK_READ = "blk_read"
ATTR_BLK_WRITE = 'blk_write' ATTR_BLK_WRITE = "blk_write"
ATTR_ADDON = 'addon' ATTR_ADDON = "addon"
ATTR_AVAILABLE = 'available' ATTR_AVAILABLE = "available"
ATTR_HOST = 'host' ATTR_HOST = "host"
ATTR_USERNAME = 'username' ATTR_USERNAME = "username"
ATTR_PROTOCOL = 'protocol' ATTR_DISCOVERY = "discovery"
ATTR_DISCOVERY = 'discovery' ATTR_CONFIG = "config"
ATTR_PLATFORM = 'platform' ATTR_SERVICES = "services"
ATTR_COMPONENT = 'component' ATTR_SERVICE = "service"
ATTR_CONFIG = 'config' ATTR_DISCOVERY = "discovery"
ATTR_SERVICES = 'services' ATTR_PROTECTED = "protected"
ATTR_SERVICE = 'service' ATTR_CRYPTO = "crypto"
ATTR_DISCOVERY = 'discovery' ATTR_BRANCH = "branch"
ATTR_PROTECTED = 'protected' ATTR_KERNEL = "kernel"
ATTR_CRYPTO = 'crypto' ATTR_APPARMOR = "apparmor"
ATTR_BRANCH = 'branch' ATTR_DEVICETREE = "devicetree"
ATTR_KERNEL = 'kernel' ATTR_CPE = "cpe"
ATTR_APPARMOR = 'apparmor' ATTR_BOARD = "board"
ATTR_DEVICETREE = 'devicetree' ATTR_HASSOS = "hassos"
ATTR_CPE = 'cpe' ATTR_HASSOS_CLI = "hassos_cli"
ATTR_BOARD = 'board' ATTR_VERSION_CLI = "version_cli"
ATTR_HASSOS = 'hassos' ATTR_VERSION_CLI_LATEST = "version_cli_latest"
ATTR_HASSOS_CLI = 'hassos_cli' ATTR_REFRESH_TOKEN = "refresh_token"
ATTR_VERSION_CLI = 'version_cli' ATTR_ACCESS_TOKEN = "access_token"
ATTR_VERSION_CLI_LATEST = 'version_cli_latest' ATTR_DOCKER_API = "docker_api"
ATTR_REFRESH_TOKEN = 'refresh_token' ATTR_FULL_ACCESS = "full_access"
ATTR_ACCESS_TOKEN = 'access_token' ATTR_PROTECTED = "protected"
ATTR_DOCKER_API = 'docker_api' ATTR_RATING = "rating"
ATTR_FULL_ACCESS = 'full_access' ATTR_HASSIO_ROLE = "hassio_role"
ATTR_PROTECTED = 'protected' ATTR_SUPERVISOR = "supervisor"
ATTR_RATING = 'rating' ATTR_AUTH_API = "auth_api"
ATTR_HASSIO_ROLE = 'hassio_role' ATTR_KERNEL_MODULES = "kernel_modules"
ATTR_SUPERVISOR = 'supervisor' ATTR_SUPPORTED_ARCH = "supported_arch"
ATTR_INGRESS = "ingress"
ATTR_INGRESS_PORT = "ingress_port"
ATTR_INGRESS_ENTRY = "ingress_entry"
ATTR_INGRESS_TOKEN = "ingress_token"
ATTR_INGRESS_URL = "ingress_url"
ATTR_IP_ADDRESS = "ip_address"
ATTR_SESSION = "session"
SERVICE_MQTT = 'mqtt' PROVIDE_SERVICE = "provide"
PROVIDE_SERVICE = 'provide' NEED_SERVICE = "need"
NEED_SERVICE = 'need' WANT_SERVICE = "want"
WANT_SERVICE = 'want'
STARTUP_INITIALIZE = 'initialize' STARTUP_INITIALIZE = "initialize"
STARTUP_SYSTEM = 'system' STARTUP_SYSTEM = "system"
STARTUP_SERVICES = 'services' STARTUP_SERVICES = "services"
STARTUP_APPLICATION = 'application' STARTUP_APPLICATION = "application"
STARTUP_ONCE = 'once' STARTUP_ONCE = "once"
BOOT_AUTO = 'auto' STARTUP_ALL = [
BOOT_MANUAL = 'manual' STARTUP_ONCE,
STARTUP_INITIALIZE,
STARTUP_SYSTEM,
STARTUP_SERVICES,
STARTUP_APPLICATION,
]
STATE_STARTED = 'started' BOOT_AUTO = "auto"
STATE_STOPPED = 'stopped' BOOT_MANUAL = "manual"
STATE_NONE = 'none'
MAP_CONFIG = 'config' STATE_STARTED = "started"
MAP_SSL = 'ssl' STATE_STOPPED = "stopped"
MAP_ADDONS = 'addons' STATE_NONE = "none"
MAP_BACKUP = 'backup'
MAP_SHARE = 'share'
ARCH_ARMHF = 'armhf' MAP_CONFIG = "config"
ARCH_AARCH64 = 'aarch64' MAP_SSL = "ssl"
ARCH_AMD64 = 'amd64' MAP_ADDONS = "addons"
ARCH_I386 = 'i386' MAP_BACKUP = "backup"
MAP_SHARE = "share"
CHANNEL_STABLE = 'stable' ARCH_ARMHF = "armhf"
CHANNEL_BETA = 'beta' ARCH_ARMV7 = "armv7"
CHANNEL_DEV = 'dev' ARCH_AARCH64 = "aarch64"
ARCH_AMD64 = "amd64"
ARCH_I386 = "i386"
REPOSITORY_CORE = 'core' ARCH_ALL = [ARCH_ARMHF, ARCH_ARMV7, ARCH_AARCH64, ARCH_AMD64, ARCH_I386]
REPOSITORY_LOCAL = 'local'
FOLDER_HOMEASSISTANT = 'homeassistant' CHANNEL_STABLE = "stable"
FOLDER_SHARE = 'share' CHANNEL_BETA = "beta"
FOLDER_ADDONS = 'addons/local' CHANNEL_DEV = "dev"
FOLDER_SSL = 'ssl'
SNAPSHOT_FULL = 'full' REPOSITORY_CORE = "core"
SNAPSHOT_PARTIAL = 'partial' REPOSITORY_LOCAL = "local"
CRYPTO_AES128 = 'aes128' FOLDER_HOMEASSISTANT = "homeassistant"
FOLDER_SHARE = "share"
FOLDER_ADDONS = "addons/local"
FOLDER_SSL = "ssl"
SECURITY_PROFILE = 'profile' SNAPSHOT_FULL = "full"
SECURITY_DEFAULT = 'default' SNAPSHOT_PARTIAL = "partial"
SECURITY_DISABLE = 'disable'
PRIVILEGED_NET_ADMIN = 'NET_ADMIN' CRYPTO_AES128 = "aes128"
PRIVILEGED_SYS_ADMIN = 'SYS_ADMIN'
PRIVILEGED_SYS_RAWIO = 'SYS_RAWIO'
PRIVILEGED_IPC_LOCK = 'IPC_LOCK'
PRIVILEGED_SYS_TIME = 'SYS_TIME'
PRIVILEGED_SYS_NICE = 'SYS_NICE'
PRIVILEGED_SYS_RESOURCE = 'SYS_RESOURCE'
PRIVILEGED_SYS_PTRACE = 'SYS_PTRACE'
FEATURES_SHUTDOWN = 'shutdown' SECURITY_PROFILE = "profile"
FEATURES_REBOOT = 'reboot' SECURITY_DEFAULT = "default"
FEATURES_HASSOS = 'hassos' SECURITY_DISABLE = "disable"
FEATURES_HOSTNAME = 'hostname'
FEATURES_SERVICES = 'services'
ROLE_DEFAULT = 'default' PRIVILEGED_NET_ADMIN = "NET_ADMIN"
ROLE_HOMEASSISTANT = 'homeassistant' PRIVILEGED_SYS_ADMIN = "SYS_ADMIN"
ROLE_MANAGER = 'manager' PRIVILEGED_SYS_RAWIO = "SYS_RAWIO"
ROLE_ADMIN = 'admin' PRIVILEGED_IPC_LOCK = "IPC_LOCK"
PRIVILEGED_SYS_TIME = "SYS_TIME"
PRIVILEGED_SYS_NICE = "SYS_NICE"
PRIVILEGED_SYS_MODULE = "SYS_MODULE"
PRIVILEGED_SYS_RESOURCE = "SYS_RESOURCE"
PRIVILEGED_SYS_PTRACE = "SYS_PTRACE"
PRIVILEGED_DAC_READ_SEARCH = "DAC_READ_SEARCH"
PRIVILEGED_ALL = [
PRIVILEGED_NET_ADMIN,
PRIVILEGED_SYS_ADMIN,
PRIVILEGED_SYS_RAWIO,
PRIVILEGED_IPC_LOCK,
PRIVILEGED_SYS_TIME,
PRIVILEGED_SYS_NICE,
PRIVILEGED_SYS_RESOURCE,
PRIVILEGED_SYS_PTRACE,
PRIVILEGED_SYS_MODULE,
PRIVILEGED_DAC_READ_SEARCH,
]
FEATURES_SHUTDOWN = "shutdown"
FEATURES_REBOOT = "reboot"
FEATURES_HASSOS = "hassos"
FEATURES_HOSTNAME = "hostname"
FEATURES_SERVICES = "services"
ROLE_DEFAULT = "default"
ROLE_HOMEASSISTANT = "homeassistant"
ROLE_BACKUP = "backup"
ROLE_MANAGER = "manager"
ROLE_ADMIN = "admin"
ROLE_ALL = [ROLE_DEFAULT, ROLE_HOMEASSISTANT, ROLE_BACKUP, ROLE_MANAGER, ROLE_ADMIN]
CHAN_ID = "chan_id"
CHAN_TYPE = "chan_type"

View File

@@ -7,7 +7,11 @@ import async_timeout
from .coresys import CoreSysAttributes from .coresys import CoreSysAttributes
from .const import ( from .const import (
STARTUP_SYSTEM, STARTUP_SERVICES, STARTUP_APPLICATION, STARTUP_INITIALIZE) STARTUP_SYSTEM,
STARTUP_SERVICES,
STARTUP_APPLICATION,
STARTUP_INITIALIZE,
)
from .exceptions import HassioError, HomeAssistantError from .exceptions import HassioError, HomeAssistantError
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -31,12 +35,15 @@ class HassIO(CoreSysAttributes):
# Load Host # Load Host
await self.sys_host.load() await self.sys_host.load()
# Load HassOS
await self.sys_hassos.load()
# Load Home Assistant # Load Home Assistant
await self.sys_homeassistant.load() await self.sys_homeassistant.load()
# Load CPU/Arch
await self.sys_arch.load()
# Load HassOS
await self.sys_hassos.load()
# Load Add-ons # Load Add-ons
await self.sys_addons.load() await self.sys_addons.load()
@@ -55,18 +62,20 @@ class HassIO(CoreSysAttributes):
# Load discovery # Load discovery
await self.sys_discovery.load() await self.sys_discovery.load()
# Load ingress
await self.sys_ingress.load()
# start dns forwarding # start dns forwarding
self.sys_create_task(self.sys_dns.start()) self.sys_create_task(self.sys_dns.start())
async def start(self): async def start(self):
"""Start Hass.io orchestration.""" """Start Hass.io orchestration."""
# on release channel, try update itself # on release channel, try update itself
# on dev mode, only read new versions if self.sys_supervisor.need_update:
if not self.sys_dev and self.sys_supervisor.need_update: if self.sys_dev:
if await self.sys_supervisor.update(): _LOGGER.warning("Ignore Hass.io updates on dev!")
elif await self.sys_supervisor.update():
return return
else:
_LOGGER.info("Ignore Hass.io auto updates on dev channel")
# start api # start api
await self.sys_api.start() await self.sys_api.start()
@@ -106,7 +115,7 @@ class HassIO(CoreSysAttributes):
await self.sys_tasks.load() await self.sys_tasks.load()
# If landingpage / run upgrade in background # If landingpage / run upgrade in background
if self.sys_homeassistant.version == 'landingpage': if self.sys_homeassistant.version == "landingpage":
self.sys_create_task(self.sys_homeassistant.install()) self.sys_create_task(self.sys_homeassistant.install())
_LOGGER.info("Hass.io is up and running") _LOGGER.info("Hass.io is up and running")
@@ -119,12 +128,15 @@ class HassIO(CoreSysAttributes):
# process async stop tasks # process async stop tasks
try: try:
with async_timeout.timeout(10): with async_timeout.timeout(10):
await asyncio.wait([ await asyncio.wait(
[
self.sys_api.stop(), self.sys_api.stop(),
self.sys_dns.stop(), self.sys_dns.stop(),
self.sys_websession.close(), self.sys_websession.close(),
self.sys_websession_ssl.close() self.sys_websession_ssl.close(),
]) self.sys_ingress.unload(),
]
)
except asyncio.TimeoutError: except asyncio.TimeoutError:
_LOGGER.warning("Force Shutdown!") _LOGGER.warning("Force Shutdown!")

View File

@@ -1,287 +1,474 @@
"""Handle core shared data.""" """Handle core shared data."""
from __future__ import annotations
import asyncio
from typing import TYPE_CHECKING
import aiohttp import aiohttp
from .const import CHANNEL_DEV
from .config import CoreConfig from .config import CoreConfig
from .const import CHANNEL_DEV
from .docker import DockerAPI from .docker import DockerAPI
from .misc.dns import DNSForward from .misc.dns import DNSForward
from .misc.hardware import Hardware from .misc.hardware import Hardware
from .misc.scheduler import Scheduler from .misc.scheduler import Scheduler
if TYPE_CHECKING:
from .addons import AddonManager
from .api import RestAPI
from .arch import CpuArch
from .auth import Auth
from .core import HassIO
from .dbus import DBusManager
from .discovery import Discovery
from .hassos import HassOS
from .homeassistant import HomeAssistant
from .host import HostManager
from .ingress import Ingress
from .services import ServiceManager
from .snapshots import SnapshotManager
from .supervisor import Supervisor
from .tasks import Tasks
from .updater import Updater
class CoreSys: class CoreSys:
"""Class that handle all shared data.""" """Class that handle all shared data."""
def __init__(self, loop): def __init__(self):
"""Initialize coresys.""" """Initialize coresys."""
# Static attributes # Static attributes
self.exit_code = 0 self.machine_id: str = None
self.machine_id = None
# External objects # External objects
self._loop = loop self._loop: asyncio.BaseEventLoop = asyncio.get_running_loop()
self._websession = aiohttp.ClientSession(loop=loop) self._websession: aiohttp.ClientSession = aiohttp.ClientSession()
self._websession_ssl = aiohttp.ClientSession( self._websession_ssl: aiohttp.ClientSession = aiohttp.ClientSession(
connector=aiohttp.TCPConnector(verify_ssl=False), loop=loop) connector=aiohttp.TCPConnector(ssl=False))
# Global objects # Global objects
self._config = CoreConfig() self._config: CoreConfig = CoreConfig()
self._hardware = Hardware() self._hardware: Hardware = Hardware()
self._docker = DockerAPI() self._docker: DockerAPI = DockerAPI()
self._scheduler = Scheduler(loop=loop) self._scheduler: Scheduler = Scheduler()
self._dns = DNSForward(loop=loop) self._dns: DNSForward = DNSForward()
# Internal objects pointers # Internal objects pointers
self._core = None self._core: HassIO = None
self._homeassistant = None self._arch: CpuArch = None
self._supervisor = None self._auth: Auth = None
self._addons = None self._homeassistant: HomeAssistant = None
self._api = None self._supervisor: Supervisor = None
self._updater = None self._addons: AddonManager = None
self._snapshots = None self._api: RestAPI = None
self._tasks = None self._updater: Updater = None
self._host = None self._snapshots: SnapshotManager = None
self._dbus = None self._tasks: Tasks = None
self._hassos = None self._host: HostManager = None
self._services = None self._ingress: Ingress = None
self._discovery = None self._dbus: DBusManager = None
self._hassos: HassOS = None
self._services: ServiceManager = None
self._discovery: Discovery = None
@property @property
def arch(self): def machine(self) -> str:
"""Return running arch of the Hass.io system."""
if self._supervisor:
return self._supervisor.arch
return None
@property
def machine(self):
"""Return running machine type of the Hass.io system.""" """Return running machine type of the Hass.io system."""
if self._homeassistant: if self._homeassistant:
return self._homeassistant.machine return self._homeassistant.machine
return None return None
@property @property
def dev(self): def dev(self) -> str:
"""Return True if we run dev mode.""" """Return True if we run dev mode."""
return self._updater.channel == CHANNEL_DEV return self._updater.channel == CHANNEL_DEV
@property @property
def timezone(self): def timezone(self) -> str:
"""Return timezone.""" """Return timezone."""
return self._config.timezone return self._config.timezone
@property @property
def loop(self): def loop(self) -> asyncio.BaseEventLoop:
"""Return loop object.""" """Return loop object."""
return self._loop return self._loop
@property @property
def websession(self): def websession(self) -> aiohttp.ClientSession:
"""Return websession object.""" """Return websession object."""
return self._websession return self._websession
@property @property
def websession_ssl(self): def websession_ssl(self) -> aiohttp.ClientSession:
"""Return websession object with disabled SSL.""" """Return websession object with disabled SSL."""
return self._websession_ssl return self._websession_ssl
@property @property
def config(self): def config(self) -> CoreConfig:
"""Return CoreConfig object.""" """Return CoreConfig object."""
return self._config return self._config
@property @property
def hardware(self): def hardware(self) -> Hardware:
"""Return Hardware object.""" """Return Hardware object."""
return self._hardware return self._hardware
@property @property
def docker(self): def docker(self) -> DockerAPI:
"""Return DockerAPI object.""" """Return DockerAPI object."""
return self._docker return self._docker
@property @property
def scheduler(self): def scheduler(self) -> Scheduler:
"""Return Scheduler object.""" """Return Scheduler object."""
return self._scheduler return self._scheduler
@property @property
def dns(self): def dns(self) -> DNSForward:
"""Return DNSForward object.""" """Return DNSForward object."""
return self._dns return self._dns
@property @property
def core(self): def core(self) -> HassIO:
"""Return HassIO object.""" """Return HassIO object."""
return self._core return self._core
@core.setter @core.setter
def core(self, value): def core(self, value: HassIO):
"""Set a Hass.io object.""" """Set a Hass.io object."""
if self._core: if self._core:
raise RuntimeError("Hass.io already set!") raise RuntimeError("Hass.io already set!")
self._core = value self._core = value
@property @property
def homeassistant(self): def arch(self) -> CpuArch:
"""Return CpuArch object."""
return self._arch
@arch.setter
def arch(self, value: CpuArch):
"""Set a CpuArch object."""
if self._arch:
raise RuntimeError("CpuArch already set!")
self._arch = value
@property
def auth(self) -> Auth:
"""Return Auth object."""
return self._auth
@auth.setter
def auth(self, value: Auth):
"""Set a Auth object."""
if self._auth:
raise RuntimeError("Auth already set!")
self._auth = value
@property
def homeassistant(self) -> HomeAssistant:
"""Return Home Assistant object.""" """Return Home Assistant object."""
return self._homeassistant return self._homeassistant
@homeassistant.setter @homeassistant.setter
def homeassistant(self, value): def homeassistant(self, value: HomeAssistant):
"""Set a HomeAssistant object.""" """Set a HomeAssistant object."""
if self._homeassistant: if self._homeassistant:
raise RuntimeError("Home Assistant already set!") raise RuntimeError("Home Assistant already set!")
self._homeassistant = value self._homeassistant = value
@property @property
def supervisor(self): def supervisor(self) -> Supervisor:
"""Return Supervisor object.""" """Return Supervisor object."""
return self._supervisor return self._supervisor
@supervisor.setter @supervisor.setter
def supervisor(self, value): def supervisor(self, value: Supervisor):
"""Set a Supervisor object.""" """Set a Supervisor object."""
if self._supervisor: if self._supervisor:
raise RuntimeError("Supervisor already set!") raise RuntimeError("Supervisor already set!")
self._supervisor = value self._supervisor = value
@property @property
def api(self): def api(self) -> RestAPI:
"""Return API object.""" """Return API object."""
return self._api return self._api
@api.setter @api.setter
def api(self, value): def api(self, value: RestAPI):
"""Set an API object.""" """Set an API object."""
if self._api: if self._api:
raise RuntimeError("API already set!") raise RuntimeError("API already set!")
self._api = value self._api = value
@property @property
def updater(self): def updater(self) -> Updater:
"""Return Updater object.""" """Return Updater object."""
return self._updater return self._updater
@updater.setter @updater.setter
def updater(self, value): def updater(self, value: Updater):
"""Set a Updater object.""" """Set a Updater object."""
if self._updater: if self._updater:
raise RuntimeError("Updater already set!") raise RuntimeError("Updater already set!")
self._updater = value self._updater = value
@property @property
def addons(self): def addons(self) -> AddonManager:
"""Return AddonManager object.""" """Return AddonManager object."""
return self._addons return self._addons
@addons.setter @addons.setter
def addons(self, value): def addons(self, value: AddonManager):
"""Set a AddonManager object.""" """Set a AddonManager object."""
if self._addons: if self._addons:
raise RuntimeError("AddonManager already set!") raise RuntimeError("AddonManager already set!")
self._addons = value self._addons = value
@property @property
def snapshots(self): def snapshots(self) -> SnapshotManager:
"""Return SnapshotManager object.""" """Return SnapshotManager object."""
return self._snapshots return self._snapshots
@snapshots.setter @snapshots.setter
def snapshots(self, value): def snapshots(self, value: SnapshotManager):
"""Set a SnapshotManager object.""" """Set a SnapshotManager object."""
if self._snapshots: if self._snapshots:
raise RuntimeError("SnapshotsManager already set!") raise RuntimeError("SnapshotsManager already set!")
self._snapshots = value self._snapshots = value
@property @property
def tasks(self): def tasks(self) -> Tasks:
"""Return Tasks object.""" """Return Tasks object."""
return self._tasks return self._tasks
@tasks.setter @tasks.setter
def tasks(self, value): def tasks(self, value: Tasks):
"""Set a Tasks object.""" """Set a Tasks object."""
if self._tasks: if self._tasks:
raise RuntimeError("Tasks already set!") raise RuntimeError("Tasks already set!")
self._tasks = value self._tasks = value
@property @property
def services(self): def services(self) -> ServiceManager:
"""Return ServiceManager object.""" """Return ServiceManager object."""
return self._services return self._services
@services.setter @services.setter
def services(self, value): def services(self, value: ServiceManager):
"""Set a ServiceManager object.""" """Set a ServiceManager object."""
if self._services: if self._services:
raise RuntimeError("Services already set!") raise RuntimeError("Services already set!")
self._services = value self._services = value
@property @property
def discovery(self): def discovery(self) -> Discovery:
"""Return ServiceManager object.""" """Return ServiceManager object."""
return self._discovery return self._discovery
@discovery.setter @discovery.setter
def discovery(self, value): def discovery(self, value: Discovery):
"""Set a Discovery object.""" """Set a Discovery object."""
if self._discovery: if self._discovery:
raise RuntimeError("Discovery already set!") raise RuntimeError("Discovery already set!")
self._discovery = value self._discovery = value
@property @property
def dbus(self): def dbus(self) -> DBusManager:
"""Return DBusManager object.""" """Return DBusManager object."""
return self._dbus return self._dbus
@dbus.setter @dbus.setter
def dbus(self, value): def dbus(self, value: DBusManager):
"""Set a DBusManager object.""" """Set a DBusManager object."""
if self._dbus: if self._dbus:
raise RuntimeError("DBusManager already set!") raise RuntimeError("DBusManager already set!")
self._dbus = value self._dbus = value
@property @property
def host(self): def host(self) -> HostManager:
"""Return HostManager object.""" """Return HostManager object."""
return self._host return self._host
@host.setter @host.setter
def host(self, value): def host(self, value: HostManager):
"""Set a HostManager object.""" """Set a HostManager object."""
if self._host: if self._host:
raise RuntimeError("HostManager already set!") raise RuntimeError("HostManager already set!")
self._host = value self._host = value
@property @property
def hassos(self): def ingress(self) -> Ingress:
"""Return Ingress object."""
return self._ingress
@ingress.setter
def ingress(self, value: Ingress):
"""Set a Ingress object."""
if self._ingress:
raise RuntimeError("Ingress already set!")
self._ingress = value
@property
def hassos(self) -> HassOS:
"""Return HassOS object.""" """Return HassOS object."""
return self._hassos return self._hassos
@hassos.setter @hassos.setter
def hassos(self, value): def hassos(self, value: HassOS):
"""Set a HassOS object.""" """Set a HassOS object."""
if self._hassos: if self._hassos:
raise RuntimeError("HassOS already set!") raise RuntimeError("HassOS already set!")
self._hassos = value self._hassos = value
def run_in_executor(self, funct, *args):
"""Wrapper for executor pool."""
return self._loop.run_in_executor(None, funct, *args)
def create_task(self, coroutine):
"""Wrapper for async task."""
return self._loop.create_task(coroutine)
class CoreSysAttributes: class CoreSysAttributes:
"""Inheret basic CoreSysAttributes.""" """Inheret basic CoreSysAttributes."""
coresys = None coresys = None
def __getattr__(self, name): @property
"""Mapping to coresys.""" def sys_machine(self) -> str:
if name.startswith("sys_") and hasattr(self.coresys, name[4:]): """Return running machine type of the Hass.io system."""
return getattr(self.coresys, name[4:]) return self.coresys.machine
raise AttributeError(f"Can't resolve {name} on {self}")
@property
def sys_dev(self) -> str:
"""Return True if we run dev mode."""
return self.coresys.dev
@property
def sys_timezone(self) -> str:
"""Return timezone."""
return self.coresys.timezone
@property
def sys_machine_id(self) -> str:
"""Return timezone."""
return self.coresys.machine_id
@property
def sys_loop(self) -> asyncio.BaseEventLoop:
"""Return loop object."""
return self.coresys.loop
@property
def sys_websession(self) -> aiohttp.ClientSession:
"""Return websession object."""
return self.coresys.websession
@property
def sys_websession_ssl(self) -> aiohttp.ClientSession:
"""Return websession object with disabled SSL."""
return self.coresys.websession_ssl
@property
def sys_config(self) -> CoreConfig:
"""Return CoreConfig object."""
return self.coresys.config
@property
def sys_hardware(self) -> Hardware:
"""Return Hardware object."""
return self.coresys.hardware
@property
def sys_docker(self) -> DockerAPI:
"""Return DockerAPI object."""
return self.coresys.docker
@property
def sys_scheduler(self) -> Scheduler:
"""Return Scheduler object."""
return self.coresys.scheduler
@property
def sys_dns(self) -> DNSForward:
"""Return DNSForward object."""
return self.coresys.dns
@property
def sys_core(self) -> HassIO:
"""Return HassIO object."""
return self.coresys.core
@property
def sys_arch(self) -> CpuArch:
"""Return CpuArch object."""
return self.coresys.arch
@property
def sys_auth(self) -> Auth:
"""Return Auth object."""
return self.coresys.auth
@property
def sys_homeassistant(self) -> HomeAssistant:
"""Return Home Assistant object."""
return self.coresys.homeassistant
@property
def sys_supervisor(self) -> Supervisor:
"""Return Supervisor object."""
return self.coresys.supervisor
@property
def sys_api(self) -> RestAPI:
"""Return API object."""
return self.coresys.api
@property
def sys_updater(self) -> Updater:
"""Return Updater object."""
return self.coresys.updater
@property
def sys_addons(self) -> AddonManager:
"""Return AddonManager object."""
return self.coresys.addons
@property
def sys_snapshots(self) -> SnapshotManager:
"""Return SnapshotManager object."""
return self.coresys.snapshots
@property
def sys_tasks(self) -> Tasks:
"""Return Tasks object."""
return self.coresys.tasks
@property
def sys_services(self) -> ServiceManager:
"""Return ServiceManager object."""
return self.coresys.services
@property
def sys_discovery(self) -> Discovery:
"""Return ServiceManager object."""
return self.coresys.discovery
@property
def sys_dbus(self) -> DBusManager:
"""Return DBusManager object."""
return self.coresys.dbus
@property
def sys_host(self) -> HostManager:
"""Return HostManager object."""
return self.coresys.host
@property
def sys_ingress(self) -> Ingress:
"""Return Ingress object."""
return self.coresys.ingress
@property
def sys_hassos(self) -> HassOS:
"""Return HassOS object."""
return self.coresys.hassos
def sys_run_in_executor(self, funct, *args) -> asyncio.Future:
"""Wrapper for executor pool."""
return self.sys_loop.run_in_executor(None, funct, *args)
def sys_create_task(self, coroutine) -> asyncio.Task:
"""Wrapper for async task."""
return self.sys_loop.create_task(coroutine)

View File

@@ -1,46 +1,62 @@
"""Handle discover message for Home Assistant.""" """Handle discover message for Home Assistant."""
import logging from __future__ import annotations
from contextlib import suppress from contextlib import suppress
from uuid import uuid4 import logging
from typing import Any, Dict, List, Optional, TYPE_CHECKING
from uuid import uuid4, UUID
import attr import attr
import voluptuous as vol import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
from .const import FILE_HASSIO_DISCOVERY, ATTR_CONFIG, ATTR_DISCOVERY from ..const import ATTR_CONFIG, ATTR_DISCOVERY, FILE_HASSIO_DISCOVERY
from .coresys import CoreSysAttributes from ..coresys import CoreSys, CoreSysAttributes
from .exceptions import DiscoveryError, HomeAssistantAPIError from ..exceptions import DiscoveryError, HomeAssistantAPIError
from .validate import SCHEMA_DISCOVERY_CONFIG from ..utils.json import JsonConfig
from .utils.json import JsonConfig from .validate import SCHEMA_DISCOVERY_CONFIG, valid_discovery_config
from .services.validate import DISCOVERY_SERVICES
if TYPE_CHECKING:
from ..addons.addon import Addon
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
CMD_NEW = 'post' CMD_NEW = "post"
CMD_DEL = 'delete' CMD_DEL = "delete"
@attr.s
class Message:
"""Represent a single Discovery message."""
addon: str = attr.ib()
service: str = attr.ib()
config: Dict[str, Any] = attr.ib(cmp=False)
uuid: UUID = attr.ib(factory=lambda: uuid4().hex, cmp=False)
class Discovery(CoreSysAttributes, JsonConfig): class Discovery(CoreSysAttributes, JsonConfig):
"""Home Assistant Discovery handler.""" """Home Assistant Discovery handler."""
def __init__(self, coresys): def __init__(self, coresys: CoreSys):
"""Initialize discovery handler.""" """Initialize discovery handler."""
super().__init__(FILE_HASSIO_DISCOVERY, SCHEMA_DISCOVERY_CONFIG) super().__init__(FILE_HASSIO_DISCOVERY, SCHEMA_DISCOVERY_CONFIG)
self.coresys = coresys self.coresys: CoreSys = coresys
self.message_obj = {} self.message_obj: Dict[str, Message] = {}
async def load(self): async def load(self) -> None:
"""Load exists discovery message into storage.""" """Load exists discovery message into storage."""
messages = {} messages = {}
for message in self._data[ATTR_DISCOVERY]: for message in self._data[ATTR_DISCOVERY]:
discovery = Message(**message) discovery = Message(**message)
messages[discovery.uuid] = discovery messages[discovery.uuid] = discovery
_LOGGER.info("Load %d messages", len(messages))
self.message_obj = messages self.message_obj = messages
def save(self): def save(self) -> None:
"""Write discovery message into data file.""" """Write discovery message into data file."""
messages = [] messages: List[Dict[str, Any]] = []
for message in self.list_messages: for message in self.list_messages:
messages.append(attr.asdict(message)) messages.append(attr.asdict(message))
@@ -48,52 +64,53 @@ class Discovery(CoreSysAttributes, JsonConfig):
self._data[ATTR_DISCOVERY].extend(messages) self._data[ATTR_DISCOVERY].extend(messages)
self.save_data() self.save_data()
def get(self, uuid): def get(self, uuid: str) -> Optional[Message]:
"""Return discovery message.""" """Return discovery message."""
return self.message_obj.get(uuid) return self.message_obj.get(uuid)
@property @property
def list_messages(self): def list_messages(self) -> List[Message]:
"""Return list of available discovery messages.""" """Return list of available discovery messages."""
return list(self.message_obj.values()) return list(self.message_obj.values())
def send(self, addon, service, component, platform, config): def send(self, addon: Addon, service: str, config: Dict[str, Any]) -> Message:
"""Send a discovery message to Home Assistant.""" """Send a discovery message to Home Assistant."""
try: try:
DISCOVERY_SERVICES[service](config) config = valid_discovery_config(service, config)
except vol.Invalid as err: except vol.Invalid as err:
_LOGGER.error( _LOGGER.error("Invalid discovery %s config", humanize_error(config, err))
"Invalid discovery %s config", humanize_error(config, err))
raise DiscoveryError() from None raise DiscoveryError() from None
# Create message # Create message
message = Message(addon.slug, service, component, platform, config) message = Message(addon.slug, service, config)
# Already exists? # Already exists?
for old_message in self.list_messages: for old_message in self.list_messages:
if old_message != message: if old_message != message:
continue continue
_LOGGER.warning("Duplicate discovery message from %s", addon.slug) _LOGGER.info("Duplicate discovery message from %s", addon.slug)
return old_message return old_message
_LOGGER.info("Send discovery to Home Assistant %s/%s from %s", _LOGGER.info("Send discovery to Home Assistant %s from %s", service, addon.slug)
component, platform, addon.slug)
self.message_obj[message.uuid] = message self.message_obj[message.uuid] = message
self.save() self.save()
self.sys_create_task(self._push_discovery(message, CMD_NEW)) self.sys_create_task(self._push_discovery(message, CMD_NEW))
return message return message
def remove(self, message): def remove(self, message: Message) -> None:
"""Remove a discovery message from Home Assistant.""" """Remove a discovery message from Home Assistant."""
self.message_obj.pop(message.uuid, None) self.message_obj.pop(message.uuid, None)
self.save() self.save()
_LOGGER.info("Delete discovery to Home Assistant %s/%s from %s", _LOGGER.info(
message.component, message.platform, message.addon) "Delete discovery to Home Assistant %s from %s",
message.service,
message.addon,
)
self.sys_create_task(self._push_discovery(message, CMD_DEL)) self.sys_create_task(self._push_discovery(message, CMD_DEL))
async def _push_discovery(self, message, command): async def _push_discovery(self, message: Message, command: str) -> None:
"""Send a discovery request.""" """Send a discovery request."""
if not await self.sys_homeassistant.check_api_state(): if not await self.sys_homeassistant.check_api_state():
_LOGGER.info("Discovery %s mesage ignore", message.uuid) _LOGGER.info("Discovery %s mesage ignore", message.uuid)
@@ -104,20 +121,12 @@ class Discovery(CoreSysAttributes, JsonConfig):
with suppress(HomeAssistantAPIError): with suppress(HomeAssistantAPIError):
async with self.sys_homeassistant.make_request( async with self.sys_homeassistant.make_request(
command, f"api/hassio_push/discovery/{message.uuid}", command,
json=data, timeout=10): f"api/hassio_push/discovery/{message.uuid}",
json=data,
timeout=10,
):
_LOGGER.info("Discovery %s message send", message.uuid) _LOGGER.info("Discovery %s message send", message.uuid)
return return
_LOGGER.warning("Discovery %s message fail", message.uuid) _LOGGER.warning("Discovery %s message fail", message.uuid)
@attr.s
class Message:
"""Represent a single Discovery message."""
addon = attr.ib()
service = attr.ib()
component = attr.ib()
platform = attr.ib()
config = attr.ib(cmp=False)
uuid = attr.ib(factory=lambda: uuid4().hex, cmp=False)

10
hassio/discovery/const.py Normal file
View File

@@ -0,0 +1,10 @@
"""Discovery static data."""
ATTR_HOST = "host"
ATTR_PASSWORD = "password"
ATTR_PORT = "port"
ATTR_PROTOCOL = "protocol"
ATTR_SSL = "ssl"
ATTR_USERNAME = "username"
ATTR_API_KEY = "api_key"
ATTR_SERIAL = "serial"

View File

@@ -0,0 +1 @@
"""Discovery service modules."""

View File

@@ -0,0 +1,16 @@
"""Discovery service for MQTT."""
import voluptuous as vol
from hassio.validate import NETWORK_PORT
from ..const import ATTR_HOST, ATTR_PORT, ATTR_API_KEY, ATTR_SERIAL
SCHEMA = vol.Schema(
{
vol.Required(ATTR_HOST): vol.Coerce(str),
vol.Required(ATTR_PORT): NETWORK_PORT,
vol.Required(ATTR_SERIAL): vol.Coerce(str),
vol.Required(ATTR_API_KEY): vol.Coerce(str),
}
)

View File

@@ -0,0 +1,27 @@
"""Discovery service for MQTT."""
import voluptuous as vol
from hassio.validate import NETWORK_PORT
from ..const import (
ATTR_HOST,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_PROTOCOL,
ATTR_SSL,
ATTR_USERNAME,
)
# pylint: disable=no-value-for-parameter
SCHEMA = vol.Schema(
{
vol.Required(ATTR_HOST): vol.Coerce(str),
vol.Required(ATTR_PORT): NETWORK_PORT,
vol.Optional(ATTR_USERNAME): vol.Coerce(str),
vol.Optional(ATTR_PASSWORD): vol.Coerce(str),
vol.Optional(ATTR_SSL, default=False): vol.Boolean(),
vol.Optional(ATTR_PROTOCOL, default="3.1.1"): vol.All(
vol.Coerce(str), vol.In(["3.1", "3.1.1"])
),
}
)

View File

@@ -0,0 +1,47 @@
"""Validate services schema."""
from pathlib import Path
from importlib import import_module
import voluptuous as vol
from ..const import ATTR_ADDON, ATTR_CONFIG, ATTR_DISCOVERY, ATTR_SERVICE, ATTR_UUID
from ..utils.validate import schema_or
from ..validate import UUID_MATCH
def valid_discovery_service(service):
"""Validate service name."""
service_file = Path(__file__).parent.joinpath(f"services/{service}.py")
if not service_file.exists():
raise vol.Invalid(f"Service {service} not found")
return service
def valid_discovery_config(service, config):
"""Validate service name."""
try:
service_mod = import_module(f".services.{service}", "hassio.discovery")
except ImportError:
raise vol.Invalid(f"Service {service} not found")
return service_mod.SCHEMA(config)
SCHEMA_DISCOVERY = vol.Schema(
[
vol.Schema(
{
vol.Required(ATTR_UUID): UUID_MATCH,
vol.Required(ATTR_ADDON): vol.Coerce(str),
vol.Required(ATTR_SERVICE): valid_discovery_service,
vol.Required(ATTR_CONFIG): vol.Maybe(dict),
},
extra=vol.REMOVE_EXTRA,
)
]
)
SCHEMA_DISCOVERY_CONFIG = vol.Schema(
{vol.Optional(ATTR_DISCOVERY, default=list): schema_or(SCHEMA_DISCOVERY)},
extra=vol.REMOVE_EXTRA,
)

View File

@@ -1,17 +1,24 @@
"""Init file for Hass.io Docker object.""" """Init file for Hass.io Docker object."""
from contextlib import suppress
import logging import logging
from contextlib import suppress
from typing import Any, Dict, Optional
import attr import attr
import docker import docker
from .network import DockerNetwork
from ..const import SOCKET_DOCKER from ..const import SOCKET_DOCKER
from ..exceptions import DockerAPIError
from .network import DockerNetwork
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
# pylint: disable=invalid-name
CommandReturn = attr.make_class('CommandReturn', ['exit_code', 'output']) @attr.s(frozen=True)
class CommandReturn:
"""Return object from command run."""
exit_code: int = attr.ib()
output: bytes = attr.ib()
class DockerAPI: class DockerAPI:
@@ -22,74 +29,87 @@ class DockerAPI:
def __init__(self): def __init__(self):
"""Initialize Docker base wrapper.""" """Initialize Docker base wrapper."""
self.docker = docker.DockerClient( self.docker: docker.DockerClient = docker.DockerClient(
base_url="unix:/{}".format(str(SOCKET_DOCKER)), base_url="unix:/{}".format(str(SOCKET_DOCKER)), version="auto", timeout=900
version='auto', timeout=900) )
self.network = DockerNetwork(self.docker) self.network: DockerNetwork = DockerNetwork(self.docker)
@property @property
def images(self): def images(self) -> docker.models.images.ImageCollection:
"""Return API images.""" """Return API images."""
return self.docker.images return self.docker.images
@property @property
def containers(self): def containers(self) -> docker.models.containers.ContainerCollection:
"""Return API containers.""" """Return API containers."""
return self.docker.containers return self.docker.containers
@property @property
def api(self): def api(self) -> docker.APIClient:
"""Return API containers.""" """Return API containers."""
return self.docker.api return self.docker.api
def run(self, image, **kwargs): def run(
self, image: str, **kwargs: Dict[str, Any]
) -> docker.models.containers.Container:
""""Create a Docker container and run it. """"Create a Docker container and run it.
Need run inside executor. Need run inside executor.
""" """
name = kwargs.get('name', image) name = kwargs.get("name", image)
network_mode = kwargs.get('network_mode') network_mode = kwargs.get("network_mode")
hostname = kwargs.get('hostname') hostname = kwargs.get("hostname")
# Setup network # Setup network
kwargs['dns_search'] = ["."] kwargs["dns_search"] = ["."]
if network_mode: if network_mode:
kwargs['dns'] = [str(self.network.supervisor)] kwargs["dns"] = [str(self.network.supervisor)]
kwargs['dns_opt'] = ["ndots:0"] kwargs["dns_opt"] = ["ndots:0"]
else: else:
kwargs['network'] = None kwargs["network"] = None
# Create container # Create container
try: try:
container = self.docker.containers.create(image, **kwargs) container = self.docker.containers.create(
image, use_config_proxy=False, **kwargs
)
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.error("Can't create container from %s: %s", name, err) _LOGGER.error("Can't create container from %s: %s", name, err)
return False raise DockerAPIError() from None
# attach network # Attach network
if not network_mode: if not network_mode:
alias = [hostname] if hostname else None alias = [hostname] if hostname else None
if self.network.attach_container(container, alias=alias): try:
self.network.detach_default_bridge(container) self.network.attach_container(container, alias=alias)
else: except DockerAPIError:
_LOGGER.warning("Can't attach %s to hassio-net!", name) _LOGGER.warning("Can't attach %s to hassio-net!", name)
else:
with suppress(DockerAPIError):
self.network.detach_default_bridge(container)
# run container # Run container
try: try:
container.start() container.start()
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.error("Can't start %s: %s", name, err) _LOGGER.error("Can't start %s: %s", name, err)
return False raise DockerAPIError() from None
return True # Update metadata
with suppress(docker.errors.DockerException):
container.reload()
def run_command(self, image, command=None, **kwargs): return container
def run_command(
self, image: str, command: Optional[str] = None, **kwargs: Dict[str, Any]
) -> CommandReturn:
"""Create a temporary container and run command. """Create a temporary container and run command.
Need run inside executor. Need run inside executor.
""" """
stdout = kwargs.get('stdout', True) stdout = kwargs.get("stdout", True)
stderr = kwargs.get('stderr', True) stderr = kwargs.get("stderr", True)
_LOGGER.info("Run command '%s' on %s", command, image) _LOGGER.info("Run command '%s' on %s", command, image)
try: try:
@@ -97,6 +117,7 @@ class DockerAPI:
image, image,
command=command, command=command,
network=self.network.name, network=self.network.name,
use_config_proxy=False,
**kwargs **kwargs
) )
@@ -106,10 +127,11 @@ class DockerAPI:
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.error("Can't execute command: %s", err) _LOGGER.error("Can't execute command: %s", err)
return CommandReturn(None, b"") raise DockerAPIError() from None
finally:
# cleanup container # cleanup container
with suppress(docker.errors.DockerException): with suppress(docker.errors.DockerException):
container.remove(force=True) container.remove(force=True)
return CommandReturn(result.get('StatusCode'), output) return CommandReturn(result.get("StatusCode"), output)

View File

@@ -1,17 +1,35 @@
"""Init file for Hass.io add-on Docker object.""" """Init file for Hass.io add-on Docker object."""
from __future__ import annotations
from contextlib import suppress
from ipaddress import IPv4Address, ip_address
import logging import logging
import os import os
from pathlib import Path from pathlib import Path
from typing import TYPE_CHECKING, Dict, List, Optional, Union, Awaitable
import docker import docker
import requests import requests
from .interface import DockerInterface
from ..addons.build import AddonBuild from ..addons.build import AddonBuild
from ..const import ( from ..const import (
MAP_CONFIG, MAP_SSL, MAP_ADDONS, MAP_BACKUP, MAP_SHARE, ENV_TOKEN, ENV_TIME,
ENV_TIME, SECURITY_PROFILE, SECURITY_DISABLE) ENV_TOKEN,
MAP_ADDONS,
MAP_BACKUP,
MAP_CONFIG,
MAP_SHARE,
MAP_SSL,
SECURITY_DISABLE,
SECURITY_PROFILE,
)
from ..coresys import CoreSys
from ..exceptions import DockerAPIError
from ..utils import process_lock from ..utils import process_lock
from .interface import DockerInterface
if TYPE_CHECKING:
from ..addons.addon import Addon
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -21,73 +39,87 @@ AUDIO_DEVICE = "/dev/snd:/dev/snd:rwm"
class DockerAddon(DockerInterface): class DockerAddon(DockerInterface):
"""Docker Hass.io wrapper for Home Assistant.""" """Docker Hass.io wrapper for Home Assistant."""
def __init__(self, coresys, slug): def __init__(self, coresys: CoreSys, slug: str):
"""Initialize Docker Home Assistant wrapper.""" """Initialize Docker Home Assistant wrapper."""
super().__init__(coresys) super().__init__(coresys)
self._id = slug self._id: str = slug
@property @property
def addon(self): def addon(self) -> Addon:
"""Return add-on of Docker image.""" """Return add-on of Docker image."""
return self.sys_addons.get(self._id) return self.sys_addons.get(self._id)
@property @property
def image(self): def image(self) -> str:
"""Return name of Docker image.""" """Return name of Docker image."""
return self.addon.image return self.addon.image
@property @property
def timeout(self): def ip_address(self) -> IPv4Address:
"""Return IP address of this container."""
if self.addon.host_network:
return self.sys_docker.network.gateway
# Extract IP-Address
try:
return ip_address(
self._meta["NetworkSettings"]["Networks"]["hassio"]["IPAddress"])
except (KeyError, TypeError, ValueError):
return ip_address("0.0.0.0")
@property
def timeout(self) -> int:
"""Return timeout for Docker actions.""" """Return timeout for Docker actions."""
return self.addon.timeout return self.addon.timeout
@property @property
def version(self): def version(self) -> str:
"""Return version of Docker image.""" """Return version of Docker image."""
if not self.addon.legacy: if self.addon.legacy:
return super().version
return self.addon.version_installed return self.addon.version_installed
return super().version
@property @property
def arch(self): def arch(self) -> str:
"""Return arch of Docker image.""" """Return arch of Docker image."""
if not self.addon.legacy: if self.addon.legacy:
return self.sys_arch.default
return super().arch return super().arch
return self.sys_arch
@property @property
def name(self): def name(self) -> str:
"""Return name of Docker container.""" """Return name of Docker container."""
return "addon_{}".format(self.addon.slug) return f"addon_{self.addon.slug}"
@property @property
def ipc(self): def ipc(self) -> Optional[str]:
"""Return the IPC namespace.""" """Return the IPC namespace."""
if self.addon.host_ipc: if self.addon.host_ipc:
return 'host' return "host"
return None return None
@property @property
def full_access(self): def full_access(self) -> bool:
"""Return True if full access is enabled.""" """Return True if full access is enabled."""
return not self.addon.protected and self.addon.with_full_access return not self.addon.protected and self.addon.with_full_access
@property @property
def hostname(self): def hostname(self) -> str:
"""Return slug/id of add-on.""" """Return slug/id of add-on."""
return self.addon.slug.replace('_', '-') return self.addon.slug.replace("_", "-")
@property @property
def environment(self): def environment(self) -> Dict[str, str]:
"""Return environment for Docker add-on.""" """Return environment for Docker add-on."""
addon_env = self.addon.environment or {} addon_env = self.addon.environment or {}
# Need audio settings # Provide options for legacy add-ons
if self.addon.with_audio: if self.addon.legacy:
addon_env.update({ for key, value in self.addon.options.items():
'ALSA_OUTPUT': self.addon.audio_output, if isinstance(value, (int, str)):
'ALSA_INPUT': self.addon.audio_input, addon_env[key] = value
}) else:
_LOGGER.warning("Can not set nested option %s as Docker env", key)
return { return {
**addon_env, **addon_env,
@@ -96,12 +128,12 @@ class DockerAddon(DockerInterface):
} }
@property @property
def devices(self): def devices(self) -> List[str]:
"""Return needed devices.""" """Return needed devices."""
devices = self.addon.devices or [] devices = self.addon.devices or []
# Use audio devices # Use audio devices
if self.addon.with_audio and AUDIO_DEVICE not in devices: if self.addon.with_audio and self.sys_hardware.support_audio:
devices.append(AUDIO_DEVICE) devices.append(AUDIO_DEVICE)
# Auto mapping UART devices # Auto mapping UART devices
@@ -113,7 +145,7 @@ class DockerAddon(DockerInterface):
return devices or None return devices or None
@property @property
def ports(self): def ports(self) -> Optional[Dict[str, Union[str, int, None]]]:
"""Filter None from add-on ports.""" """Filter None from add-on ports."""
if not self.addon.ports: if not self.addon.ports:
return None return None
@@ -125,7 +157,7 @@ class DockerAddon(DockerInterface):
} }
@property @property
def security_opt(self): def security_opt(self) -> List[str]:
"""Controlling security options.""" """Controlling security options."""
security = [] security = []
@@ -143,7 +175,7 @@ class DockerAddon(DockerInterface):
return security return security
@property @property
def tmpfs(self): def tmpfs(self) -> Optional[Dict[str, str]]:
"""Return tmpfs for Docker add-on.""" """Return tmpfs for Docker add-on."""
options = self.addon.tmpfs options = self.addon.tmpfs
if options: if options:
@@ -151,130 +183,148 @@ class DockerAddon(DockerInterface):
return None return None
@property @property
def network_mapping(self): def network_mapping(self) -> Dict[str, str]:
"""Return hosts mapping.""" """Return hosts mapping."""
return { return {
'homeassistant': self.sys_docker.network.gateway, "homeassistant": self.sys_docker.network.gateway,
'hassio': self.sys_docker.network.supervisor, "hassio": self.sys_docker.network.supervisor,
} }
@property @property
def network_mode(self): def network_mode(self) -> Optional[str]:
"""Return network mode for add-on.""" """Return network mode for add-on."""
if self.addon.host_network: if self.addon.host_network:
return 'host' return "host"
return None return None
@property @property
def pid_mode(self): def pid_mode(self) -> Optional[str]:
"""Return PID mode for add-on.""" """Return PID mode for add-on."""
if not self.addon.protected and self.addon.host_pid: if not self.addon.protected and self.addon.host_pid:
return 'host' return "host"
return None return None
@property @property
def volumes(self): def volumes(self) -> Dict[str, Dict[str, str]]:
"""Generate volumes for mappings.""" """Generate volumes for mappings."""
volumes = { volumes = {str(self.addon.path_extern_data): {"bind": "/data", "mode": "rw"}}
str(self.addon.path_extern_data): {
'bind': "/data", 'mode': 'rw'
}}
addon_mapping = self.addon.map_volumes addon_mapping = self.addon.map_volumes
# setup config mappings # setup config mappings
if MAP_CONFIG in addon_mapping: if MAP_CONFIG in addon_mapping:
volumes.update({ volumes.update(
{
str(self.sys_config.path_extern_homeassistant): { str(self.sys_config.path_extern_homeassistant): {
'bind': "/config", 'mode': addon_mapping[MAP_CONFIG] "bind": "/config",
}}) "mode": addon_mapping[MAP_CONFIG],
}
}
)
if MAP_SSL in addon_mapping: if MAP_SSL in addon_mapping:
volumes.update({ volumes.update(
{
str(self.sys_config.path_extern_ssl): { str(self.sys_config.path_extern_ssl): {
'bind': "/ssl", 'mode': addon_mapping[MAP_SSL] "bind": "/ssl",
}}) "mode": addon_mapping[MAP_SSL],
}
}
)
if MAP_ADDONS in addon_mapping: if MAP_ADDONS in addon_mapping:
volumes.update({ volumes.update(
{
str(self.sys_config.path_extern_addons_local): { str(self.sys_config.path_extern_addons_local): {
'bind': "/addons", 'mode': addon_mapping[MAP_ADDONS] "bind": "/addons",
}}) "mode": addon_mapping[MAP_ADDONS],
}
}
)
if MAP_BACKUP in addon_mapping: if MAP_BACKUP in addon_mapping:
volumes.update({ volumes.update(
{
str(self.sys_config.path_extern_backup): { str(self.sys_config.path_extern_backup): {
'bind': "/backup", 'mode': addon_mapping[MAP_BACKUP] "bind": "/backup",
}}) "mode": addon_mapping[MAP_BACKUP],
}
}
)
if MAP_SHARE in addon_mapping: if MAP_SHARE in addon_mapping:
volumes.update({ volumes.update(
{
str(self.sys_config.path_extern_share): { str(self.sys_config.path_extern_share): {
'bind': "/share", 'mode': addon_mapping[MAP_SHARE] "bind": "/share",
}}) "mode": addon_mapping[MAP_SHARE],
}
}
)
# Init other hardware mappings # Init other hardware mappings
# GPIO support # GPIO support
if self.addon.with_gpio: if self.addon.with_gpio and self.sys_hardware.support_gpio:
for gpio_path in ("/sys/class/gpio", "/sys/devices/platform/soc"): for gpio_path in ("/sys/class/gpio", "/sys/devices/platform/soc"):
if not Path(gpio_path).exists(): volumes.update({gpio_path: {"bind": gpio_path, "mode": "rw"}})
continue
volumes.update({
gpio_path: {
'bind': gpio_path, 'mode': 'rw'
},
})
# DeviceTree support # DeviceTree support
if self.addon.with_devicetree: if self.addon.with_devicetree:
volumes.update({ volumes.update(
{
"/sys/firmware/devicetree/base": { "/sys/firmware/devicetree/base": {
'bind': "/device-tree", 'mode': 'ro' "bind": "/device-tree",
}, "mode": "ro",
}) }
}
)
# Kernel Modules support
if self.addon.with_kernel_modules:
volumes.update({"/lib/modules": {"bind": "/lib/modules", "mode": "ro"}})
# Docker API support # Docker API support
if not self.addon.protected and self.addon.access_docker_api: if not self.addon.protected and self.addon.access_docker_api:
volumes.update({ volumes.update(
"/var/run/docker.sock": { {"/var/run/docker.sock": {"bind": "/var/run/docker.sock", "mode": "ro"}}
'bind': "/var/run/docker.sock", 'mode': 'ro' )
},
})
# Host D-Bus system # Host D-Bus system
if self.addon.host_dbus: if self.addon.host_dbus:
volumes.update({ volumes.update({"/var/run/dbus": {"bind": "/var/run/dbus", "mode": "rw"}})
"/var/run/dbus": {
'bind': "/var/run/dbus", 'mode': 'rw'
}})
# ALSA configuration # ALSA configuration
if self.addon.with_audio: if self.addon.with_audio:
volumes.update({ volumes.update(
{
str(self.addon.path_extern_asound): { str(self.addon.path_extern_asound): {
'bind': "/etc/asound.conf", 'mode': 'ro' "bind": "/etc/asound.conf",
}}) "mode": "ro",
}
}
)
return volumes return volumes
def _run(self): def _run(self) -> None:
"""Run Docker image. """Run Docker image.
Need run inside executor. Need run inside executor.
""" """
if self._is_running(): if self._is_running():
return True return
# Security check # Security check
if not self.addon.protected: if not self.addon.protected:
_LOGGER.warning( _LOGGER.warning("%s run with disabled protected mode!", self.addon.name)
"%s run with disabled protected mode!", self.addon.name)
# cleanup # Cleanup
with suppress(DockerAPIError):
self._stop() self._stop()
ret = self.sys_docker.run( # Create & Run container
docker_container = self.sys_docker.run(
self.image, self.image,
name=self.name, name=self.name,
hostname=self.hostname, hostname=self.hostname,
@@ -292,26 +342,23 @@ class DockerAddon(DockerInterface):
security_opt=self.security_opt, security_opt=self.security_opt,
environment=self.environment, environment=self.environment,
volumes=self.volumes, volumes=self.volumes,
tmpfs=self.tmpfs tmpfs=self.tmpfs,
) )
if ret: _LOGGER.info("Start Docker add-on %s with version %s", self.image, self.version)
_LOGGER.info("Start Docker add-on %s with version %s", self._meta = docker_container.attrs
self.image, self.version)
return ret def _install(self, tag: str, image: Optional[str] = None) -> None:
def _install(self, tag):
"""Pull Docker image or build it. """Pull Docker image or build it.
Need run inside executor. Need run inside executor.
""" """
if self.addon.need_build: if self.addon.need_build:
return self._build(tag) self._build(tag)
return super()._install(tag) super()._install(tag, image)
def _build(self, tag): def _build(self, tag: str) -> None:
"""Build a Docker container. """Build a Docker container.
Need run inside executor. Need run inside executor.
@@ -321,27 +368,27 @@ class DockerAddon(DockerInterface):
_LOGGER.info("Start build %s:%s", self.image, tag) _LOGGER.info("Start build %s:%s", self.image, tag)
try: try:
image, log = self.sys_docker.images.build( image, log = self.sys_docker.images.build(
**build_env.get_docker_args(tag)) use_config_proxy=False, **build_env.get_docker_args(tag)
)
_LOGGER.debug("Build %s:%s done: %s", self.image, tag, log) _LOGGER.debug("Build %s:%s done: %s", self.image, tag, log)
image.tag(self.image, tag='latest') image.tag(self.image, tag="latest")
# Update meta data # Update meta data
self._meta = image.attrs self._meta = image.attrs
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.error("Can't build %s:%s: %s", self.image, tag, err) _LOGGER.error("Can't build %s:%s: %s", self.image, tag, err)
return False raise DockerAPIError() from None
_LOGGER.info("Build %s:%s done", self.image, tag) _LOGGER.info("Build %s:%s done", self.image, tag)
return True
@process_lock @process_lock
def export_image(self, path): def export_image(self, tar_file: Path) -> Awaitable[None]:
"""Export current images into a tar file.""" """Export current images into a tar file."""
return self.sys_run_in_executor(self._export_image, path) return self.sys_run_in_executor(self._export_image, tar_file)
def _export_image(self, tar_file): def _export_image(self, tar_file: Path) -> None:
"""Export current images into a tar file. """Export current images into a tar file.
Need run inside executor. Need run inside executor.
@@ -350,7 +397,7 @@ class DockerAddon(DockerInterface):
image = self.sys_docker.api.get_image(self.image) image = self.sys_docker.api.get_image(self.image)
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.error("Can't fetch image %s: %s", self.image, err) _LOGGER.error("Can't fetch image %s: %s", self.image, err)
return False raise DockerAPIError() from None
_LOGGER.info("Export image %s to %s", self.image, tar_file) _LOGGER.info("Export image %s to %s", self.image, tar_file)
try: try:
@@ -359,17 +406,16 @@ class DockerAddon(DockerInterface):
write_tar.write(chunk) write_tar.write(chunk)
except (OSError, requests.exceptions.ReadTimeout) as err: except (OSError, requests.exceptions.ReadTimeout) as err:
_LOGGER.error("Can't write tar file %s: %s", tar_file, err) _LOGGER.error("Can't write tar file %s: %s", tar_file, err)
return False raise DockerAPIError() from None
_LOGGER.info("Export image %s done", self.image) _LOGGER.info("Export image %s done", self.image)
return True
@process_lock @process_lock
def import_image(self, path, tag): def import_image(self, tar_file: Path, tag: str) -> Awaitable[None]:
"""Import a tar file as image.""" """Import a tar file as image."""
return self.sys_run_in_executor(self._import_image, path, tag) return self.sys_run_in_executor(self._import_image, tar_file, tag)
def _import_image(self, tar_file, tag): def _import_image(self, tar_file: Path, tag: str) -> None:
"""Import a tar file as image. """Import a tar file as image.
Need run inside executor. Need run inside executor.
@@ -378,37 +424,38 @@ class DockerAddon(DockerInterface):
with tar_file.open("rb") as read_tar: with tar_file.open("rb") as read_tar:
self.sys_docker.api.load_image(read_tar, quiet=True) self.sys_docker.api.load_image(read_tar, quiet=True)
image = self.sys_docker.images.get(self.image) docker_image = self.sys_docker.images.get(self.image)
image.tag(self.image, tag=tag) docker_image.tag(self.image, tag=tag)
except (docker.errors.DockerException, OSError) as err: except (docker.errors.DockerException, OSError) as err:
_LOGGER.error("Can't import image %s: %s", self.image, err) _LOGGER.error("Can't import image %s: %s", self.image, err)
return False raise DockerAPIError() from None
_LOGGER.info("Import image %s and tag %s", tar_file, tag) _LOGGER.info("Import image %s and tag %s", tar_file, tag)
self._meta = image.attrs self._meta = docker_image.attrs
with suppress(DockerAPIError):
self._cleanup() self._cleanup()
return True
@process_lock @process_lock
def write_stdin(self, data): def write_stdin(self, data: bytes) -> Awaitable[None]:
"""Write to add-on stdin.""" """Write to add-on stdin."""
return self.sys_run_in_executor(self._write_stdin, data) return self.sys_run_in_executor(self._write_stdin, data)
def _write_stdin(self, data): def _write_stdin(self, data: bytes) -> None:
"""Write to add-on stdin. """Write to add-on stdin.
Need run inside executor. Need run inside executor.
""" """
if not self._is_running(): if not self._is_running():
return False raise DockerAPIError() from None
try: try:
# Load needed docker objects # Load needed docker objects
container = self.sys_docker.containers.get(self.name) container = self.sys_docker.containers.get(self.name)
socket = container.attach_socket(params={'stdin': 1, 'stream': 1}) socket = container.attach_socket(params={"stdin": 1, "stream": 1})
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.error("Can't attach to %s stdin: %s", self.name, err) _LOGGER.error("Can't attach to %s stdin: %s", self.name, err)
return False raise DockerAPIError() from None
try: try:
# Write to stdin # Write to stdin
@@ -417,6 +464,4 @@ class DockerAddon(DockerInterface):
socket.close() socket.close()
except OSError as err: except OSError as err:
_LOGGER.error("Can't write to %s stdin: %s", self.name, err) _LOGGER.error("Can't write to %s stdin: %s", self.name, err)
return False raise DockerAPIError() from None
return True

View File

@@ -3,8 +3,8 @@ import logging
import docker import docker
from .interface import DockerInterface
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -15,9 +15,9 @@ class DockerHassOSCli(DockerInterface, CoreSysAttributes):
@property @property
def image(self): def image(self):
"""Return name of HassOS CLI image.""" """Return name of HassOS CLI image."""
return f"homeassistant/{self.sys_arch}-hassio-cli" return f"homeassistant/{self.sys_arch.supervisor}-hassio-cli"
def _stop(self): def _stop(self, remove_container=True):
"""Don't need stop.""" """Don't need stop."""
return True return True
@@ -33,5 +33,6 @@ class DockerHassOSCli(DockerInterface, CoreSysAttributes):
else: else:
self._meta = image.attrs self._meta = image.attrs
_LOGGER.info("Found HassOS CLI %s with version %s", _LOGGER.info(
self.image, self.version) "Found HassOS CLI %s with version %s", self.image, self.version
)

View File

@@ -1,14 +1,18 @@
"""Init file for Hass.io Docker object.""" """Init file for Hass.io Docker object."""
from contextlib import suppress
from ipaddress import IPv4Address
import logging import logging
from typing import Awaitable
import docker import docker
from .interface import DockerInterface from ..const import ENV_TIME, ENV_TOKEN, LABEL_MACHINE
from ..const import ENV_TOKEN, ENV_TIME, LABEL_MACHINE from ..exceptions import DockerAPIError
from .interface import CommandReturn, DockerInterface
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
HASS_DOCKER_NAME = 'homeassistant' HASS_DOCKER_NAME = "homeassistant"
class DockerHomeAssistant(DockerInterface): class DockerHomeAssistant(DockerInterface):
@@ -17,8 +21,8 @@ class DockerHomeAssistant(DockerInterface):
@property @property
def machine(self): def machine(self):
"""Return machine of Home Assistant Docker image.""" """Return machine of Home Assistant Docker image."""
if self._meta and LABEL_MACHINE in self._meta['Config']['Labels']: if self._meta and LABEL_MACHINE in self._meta["Config"]["Labels"]:
return self._meta['Config']['Labels'][LABEL_MACHINE] return self._meta["Config"]["Labels"][LABEL_MACHINE]
return None return None
@property @property
@@ -39,18 +43,25 @@ class DockerHomeAssistant(DockerInterface):
devices.append(f"{device}:{device}:rwm") devices.append(f"{device}:{device}:rwm")
return devices or None return devices or None
def _run(self): @property
def ip_address(self) -> IPv4Address:
"""Return IP address of this container."""
return self.sys_docker.network.gateway
def _run(self) -> None:
"""Run Docker image. """Run Docker image.
Need run inside executor. Need run inside executor.
""" """
if self._is_running(): if self._is_running():
return False return
# cleanup # Cleanup
with suppress(DockerAPIError):
self._stop() self._stop()
ret = self.sys_docker.run( # Create & Run container
docker_container = self.sys_docker.run(
self.image, self.image,
name=self.name, name=self.name,
hostname=self.name, hostname=self.name,
@@ -58,29 +69,29 @@ class DockerHomeAssistant(DockerInterface):
privileged=True, privileged=True,
init=True, init=True,
devices=self.devices, devices=self.devices,
network_mode='host', network_mode="host",
environment={ environment={
'HASSIO': self.sys_docker.network.supervisor, "HASSIO": self.sys_docker.network.supervisor,
ENV_TIME: self.sys_timezone, ENV_TIME: self.sys_timezone,
ENV_TOKEN: self.sys_homeassistant.hassio_token, ENV_TOKEN: self.sys_homeassistant.hassio_token,
}, },
volumes={ volumes={
str(self.sys_config.path_extern_homeassistant): str(self.sys_config.path_extern_homeassistant): {
{'bind': '/config', 'mode': 'rw'}, "bind": "/config",
str(self.sys_config.path_extern_ssl): "mode": "rw",
{'bind': '/ssl', 'mode': 'ro'}, },
str(self.sys_config.path_extern_share): str(self.sys_config.path_extern_ssl): {"bind": "/ssl", "mode": "ro"},
{'bind': '/share', 'mode': 'rw'}, str(self.sys_config.path_extern_share): {
} "bind": "/share",
"mode": "rw",
},
},
) )
if ret: _LOGGER.info("Start homeassistant %s with version %s", self.image, self.version)
_LOGGER.info("Start homeassistant %s with version %s", self._meta = docker_container.attrs
self.image, self.version)
return ret def _execute_command(self, command: str) -> CommandReturn:
def _execute_command(self, command):
"""Create a temporary container and run command. """Create a temporary container and run command.
Need run inside executor. Need run inside executor.
@@ -94,31 +105,37 @@ class DockerHomeAssistant(DockerInterface):
detach=True, detach=True,
stdout=True, stdout=True,
stderr=True, stderr=True,
environment={ environment={ENV_TIME: self.sys_timezone},
ENV_TIME: self.sys_timezone,
},
volumes={ volumes={
str(self.sys_config.path_extern_homeassistant): str(self.sys_config.path_extern_homeassistant): {
{'bind': '/config', 'mode': 'rw'}, "bind": "/config",
str(self.sys_config.path_extern_ssl): "mode": "rw",
{'bind': '/ssl', 'mode': 'ro'}, },
str(self.sys_config.path_extern_share): str(self.sys_config.path_extern_ssl): {"bind": "/ssl", "mode": "ro"},
{'bind': '/share', 'mode': 'ro'}, str(self.sys_config.path_extern_share): {
} "bind": "/share",
"mode": "ro",
},
},
) )
def is_initialize(self): def is_initialize(self) -> Awaitable[bool]:
"""Return True if Docker container exists.""" """Return True if Docker container exists."""
return self.sys_run_in_executor(self._is_initialize) return self.sys_run_in_executor(self._is_initialize)
def _is_initialize(self): def _is_initialize(self) -> bool:
"""Return True if docker container exists. """Return True if docker container exists.
Need run inside executor. Need run inside executor.
""" """
try: try:
self.sys_docker.containers.get(self.name) docker_container = self.sys_docker.containers.get(self.name)
docker_image = self.sys_docker.images.get(self.image)
except docker.errors.DockerException: except docker.errors.DockerException:
return False return False
# we run on an old image, stop and start it
if docker_container.image.id != docker_image.id:
return False
return True return True

View File

@@ -2,13 +2,16 @@
import asyncio import asyncio
from contextlib import suppress from contextlib import suppress
import logging import logging
from typing import Any, Dict, Optional, Awaitable
import docker import docker
from .stats import DockerStats from ..const import LABEL_ARCH, LABEL_VERSION
from ..const import LABEL_VERSION, LABEL_ARCH from ..coresys import CoreSys, CoreSysAttributes
from ..coresys import CoreSysAttributes from ..exceptions import DockerAPIError
from ..utils import process_lock from ..utils import process_lock
from .stats import DockerStats
from . import CommandReturn
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -16,118 +19,119 @@ _LOGGER = logging.getLogger(__name__)
class DockerInterface(CoreSysAttributes): class DockerInterface(CoreSysAttributes):
"""Docker Hass.io interface.""" """Docker Hass.io interface."""
def __init__(self, coresys): def __init__(self, coresys: CoreSys):
"""Initialize Docker base wrapper.""" """Initialize Docker base wrapper."""
self.coresys = coresys self.coresys: CoreSys = coresys
self._meta = None self._meta: Optional[Dict[str, Any]] = None
self.lock = asyncio.Lock(loop=coresys.loop) self.lock: asyncio.Lock = asyncio.Lock(loop=coresys.loop)
@property @property
def timeout(self): def timeout(self) -> str:
"""Return timeout for Docker actions.""" """Return timeout for Docker actions."""
return 30 return 30
@property @property
def name(self): def name(self) -> Optional[str]:
"""Return name of Docker container.""" """Return name of Docker container."""
return None return None
@property @property
def meta_config(self): def meta_config(self) -> Dict[str, Any]:
"""Return meta data of configuration for container/image.""" """Return meta data of configuration for container/image."""
if not self._meta: if not self._meta:
return {} return {}
return self._meta.get('Config', {}) return self._meta.get("Config", {})
@property @property
def meta_labels(self): def meta_labels(self) -> Dict[str, str]:
"""Return meta data of labels for container/image.""" """Return meta data of labels for container/image."""
return self.meta_config.get('Labels') or {} return self.meta_config.get("Labels") or {}
@property @property
def image(self): def image(self) -> Optional[str]:
"""Return name of Docker image.""" """Return name of Docker image."""
return self.meta_config.get('Image') return self.meta_config.get("Image")
@property @property
def version(self): def version(self) -> Optional[str]:
"""Return version of Docker image.""" """Return version of Docker image."""
return self.meta_labels.get(LABEL_VERSION) return self.meta_labels.get(LABEL_VERSION)
@property @property
def arch(self): def arch(self) -> Optional[str]:
"""Return arch of Docker image.""" """Return arch of Docker image."""
return self.meta_labels.get(LABEL_ARCH) return self.meta_labels.get(LABEL_ARCH)
@property @property
def in_progress(self): def in_progress(self) -> bool:
"""Return True if a task is in progress.""" """Return True if a task is in progress."""
return self.lock.locked() return self.lock.locked()
@process_lock @process_lock
def install(self, tag): def install(self, tag: str, image: Optional[str] = None):
"""Pull docker image.""" """Pull docker image."""
return self.sys_run_in_executor(self._install, tag) return self.sys_run_in_executor(self._install, tag, image)
def _install(self, tag): def _install(self, tag: str, image: Optional[str] = None) -> None:
"""Pull Docker image. """Pull Docker image.
Need run inside executor. Need run inside executor.
""" """
image = image or self.image
try: try:
_LOGGER.info("Pull image %s tag %s.", self.image, tag) _LOGGER.info("Pull image %s tag %s.", image, tag)
image = self.sys_docker.images.pull(f"{self.image}:{tag}") docker_image = self.sys_docker.images.pull(f"{image}:{tag}")
image.tag(self.image, tag='latest') _LOGGER.info("Tag image %s with version %s as latest", image, tag)
self._meta = image.attrs docker_image.tag(image, tag="latest")
except docker.errors.APIError as err: except docker.errors.APIError as err:
_LOGGER.error("Can't install %s:%s -> %s.", self.image, tag, err) _LOGGER.error("Can't install %s:%s -> %s.", image, tag, err)
return False raise DockerAPIError() from None
else:
self._meta = docker_image.attrs
_LOGGER.info("Tag image %s with version %s as latest", self.image, tag) def exists(self) -> Awaitable[bool]:
return True
def exists(self):
"""Return True if Docker image exists in local repository.""" """Return True if Docker image exists in local repository."""
return self.sys_run_in_executor(self._exists) return self.sys_run_in_executor(self._exists)
def _exists(self): def _exists(self) -> bool:
"""Return True if Docker image exists in local repository. """Return True if Docker image exists in local repository.
Need run inside executor. Need run inside executor.
""" """
try: try:
image = self.sys_docker.images.get(self.image) docker_image = self.sys_docker.images.get(self.image)
assert f"{self.image}:{self.version}" in image.tags assert f"{self.image}:{self.version}" in docker_image.tags
except (docker.errors.DockerException, AssertionError): except (docker.errors.DockerException, AssertionError):
return False return False
return True return True
def is_running(self): def is_running(self) -> Awaitable[bool]:
"""Return True if Docker is running. """Return True if Docker is running.
Return a Future. Return a Future.
""" """
return self.sys_run_in_executor(self._is_running) return self.sys_run_in_executor(self._is_running)
def _is_running(self): def _is_running(self) -> bool:
"""Return True if Docker is running. """Return True if Docker is running.
Need run inside executor. Need run inside executor.
""" """
try: try:
container = self.sys_docker.containers.get(self.name) docker_container = self.sys_docker.containers.get(self.name)
image = self.sys_docker.images.get(self.image) docker_image = self.sys_docker.images.get(self.image)
except docker.errors.DockerException: except docker.errors.DockerException:
return False return False
# container is not running # container is not running
if container.status != 'running': if docker_container.status != "running":
return False return False
# we run on an old image, stop and start it # we run on an old image, stop and start it
if container.image.id != image.id: if docker_container.image.id != docker_image.id:
return False return False
return True return True
@@ -137,7 +141,7 @@ class DockerInterface(CoreSysAttributes):
"""Attach to running Docker container.""" """Attach to running Docker container."""
return self.sys_run_in_executor(self._attach) return self.sys_run_in_executor(self._attach)
def _attach(self): def _attach(self) -> None:
"""Attach to running docker container. """Attach to running docker container.
Need run inside executor. Need run inside executor.
@@ -145,22 +149,21 @@ class DockerInterface(CoreSysAttributes):
try: try:
if self.image: if self.image:
self._meta = self.sys_docker.images.get(self.image).attrs self._meta = self.sys_docker.images.get(self.image).attrs
else:
self._meta = self.sys_docker.containers.get(self.name).attrs self._meta = self.sys_docker.containers.get(self.name).attrs
except docker.errors.DockerException: except docker.errors.DockerException:
return False pass
_LOGGER.info( # Successfull?
"Attach to image %s with version %s", self.image, self.version) if not self._meta:
raise DockerAPIError() from None
return True _LOGGER.info("Attach to %s with version %s", self.image, self.version)
@process_lock @process_lock
def run(self): def run(self) -> Awaitable[None]:
"""Run Docker image.""" """Run Docker image."""
return self.sys_run_in_executor(self._run) return self.sys_run_in_executor(self._run)
def _run(self): def _run(self) -> None:
"""Run Docker image. """Run Docker image.
Need run inside executor. Need run inside executor.
@@ -168,114 +171,137 @@ class DockerInterface(CoreSysAttributes):
raise NotImplementedError() raise NotImplementedError()
@process_lock @process_lock
def stop(self): def stop(self, remove_container=True) -> Awaitable[None]:
"""Stop/remove Docker container.""" """Stop/remove Docker container."""
return self.sys_run_in_executor(self._stop) return self.sys_run_in_executor(self._stop, remove_container)
def _stop(self): def _stop(self, remove_container=True) -> None:
"""Stop/remove and remove docker container. """Stop/remove Docker container.
Need run inside executor. Need run inside executor.
""" """
try: try:
container = self.sys_docker.containers.get(self.name) docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException: except docker.errors.DockerException:
return False raise DockerAPIError() from None
if container.status == 'running': if docker_container.status == "running":
_LOGGER.info("Stop %s Docker application", self.image) _LOGGER.info("Stop %s application", self.name)
with suppress(docker.errors.DockerException): with suppress(docker.errors.DockerException):
container.stop(timeout=self.timeout) docker_container.stop(timeout=self.timeout)
if remove_container:
with suppress(docker.errors.DockerException): with suppress(docker.errors.DockerException):
_LOGGER.info("Clean %s Docker application", self.image) _LOGGER.info("Clean %s application", self.name)
container.remove(force=True) docker_container.remove(force=True)
return True
@process_lock @process_lock
def remove(self): def start(self) -> Awaitable[None]:
"""Start Docker container."""
return self.sys_run_in_executor(self._start)
def _start(self) -> None:
"""Start docker container.
Need run inside executor.
"""
try:
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
raise DockerAPIError() from None
_LOGGER.info("Start %s", self.image)
try:
docker_container.start()
except docker.errors.DockerException as err:
_LOGGER.error("Can't start %s: %s", self.image, err)
raise DockerAPIError() from None
@process_lock
def remove(self) -> Awaitable[None]:
"""Remove Docker images.""" """Remove Docker images."""
return self.sys_run_in_executor(self._remove) return self.sys_run_in_executor(self._remove)
def _remove(self): def _remove(self) -> None:
"""remove docker images. """remove docker images.
Need run inside executor. Need run inside executor.
""" """
# Cleanup container # Cleanup container
with suppress(DockerAPIError):
self._stop() self._stop()
_LOGGER.info( _LOGGER.info("Remove image %s with latest and %s", self.image, self.version)
"Remove Docker %s with latest and %s", self.image, self.version)
try: try:
with suppress(docker.errors.ImageNotFound): with suppress(docker.errors.ImageNotFound):
self.sys_docker.images.remove( self.sys_docker.images.remove(image=f"{self.image}:latest", force=True)
image=f"{self.image}:latest", force=True)
with suppress(docker.errors.ImageNotFound): with suppress(docker.errors.ImageNotFound):
self.sys_docker.images.remove( self.sys_docker.images.remove(
image=f"{self.image}:{self.version}", force=True) image=f"{self.image}:{self.version}", force=True
)
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.warning("Can't remove image %s: %s", self.image, err) _LOGGER.warning("Can't remove image %s: %s", self.image, err)
return False raise DockerAPIError() from None
self._meta = None self._meta = None
return True
@process_lock @process_lock
def update(self, tag): def update(self, tag: str, image: Optional[str] = None) -> Awaitable[None]:
"""Update a Docker image.""" """Update a Docker image."""
return self.sys_run_in_executor(self._update, tag) return self.sys_run_in_executor(self._update, tag, image)
def _update(self, tag): def _update(self, tag: str, image: Optional[str] = None) -> None:
"""Update a docker image. """Update a docker image.
Need run inside executor. Need run inside executor.
""" """
image = image or self.image
_LOGGER.info( _LOGGER.info(
"Update Docker %s with %s:%s", self.version, self.image, tag) "Update image %s:%s to %s:%s", self.image, self.version, image, tag
)
# Update docker image # Update docker image
if not self._install(tag): self._install(tag, image)
return False
# Stop container & cleanup # Stop container & cleanup
with suppress(DockerAPIError):
try:
self._stop() self._stop()
finally:
self._cleanup() self._cleanup()
return True def logs(self) -> Awaitable[bytes]:
def logs(self):
"""Return Docker logs of container. """Return Docker logs of container.
Return a Future. Return a Future.
""" """
return self.sys_run_in_executor(self._logs) return self.sys_run_in_executor(self._logs)
def _logs(self): def _logs(self) -> bytes:
"""Return Docker logs of container. """Return Docker logs of container.
Need run inside executor. Need run inside executor.
""" """
try: try:
container = self.sys_docker.containers.get(self.name) docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException: except docker.errors.DockerException:
return b"" return b""
try: try:
return container.logs(tail=100, stdout=True, stderr=True) return docker_container.logs(tail=100, stdout=True, stderr=True)
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.warning("Can't grep logs from %s: %s", self.image, err) _LOGGER.warning("Can't grep logs from %s: %s", self.image, err)
@process_lock @process_lock
def cleanup(self): def cleanup(self) -> Awaitable[None]:
"""Check if old version exists and cleanup.""" """Check if old version exists and cleanup."""
return self.sys_run_in_executor(self._cleanup) return self.sys_run_in_executor(self._cleanup)
def _cleanup(self): def _cleanup(self) -> None:
"""Check if old version exists and cleanup. """Check if old version exists and cleanup.
Need run inside executor. Need run inside executor.
@@ -284,47 +310,94 @@ class DockerInterface(CoreSysAttributes):
latest = self.sys_docker.images.get(self.image) latest = self.sys_docker.images.get(self.image)
except docker.errors.DockerException: except docker.errors.DockerException:
_LOGGER.warning("Can't find %s for cleanup", self.image) _LOGGER.warning("Can't find %s for cleanup", self.image)
return False raise DockerAPIError() from None
for image in self.sys_docker.images.list(name=self.image): for image in self.sys_docker.images.list(name=self.image):
if latest.id == image.id: if latest.id == image.id:
continue continue
with suppress(docker.errors.DockerException): with suppress(docker.errors.DockerException):
_LOGGER.info("Cleanup Docker images: %s", image.tags) _LOGGER.info("Cleanup images: %s", image.tags)
self.sys_docker.images.remove(image.id, force=True) self.sys_docker.images.remove(image.id, force=True)
return True
@process_lock @process_lock
def execute_command(self, command): def restart(self) -> Awaitable[None]:
"""Create a temporary container and run command.""" """Restart docker container."""
return self.sys_run_in_executor(self._execute_command, command) return self.sys_loop.run_in_executor(None, self._restart)
def _execute_command(self, command): def _restart(self) -> None:
"""Create a temporary container and run command. """Restart docker container.
Need run inside executor.
"""
raise NotImplementedError()
def stats(self):
"""Read and return stats from container."""
return self.sys_run_in_executor(self._stats)
def _stats(self):
"""Create a temporary container and run command.
Need run inside executor. Need run inside executor.
""" """
try: try:
container = self.sys_docker.containers.get(self.name) container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException: except docker.errors.DockerException:
return None raise DockerAPIError() from None
_LOGGER.info("Restart %s", self.image)
try:
container.restart(timeout=self.timeout)
except docker.errors.DockerException as err:
_LOGGER.warning("Can't restart %s: %s", self.image, err)
raise DockerAPIError() from None
@process_lock
def execute_command(self, command: str) -> Awaitable[CommandReturn]:
"""Create a temporary container and run command."""
return self.sys_run_in_executor(self._execute_command, command)
def _execute_command(self, command: str) -> CommandReturn:
"""Create a temporary container and run command.
Need run inside executor.
"""
raise NotImplementedError()
def stats(self) -> Awaitable[DockerStats]:
"""Read and return stats from container."""
return self.sys_run_in_executor(self._stats)
def _stats(self) -> DockerStats:
"""Create a temporary container and run command.
Need run inside executor.
"""
try:
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
raise DockerAPIError() from None
try: try:
stats = container.stats(stream=False) stats = docker_container.stats(stream=False)
return DockerStats(stats) return DockerStats(stats)
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.error("Can't read stats from %s: %s", self.name, err) _LOGGER.error("Can't read stats from %s: %s", self.name, err)
return None raise DockerAPIError() from None
def is_fails(self) -> Awaitable[bool]:
"""Return True if Docker is failing state.
Return a Future.
"""
return self.sys_run_in_executor(self._is_fails)
def _is_fails(self) -> bool:
"""Return True if Docker is failing state.
Need run inside executor.
"""
try:
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
return False
# container is not running
if docker_container.status != "exited":
return False
# Check return value
if int(docker_container.attrs["State"]["ExitCode"]) != 0:
return True
return False

View File

@@ -1,9 +1,12 @@
"""Internal network manager for Hass.io.""" """Internal network manager for Hass.io."""
from ipaddress import IPv4Address
import logging import logging
from typing import List, Optional
import docker import docker
from ..const import DOCKER_NETWORK_MASK, DOCKER_NETWORK, DOCKER_NETWORK_RANGE from ..const import DOCKER_NETWORK, DOCKER_NETWORK_MASK, DOCKER_NETWORK_RANGE
from ..exceptions import DockerAPIError
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -14,32 +17,32 @@ class DockerNetwork:
This class is not AsyncIO safe! This class is not AsyncIO safe!
""" """
def __init__(self, dock): def __init__(self, docker_client: docker.DockerClient):
"""Initialize internal Hass.io network.""" """Initialize internal Hass.io network."""
self.docker = dock self.docker: docker.DockerClient = docker_client
self.network = self._get_network() self.network: docker.models.networks.Network = self._get_network()
@property @property
def name(self): def name(self) -> str:
"""Return name of network.""" """Return name of network."""
return DOCKER_NETWORK return DOCKER_NETWORK
@property @property
def containers(self): def containers(self) -> List[docker.models.containers.Container]:
"""Return of connected containers from network.""" """Return of connected containers from network."""
return self.network.containers return self.network.containers
@property @property
def gateway(self): def gateway(self) -> IPv4Address:
"""Return gateway of the network.""" """Return gateway of the network."""
return DOCKER_NETWORK_MASK[1] return DOCKER_NETWORK_MASK[1]
@property @property
def supervisor(self): def supervisor(self) -> IPv4Address:
"""Return supervisor of the network.""" """Return supervisor of the network."""
return DOCKER_NETWORK_MASK[2] return DOCKER_NETWORK_MASK[2]
def _get_network(self): def _get_network(self) -> docker.models.networks.Network:
"""Get HassIO network.""" """Get HassIO network."""
try: try:
return self.docker.networks.get(DOCKER_NETWORK) return self.docker.networks.get(DOCKER_NETWORK)
@@ -49,18 +52,25 @@ class DockerNetwork:
ipam_pool = docker.types.IPAMPool( ipam_pool = docker.types.IPAMPool(
subnet=str(DOCKER_NETWORK_MASK), subnet=str(DOCKER_NETWORK_MASK),
gateway=str(self.gateway), gateway=str(self.gateway),
iprange=str(DOCKER_NETWORK_RANGE) iprange=str(DOCKER_NETWORK_RANGE),
) )
ipam_config = docker.types.IPAMConfig(pool_configs=[ipam_pool]) ipam_config = docker.types.IPAMConfig(pool_configs=[ipam_pool])
return self.docker.networks.create( return self.docker.networks.create(
DOCKER_NETWORK, driver='bridge', ipam=ipam_config, DOCKER_NETWORK,
enable_ipv6=False, options={ driver="bridge",
"com.docker.network.bridge.name": DOCKER_NETWORK, ipam=ipam_config,
}) enable_ipv6=False,
options={"com.docker.network.bridge.name": DOCKER_NETWORK},
)
def attach_container(self, container, alias=None, ipv4=None): def attach_container(
self,
container: docker.models.containers.Container,
alias: Optional[List[str]] = None,
ipv4: Optional[IPv4Address] = None,
) -> None:
"""Attach container to Hass.io network. """Attach container to Hass.io network.
Need run inside executor. Need run inside executor.
@@ -71,23 +81,24 @@ class DockerNetwork:
self.network.connect(container, aliases=alias, ipv4_address=ipv4) self.network.connect(container, aliases=alias, ipv4_address=ipv4)
except docker.errors.APIError as err: except docker.errors.APIError as err:
_LOGGER.error("Can't link container to hassio-net: %s", err) _LOGGER.error("Can't link container to hassio-net: %s", err)
return False raise DockerAPIError() from None
self.network.reload() self.network.reload()
return True
def detach_default_bridge(self, container): def detach_default_bridge(
self, container: docker.models.containers.Container
) -> None:
"""Detach default Docker bridge. """Detach default Docker bridge.
Need run inside executor. Need run inside executor.
""" """
try: try:
default_network = self.docker.networks.get('bridge') default_network = self.docker.networks.get("bridge")
default_network.disconnect(container) default_network.disconnect(container)
except docker.errors.NotFound: except docker.errors.NotFound:
return return
except docker.errors.APIError as err: except docker.errors.APIError as err:
_LOGGER.warning( _LOGGER.warning("Can't disconnect container from default: %s", err)
"Can't disconnect container from default: %s", err) raise DockerAPIError() from None

View File

@@ -1,11 +1,13 @@
"""Init file for Hass.io Docker object.""" """Init file for Hass.io Docker object."""
from ipaddress import IPv4Address
import logging import logging
import os import os
import docker import docker
from .interface import DockerInterface
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from ..exceptions import DockerAPIError
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -14,29 +16,36 @@ class DockerSupervisor(DockerInterface, CoreSysAttributes):
"""Docker Hass.io wrapper for Supervisor.""" """Docker Hass.io wrapper for Supervisor."""
@property @property
def name(self): def name(self) -> str:
"""Return name of Docker container.""" """Return name of Docker container."""
return os.environ['SUPERVISOR_NAME'] return os.environ["SUPERVISOR_NAME"]
def _attach(self): @property
def ip_address(self) -> IPv4Address:
"""Return IP address of this container."""
return self.sys_docker.network.supervisor
def _attach(self) -> None:
"""Attach to running docker container. """Attach to running docker container.
Need run inside executor. Need run inside executor.
""" """
try: try:
container = self.sys_docker.containers.get(self.name) docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException: except docker.errors.DockerException:
return False raise DockerAPIError() from None
self._meta = container.attrs self._meta = docker_container.attrs
_LOGGER.info("Attach to Supervisor %s with version %s", _LOGGER.info(
self.image, self.version) "Attach to Supervisor %s with version %s", self.image, self.version
)
# If already attach # If already attach
if container in self.sys_docker.network.containers: if docker_container in self.sys_docker.network.containers:
return True return
# Attach to network # Attach to network
return self.sys_docker.network.attach_container( _LOGGER.info("Connect Supervisor to Hass.io Network")
container, alias=['hassio'], self.sys_docker.network.attach_container(
ipv4=self.sys_docker.network.supervisor) docker_container, alias=["hassio"], ipv4=self.sys_docker.network.supervisor
)

View File

@@ -3,111 +3,135 @@
class HassioError(Exception): class HassioError(Exception):
"""Root exception.""" """Root exception."""
pass
class HassioNotSupportedError(HassioError): class HassioNotSupportedError(HassioError):
"""Function is not supported.""" """Function is not supported."""
pass
# HomeAssistant # HomeAssistant
class HomeAssistantError(HassioError): class HomeAssistantError(HassioError):
"""Home Assistant exception.""" """Home Assistant exception."""
pass
class HomeAssistantUpdateError(HomeAssistantError): class HomeAssistantUpdateError(HomeAssistantError):
"""Error on update of a Home Assistant.""" """Error on update of a Home Assistant."""
pass
class HomeAssistantAPIError(HomeAssistantError): class HomeAssistantAPIError(HomeAssistantError):
"""Home Assistant API exception.""" """Home Assistant API exception."""
pass
class HomeAssistantAuthError(HomeAssistantAPIError): class HomeAssistantAuthError(HomeAssistantAPIError):
"""Home Assistant Auth API exception.""" """Home Assistant Auth API exception."""
pass
# Supervisor
class SupervisorError(HassioError):
"""Supervisor error."""
class SupervisorUpdateError(SupervisorError):
"""Supervisor update error."""
# HassOS # HassOS
class HassOSError(HassioError): class HassOSError(HassioError):
"""HassOS exception.""" """HassOS exception."""
pass
class HassOSUpdateError(HassOSError): class HassOSUpdateError(HassOSError):
"""Error on update of a HassOS.""" """Error on update of a HassOS."""
pass
class HassOSNotSupportedError(HassioNotSupportedError): class HassOSNotSupportedError(HassioNotSupportedError):
"""Function not supported by HassOS.""" """Function not supported by HassOS."""
pass
# Addons
class AddonsError(HassioError):
"""Addons exception."""
class AddonsNotSupportedError(HassioNotSupportedError):
"""Addons don't support a function."""
# Arch
class HassioArchNotFound(HassioNotSupportedError):
"""No matches with exists arch."""
# Updater # Updater
class HassioUpdaterError(HassioError): class HassioUpdaterError(HassioError):
"""Error on Updater.""" """Error on Updater."""
pass
# Auth
class AuthError(HassioError):
"""Auth errors."""
# Host # Host
class HostError(HassioError): class HostError(HassioError):
"""Internal Host error.""" """Internal Host error."""
pass
class HostNotSupportedError(HassioNotSupportedError): class HostNotSupportedError(HassioNotSupportedError):
"""Host function is not supprted.""" """Host function is not supprted."""
pass
class HostServiceError(HostError): class HostServiceError(HostError):
"""Host service functions fails.""" """Host service functions fails."""
pass
class HostAppArmorError(HostError): class HostAppArmorError(HostError):
"""Host apparmor functions fails.""" """Host apparmor functions fails."""
pass
# API # API
class APIError(HassioError, RuntimeError): class APIError(HassioError, RuntimeError):
"""API errors.""" """API errors."""
pass
class APIForbidden(APIError): class APIForbidden(APIError):
"""API forbidden error.""" """API forbidden error."""
pass
# Service / Discovery # Service / Discovery
class DiscoveryError(HassioError): class DiscoveryError(HassioError):
"""Discovery Errors.""" """Discovery Errors."""
pass
class ServicesError(HassioError): class ServicesError(HassioError):
"""Services Errors.""" """Services Errors."""
pass
# utils/gdbus # utils/gdbus
class DBusError(HassioError): class DBusError(HassioError):
"""DBus generic error.""" """DBus generic error."""
pass
class DBusNotConnectedError(HostNotSupportedError): class DBusNotConnectedError(HostNotSupportedError):
@@ -116,26 +140,36 @@ class DBusNotConnectedError(HostNotSupportedError):
class DBusFatalError(DBusError): class DBusFatalError(DBusError):
"""DBus call going wrong.""" """DBus call going wrong."""
pass
class DBusParseError(DBusError): class DBusParseError(DBusError):
"""DBus parse error.""" """DBus parse error."""
pass
# util/apparmor # util/apparmor
class AppArmorError(HostAppArmorError): class AppArmorError(HostAppArmorError):
"""General AppArmor error.""" """General AppArmor error."""
pass
class AppArmorFileError(AppArmorError): class AppArmorFileError(AppArmorError):
"""AppArmor profile file error.""" """AppArmor profile file error."""
pass
class AppArmorInvalidError(AppArmorError): class AppArmorInvalidError(AppArmorError):
"""AppArmor profile validate error.""" """AppArmor profile validate error."""
pass
# util/json
class JsonFileError(HassioError):
"""Invalid json file."""
# docker/api
class DockerAPIError(HassioError):
"""Docker API error."""

View File

@@ -1,15 +1,22 @@
"""HassOS support on supervisor.""" """HassOS support on supervisor."""
import asyncio import asyncio
from contextlib import suppress
import logging import logging
from pathlib import Path from pathlib import Path
from typing import Awaitable, Optional
import aiohttp import aiohttp
from cpe import CPE from cpe import CPE
from .coresys import CoreSysAttributes
from .const import URL_HASSOS_OTA from .const import URL_HASSOS_OTA
from .coresys import CoreSysAttributes, CoreSys
from .docker.hassos_cli import DockerHassOSCli from .docker.hassos_cli import DockerHassOSCli
from .exceptions import HassOSNotSupportedError, HassOSUpdateError, DBusError from .exceptions import (
DBusError,
HassOSNotSupportedError,
HassOSUpdateError,
DockerAPIError,
)
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -17,61 +24,61 @@ _LOGGER = logging.getLogger(__name__)
class HassOS(CoreSysAttributes): class HassOS(CoreSysAttributes):
"""HassOS interface inside HassIO.""" """HassOS interface inside HassIO."""
def __init__(self, coresys): def __init__(self, coresys: CoreSys):
"""Initialize HassOS handler.""" """Initialize HassOS handler."""
self.coresys = coresys self.coresys: CoreSys = coresys
self.instance = DockerHassOSCli(coresys) self.instance: DockerHassOSCli = DockerHassOSCli(coresys)
self._available = False self._available: bool = False
self._version = None self._version: Optional[str] = None
self._board = None self._board: Optional[str] = None
@property @property
def available(self): def available(self) -> bool:
"""Return True, if HassOS on host.""" """Return True, if HassOS on host."""
return self._available return self._available
@property @property
def version(self): def version(self) -> Optional[str]:
"""Return version of HassOS.""" """Return version of HassOS."""
return self._version return self._version
@property @property
def version_cli(self): def version_cli(self) -> Optional[str]:
"""Return version of HassOS cli.""" """Return version of HassOS cli."""
return self.instance.version return self.instance.version
@property @property
def version_latest(self): def version_latest(self) -> str:
"""Return version of HassOS.""" """Return version of HassOS."""
return self.sys_updater.version_hassos return self.sys_updater.version_hassos
@property @property
def version_cli_latest(self): def version_cli_latest(self) -> str:
"""Return version of HassOS.""" """Return version of HassOS."""
return self.sys_updater.version_hassos_cli return self.sys_updater.version_hassos_cli
@property @property
def need_update(self): def need_update(self) -> bool:
"""Return true if a HassOS update is available.""" """Return true if a HassOS update is available."""
return self.version != self.version_latest return self.version != self.version_latest
@property @property
def need_cli_update(self): def need_cli_update(self) -> bool:
"""Return true if a HassOS cli update is available.""" """Return true if a HassOS cli update is available."""
return self.version_cli != self.version_cli_latest return self.version_cli != self.version_cli_latest
@property @property
def board(self): def board(self) -> Optional[str]:
"""Return board name.""" """Return board name."""
return self._board return self._board
def _check_host(self): def _check_host(self) -> None:
"""Check if HassOS is availabe.""" """Check if HassOS is available."""
if not self.available: if not self.available:
_LOGGER.error("No HassOS available") _LOGGER.error("No HassOS available")
raise HassOSNotSupportedError() raise HassOSNotSupportedError()
async def _download_raucb(self, version): async def _download_raucb(self, version: str) -> None:
"""Download rauc bundle (OTA) from github.""" """Download rauc bundle (OTA) from github."""
url = URL_HASSOS_OTA.format(version=version, board=self.board) url = URL_HASSOS_OTA.format(version=version, board=self.board)
raucb = Path(self.sys_config.path_tmp, f"hassos-{version}.raucb") raucb = Path(self.sys_config.path_tmp, f"hassos-{version}.raucb")
@@ -83,9 +90,9 @@ class HassOS(CoreSysAttributes):
raise HassOSUpdateError() raise HassOSUpdateError()
# Download RAUCB file # Download RAUCB file
with raucb.open('wb') as ota_file: with raucb.open("wb") as ota_file:
while True: while True:
chunk = await request.content.read(1048576) chunk = await request.content.read(1_048_576)
if not chunk: if not chunk:
break break
ota_file.write(chunk) ota_file.write(chunk)
@@ -101,7 +108,7 @@ class HassOS(CoreSysAttributes):
raise HassOSUpdateError() raise HassOSUpdateError()
async def load(self): async def load(self) -> None:
"""Load HassOS data.""" """Load HassOS data."""
try: try:
# Check needed host functions # Check needed host functions
@@ -111,7 +118,7 @@ class HassOS(CoreSysAttributes):
assert self.sys_host.info.cpe is not None assert self.sys_host.info.cpe is not None
cpe = CPE(self.sys_host.info.cpe) cpe = CPE(self.sys_host.info.cpe)
assert cpe.get_product()[0] == 'hassos' assert cpe.get_product()[0] == "hassos"
except (AssertionError, NotImplementedError): except (AssertionError, NotImplementedError):
_LOGGER.debug("Found no HassOS") _LOGGER.debug("Found no HassOS")
return return
@@ -122,9 +129,10 @@ class HassOS(CoreSysAttributes):
self._board = cpe.get_target_hardware()[0] self._board = cpe.get_target_hardware()[0]
_LOGGER.info("Detect HassOS %s on host system", self.version) _LOGGER.info("Detect HassOS %s on host system", self.version)
with suppress(DockerAPIError):
await self.instance.attach() await self.instance.attach()
def config_sync(self): def config_sync(self) -> Awaitable[None]:
"""Trigger a host config reload from usb. """Trigger a host config reload from usb.
Return a coroutine. Return a coroutine.
@@ -132,9 +140,9 @@ class HassOS(CoreSysAttributes):
self._check_host() self._check_host()
_LOGGER.info("Syncing configuration from USB with HassOS.") _LOGGER.info("Syncing configuration from USB with HassOS.")
return self.sys_host.services.restart('hassos-config.service') return self.sys_host.services.restart("hassos-config.service")
async def update(self, version=None): async def update(self, version: Optional[str] = None) -> None:
"""Update HassOS system.""" """Update HassOS system."""
version = version or self.version_latest version = version or self.version_latest
@@ -167,20 +175,19 @@ class HassOS(CoreSysAttributes):
# Update fails # Update fails
rauc_status = await self.sys_dbus.get_properties() rauc_status = await self.sys_dbus.get_properties()
_LOGGER.error( _LOGGER.error("HassOS update fails with: %s", rauc_status.get("LastError"))
"HassOS update fails with: %s", rauc_status.get('LastError'))
raise HassOSUpdateError() raise HassOSUpdateError()
async def update_cli(self, version=None): async def update_cli(self, version: Optional[str] = None) -> None:
"""Update local HassOS cli.""" """Update local HassOS cli."""
version = version or self.version_cli_latest version = version or self.version_cli_latest
if version == self.version_cli: if version == self.version_cli:
_LOGGER.warning("Version %s is already installed for CLI", version) _LOGGER.warning("Version %s is already installed for CLI", version)
raise HassOSUpdateError()
if await self.instance.update(version):
return return
try:
await self.instance.update(version)
except DockerAPIError:
_LOGGER.error("HassOS CLI update fails") _LOGGER.error("HassOS CLI update fails")
raise HassOSUpdateError() raise HassOSUpdateError() from None

View File

@@ -2,28 +2,47 @@
import asyncio import asyncio
from contextlib import asynccontextmanager, suppress from contextlib import asynccontextmanager, suppress
from datetime import datetime, timedelta from datetime import datetime, timedelta
from ipaddress import IPv4Address
import logging import logging
import os import os
import re
from pathlib import Path from pathlib import Path
import re
import secrets
import socket import socket
import time import time
from typing import Any, AsyncContextManager, Awaitable, Dict, Optional
from uuid import UUID
import aiohttp import aiohttp
from aiohttp import hdrs from aiohttp import hdrs
import attr import attr
from .const import ( from .const import (
FILE_HASSIO_HOMEASSISTANT, ATTR_IMAGE, ATTR_LAST_VERSION, ATTR_UUID, ATTR_ACCESS_TOKEN,
ATTR_BOOT, ATTR_PASSWORD, ATTR_PORT, ATTR_SSL, ATTR_WATCHDOG, ATTR_BOOT,
ATTR_WAIT_BOOT, ATTR_REFRESH_TOKEN, ATTR_ACCESS_TOKEN, ATTR_IMAGE,
HEADER_HA_ACCESS) ATTR_LAST_VERSION,
from .coresys import CoreSysAttributes ATTR_PASSWORD,
ATTR_PORT,
ATTR_REFRESH_TOKEN,
ATTR_SSL,
ATTR_UUID,
ATTR_WAIT_BOOT,
ATTR_WATCHDOG,
FILE_HASSIO_HOMEASSISTANT,
HEADER_HA_ACCESS,
)
from .coresys import CoreSys, CoreSysAttributes
from .docker.homeassistant import DockerHomeAssistant from .docker.homeassistant import DockerHomeAssistant
from .docker.stats import DockerStats
from .exceptions import ( from .exceptions import (
HomeAssistantUpdateError, HomeAssistantError, HomeAssistantAPIError, DockerAPIError,
HomeAssistantAuthError) HomeAssistantAPIError,
from .utils import convert_to_ascii, process_lock, create_token HomeAssistantAuthError,
HomeAssistantError,
HomeAssistantUpdateError,
)
from .utils import convert_to_ascii, process_lock
from .utils.json import JsonConfig from .utils.json import JsonConfig
from .validate import SCHEMA_HASS_CONFIG from .validate import SCHEMA_HASS_CONFIG
@@ -31,118 +50,128 @@ _LOGGER = logging.getLogger(__name__)
RE_YAML_ERROR = re.compile(r"homeassistant\.util\.yaml") RE_YAML_ERROR = re.compile(r"homeassistant\.util\.yaml")
# pylint: disable=invalid-name
ConfigResult = attr.make_class('ConfigResult', ['valid', 'log'], frozen=True) @attr.s(frozen=True)
class ConfigResult:
"""Return object from config check."""
valid = attr.ib()
log = attr.ib()
class HomeAssistant(JsonConfig, CoreSysAttributes): class HomeAssistant(JsonConfig, CoreSysAttributes):
"""Home Assistant core object for handle it.""" """Home Assistant core object for handle it."""
def __init__(self, coresys): def __init__(self, coresys: CoreSys):
"""Initialize Home Assistant object.""" """Initialize Home Assistant object."""
super().__init__(FILE_HASSIO_HOMEASSISTANT, SCHEMA_HASS_CONFIG) super().__init__(FILE_HASSIO_HOMEASSISTANT, SCHEMA_HASS_CONFIG)
self.coresys = coresys self.coresys: CoreSys = coresys
self.instance = DockerHomeAssistant(coresys) self.instance: DockerHomeAssistant = DockerHomeAssistant(coresys)
self.lock = asyncio.Lock(loop=coresys.loop) self.lock: asyncio.Lock = asyncio.Lock(loop=coresys.loop)
self._error_state = False self._error_state: bool = False
# We don't persist access tokens. Instead we fetch new ones when needed
self.access_token = None
self._access_token_expires = None
async def load(self): # We don't persist access tokens. Instead we fetch new ones when needed
self.access_token: Optional[str] = None
self._access_token_expires: Optional[datetime] = None
async def load(self) -> None:
"""Prepare Home Assistant object.""" """Prepare Home Assistant object."""
if await self.instance.attach(): with suppress(DockerAPIError):
await self.instance.attach()
return return
_LOGGER.info("No Home Assistant Docker image %s found.", self.image) _LOGGER.info("No Home Assistant Docker image %s found.", self.image)
await self.install_landingpage() await self.install_landingpage()
@property @property
def machine(self): def machine(self) -> str:
"""Return the system machines.""" """Return the system machines."""
return self.instance.machine return self.instance.machine
@property @property
def error_state(self): def arch(self) -> str:
"""Return arch of running Home Assistant."""
return self.instance.arch
@property
def error_state(self) -> bool:
"""Return True if system is in error.""" """Return True if system is in error."""
return self._error_state return self._error_state
@property @property
def api_ip(self): def ip_address(self) -> IPv4Address:
"""Return IP of Home Assistant instance.""" """Return IP of Home Assistant instance."""
return self.sys_docker.network.gateway return self.instance.ip_address
@property @property
def api_port(self): def api_port(self) -> int:
"""Return network port to Home Assistant instance.""" """Return network port to Home Assistant instance."""
return self._data[ATTR_PORT] return self._data[ATTR_PORT]
@api_port.setter @api_port.setter
def api_port(self, value): def api_port(self, value: int) -> None:
"""Set network port for Home Assistant instance.""" """Set network port for Home Assistant instance."""
self._data[ATTR_PORT] = value self._data[ATTR_PORT] = value
@property @property
def api_password(self): def api_password(self) -> str:
"""Return password for Home Assistant instance.""" """Return password for Home Assistant instance."""
return self._data.get(ATTR_PASSWORD) return self._data.get(ATTR_PASSWORD)
@api_password.setter @api_password.setter
def api_password(self, value): def api_password(self, value: str):
"""Set password for Home Assistant instance.""" """Set password for Home Assistant instance."""
self._data[ATTR_PASSWORD] = value self._data[ATTR_PASSWORD] = value
@property @property
def api_ssl(self): def api_ssl(self) -> bool:
"""Return if we need ssl to Home Assistant instance.""" """Return if we need ssl to Home Assistant instance."""
return self._data[ATTR_SSL] return self._data[ATTR_SSL]
@api_ssl.setter @api_ssl.setter
def api_ssl(self, value): def api_ssl(self, value: bool):
"""Set SSL for Home Assistant instance.""" """Set SSL for Home Assistant instance."""
self._data[ATTR_SSL] = value self._data[ATTR_SSL] = value
@property @property
def api_url(self): def api_url(self) -> str:
"""Return API url to Home Assistant.""" """Return API url to Home Assistant."""
return "{}://{}:{}".format( return "{}://{}:{}".format('https' if self.api_ssl else 'http',
'https' if self.api_ssl else 'http', self.api_ip, self.api_port self.ip_address, self.api_port)
)
@property @property
def watchdog(self): def watchdog(self) -> bool:
"""Return True if the watchdog should protect Home Assistant.""" """Return True if the watchdog should protect Home Assistant."""
return self._data[ATTR_WATCHDOG] return self._data[ATTR_WATCHDOG]
@watchdog.setter @watchdog.setter
def watchdog(self, value): def watchdog(self, value: bool):
"""Return True if the watchdog should protect Home Assistant.""" """Return True if the watchdog should protect Home Assistant."""
self._data[ATTR_WATCHDOG] = value self._data[ATTR_WATCHDOG] = value
@property @property
def wait_boot(self): def wait_boot(self) -> int:
"""Return time to wait for Home Assistant startup.""" """Return time to wait for Home Assistant startup."""
return self._data[ATTR_WAIT_BOOT] return self._data[ATTR_WAIT_BOOT]
@wait_boot.setter @wait_boot.setter
def wait_boot(self, value): def wait_boot(self, value: int):
"""Set time to wait for Home Assistant startup.""" """Set time to wait for Home Assistant startup."""
self._data[ATTR_WAIT_BOOT] = value self._data[ATTR_WAIT_BOOT] = value
@property @property
def version(self): def version(self) -> str:
"""Return version of running Home Assistant.""" """Return version of running Home Assistant."""
return self.instance.version return self.instance.version
@property @property
def last_version(self): def last_version(self) -> str:
"""Return last available version of Home Assistant.""" """Return last available version of Home Assistant."""
if self.is_custom_image: if self.is_custom_image:
return self._data.get(ATTR_LAST_VERSION) return self._data.get(ATTR_LAST_VERSION)
return self.sys_updater.version_homeassistant return self.sys_updater.version_homeassistant
@last_version.setter @last_version.setter
def last_version(self, value): def last_version(self, value: str):
"""Set last available version of Home Assistant.""" """Set last available version of Home Assistant."""
if value: if value:
self._data[ATTR_LAST_VERSION] = value self._data[ATTR_LAST_VERSION] = value
@@ -150,14 +179,14 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
self._data.pop(ATTR_LAST_VERSION, None) self._data.pop(ATTR_LAST_VERSION, None)
@property @property
def image(self): def image(self) -> str:
"""Return image name of the Home Assistant container.""" """Return image name of the Home Assistant container."""
if self._data.get(ATTR_IMAGE): if self._data.get(ATTR_IMAGE):
return self._data[ATTR_IMAGE] return self._data[ATTR_IMAGE]
return os.environ['HOMEASSISTANT_REPOSITORY'] return os.environ['HOMEASSISTANT_REPOSITORY']
@image.setter @image.setter
def image(self, value): def image(self, value: str):
"""Set image name of Home Assistant container.""" """Set image name of Home Assistant container."""
if value: if value:
self._data[ATTR_IMAGE] = value self._data[ATTR_IMAGE] = value
@@ -165,60 +194,54 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
self._data.pop(ATTR_IMAGE, None) self._data.pop(ATTR_IMAGE, None)
@property @property
def is_custom_image(self): def is_custom_image(self) -> bool:
"""Return True if a custom image is used.""" """Return True if a custom image is used."""
return all(attr in self._data for attr in return all(
(ATTR_IMAGE, ATTR_LAST_VERSION)) attr in self._data for attr in (ATTR_IMAGE, ATTR_LAST_VERSION))
@property @property
def boot(self): def boot(self) -> bool:
"""Return True if Home Assistant boot is enabled.""" """Return True if Home Assistant boot is enabled."""
return self._data[ATTR_BOOT] return self._data[ATTR_BOOT]
@boot.setter @boot.setter
def boot(self, value): def boot(self, value: bool):
"""Set Home Assistant boot options.""" """Set Home Assistant boot options."""
self._data[ATTR_BOOT] = value self._data[ATTR_BOOT] = value
@property @property
def uuid(self): def uuid(self) -> UUID:
"""Return a UUID of this Home Assistant instance.""" """Return a UUID of this Home Assistant instance."""
return self._data[ATTR_UUID] return self._data[ATTR_UUID]
@property @property
def hassio_token(self): def hassio_token(self) -> str:
"""Return a access token for the Hass.io API.""" """Return an access token for the Hass.io API."""
return self._data.get(ATTR_ACCESS_TOKEN) return self._data.get(ATTR_ACCESS_TOKEN)
@property @property
def refresh_token(self): def refresh_token(self) -> str:
"""Return the refresh token to authenticate with Home Assistant.""" """Return the refresh token to authenticate with Home Assistant."""
return self._data.get(ATTR_REFRESH_TOKEN) return self._data.get(ATTR_REFRESH_TOKEN)
@refresh_token.setter @refresh_token.setter
def refresh_token(self, value): def refresh_token(self, value: str):
"""Set Home Assistant refresh_token.""" """Set Home Assistant refresh_token."""
self._data[ATTR_REFRESH_TOKEN] = value self._data[ATTR_REFRESH_TOKEN] = value
@process_lock @process_lock
async def install_landingpage(self): async def install_landingpage(self) -> None:
"""Install a landing page.""" """Install a landing page."""
_LOGGER.info("Setup HomeAssistant landingpage") _LOGGER.info("Setup HomeAssistant landingpage")
while True: while True:
if await self.instance.install('landingpage'): with suppress(DockerAPIError):
break await self.instance.install('landingpage')
_LOGGER.warning("Fails install landingpage, retry after 60sec") return
await asyncio.sleep(60) _LOGGER.warning("Fails install landingpage, retry after 30sec")
await asyncio.sleep(30)
# Run landingpage after installation
_LOGGER.info("Start landing page")
try:
await self._start()
except HomeAssistantError:
_LOGGER.warning("Can't start landing page")
@process_lock @process_lock
async def install(self): async def install(self) -> None:
"""Install a landing page.""" """Install a landing page."""
_LOGGER.info("Setup Home Assistant") _LOGGER.info("Setup Home Assistant")
while True: while True:
@@ -227,10 +250,12 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
await self.sys_updater.reload() await self.sys_updater.reload()
tag = self.last_version tag = self.last_version
if tag and await self.instance.install(tag): if tag:
with suppress(DockerAPIError):
await self.instance.install(tag)
break break
_LOGGER.warning("Error on install Home Assistant. Retry in 60sec") _LOGGER.warning("Error on install Home Assistant. Retry in 30sec")
await asyncio.sleep(60) await asyncio.sleep(30)
# finishing # finishing
_LOGGER.info("Home Assistant docker now installed") _LOGGER.info("Home Assistant docker now installed")
@@ -242,10 +267,11 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
except HomeAssistantError: except HomeAssistantError:
_LOGGER.error("Can't start Home Assistant!") _LOGGER.error("Can't start Home Assistant!")
finally: finally:
with suppress(DockerAPIError):
await self.instance.cleanup() await self.instance.cleanup()
@process_lock @process_lock
async def update(self, version=None): async def update(self, version=None) -> None:
"""Update HomeAssistant version.""" """Update HomeAssistant version."""
version = version or self.last_version version = version or self.last_version
rollback = self.version if not self.error_state else None rollback = self.version if not self.error_state else None
@@ -254,16 +280,18 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
if exists and version == self.instance.version: if exists and version == self.instance.version:
_LOGGER.warning("Version %s is already installed", version) _LOGGER.warning("Version %s is already installed", version)
return HomeAssistantUpdateError() return
# process a update # process an update
async def _update(to_version): async def _update(to_version):
"""Run Home Assistant update.""" """Run Home Assistant update."""
try:
_LOGGER.info("Update Home Assistant to version %s", to_version) _LOGGER.info("Update Home Assistant to version %s", to_version)
if not await self.instance.update(to_version): try:
raise HomeAssistantUpdateError() await self.instance.update(to_version)
finally: except DockerAPIError:
_LOGGER.warning("Update Home Assistant image fails")
raise HomeAssistantUpdateError() from None
if running: if running:
await self._start() await self._start()
_LOGGER.info("Successful run Home Assistant %s", to_version) _LOGGER.info("Successful run Home Assistant %s", to_version)
@@ -280,95 +308,124 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
else: else:
raise HomeAssistantUpdateError() raise HomeAssistantUpdateError()
async def _start(self): async def _start(self) -> None:
"""Start Home Assistant Docker & wait.""" """Start Home Assistant Docker & wait."""
if await self.instance.is_running(): if await self.instance.is_running():
_LOGGER.warning("Home Assistant is already running!") _LOGGER.warning("Home Assistant is already running!")
return return
# Create new API token # Create new API token
self._data[ATTR_ACCESS_TOKEN] = create_token() self._data[ATTR_ACCESS_TOKEN] = secrets.token_hex(56)
self.save_data() self.save_data()
if not await self.instance.run(): try:
raise HomeAssistantError() await self.instance.run()
except DockerAPIError:
raise HomeAssistantError() from None
await self._block_till_run() await self._block_till_run()
@process_lock @process_lock
def start(self): async def start(self) -> None:
"""Run Home Assistant docker. """Run Home Assistant docker."""
try:
if await self.instance.is_running():
await self.instance.restart()
elif await self.instance.is_initialize():
await self.instance.start()
else:
await self._start()
return
Return a coroutine. await self._block_till_run()
""" except DockerAPIError:
return self._start() raise HomeAssistantError() from None
@process_lock @process_lock
def stop(self): async def stop(self) -> None:
"""Stop Home Assistant Docker. """Stop Home Assistant Docker.
Return a coroutine. Return a coroutine.
""" """
return self.instance.stop() try:
return await self.instance.stop(remove_container=False)
except DockerAPIError:
raise HomeAssistantError() from None
@process_lock @process_lock
async def restart(self): async def restart(self) -> None:
"""Restart Home Assistant Docker.""" """Restart Home Assistant Docker."""
try:
await self.instance.restart()
except DockerAPIError:
raise HomeAssistantError() from None
await self._block_till_run()
@process_lock
async def rebuild(self) -> None:
"""Rebuild Home Assistant Docker container."""
with suppress(DockerAPIError):
await self.instance.stop() await self.instance.stop()
await self._start() await self._start()
def logs(self): def logs(self) -> Awaitable[bytes]:
"""Get HomeAssistant docker logs. """Get HomeAssistant docker logs.
Return a coroutine. Return a coroutine.
""" """
return self.instance.logs() return self.instance.logs()
def stats(self): async def stats(self) -> DockerStats:
"""Return stats of Home Assistant. """Return stats of Home Assistant.
Return a coroutine. Return a coroutine.
""" """
return self.instance.stats() try:
return await self.instance.stats()
except DockerAPIError:
raise HomeAssistantError() from None
def is_running(self): def is_running(self) -> Awaitable[bool]:
"""Return True if Docker container is running. """Return True if Docker container is running.
Return a coroutine. Return a coroutine.
""" """
return self.instance.is_running() return self.instance.is_running()
def is_initialize(self): def is_fails(self) -> Awaitable[bool]:
"""Return True if a Docker container is exists. """Return True if a Docker container is fails state.
Return a coroutine. Return a coroutine.
""" """
return self.instance.is_initialize() return self.instance.is_fails()
@property @property
def in_progress(self): def in_progress(self) -> bool:
"""Return True if a task is in progress.""" """Return True if a task is in progress."""
return self.instance.in_progress or self.lock.locked() return self.instance.in_progress or self.lock.locked()
async def check_config(self): async def check_config(self) -> ConfigResult:
"""Run Home Assistant config check.""" """Run Home Assistant config check."""
result = await self.instance.execute_command( result = await self.instance.execute_command(
"python3 -m homeassistant -c /config --script check_config" "python3 -m homeassistant -c /config --script check_config")
)
# if not valid # if not valid
if result.exit_code is None: if result.exit_code is None:
_LOGGER.error("Fatal error on config check!")
raise HomeAssistantError() raise HomeAssistantError()
# parse output # parse output
log = convert_to_ascii(result.output) log = convert_to_ascii(result.output)
if result.exit_code != 0 or RE_YAML_ERROR.search(log): if result.exit_code != 0 or RE_YAML_ERROR.search(log):
_LOGGER.error("Invalid Home Assistant config found!")
return ConfigResult(False, log) return ConfigResult(False, log)
_LOGGER.info("Home Assistant config is valid")
return ConfigResult(True, log) return ConfigResult(True, log)
async def ensure_access_token(self): async def ensure_access_token(self) -> None:
"""Ensures there is an access token.""" """Ensures there is an access token."""
if (self.access_token is not None and if self.access_token is not None and self._access_token_expires > datetime.utcnow():
self._access_token_expires > datetime.utcnow()):
return return
with suppress(asyncio.TimeoutError, aiohttp.ClientError): with suppress(asyncio.TimeoutError, aiohttp.ClientError):
@@ -378,8 +435,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
data={ data={
"grant_type": "refresh_token", "grant_type": "refresh_token",
"refresh_token": self.refresh_token "refresh_token": self.refresh_token
} }) as resp:
) as resp:
if resp.status != 200: if resp.status != 200:
_LOGGER.error("Can't update Home Assistant access token!") _LOGGER.error("Can't update Home Assistant access token!")
raise HomeAssistantAuthError() raise HomeAssistantAuthError()
@@ -391,8 +447,13 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
datetime.utcnow() + timedelta(seconds=tokens['expires_in']) datetime.utcnow() + timedelta(seconds=tokens['expires_in'])
@asynccontextmanager @asynccontextmanager
async def make_request(self, method, path, json=None, content_type=None, async def make_request(self,
data=None, timeout=30): method: str,
path: str,
json: Optional[Dict[str, Any]] = None,
content_type: Optional[str] = None,
data: Optional[bytes] = None,
timeout=30) -> AsyncContextManager[aiohttp.ClientResponse]:
"""Async context manager to make a request with right auth.""" """Async context manager to make a request with right auth."""
url = f"{self.api_url}/{path}" url = f"{self.api_url}/{path}"
headers = {} headers = {}
@@ -414,8 +475,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
try: try:
async with getattr(self.sys_websession_ssl, method)( async with getattr(self.sys_websession_ssl, method)(
url, data=data, timeout=timeout, json=json, url, data=data, timeout=timeout, json=json,
headers=headers headers=headers) as resp:
) as resp:
# Access token expired # Access token expired
if resp.status == 401 and self.refresh_token: if resp.status == 401 and self.refresh_token:
self.access_token = None self.access_token = None
@@ -428,29 +488,29 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
raise HomeAssistantAPIError() raise HomeAssistantAPIError()
async def check_api_state(self): async def check_api_state(self) -> bool:
"""Return True if Home Assistant up and running.""" """Return True if Home Assistant up and running."""
with suppress(HomeAssistantAPIError): with suppress(HomeAssistantAPIError):
async with self.make_request('get', 'api/') as resp: async with self.make_request('get', 'api/') as resp:
if resp.status in (200, 201): if resp.status in (200, 201):
return True return True
err = resp.status status = resp.status
_LOGGER.warning("Home Assistant API config mismatch: %s", status)
_LOGGER.warning("Home Assistant API config mismatch: %d", err)
return False return False
async def _block_till_run(self): async def _block_till_run(self) -> None:
"""Block until Home-Assistant is booting up or startup timeout.""" """Block until Home-Assistant is booting up or startup timeout."""
start_time = time.monotonic() start_time = time.monotonic()
migration_progress = False migration_progress = False
migration_file = Path( migration_file = Path(self.sys_config.path_homeassistant,
self.sys_config.path_homeassistant, '.migration_progress') '.migration_progress')
def check_port(): def check_port():
"""Check if port is mapped.""" """Check if port is mapped."""
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try: try:
result = sock.connect_ex((str(self.api_ip), self.api_port)) result = sock.connect_ex((str(self.ip_address), self.api_port))
sock.close() sock.close()
# Check if the port is available # Check if the port is available
@@ -461,23 +521,20 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
return False return False
while True: while True:
await asyncio.sleep(10) await asyncio.sleep(5)
# 1 # 1: Check if Container is is_running
# Check if Container is is_running
if not await self.instance.is_running(): if not await self.instance.is_running():
_LOGGER.error("Home Assistant has crashed!") _LOGGER.error("Home Assistant has crashed!")
break break
# 2 # 2: Check if API response
# Check if API response
if await self.sys_run_in_executor(check_port): if await self.sys_run_in_executor(check_port):
_LOGGER.info("Detect a running Home Assistant instance") _LOGGER.info("Detect a running Home Assistant instance")
self._error_state = False self._error_state = False
return return
# 3 # 3: Running DB Migration
# Running DB Migration
if migration_file.exists(): if migration_file.exists():
if not migration_progress: if not migration_progress:
migration_progress = True migration_progress = True
@@ -488,11 +545,9 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
start_time = time.monotonic() start_time = time.monotonic()
_LOGGER.info("Home Assistant record migration done") _LOGGER.info("Home Assistant record migration done")
# 4 # 4: Timeout
# Timeout
if time.monotonic() - start_time > self.wait_boot: if time.monotonic() - start_time > self.wait_boot:
_LOGGER.warning( _LOGGER.warning("Don't wait anymore of Home Assistant startup!")
"Don't wait anymore of Home Assistant startup!")
break break
self._error_state = True self._error_state = True

View File

@@ -6,7 +6,8 @@ from string import Template
import attr import attr
from ..const import ATTR_INPUT, ATTR_OUTPUT, ATTR_DEVICES, ATTR_NAME from ..const import (
ATTR_INPUT, ATTR_OUTPUT, ATTR_DEVICES, ATTR_NAME, CHAN_ID, CHAN_TYPE)
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -58,7 +59,9 @@ class AlsaAudio(CoreSysAttributes):
# Process devices # Process devices
for dev_id, dev_data in self.sys_hardware.audio_devices.items(): for dev_id, dev_data in self.sys_hardware.audio_devices.items():
for chan_id, chan_type in dev_data[ATTR_DEVICES].items(): for chan_info in dev_data[ATTR_DEVICES]:
chan_id = chan_info[CHAN_ID]
chan_type = chan_info[CHAN_TYPE]
alsa_id = f"{dev_id},{chan_id}" alsa_id = f"{dev_id},{chan_id}"
dev_name = dev_data[ATTR_NAME] dev_name = dev_data[ATTR_NAME]

103
hassio/ingress.py Normal file
View File

@@ -0,0 +1,103 @@
"""Fetch last versions from webserver."""
from datetime import timedelta
import logging
from typing import Dict, Optional
import secrets
from .addons.addon import Addon
from .const import ATTR_SESSION, FILE_HASSIO_INGRESS
from .coresys import CoreSys, CoreSysAttributes
from .utils.json import JsonConfig
from .utils.dt import utcnow, utc_from_timestamp
from .validate import SCHEMA_INGRESS_CONFIG
_LOGGER = logging.getLogger(__name__)
class Ingress(JsonConfig, CoreSysAttributes):
"""Fetch last versions from version.json."""
def __init__(self, coresys: CoreSys):
"""Initialize updater."""
super().__init__(FILE_HASSIO_INGRESS, SCHEMA_INGRESS_CONFIG)
self.coresys: CoreSys = coresys
self.tokens: Dict[str, str] = {}
def get(self, token: str) -> Optional[Addon]:
"""Return addon they have this ingress token."""
if token not in self.tokens:
self._update_token_list()
return self.sys_addons.get(self.tokens.get(token))
@property
def sessions(self) -> Dict[str, float]:
"""Return sessions."""
return self._data[ATTR_SESSION]
async def load(self) -> None:
"""Update internal data."""
self._update_token_list()
self._cleanup_sessions()
_LOGGER.info("Load %d ingress session", len(self.sessions))
async def reload(self) -> None:
"""Reload/Validate sessions."""
self._cleanup_sessions()
async def unload(self) -> None:
"""Shutdown sessions."""
self.save_data()
def _cleanup_sessions(self) -> None:
"""Remove not used sessions."""
now = utcnow()
sessions = {}
for session, valid in self.sessions.items():
valid_dt = utc_from_timestamp(valid)
if valid_dt < now:
continue
# Is valid
sessions[session] = valid
# Write back
self.sessions.clear()
self.sessions.update(sessions)
def _update_token_list(self) -> None:
"""Regenerate token <-> Add-on map."""
self.tokens.clear()
# Read all ingress token and build a map
for addon in self.sys_addons.list_installed:
if not addon.with_ingress:
continue
self.tokens[addon.ingress_token] = addon.slug
def create_session(self) -> str:
"""Create new session."""
session = secrets.token_hex(64)
valid = utcnow() + timedelta(minutes=15)
self.sessions[session] = valid.timestamp()
self.save_data()
return session
def validate_session(self, session: str) -> bool:
"""Return True if session valid and make it longer valid."""
if session not in self.sessions:
return False
valid_until = utc_from_timestamp(self.sessions[session])
# Is still valid?
if valid_until < utcnow():
return False
# Update time
valid_until = valid_until + timedelta(minutes=15)
self.sessions[session] = valid_until.timestamp()
return True

View File

@@ -13,9 +13,8 @@ COMMAND = "socat UDP-RECVFROM:53,fork UDP-SENDTO:127.0.0.11:53"
class DNSForward: class DNSForward:
"""Manage DNS forwarding to internal DNS.""" """Manage DNS forwarding to internal DNS."""
def __init__(self, loop): def __init__(self):
"""Initialize DNS forwarding.""" """Initialize DNS forwarding."""
self.loop = loop
self.proc = None self.proc = None
async def start(self): async def start(self):
@@ -25,9 +24,7 @@ class DNSForward:
*shlex.split(COMMAND), *shlex.split(COMMAND),
stdin=asyncio.subprocess.DEVNULL, stdin=asyncio.subprocess.DEVNULL,
stdout=asyncio.subprocess.DEVNULL, stdout=asyncio.subprocess.DEVNULL,
stderr=asyncio.subprocess.DEVNULL, stderr=asyncio.subprocess.DEVNULL)
loop=self.loop
)
except OSError as err: except OSError as err:
_LOGGER.error("Can't start DNS forwarding: %s", err) _LOGGER.error("Can't start DNS forwarding: %s", err)
else: else:

View File

@@ -6,7 +6,7 @@ import re
import pyudev import pyudev
from ..const import ATTR_NAME, ATTR_TYPE, ATTR_DEVICES from ..const import ATTR_NAME, ATTR_TYPE, ATTR_DEVICES, CHAN_ID, CHAN_TYPE
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -20,6 +20,7 @@ PROC_STAT = Path("/proc/stat")
RE_BOOT_TIME = re.compile(r"btime (\d+)") RE_BOOT_TIME = re.compile(r"btime (\d+)")
GPIO_DEVICES = Path("/sys/class/gpio") GPIO_DEVICES = Path("/sys/class/gpio")
SOC_DEVICES = Path("/sys/devices/platform/soc")
RE_TTY = re.compile(r"tty[A-Z]+") RE_TTY = re.compile(r"tty[A-Z]+")
@@ -60,6 +61,11 @@ class Hardware:
return dev_list return dev_list
@property
def support_audio(self):
"""Return True if the system have audio support."""
return bool(self.audio_devices)
@property @property
def audio_devices(self): def audio_devices(self):
"""Return all available audio interfaces.""" """Return all available audio interfaces."""
@@ -68,10 +74,8 @@ class Hardware:
return {} return {}
try: try:
with ASOUND_CARDS.open('r') as cards_file: cards = ASOUND_CARDS.read_text()
cards = cards_file.read() devices = ASOUND_DEVICES.read_text()
with ASOUND_DEVICES.open('r') as devices_file:
devices = devices_file.read()
except OSError as err: except OSError as err:
_LOGGER.error("Can't read asound data: %s", err) _LOGGER.error("Can't read asound data: %s", err)
return {} return {}
@@ -83,20 +87,27 @@ class Hardware:
audio_list[match.group(1)] = { audio_list[match.group(1)] = {
ATTR_NAME: match.group(3), ATTR_NAME: match.group(3),
ATTR_TYPE: match.group(2), ATTR_TYPE: match.group(2),
ATTR_DEVICES: {}, ATTR_DEVICES: [],
} }
# parse devices # parse devices
for match in RE_DEVICES.finditer(devices): for match in RE_DEVICES.finditer(devices):
try: try:
audio_list[match.group(1)][ATTR_DEVICES][match.group(2)] = \ audio_list[match.group(1)][ATTR_DEVICES].append({
match.group(3) CHAN_ID: match.group(2),
CHAN_TYPE: match.group(3)
})
except KeyError: except KeyError:
_LOGGER.warning("Wrong audio device found %s", match.group(0)) _LOGGER.warning("Wrong audio device found %s", match.group(0))
continue continue
return audio_list return audio_list
@property
def support_gpio(self):
"""Return True if device support GPIOs."""
return SOC_DEVICES.exists() and GPIO_DEVICES.exists()
@property @property
def gpio_devices(self): def gpio_devices(self):
"""Return list of GPIO interface on device.""" """Return list of GPIO interface on device."""

View File

@@ -1,6 +1,7 @@
"""Schedule for Hass.io.""" """Schedule for Hass.io."""
import logging import asyncio
from datetime import date, datetime, time, timedelta from datetime import date, datetime, time, timedelta
import logging
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -13,9 +14,9 @@ TASK = 'task'
class Scheduler: class Scheduler:
"""Schedule task inside Hass.io.""" """Schedule task inside Hass.io."""
def __init__(self, loop): def __init__(self):
"""Initialize task schedule.""" """Initialize task schedule."""
self.loop = loop self.loop = asyncio.get_running_loop()
self._data = {} self._data = {}
self.suspend = False self.suspend = False
@@ -57,8 +58,8 @@ class Scheduler:
job = self.loop.call_later(interval, self._run_task, task_id) job = self.loop.call_later(interval, self._run_task, task_id)
elif isinstance(interval, time): elif isinstance(interval, time):
today = datetime.combine(date.today(), interval) today = datetime.combine(date.today(), interval)
tomorrow = datetime.combine( tomorrow = datetime.combine(date.today() + timedelta(days=1),
date.today() + timedelta(days=1), interval) interval)
# Check if we run it today or next day # Check if we run it today or next day
if today > datetime.today(): if today > datetime.today():

View File

@@ -1,38 +1,38 @@
"""Handle internal services discovery.""" """Handle internal services discovery."""
from .mqtt import MQTTService from typing import Dict, List
from ..coresys import CoreSys, CoreSysAttributes
from .const import SERVICE_MQTT
from .data import ServicesData from .data import ServicesData
from ..const import SERVICE_MQTT from .interface import ServiceInterface
from ..coresys import CoreSysAttributes from .modules.mqtt import MQTTService
AVAILABLE_SERVICES = {SERVICE_MQTT: MQTTService}
AVAILABLE_SERVICES = {
SERVICE_MQTT: MQTTService
}
class ServiceManager(CoreSysAttributes): class ServiceManager(CoreSysAttributes):
"""Handle internal services discovery.""" """Handle internal services discovery."""
def __init__(self, coresys): def __init__(self, coresys: CoreSys):
"""Initialize Services handler.""" """Initialize Services handler."""
self.coresys = coresys self.coresys: CoreSys = coresys
self.data = ServicesData() self.data: ServicesData = ServicesData()
self.services_obj = {} self.services_obj: Dict[str, ServiceInterface] = {}
@property @property
def list_services(self): def list_services(self) -> List[ServiceInterface]:
"""Return a list of services.""" """Return a list of services."""
return list(self.services_obj.values()) return list(self.services_obj.values())
def get(self, slug): def get(self, slug: str) -> ServiceInterface:
"""Return service object from slug.""" """Return service object from slug."""
return self.services_obj.get(slug) return self.services_obj.get(slug)
async def load(self): async def load(self) -> None:
"""Load available services.""" """Load available services."""
for slug, service in AVAILABLE_SERVICES.items(): for slug, service in AVAILABLE_SERVICES.items():
self.services_obj[slug] = service(self.coresys) self.services_obj[slug] = service(self.coresys)
def reset(self): def reset(self) -> None:
"""Reset available data.""" """Reset available data."""
self.data.reset_data() self.data.reset_data()

11
hassio/services/const.py Normal file
View File

@@ -0,0 +1,11 @@
"""Service API static data."""
ATTR_ADDON = "addon"
ATTR_HOST = "host"
ATTR_PASSWORD = "password"
ATTR_PORT = "port"
ATTR_PROTOCOL = "protocol"
ATTR_SSL = "ssl"
ATTR_USERNAME = "username"
SERVICE_MQTT = "mqtt"

View File

@@ -1,8 +1,10 @@
"""Handle service data for persistent supervisor reboot.""" """Handle service data for persistent supervisor reboot."""
from typing import Any, Dict
from .validate import SCHEMA_SERVICES_CONFIG from ..const import FILE_HASSIO_SERVICES
from ..const import FILE_HASSIO_SERVICES, SERVICE_MQTT
from ..utils.json import JsonConfig from ..utils.json import JsonConfig
from .const import SERVICE_MQTT
from .validate import SCHEMA_SERVICES_CONFIG
class ServicesData(JsonConfig): class ServicesData(JsonConfig):
@@ -13,6 +15,6 @@ class ServicesData(JsonConfig):
super().__init__(FILE_HASSIO_SERVICES, SCHEMA_SERVICES_CONFIG) super().__init__(FILE_HASSIO_SERVICES, SCHEMA_SERVICES_CONFIG)
@property @property
def mqtt(self): def mqtt(self) -> Dict[str, Any]:
"""Return settings for MQTT service.""" """Return settings for MQTT service."""
return self._data[SERVICE_MQTT] return self._data[SERVICE_MQTT]

View File

@@ -1,33 +1,37 @@
"""Interface for single service.""" """Interface for single service."""
from typing import Any, Dict, List, Optional
from ..coresys import CoreSysAttributes import voluptuous as vol
from ..addons.addon import Addon
from ..const import PROVIDE_SERVICE from ..const import PROVIDE_SERVICE
from ..coresys import CoreSys, CoreSysAttributes
class ServiceInterface(CoreSysAttributes): class ServiceInterface(CoreSysAttributes):
"""Interface class for service integration.""" """Interface class for service integration."""
def __init__(self, coresys): def __init__(self, coresys: CoreSys):
"""Initialize service interface.""" """Initialize service interface."""
self.coresys = coresys self.coresys: CoreSys = coresys
@property @property
def slug(self): def slug(self) -> str:
"""Return slug of this service.""" """Return slug of this service."""
return None raise NotImplementedError()
@property @property
def _data(self): def _data(self) -> Dict[str, Any]:
"""Return data of this service.""" """Return data of this service."""
return None raise NotImplementedError()
@property @property
def schema(self): def schema(self) -> vol.Schema:
"""Return data schema of this service.""" """Return data schema of this service."""
return None raise NotImplementedError()
@property @property
def providers(self): def providers(self) -> List[str]:
"""Return name of service providers addon.""" """Return name of service providers addon."""
addons = [] addons = []
for addon in self.sys_addons.list_installed: for addon in self.sys_addons.list_installed:
@@ -36,24 +40,24 @@ class ServiceInterface(CoreSysAttributes):
return addons return addons
@property @property
def enabled(self): def enabled(self) -> bool:
"""Return True if the service is in use.""" """Return True if the service is in use."""
return bool(self._data) return bool(self._data)
def save(self): def save(self) -> None:
"""Save changes.""" """Save changes."""
self.sys_services.data.save_data() self.sys_services.data.save_data()
def get_service_data(self): def get_service_data(self) -> Optional[Dict[str, Any]]:
"""Return the requested service data.""" """Return the requested service data."""
if self.enabled: if self.enabled:
return self._data return self._data
return None return None
def set_service_data(self, addon, data): def set_service_data(self, addon: Addon, data: Dict[str, Any]) -> None:
"""Write the data into service object.""" """Write the data into service object."""
raise NotImplementedError() raise NotImplementedError()
def del_service_data(self, addon): def del_service_data(self, addon: Addon) -> None:
"""Remove the data from service object.""" """Remove the data from service object."""
raise NotImplementedError() raise NotImplementedError()

View File

@@ -0,0 +1 @@
"""Services modules."""

View File

@@ -0,0 +1,81 @@
"""Provide the MQTT Service."""
import logging
from typing import Any, Dict
from hassio.addons.addon import Addon
from hassio.exceptions import ServicesError
from hassio.validate import NETWORK_PORT
import voluptuous as vol
from ..const import (
ATTR_ADDON,
ATTR_HOST,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_PROTOCOL,
ATTR_SSL,
ATTR_USERNAME,
SERVICE_MQTT,
)
from ..interface import ServiceInterface
_LOGGER = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_SERVICE_MQTT = vol.Schema(
{
vol.Required(ATTR_HOST): vol.Coerce(str),
vol.Required(ATTR_PORT): NETWORK_PORT,
vol.Optional(ATTR_USERNAME): vol.Coerce(str),
vol.Optional(ATTR_PASSWORD): vol.Coerce(str),
vol.Optional(ATTR_SSL, default=False): vol.Boolean(),
vol.Optional(ATTR_PROTOCOL, default="3.1.1"): vol.All(
vol.Coerce(str), vol.In(["3.1", "3.1.1"])
),
}
)
SCHEMA_CONFIG_MQTT = SCHEMA_SERVICE_MQTT.extend(
{vol.Required(ATTR_ADDON): vol.Coerce(str)}
)
class MQTTService(ServiceInterface):
"""Provide MQTT services."""
@property
def slug(self) -> str:
"""Return slug of this service."""
return SERVICE_MQTT
@property
def _data(self) -> Dict[str, Any]:
"""Return data of this service."""
return self.sys_services.data.mqtt
@property
def schema(self) -> vol.Schema:
"""Return data schema of this service."""
return SCHEMA_SERVICE_MQTT
def set_service_data(self, addon: Addon, data: Dict[str, Any]) -> None:
"""Write the data into service object."""
if self.enabled:
_LOGGER.error("It is already a MQTT in use from %s", self._data[ATTR_ADDON])
raise ServicesError()
self._data.update(data)
self._data[ATTR_ADDON] = addon.slug
_LOGGER.info("Set %s as service provider for mqtt", addon.slug)
self.save()
def del_service_data(self, addon: Addon) -> None:
"""Remove the data from service object."""
if not self.enabled:
_LOGGER.warning("Can't remove not exists services")
raise ServicesError()
self._data.clear()
self.save()

Some files were not shown because too many files have changed in this diff Show More