Compare commits

...

191 Commits
121 ... 150

Author SHA1 Message Date
Pascal Vizeli
dfdcddfd0b Merge pull request #968 from home-assistant/dev
Release 150
2019-03-20 22:08:17 +01:00
Pascal Vizeli
0391277bad Fix panel for 0.90.0 (#967) 2019-03-20 22:03:31 +01:00
Pascal Vizeli
73643b9bfe Bump version 150 2019-03-19 21:29:47 +01:00
Pascal Vizeli
93a52b8382 Merge pull request #965 from home-assistant/dev
Release 149
2019-03-19 21:26:38 +01:00
Pascal Vizeli
7a91bb1f6c Update panel for 0.90.0 v6 (#963) 2019-03-19 19:01:52 +01:00
Pascal Vizeli
26efa998a1 Revert dev link (#956) 2019-03-18 09:42:31 +01:00
Pascal Vizeli
fc9f3fee0a Fix 2019-03-18 09:20:48 +01:00
cadwal
ec19bd570b Include serial device node links in container device mapping to allow for persistent names in the HA serial config (#944) 2019-03-18 09:05:04 +01:00
David McNett
3335bad9e1 Correct typo: 'ignore' -> 'ignored' (#947) 2019-03-18 09:02:29 +01:00
Pascal Vizeli
71ae334e24 Update pylint (#945) 2019-03-11 14:03:28 +01:00
Pascal Vizeli
0807651fbd Bump version 149 2019-03-08 11:59:41 +01:00
Pascal Vizeli
7026d42d77 Merge pull request #942 from home-assistant/dev
Release 148
2019-03-08 11:41:58 +01:00
Pascal Vizeli
31047b9ec2 Down or upgrade exists image on restore (#941) 2019-03-08 11:36:36 +01:00
Pascal Vizeli
714791de8f Bump version 148 2019-03-07 21:12:54 +01:00
Pascal Vizeli
c544fff2b2 Merge pull request #939 from home-assistant/dev
Release 147
2019-03-07 21:12:20 +01:00
Pascal Vizeli
fc45670686 Fix bug with update (#938) 2019-03-07 21:09:43 +01:00
Pascal Vizeli
5cefa0a2ee Bump version 147 2019-03-07 16:28:39 +01:00
Pascal Vizeli
a1910d4135 Merge pull request #937 from home-assistant/dev
Release 146
2019-03-07 16:28:09 +01:00
Pascal Vizeli
f1fecdde3a Enable Armv7 for Add-ons (#936)
* Enable Armv7 for Add-ons

* Cleanups

* fix tests
2019-03-07 16:00:41 +01:00
Pascal Vizeli
9ba4ea7d18 Check json files too 2019-03-07 10:03:07 +01:00
Pascal Vizeli
58a455d639 Fix lint 2019-03-04 10:09:34 +01:00
Pascal Vizeli
3ea85f6a28 Delete .travis.yml 2019-03-04 10:04:19 +01:00
Pascal Vizeli
4e1469ada4 Replace travis 2019-03-04 10:03:54 +01:00
Curtis Gibby
5778f78f28 Fix misspelling on "environment" (#934) 2019-03-04 10:00:41 +01:00
Pascal Vizeli
227125cc0b Change json error handling (#930)
* Change json error handling

* Typing + modern way to read file

* fix lint
2019-02-26 00:19:05 +01:00
Pascal Vizeli
b36e178c45 Bump version to 146 2019-02-21 17:24:01 +01:00
Pascal Vizeli
32c9198fb2 Merge pull request #929 from home-assistant/dev
Release 145
2019-02-21 17:21:43 +01:00
Pascal Vizeli
6983dcc267 Fix image arch version on restore/update (#928) 2019-02-21 16:40:49 +01:00
Pascal Vizeli
813fcc41f0 Bump version 145 2019-02-20 17:04:41 +01:00
Pascal Vizeli
f4e9dd0f1c Merge pull request #927 from home-assistant/dev
Release 144
2019-02-20 17:04:15 +01:00
Pascal Vizeli
7f074142bf Replace pycrpytodome with cryptocraphy (#923)
* Replace pycrpytodome with cryptocraphy

* Fix typing

* fix typing

* Fix lints

* Fix build

* Add musl libc

* Fix lint

* fix lint

* Fix algo

* Add more typing fix crypto imports v2

* Fix padding
2019-02-20 10:30:22 +01:00
Pascal Vizeli
b6df37628d Merge pull request #924 from home-assistant/feat-wait-time
Increase wait time for home assistant startup
2019-02-18 16:24:21 +01:00
Pascal Vizeli
7867eded50 Increase wait time for home assistant startup 2019-02-18 09:51:21 +01:00
Pascal Vizeli
311abb8a90 Bump version 144 2019-02-02 11:48:29 +01:00
Pascal Vizeli
21303f4b05 Merge pull request #913 from home-assistant/dev
Release 143
2019-02-02 11:47:13 +01:00
Pascal Vizeli
da3270af67 Fix that need_build work like image (#912) 2019-01-31 22:08:10 +01:00
Pascal Vizeli
35aae69f23 Support armv7 and allow support of multible arch types per CPU (#892)
* Support armv7 and first abstraction

* Change layout

* Add more type hints

* Fix imports

* Update

* move forward

* add tests

* fix type

* fix lint & tests

* fix tests

* Fix unittests

* Fix create folder

* cleanup

* Fix import order

* cleanup loop parameter

* cleanup init function

* Allow changeable image name

* fix setup

* Fix load of arch

* Fix lint

* Add typing

* fix init

* fix hassos cli problem & stick on supervisor arch

* address comments

* cleanup

* Fix image selfheal

* Add comment

* update uvloop

* remove uvloop

* fix tagging

* Fix install name

* Fix validate build config

* Abstract image_name from system cache
2019-01-31 18:47:44 +01:00
Franck Nijhof
118a2e1951 Revert "Delete move.yml" (#901)
This reverts commit 07c4058a8c.
2019-01-22 12:19:38 +01:00
Pascal Vizeli
9053341581 Fix wrong UTF-8 config files (#895)
* Fix wrong UTF-8 config files

* Fix lint

* Update data.py
2019-01-18 18:57:54 +01:00
Pascal Vizeli
27532a8a00 Update aioHttp 3.5.4 (#894) 2019-01-17 21:40:52 +01:00
Pascal Vizeli
7fdfa630b5 Bump version 143 2019-01-15 12:11:56 +01:00
Pascal Vizeli
3974d5859f Merge pull request #890 from home-assistant/dev
Release 142
2019-01-15 12:10:58 +01:00
Pascal Vizeli
aa1c765c4b Add support for SYS_MODULE (#889)
* Add support for SYS_MODULE

* Update flake stuff

* Fix lint

* Fix lint

* Fix lint

* Fix lint
2019-01-15 00:56:07 +01:00
Pascal Vizeli
e78385e7ea Support to map kernel modules ro into container (#888) 2019-01-14 23:20:30 +01:00
Pascal Vizeli
9d59b56c94 Fix lint 2019-01-14 23:20:07 +01:00
Pascal Vizeli
9d72dcabfc Support to map kernel modules ro into container 2019-01-14 21:57:14 +01:00
Pascal Vizeli
a0b5d0b67e Fix error on first run because the landing page already run (#886)
* Fix error on first run because the landing page already run

* Update homeassistant.py
2019-01-14 21:25:17 +01:00
Pascal Vizeli
2b5520405f Fix log info about update on dev (#885) 2019-01-14 20:05:03 +01:00
Pascal Vizeli
ca376b3fcd Update docker-py to 3.7.0 (#882)
* Update docker-py to 3.7.0

* Update __init__.py

* Update addon.py
2019-01-14 20:04:27 +01:00
Pascal Vizeli
11e3c0c547 Update aioHttp to 3.5.2 (#881) 2019-01-13 12:22:01 +01:00
Pascal Vizeli
9da136e037 Fix API descriptions 2019-01-02 23:31:35 +01:00
Pascal Vizeli
9b3e59d876 Merge pull request #861 from casperklein/patch-1
Duplicate entry removed.
2018-12-20 16:18:29 +01:00
Casper
7a592795b5 Duplicate entry removed. 2018-12-20 13:45:04 +01:00
Pascal Vizeli
5b92137699 Bump version 142 2018-12-11 23:46:01 +01:00
Pascal Vizeli
7520cdfeb4 Merge pull request #853 from home-assistant/dev
Release 141
2018-12-11 23:45:29 +01:00
Pascal Vizeli
0ada791e3a Update Panel for Home Assistant 0.84.0 (#852) 2018-12-11 20:54:30 +01:00
Pascal Vizeli
73afced4dc Bugfix stack trace on remove (#842) 2018-11-30 00:09:33 +01:00
Pascal Vizeli
633a2e93bf Create ISSUE_TEMPLATE.md 2018-11-22 14:53:49 +01:00
Pascal Vizeli
07c4058a8c Delete move.yml 2018-11-22 14:46:58 +01:00
Alastair D'Silva
b6f3938b14 Add support for the Orange Pi Prime (#829)
Signed-off-by: Alastair D'Silva <alastair@d-silva.org>
2018-11-21 17:03:25 +01:00
Pascal Vizeli
57534fac96 Bump version 141 2018-11-20 17:39:39 +01:00
Pascal Vizeli
4a03e72983 Merge pull request #827 from home-assistant/dev
Release 140
2018-11-20 17:39:12 +01:00
Pascal Vizeli
ddb29ea9b1 Speedup build 2018-11-20 17:17:04 +01:00
Pascal Vizeli
95179c30f7 Update Panel with new security functions (#826) 2018-11-20 17:13:55 +01:00
Pascal Vizeli
f49970ce2c Update .gitmodules 2018-11-20 12:25:45 +01:00
Pascal Vizeli
790818d1aa Update README.md 2018-11-20 10:56:19 +01:00
Pascal Vizeli
62f675e613 Fix documentation 2018-11-19 22:37:46 +01:00
Pascal Vizeli
f33434fb01 Downgrade discovery duplicate logging (#824) 2018-11-19 21:05:51 +01:00
Pascal Vizeli
254d6aee32 Small code cleanups (#822)
* Small code cleanups

* Update homeassistant.py
2018-11-19 16:44:21 +01:00
Pascal Vizeli
a5ecd597ed Add tests for add-ons map (#821) 2018-11-19 16:43:24 +01:00
Pascal Vizeli
0fab3e940a Merge pull request #820 from home-assistant/master
Master
2018-11-19 14:52:45 +01:00
Pascal Vizeli
60fbebc16b Rating add-on better they implement hass auth (#819)
* Rating add-on better they implement hass auth

* Update utils.py
2018-11-19 14:51:03 +01:00
Christian
ec366d8112 Provide options for legacy add-ons (#814)
* Provide options for legacy add-ons

* Remove whitespace from blank line

* Only provide primitive data types as Docker environment variable

* Fix linting issues

* Update addon.py
2018-11-19 12:05:12 +01:00
Christian
b8818788c9 Bugfix Add-on validate correct image url (#810)
* Bugfix Add-on validate correct image path

* Add tests for different add-on image urls
2018-11-18 19:29:23 +01:00
Pascal Vizeli
e23f6f6998 Update uvloop to version 0.11.3 (#818) 2018-11-18 12:08:59 +01:00
Pascal Vizeli
05b58d76b9 Add tests for hass.io (#817)
* Add tests for hass.io

* Fix folder

* Fix test command
2018-11-18 12:08:46 +01:00
Pascal Vizeli
644d13e3fa Bugfix Add-on validate on RO (#803) 2018-11-09 23:53:41 +01:00
Pascal Vizeli
9de71472d4 Remove links they are not needed 2018-11-09 10:26:01 +01:00
Pascal Vizeli
bf28227b91 Add developer guide 2018-11-09 10:25:29 +01:00
Pascal Vizeli
4c1ee49068 Bump version 140 2018-11-05 16:20:01 +01:00
Pascal Vizeli
6e7cf5e4c9 Merge pull request #796 from home-assistant/dev
Release 139
2018-11-05 16:19:17 +01:00
Pascal Vizeli
11f8c97347 Fix discovery update (#795)
* Update discovery.py

* Update discovery.py

* Update discovery.py

* Update discovery.py

* Update discovery.py

* Update discovery.py

* Update discovery.py
2018-11-05 14:59:57 +01:00
Pascal Vizeli
a1461fd518 Update requirements.txt 2018-11-05 13:53:16 +01:00
Pascal Vizeli
fa5c2e37d3 Discovery default config (#793) 2018-11-05 07:45:28 +01:00
luca-simonetti
1f091b20ad fix: use a different convention to handle multiple devices on same card (#767)
* fix: use a different convention to handle multiple devices on same card

* fix: use a different convention to handle multiple devices on same card

* Update alsa.py

* Update alsa.py
2018-11-02 10:47:25 +01:00
Pascal Vizeli
d3b4a03851 Catch exception on watchdog for pretty log (#778)
* Catch exception on watchdog for pretty log

* Update tasks.py
2018-10-29 16:40:19 +01:00
Jorim Tielemans
fb12fee59b Expand add-on installation error message (#783)
* Expand error message

Since an add-on is only available for certain machine and architecture combination we should log both.

* Update addon.py
2018-10-27 15:24:56 +02:00
Pascal Vizeli
7a87d2334a flake8 update to 3.6.0 (#777)
* flake8 update to 3.6.0

* fix lint
2018-10-27 15:23:26 +02:00
Pascal Vizeli
9591e71138 Update auth.py (#771) 2018-10-24 14:02:16 +02:00
Ville Skyttä
cecad526a2 Grammar and spelling fixes (#772) 2018-10-24 14:01:28 +02:00
Pascal Vizeli
53dab4ee45 Bump version 139 2018-10-16 12:52:19 +02:00
Pascal Vizeli
8abbba46c7 Merge pull request #766 from home-assistant/dev
Release 138
2018-10-16 12:51:47 +02:00
Pascal Vizeli
0f01ac1b59 Fix syntax 2018-10-16 12:45:06 +02:00
Pascal Vizeli
aa8ab593c0 Rename login_backend to auth_api (#764)
* Update const.py

* Update validate.py

* Update addon.py

* Update auth.py

* Update addons.py

* Update API.md
2018-10-16 12:33:40 +02:00
Pascal Vizeli
84f791220e Don't clean cache on fake auth (#765)
* Don't clean cache on fake auth

* Update auth.py
2018-10-16 12:30:24 +02:00
Pascal Vizeli
cee2c5469f Bump version 138 2018-10-15 15:25:29 +02:00
Pascal Vizeli
6e75964a8b Merge pull request #761 from home-assistant/dev
Release 137
2018-10-15 15:25:05 +02:00
Pascal Vizeli
5ab5036504 Fix proxy handling with failing connection (#760)
* Fix proxy handling with failing connection

* fix lint

* Fix exception handling

* clenaup error handling

* Fix type error

* Fix event stream

* Fix stream handling

* Fix

* Fix lint

* Handle

* Update proxy.py

* fix lint
2018-10-15 13:01:52 +02:00
Pascal Vizeli
000a3c1f7e Bump to 137 2018-10-12 14:39:47 +02:00
Pascal Vizeli
8ea123eb94 Merge pull request #754 from home-assistant/dev
Release 136
2018-10-12 14:39:18 +02:00
Pascal Vizeli
571c42ef7d Create role for backup add-ons (#755)
* Create role for backup add-ons

* Update validate.py

* Update security.py
2018-10-12 12:48:12 +02:00
Pascal Vizeli
8443da0b9f Add-on SSO support with Home Assistant auth system (#752)
* Create auth.py

* Finish auth cache

* Add documentation

* Add valid schema

* Update auth.py

* Update auth.py

* Update security.py

* Create auth.py

* Update coresys.py

* Update bootstrap.py

* Update const.py

* Update validate.py

* Update const.py

* Update addon.py

* Update auth.py

* Update __init__.py

* Update auth.py

* Update auth.py

* Update auth.py

* Update const.py

* Update auth.py

* Update auth.py

* Update auth.py

* Update validate.py

* Update coresys.py

* Update auth.py

* Update auth.py

* more security

* Update API.md

* Update auth.py

* Update auth.py

* Update auth.py

* Update auth.py

* Update auth.py

* Update homeassistant.py

* Update homeassistant.py
2018-10-12 12:21:48 +02:00
Pascal Vizeli
7dbbcf24c8 Check exists hardware for audio/gpio devices (#753)
* Update hardware.py

* Update addon.py

* Update hardware.py

* Update addon.py
2018-10-12 10:22:58 +02:00
Pascal Vizeli
468cb0c36b Rename info (#750)
* Rename version to info

* fix security
2018-10-10 16:46:34 +02:00
Pascal Vizeli
78e093df96 Bump version 136 2018-10-09 17:10:25 +02:00
Pascal Vizeli
ec4d7dab21 Merge pull request #749 from home-assistant/dev
Release 135
2018-10-09 17:08:19 +02:00
Pascal Vizeli
d00ee0adea Add hostname into version API call (#748) 2018-10-09 15:40:44 +02:00
Pascal Vizeli
55d5ee4ed4 Merge pull request #747 from mbo18/patch-1
Add missing tinker board
2018-10-09 14:00:44 +02:00
mbo18
0e51d74265 Add missing tinker board 2018-10-09 09:29:38 +02:00
Pascal Vizeli
916f3caedd Bump version 135 2018-10-08 00:21:59 +02:00
Pascal Vizeli
ff80ccce64 Merge pull request #745 from home-assistant/dev
Release 134
2018-10-08 00:20:20 +02:00
Pascal Vizeli
23f28b38e9 small code cleanups (#740)
* small code cleanups

* Update __init__.py

* Update homeassistant.py

* Update __init__.py

* Update homeassistant.py

* Update homeassistant.py

* Update __init__.py

* fix list

* Fix api call
2018-10-07 23:50:18 +02:00
Franck Nijhof
da425a0530 Adds support for privilege DAC_READ_SEARCH (#743)
* Adds support for privilege DAC_READ_SEARCH

* 🚑 Fixes security rating regarding privileges
2018-10-07 19:17:06 +02:00
Jorim Tielemans
79dca1608e Fix machine 'odroid-c2' (#744)
Odroid-cu2 does not exist AFAIK, it needs to be c2.
2018-10-07 19:16:29 +02:00
Pascal Vizeli
33b615e40d Fix manager access to /addons (#738) 2018-10-05 13:48:29 +02:00
Pascal Vizeli
c825c40c4d Bump version 134 2018-10-01 19:07:48 +02:00
Pascal Vizeli
8beb723cc2 Merge pull request #736 from home-assistant/dev
Release 133
2018-10-01 19:07:17 +02:00
Pascal Vizeli
94fd24c251 Bugfix message handling (#735) 2018-10-01 18:57:31 +02:00
Pascal Vizeli
bf75a8a439 Cleanup discovery data (#734)
* Cleanup discovery data

* Update API.md

* Update validate.py

* Update discovery.py

* Update const.py
2018-10-01 16:17:46 +02:00
Pascal Vizeli
36cdb05387 Don't allow add-on to update itself (#733) 2018-10-01 15:22:26 +02:00
Pascal Vizeli
dccc652d42 Bump version 133 2018-09-30 20:16:42 +02:00
Pascal Vizeli
74e03a9a2e Merge pull request #728 from home-assistant/dev
Release 132
2018-09-30 20:16:08 +02:00
Pascal Vizeli
2f6df3a946 Fix discovery on add-on uninstall (#731)
* Fix discovery on add-on uninstall

* Update discovery.py

* Update discovery.py
2018-09-30 18:24:10 +02:00
Pascal Vizeli
2872be6385 Update Panel (#730) 2018-09-30 17:58:26 +02:00
Pascal Vizeli
af19e95c81 Make discovery persistent (#727)
* Make discovery persistent

* fix file handling

* fix detection

* Smooth

* Fix ring import

* Fix handling

* fix schema

* fix validate

* fix discovery cleanup
2018-09-30 15:33:16 +02:00
Pascal Vizeli
e5451973bd Overwork Services/Discovery (#725)
* Update homeassistant.py

* Update validate.py

* Update exceptions.py

* Update services.py

* Update discovery.py

* fix gitignore

* Fix handling for discovery

* use object in ref

* lock down discovery API

* fix api

* Design

* Fix API

* fix lint

* fix

* Fix security layer

* add provide layer

* fix access

* change rating

* fix rights

* Fix API error handling

* raise error

* fix rights

* api

* fix handling

* fix

* debug

* debug json

* Fix validator

* fix error

* new url

* fix schema
2018-09-29 19:49:08 +02:00
Pascal Vizeli
4ef8c9d633 Change API for new UI & Add machine support (#720)
* Change API for new UI

* Update API.md

* Update validate.py

* Update addon.py

* Update API.md

* Update addons.py

* fix lint

* Update security.py

* Update version.py

* Update security.py

* Update security.py
2018-09-28 14:34:43 +02:00
Pascal Vizeli
4a9dcb540e Add support for long live token (#719)
* Add support for long live token

* Update proxy.py
2018-09-27 14:35:40 +02:00
Pascal Vizeli
61eefea358 Add version endpoint (#718)
* Add version endpoint

* Update API.md

* Update const.py

* Create version.py

* Update __init__.py

* Update security.py

* Update version.py
2018-09-26 11:39:45 +02:00
Pascal Vizeli
f2a5512bbf Fix not exists label bug (#717) 2018-09-25 13:46:48 +02:00
Pascal Vizeli
2f4e114f25 Fix wrong regex 2018-09-25 12:51:47 +02:00
Pascal Vizeli
c91bac2527 Add log to blacklist / reduce free calls (#713) 2018-09-24 17:03:21 +02:00
Pascal Vizeli
52da7605f5 Enable Security API (#710)
* Enable Security API

* Update addons.py

* Update proxy.py

* Update __init__.py

* Update security.py

* Fix lint
2018-09-24 15:11:33 +02:00
Fabian Affolter
267791833e Update docstrings, comments and log messages (#707) 2018-09-18 23:47:47 +02:00
Pascal Vizeli
67dcf1563b Bump version to 132 2018-09-18 21:20:10 +02:00
Pascal Vizeli
ccff0f5b9e Merge pull request #706 from home-assistant/dev
Release 131
2018-09-18 21:19:33 +02:00
Pascal Vizeli
9f8ad05471 Add API role system (#703)
* Add API role system

* Finish

* Simplify

* Fix lint

* Fix rights

* Fix lint

* Fix spell

* Fix log
2018-09-18 20:39:58 +02:00
Fabian Affolter
c2299ef8da Fix typos (#704) 2018-09-18 18:17:20 +02:00
Franck Nijhof
f5845564db 👕 Fixes a typo in method name (#702) 2018-09-17 23:11:53 +02:00
Franck Nijhof
17904d70d8 🚀 Adds venv to .dockerignore (#701) 2018-09-17 21:03:14 +02:00
Franck Nijhof
622e99e04c Adds host PID mode support for add-ons (#700)
*  Adds host PID mode support for add-ons.

* 🔒 Disables host PID mode when in protected mode

* 🚦 Adds more negative rating weight to host PID mode
2018-09-17 21:02:28 +02:00
Pascal Vizeli
061420f279 Make Label handling more robust (#696)
* Make Label handling more robust

* Update interface.py

* Update interface.py

* Update interface.py
2018-09-15 22:07:05 +02:00
Franck Nijhof
3d459f1b8b Adds support for SYS_PTRACE add-on privileges (#697) 2018-09-15 22:05:50 +02:00
Pascal Vizeli
5f3dd6190a Bump version 130 2018-09-10 00:02:27 +02:00
Pascal Vizeli
ac824d3af6 Merge pull request #691 from home-assistant/dev
Release 130
2018-09-10 00:00:56 +02:00
Pascal Vizeli
dd25c29544 Bugfix Proxy with new token (#690)
* Update proxy.py

* Update security.py
2018-09-09 23:47:35 +02:00
Pascal Vizeli
5cbdbffbb2 Bump version to 130 2018-09-08 00:17:05 +02:00
Pascal Vizeli
bb81f14c2c Merge pull request #688 from home-assistant/dev
Release 129
2018-09-08 00:16:17 +02:00
Pascal Vizeli
cecefd6972 Change access to API (#686)
* Update API.md

* Update API.md

* Update API.md

* Update addons.py

* Update addons.py

* Update addons.py

* Update addons.py

* Update __init__.py

* Update security.py

* Update security.py

* Update const.py

* Update validate.py

* Update __init__.py

* Update validate.py

* Update homeassistant.py

* Update homeassistant.py

* Update homeassistant.py

* Update addon.py

* Update addon.py

* Update homeassistant.py

* Fix lint

* Fix lint

* Backward combatibility

* Make token more robust

* Fix bug

* Logic error

* Fix access

* fix valid
2018-09-07 22:59:31 +02:00
Pascal Vizeli
ff7f6a0b4c Bump version 129 2018-08-29 10:16:04 +02:00
Pascal Vizeli
1dc9f35e12 Merge pull request #674 from home-assistant/dev
Release 128
2018-08-29 10:13:57 +02:00
Pascal Vizeli
051b63c7cc Fix access token property (#673)
* Fix access token property

* revert
2018-08-28 17:04:39 +02:00
Pascal Vizeli
aac4b9b24a Snapshot/Restore Home-Assistant token (#672)
* Snapshot/Restore Home-Assistant token

* Encrypt token & check api

* fix lint
2018-08-28 16:32:17 +02:00
Paulus Schoutsen
1a208a20b6 Handle access token expiration (#671) 2018-08-28 12:14:40 +02:00
Pascal Vizeli
b1e8722ead Update: pycryptodome to 3.6.6 (#670) 2018-08-28 12:04:32 +02:00
Pascal Vizeli
a66af6e903 Update aiohttp to 3.4.0 (#668)
Update: aiohttp to 3.4.0
2018-08-28 01:18:38 +02:00
Pascal Vizeli
0c345fc615 Bump version 128 2018-08-19 22:05:42 +02:00
Pascal Vizeli
087b082a6b Merge pull request #660 from home-assistant/dev
Release 127
2018-08-19 22:03:49 +02:00
Pascal Vizeli
0b85209eae Detect running record migration (#659)
* Detect running record migration

* Fix order

* Change order second one
2018-08-19 21:58:19 +02:00
Pascal Vizeli
d81bc7de46 Change rating 1-6 (#658) 2018-08-19 18:17:14 +02:00
Pascal Vizeli
e3a99b9f89 Fix /share inside whitelist (#657) 2018-08-18 15:05:18 +02:00
Pascal Vizeli
5d319b37ea Bump verison 127 2018-08-16 23:38:57 +02:00
Pascal Vizeli
9f25606986 Merge pull request #653 from home-assistant/dev
Release 126
2018-08-16 23:38:24 +02:00
Pascal Vizeli
ecd12732ee New generation of security and access (#652)
* New generation of security and access

* Update const.py

* Update validate.py

* Update addon.py

* Update validate.py

* Fix name

* Allow access

* Fix

* add logs

* change message

* add rating

* fix lint

* fix lint

* fix

* Fix
2018-08-16 22:49:08 +02:00
Pascal Vizeli
85fbde8e36 Fix Dockerfile 2018-08-16 01:42:56 +02:00
Pascal Vizeli
6e6c2c3efb Change timezone handling (#641)
* Change timezone handling

* Update dt.py

* Update homeassistant.py

* fix

* Use new timezone

* fix handling

* fix regex

* fix regex

* Rename old config

* fix lint

* simplify

* fix regex

* fix

* cleanup

* cleanup

* fix

* fix find

* mm
2018-08-16 01:40:20 +02:00
Pascal Vizeli
0d4a808449 Improve docker build cache for supervisor (#651) 2018-08-15 23:52:52 +02:00
Pascal Vizeli
087f746647 update docker API to 3.5.0 (#650) 2018-08-15 22:05:13 +02:00
Pascal Vizeli
640d66ad1a Update uvloop 0.11.2 (#648) 2018-08-15 21:38:57 +02:00
Pascal Vizeli
f5f5ed83af Bump version 126 2018-08-09 14:38:34 +02:00
Pascal Vizeli
95f01a1161 Merge pull request #640 from home-assistant/dev
Release 125
2018-08-09 14:37:56 +02:00
Pascal Vizeli
b84e7e7d94 Allow to reset token (#639)
* Allow to reset token

* Update homeassistant.py
2018-08-09 14:37:00 +02:00
Pascal Vizeli
5d7018f3f0 Bump version 125 2018-08-09 01:05:21 +02:00
Pascal Vizeli
d87a85ceb5 Merge pull request #636 from home-assistant/dev
Release 124
2018-08-09 01:03:47 +02:00
Pascal Vizeli
9ab6e80b6f Cleanup logging (#637)
* Cleanup logging

* simplify
2018-08-09 01:03:00 +02:00
Pascal Vizeli
78e91e859e Add add-on support for docker sock ro (#635)
* Add add-on support for docker sock ro

* fix
2018-08-09 00:42:33 +02:00
Pascal Vizeli
9eee8eade6 Fix gpio mapping on amd64 systems (#634) 2018-08-09 00:29:20 +02:00
Pascal Vizeli
124ce0b8b7 Update voluptuous 0.11.5 (#622) 2018-08-09 00:06:49 +02:00
Pascal Vizeli
00e7d96472 Fix new auth system (#633)
* Fix new auth system

* Update exceptions.py

* Update exceptions.py

* Update homeassistant.py

* Update homeassistant.py

* Update homeassistant.py

* Fix some API Errors

* fix lint
2018-08-09 00:05:08 +02:00
Pascal Vizeli
398815efd8 Bump version to 124 2018-08-08 19:22:12 +02:00
Pascal Vizeli
bdc2bdcf56 Merge pull request #631 from ndarilek/dev
Add SYS_RESOURCE to list of valid privileges
2018-08-07 17:06:15 +02:00
Nolan Darilek
68eafb0a7d Add SYS_RESOURCE to list of valid privileges 2018-08-07 03:21:21 +00:00
Pascal Vizeli
7ca2fd7193 Merge pull request #618 from home-assistant/dev
Release 123
2018-08-04 01:25:55 +02:00
Pascal Vizeli
ec823edd8f Cleanup docker image (#617) 2018-08-04 00:41:14 +02:00
Pascal Vizeli
858c7a1fa7 Bump version 123 2018-08-02 23:40:52 +02:00
Pascal Vizeli
6ac45a24fc Merge pull request #615 from home-assistant/dev
Release 122
2018-08-02 23:39:43 +02:00
Pascal Vizeli
9430b39042 Update uvloop version 0.11.1 (#614) 2018-08-02 23:18:40 +02:00
Pascal Vizeli
ae7466ccfe Fix UnicodeDecodeError with read json file (#613)
* Update json.py

* Update data.py
2018-08-02 21:48:50 +02:00
Simon Holzmayer
2c17fe5da8 Adapt regex validation to allow docker images from other registries (#608)
* Adapt regex validation to allow images from other registries than dockerhub

Issue #564

* Update validate.py
2018-07-30 12:34:42 +02:00
Pascal Vizeli
a0fb91af29 Use requirements.txt (#607)
* Create requirements.txt

* Update setup.py

* Update Dockerfile

* Update Dockerfile

* Update requirements.txt

* Update requirements.txt

* Update Dockerfile

* Update tox.ini
2018-07-27 16:34:47 +02:00
Pascal Vizeli
f626e31fd3 Bump version 122 2018-07-25 01:52:24 +02:00
140 changed files with 3699 additions and 2668 deletions

View File

@@ -7,3 +7,7 @@
# Temporary files
**/__pycache__
# virtualenv
venv/
ENV/

29
.github/ISSUE_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,29 @@
<!-- READ THIS FIRST:
- If you need additional help with this template please refer to https://www.home-assistant.io/help/reporting_issues/
- Make sure you are running the latest version of Home Assistant before reporting an issue: https://github.com/home-assistant/home-assistant/releases
- Do not report issues for components here, plaese refer to https://github.com/home-assistant/home-assistant/issues
- This is for bugs only. Feature and enhancement requests should go in our community forum: https://community.home-assistant.io/c/feature-requests
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks. Do not delete any text from this template!
- If you have a problem with a Add-on, make a issue on there repository.
-->
**Home Assistant release with the issue:**
<!--
- Frontend -> Developer tools -> Info
- Or use this command: hass --version
-->
**Operating environment (HassOS/Generic):**
<!--
Please provide details about your environment.
-->
**Supervisor logs:**
<!--
- Frontend -> Hass.io -> System
- Or use this command: hassio su logs
-->
**Description of problem:**

16
.github/main.workflow vendored Normal file
View File

@@ -0,0 +1,16 @@
workflow "tox" {
on = "push"
resolves = [
"Python 3.7",
"Json Files",
]
}
action "Python 3.7" {
uses = "home-assistant/actions/py37-tox@master"
}
action "Json Files" {
uses = "home-assistant/actions/jq@master"
args = "**/*.json"
}

3
.gitignore vendored
View File

@@ -90,3 +90,6 @@ ENV/
# pylint
.pylint.d/
# VS Code
.vscode/

1
.gitmodules vendored
View File

@@ -1,3 +1,4 @@
[submodule "home-assistant-polymer"]
path = home-assistant-polymer
url = https://github.com/home-assistant/home-assistant-polymer
branch = dev

View File

@@ -1,6 +0,0 @@
sudo: true
dist: xenial
install: pip install -U tox
language: python
python: 3.7
script: tox

107
API.md
View File

@@ -1,4 +1,4 @@
# Hass.io Server
# Hass.io
## Hass.io RESTful API
@@ -22,11 +22,14 @@ On success / Code 200:
}
```
For access to API you need set the `X-HASSIO-KEY` they will be available for Add-ons/HomeAssistant with envoriment `HASSIO_TOKEN`.
For access to API you need set the `X-HASSIO-KEY` they will be available for Add-ons/HomeAssistant with environment `HASSIO_TOKEN`.
### Hass.io
- GET `/supervisor/ping`
This API call don't need a token.
- GET `/supervisor/info`
The addons from `addons` are only installed one.
@@ -311,9 +314,10 @@ Load host configs from a USB stick.
"CARD_ID": {
"name": "xy",
"type": "microphone",
"devices": {
"DEV_ID": "type of device"
}
"devices": [
"chan_id": "channel ID",
"chan_type": "type of device"
]
}
}
}
@@ -342,6 +346,7 @@ Load host configs from a USB stick.
{
"version": "INSTALL_VERSION",
"last_version": "LAST_VERSION",
"arch": "arch",
"machine": "Image machine type",
"image": "str",
"custom": "bool -> if custom image",
@@ -349,7 +354,7 @@ Load host configs from a USB stick.
"port": 8123,
"ssl": "bool",
"watchdog": "bool",
"startup_time": 600
"wait_boot": 600
}
```
@@ -383,7 +388,7 @@ Output is the raw Docker log.
"password": "",
"refresh_token": "",
"watchdog": "bool",
"startup_time": 600
"wait_boot": 600
}
```
@@ -412,6 +417,8 @@ Proxy to real websocket instance.
### RESTful for API addons
If an add-on will call itself, you can use `/addons/self/...`.
- GET `/addons`
Get all available addons.
@@ -423,11 +430,11 @@ Get all available addons.
"name": "xy bla",
"slug": "xy",
"description": "description",
"arch": ["armhf", "aarch64", "i386", "amd64"],
"repository": "core|local|REP_ID",
"version": "LAST_VERSION",
"installed": "none|INSTALL_VERSION",
"detached": "bool",
"available": "bool",
"build": "bool",
"url": "null|url",
"icon": "bool",
@@ -458,6 +465,9 @@ Get all available addons.
"auto_update": "bool",
"url": "null|url of addon",
"detached": "bool",
"available": "bool",
"arch": ["armhf", "aarch64", "i386", "amd64"],
"machine": "[raspberrypi2, tinker]",
"repository": "12345678|null",
"version": "null|VERSION_INSTALLED",
"last_version": "LAST_VERSION",
@@ -467,6 +477,7 @@ Get all available addons.
"options": "{}",
"network": "{}|null",
"host_network": "bool",
"host_pid": "bool",
"host_ipc": "bool",
"host_dbus": "bool",
"privileged": ["NET_ADMIN", "SYS_ADMIN"],
@@ -477,16 +488,23 @@ Get all available addons.
"logo": "bool",
"changelog": "bool",
"hassio_api": "bool",
"hassio_role": "default|homeassistant|manager|admin",
"homeassistant_api": "bool",
"auth_api": "bool",
"full_access": "bool",
"protected": "bool",
"rating": "1-6",
"stdin": "bool",
"webui": "null|http(s)://[HOST]:port/xy/zx",
"gpio": "bool",
"kernel_modules": "bool",
"devicetree": "bool",
"docker_api": "bool",
"audio": "bool",
"audio_input": "null|0,0",
"audio_output": "null|0,0",
"services": "null|['mqtt']",
"discovery": "null|['component/platform']"
"services_role": "['service:access']",
"discovery": "['service']"
}
```
@@ -513,6 +531,16 @@ Get all available addons.
Reset custom network/audio/options, set it `null`.
- POST `/addons/{addon}/security`
This function is not callable by itself.
```json
{
"protected": "bool",
}
```
- POST `/addons/{addon}/start`
- POST `/addons/{addon}/stop`
@@ -550,39 +578,36 @@ Write data to add-on stdin
}
```
### Service discovery
### discovery
- GET `/services/discovery`
- GET `/discovery`
```json
{
"discovery": [
{
"provider": "name",
"addon": "slug",
"service": "name",
"uuid": "uuid",
"component": "component",
"platform": "null|platform",
"config": {}
}
]
}
```
- GET `/services/discovery/{UUID}`
- GET `/discovery/{UUID}`
```json
{
"provider": "name",
"addon": "slug",
"service": "name",
"uuid": "uuid",
"component": "component",
"platform": "null|platform",
"config": {}
}
```
- POST `/services/discovery`
- POST `/discovery`
```json
{
"component": "component",
"platform": "null|platform",
"service": "name",
"config": {}
}
```
@@ -594,7 +619,9 @@ return:
}
```
- DEL `/services/discovery/{UUID}`
- DEL `/discovery/{UUID}`
### Services
- GET `/services`
```json
@@ -603,7 +630,7 @@ return:
{
"slug": "name",
"available": "bool",
"provider": "null|name|list"
"providers": "list"
}
]
}
@@ -611,12 +638,10 @@ return:
#### MQTT
This service performs an auto discovery to Home-Assistant.
- GET `/services/mqtt`
```json
{
"provider": "name",
"addon": "name",
"host": "xy",
"port": "8883",
"ssl": "bool",
@@ -639,3 +664,31 @@ This service performs an auto discovery to Home-Assistant.
```
- DEL `/services/mqtt`
### Misc
- GET `/info`
```json
{
"supervisor": "version",
"homeassistant": "version",
"hassos": "null|version",
"hostname": "name",
"machine": "type",
"arch": "arch",
"supported_arch": ["arch1", "arch2"],
"channel": "stable|beta|dev"
}
```
### Auth / SSO API
You can use the user system on homeassistant. We handle this auth system on
supervisor.
You can call post `/auth`
We support:
- Json `{ "user|name": "...", "password": "..." }`
- application/x-www-form-urlencoded `user|name=...&password=...`
- BasicAuth

View File

@@ -1,24 +1,29 @@
ARG BUILD_FROM
FROM $BUILD_FROM
# Add env
ENV LANG C.UTF-8
# Setup base
# Install base
RUN apk add --no-cache \
git \
socat \
glib \
libstdc++ \
eudev-libs \
&& apk add --no-cache --virtual .build-dependencies \
make \
g++ \
&& pip3 install --no-cache-dir \
uvloop==0.11.0 \
cchardet==2.1.1 \
pycryptodome==3.6.4 \
&& apk del .build-dependencies
openssl \
libffi \
musl \
git \
socat \
glib \
libstdc++ \
eudev-libs
# Install requirements
COPY requirements.txt /usr/src/
RUN apk add --no-cache --virtual .build-dependencies \
make \
g++ \
openssl-dev \
libffi-dev \
musl-dev \
&& export MAKEFLAGS="-j$(nproc)" \
&& pip3 install --no-cache-dir -r /usr/src/requirements.txt \
&& apk del .build-dependencies \
&& rm -f /usr/src/requirements.txt
# Install HassIO
COPY . /usr/src/hassio

View File

@@ -10,9 +10,19 @@ and updating software.
![](misc/hassio.png?raw=true)
- [Hass.io Addons](https://github.com/home-assistant/hassio-addons)
- [Hass.io Build](https://github.com/home-assistant/hassio-build)
## Installation
Installation instructions can be found at <https://home-assistant.io/hassio>.
## Development
The development of the supervisor is a bit tricky. Not difficult but tricky.
- You can use the builder to build your supervisor: https://github.com/home-assistant/hassio-build/tree/master/builder
- Go into a HassOS device or VM and pull your supervisor.
- Set the developer modus on updater.json
- Tag it as `homeassistant/xy-hassio-supervisor:latest`
- Restart the service like `systemctl restart hassos-supervisor | journalctl -fu hassos-supervisor`
- Test your changes
Small Bugfix or improvements, make a PR. Significant change makes first an RFC.

View File

@@ -1 +1 @@
"""Init file for HassIO."""
"""Init file for Hass.io."""

View File

@@ -1,4 +1,4 @@
"""Main file for HassIO."""
"""Main file for Hass.io."""
import asyncio
from concurrent.futures import ThreadPoolExecutor
import logging
@@ -9,7 +9,7 @@ from hassio import bootstrap
_LOGGER = logging.getLogger(__name__)
def attempt_use_uvloop():
def initialize_event_loop():
"""Attempt to use uvloop."""
try:
import uvloop
@@ -17,13 +17,17 @@ def attempt_use_uvloop():
except ImportError:
pass
return asyncio.get_event_loop()
# pylint: disable=invalid-name
if __name__ == "__main__":
bootstrap.initialize_logging()
attempt_use_uvloop()
loop = asyncio.get_event_loop()
# Init async event loop
loop = initialize_event_loop()
# Check if all information are available to setup Hass.io
if not bootstrap.check_environment():
sys.exit(1)
@@ -31,8 +35,8 @@ if __name__ == "__main__":
executor = ThreadPoolExecutor(thread_name_prefix="SyncWorker")
loop.set_default_executor(executor)
_LOGGER.info("Initialize Hassio setup")
coresys = bootstrap.initialize_coresys(loop)
_LOGGER.info("Initialize Hass.io setup")
coresys = loop.run_until_complete(bootstrap.initialize_coresys())
bootstrap.migrate_system_env(coresys)
@@ -43,13 +47,13 @@ if __name__ == "__main__":
loop.call_soon_threadsafe(bootstrap.reg_signal, loop)
try:
_LOGGER.info("Run HassIO")
_LOGGER.info("Run Hass.io")
loop.run_forever()
finally:
_LOGGER.info("Stopping HassIO")
_LOGGER.info("Stopping Hass.io")
loop.run_until_complete(coresys.core.stop())
executor.shutdown(wait=False)
loop.close()
_LOGGER.info("Close Hassio")
_LOGGER.info("Close Hass.io")
sys.exit(0)

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO addons."""
"""Init file for Hass.io add-ons."""
import asyncio
import logging
@@ -14,10 +14,10 @@ BUILTIN_REPOSITORIES = set((REPOSITORY_CORE, REPOSITORY_LOCAL))
class AddonManager(CoreSysAttributes):
"""Manage addons inside HassIO."""
"""Manage add-ons inside Hass.io."""
def __init__(self, coresys):
"""Initialize docker base wrapper."""
"""Initialize Docker base wrapper."""
self.coresys = coresys
self.data = AddonsData(coresys)
self.addons_obj = {}
@@ -25,44 +25,44 @@ class AddonManager(CoreSysAttributes):
@property
def list_addons(self):
"""Return a list of all addons."""
"""Return a list of all add-ons."""
return list(self.addons_obj.values())
@property
def list_installed(self):
"""Return a list of installed addons."""
"""Return a list of installed add-ons."""
return [addon for addon in self.addons_obj.values()
if addon.is_installed]
@property
def list_repositories(self):
"""Return list of addon repositories."""
"""Return list of add-on repositories."""
return list(self.repositories_obj.values())
def get(self, addon_slug):
"""Return an add-on from slug."""
return self.addons_obj.get(addon_slug)
def from_uuid(self, uuid):
"""Return an add-on from uuid."""
def from_token(self, token):
"""Return an add-on from Hass.io token."""
for addon in self.list_addons:
if addon.is_installed and uuid == addon.uuid:
if addon.is_installed and token == addon.hassio_token:
return addon
return None
async def load(self):
"""Startup addon management."""
"""Start up add-on management."""
self.data.reload()
# init hassio built-in repositories
# Init Hass.io built-in repositories
repositories = \
set(self.sys_config.addons_repositories) | BUILTIN_REPOSITORIES
# init custom repositories & load addons
# Init custom repositories and load add-ons
await self.load_repositories(repositories)
async def reload(self):
"""Update addons from repo and reload list."""
"""Update add-ons from repository and reload list."""
tasks = [repository.update() for repository in
self.repositories_obj.values()]
if tasks:
@@ -106,14 +106,14 @@ class AddonManager(CoreSysAttributes):
await self.load_addons()
async def load_addons(self):
"""Update/add internal addon store."""
"""Update/add internal add-on store."""
all_addons = set(self.data.system) | set(self.data.cache)
# calc diff
add_addons = all_addons - set(self.addons_obj)
del_addons = set(self.addons_obj) - all_addons
_LOGGER.info("Load addons: %d all - %d new - %d remove",
_LOGGER.info("Load add-ons: %d all - %d new - %d remove",
len(all_addons), len(add_addons), len(del_addons))
# new addons
@@ -132,14 +132,14 @@ class AddonManager(CoreSysAttributes):
self.addons_obj.pop(addon_slug)
async def boot(self, stage):
"""Boot addons with mode auto."""
"""Boot add-ons with mode auto."""
tasks = []
for addon in self.addons_obj.values():
if addon.is_installed and addon.boot == BOOT_AUTO and \
addon.startup == stage:
tasks.append(addon.start())
_LOGGER.info("Startup %s run %d addons", stage, len(tasks))
_LOGGER.info("Startup %s run %d add-ons", stage, len(tasks))
if tasks:
await asyncio.wait(tasks)
await asyncio.sleep(self.sys_config.wait_boot)
@@ -153,6 +153,6 @@ class AddonManager(CoreSysAttributes):
addon.startup == stage:
tasks.append(addon.stop())
_LOGGER.info("Shutdown %s stop %d addons", stage, len(tasks))
_LOGGER.info("Shutdown %s stop %d add-ons", stage, len(tasks))
if tasks:
await asyncio.wait(tasks)

View File

@@ -1,37 +1,41 @@
"""Init file for HassIO addons."""
"""Init file for Hass.io add-ons."""
from contextlib import suppress
from copy import deepcopy
import logging
import json
from pathlib import Path, PurePath
import re
import shutil
import tarfile
from tempfile import TemporaryDirectory
from typing import Dict, Any
import voluptuous as vol
from voluptuous.humanize import humanize_error
from .validate import (
validate_options, SCHEMA_ADDON_SNAPSHOT, RE_VOLUME, RE_SERVICE)
from .utils import check_installed, remove_data
from ..const import (
ATTR_NAME, ATTR_VERSION, ATTR_SLUG, ATTR_DESCRIPTON, ATTR_BOOT, ATTR_MAP,
ATTR_OPTIONS, ATTR_PORTS, ATTR_SCHEMA, ATTR_IMAGE, ATTR_REPOSITORY,
ATTR_URL, ATTR_ARCH, ATTR_LOCATON, ATTR_DEVICES, ATTR_ENVIRONMENT,
ATTR_HOST_NETWORK, ATTR_TMPFS, ATTR_PRIVILEGED, ATTR_STARTUP, ATTR_UUID,
STATE_STARTED, STATE_STOPPED, STATE_NONE, ATTR_USER, ATTR_SYSTEM,
ATTR_STATE, ATTR_TIMEOUT, ATTR_AUTO_UPDATE, ATTR_NETWORK, ATTR_WEBUI,
ATTR_HASSIO_API, ATTR_AUDIO, ATTR_AUDIO_OUTPUT, ATTR_AUDIO_INPUT,
ATTR_GPIO, ATTR_HOMEASSISTANT_API, ATTR_STDIN, ATTR_LEGACY, ATTR_HOST_IPC,
ATTR_HOST_DBUS, ATTR_AUTO_UART, ATTR_DISCOVERY, ATTR_SERVICES,
ATTR_APPARMOR, ATTR_DEVICETREE, SECURITY_PROFILE, SECURITY_DISABLE,
SECURITY_DEFAULT)
ATTR_ACCESS_TOKEN, ATTR_APPARMOR, ATTR_ARCH, ATTR_AUDIO, ATTR_AUDIO_INPUT,
ATTR_AUDIO_OUTPUT, ATTR_AUTH_API, ATTR_AUTO_UART, ATTR_AUTO_UPDATE,
ATTR_BOOT, ATTR_DESCRIPTON, ATTR_DEVICES, ATTR_DEVICETREE, ATTR_DISCOVERY,
ATTR_DOCKER_API, ATTR_ENVIRONMENT, ATTR_FULL_ACCESS, ATTR_GPIO,
ATTR_HASSIO_API, ATTR_HASSIO_ROLE, ATTR_HOMEASSISTANT_API, ATTR_HOST_DBUS,
ATTR_HOST_IPC, ATTR_HOST_NETWORK, ATTR_HOST_PID, ATTR_IMAGE,
ATTR_KERNEL_MODULES, ATTR_LEGACY, ATTR_LOCATON, ATTR_MACHINE, ATTR_MAP,
ATTR_NAME, ATTR_NETWORK, ATTR_OPTIONS, ATTR_PORTS, ATTR_PRIVILEGED,
ATTR_PROTECTED, ATTR_REPOSITORY, ATTR_SCHEMA, ATTR_SERVICES, ATTR_SLUG,
ATTR_STARTUP, ATTR_STATE, ATTR_STDIN, ATTR_SYSTEM, ATTR_TIMEOUT,
ATTR_TMPFS, ATTR_URL, ATTR_USER, ATTR_UUID, ATTR_VERSION, ATTR_WEBUI,
SECURITY_DEFAULT, SECURITY_DISABLE, SECURITY_PROFILE, STATE_NONE,
STATE_STARTED, STATE_STOPPED)
from ..coresys import CoreSysAttributes
from ..docker.addon import DockerAddon
from ..utils.json import write_json_file, read_json_file
from ..exceptions import HostAppArmorError, JsonFileError
from ..utils import create_token
from ..utils.apparmor import adjust_profile
from ..exceptions import HostAppArmorError
from ..utils.json import read_json_file, write_json_file
from .utils import check_installed, remove_data
from .validate import (
MACHINE_ALL, RE_SERVICE, RE_VOLUME, SCHEMA_ADDON_SNAPSHOT,
validate_options)
_LOGGER = logging.getLogger(__name__)
@@ -41,7 +45,7 @@ RE_WEBUI = re.compile(
class Addon(CoreSysAttributes):
"""Hold data for addon inside HassIO."""
"""Hold data for add-on inside Hass.io."""
def __init__(self, coresys, slug):
"""Initialize data holder."""
@@ -52,65 +56,85 @@ class Addon(CoreSysAttributes):
async def load(self):
"""Async initialize of object."""
if self.is_installed:
await self.instance.attach()
if not self.is_installed:
return
await self.instance.attach()
@property
def slug(self):
"""Return slug/id of addon."""
"""Return slug/id of add-on."""
return self._id
@property
def _mesh(self):
"""Return addon data from system or cache."""
"""Return add-on data from system or cache."""
return self._data.system.get(self._id, self._data.cache.get(self._id))
@property
def _data(self):
"""Return addons data storage."""
"""Return add-ons data storage."""
return self.sys_addons.data
@property
def is_installed(self):
"""Return True if an addon is installed."""
"""Return True if an add-on is installed."""
return self._id in self._data.system
@property
def is_detached(self):
"""Return True if addon is detached."""
"""Return True if add-on is detached."""
return self._id not in self._data.cache
@property
def available(self):
"""Return True if this add-on is available on this platform."""
# Architecture
if not self.sys_arch.is_supported(self.supported_arch):
return False
# Machine / Hardware
if self.sys_machine not in self.supported_machine:
return False
return True
@property
def version_installed(self):
"""Return installed version."""
return self._data.user.get(self._id, {}).get(ATTR_VERSION)
def _set_install(self, version):
def _set_install(self, image: str, version: str) -> None:
"""Set addon as installed."""
self._data.system[self._id] = deepcopy(self._data.cache[self._id])
self._data.user[self._id] = {
ATTR_OPTIONS: {},
ATTR_VERSION: version,
ATTR_IMAGE: image,
}
self._data.save_data()
self.save_data()
def _set_uninstall(self):
"""Set addon as uninstalled."""
def _set_uninstall(self) -> None:
"""Set add-on as uninstalled."""
self._data.system.pop(self._id, None)
self._data.user.pop(self._id, None)
self._data.save_data()
self.save_data()
def _set_update(self, version):
"""Update version of addon."""
def _set_update(self, image: str, version: str) -> None:
"""Update version of add-on."""
self._data.system[self._id] = deepcopy(self._data.cache[self._id])
self._data.user[self._id][ATTR_VERSION] = version
self._data.save_data()
self._data.user[self._id].update({
ATTR_VERSION: version,
ATTR_IMAGE: image,
})
self.save_data()
def _restore_data(self, user, system):
"""Restore data to addon."""
def _restore_data(self, user: Dict[str, Any], system: Dict[str, Any], image: str) -> None:
"""Restore data to add-on."""
self._data.user[self._id] = deepcopy(user)
self._data.system[self._id] = deepcopy(system)
self._data.save_data()
self._data.user[self._id][ATTR_IMAGE] = image
self.save_data()
@property
def options(self):
@@ -124,7 +148,7 @@ class Addon(CoreSysAttributes):
@options.setter
def options(self, value):
"""Store user addon options."""
"""Store user add-on options."""
if value is None:
self._data.user[self._id][ATTR_OPTIONS] = {}
else:
@@ -156,7 +180,7 @@ class Addon(CoreSysAttributes):
@property
def name(self):
"""Return name of addon."""
"""Return name of add-on."""
return self._mesh[ATTR_NAME]
@property
@@ -171,9 +195,16 @@ class Addon(CoreSysAttributes):
return self._data.user[self._id][ATTR_UUID]
return None
@property
def hassio_token(self):
"""Return access token for Hass.io API."""
if self.is_installed:
return self._data.user[self._id].get(ATTR_ACCESS_TOKEN)
return None
@property
def description(self):
"""Return description of addon."""
"""Return description of add-on."""
return self._mesh[ATTR_DESCRIPTON]
@property
@@ -191,44 +222,55 @@ class Addon(CoreSysAttributes):
@property
def repository(self):
"""Return repository of addon."""
"""Return repository of add-on."""
return self._mesh[ATTR_REPOSITORY]
@property
def last_version(self):
"""Return version of addon."""
"""Return version of add-on."""
if self._id in self._data.cache:
return self._data.cache[self._id][ATTR_VERSION]
return self.version_installed
@property
def protected(self):
"""Return if add-on is in protected mode."""
if self.is_installed:
return self._data.user[self._id][ATTR_PROTECTED]
return True
@protected.setter
def protected(self, value):
"""Set add-on in protected mode."""
self._data.user[self._id][ATTR_PROTECTED] = value
@property
def startup(self):
"""Return startup type of addon."""
"""Return startup type of add-on."""
return self._mesh.get(ATTR_STARTUP)
@property
def services(self):
def services_role(self):
"""Return dict of services with rights."""
raw_services = self._mesh.get(ATTR_SERVICES)
if not raw_services:
return None
return {}
formated_services = {}
services = {}
for data in raw_services:
service = RE_SERVICE.match(data)
formated_services[service.group('service')] = \
service.group('rights') or 'ro'
services[service.group('service')] = service.group('rights')
return formated_services
return services
@property
def discovery(self):
"""Return list of discoverable components/platforms."""
return self._mesh.get(ATTR_DISCOVERY)
return self._mesh.get(ATTR_DISCOVERY, [])
@property
def ports(self):
"""Return ports of addon."""
"""Return ports of add-on."""
if self.host_network or ATTR_PORTS not in self._mesh:
return None
@@ -239,7 +281,7 @@ class Addon(CoreSysAttributes):
@ports.setter
def ports(self, value):
"""Set custom ports of addon."""
"""Set custom ports of add-on."""
if value is None:
self._data.user[self._id].pop(ATTR_NETWORK, None)
else:
@@ -283,47 +325,52 @@ class Addon(CoreSysAttributes):
@property
def host_network(self):
"""Return True if addon run on host network."""
"""Return True if add-on run on host network."""
return self._mesh[ATTR_HOST_NETWORK]
@property
def host_pid(self):
"""Return True if add-on run on host PID namespace."""
return self._mesh[ATTR_HOST_PID]
@property
def host_ipc(self):
"""Return True if addon run on host IPC namespace."""
"""Return True if add-on run on host IPC namespace."""
return self._mesh[ATTR_HOST_IPC]
@property
def host_dbus(self):
"""Return True if addon run on host DBUS."""
"""Return True if add-on run on host D-BUS."""
return self._mesh[ATTR_HOST_DBUS]
@property
def devices(self):
"""Return devices of addon."""
"""Return devices of add-on."""
return self._mesh.get(ATTR_DEVICES)
@property
def auto_uart(self):
"""Return True if we should map all uart device."""
"""Return True if we should map all UART device."""
return self._mesh.get(ATTR_AUTO_UART)
@property
def tmpfs(self):
"""Return tmpfs of addon."""
"""Return tmpfs of add-on."""
return self._mesh.get(ATTR_TMPFS)
@property
def environment(self):
"""Return environment of addon."""
"""Return environment of add-on."""
return self._mesh.get(ATTR_ENVIRONMENT)
@property
def privileged(self):
"""Return list of privilege."""
return self._mesh.get(ATTR_PRIVILEGED)
return self._mesh.get(ATTR_PRIVILEGED, [])
@property
def apparmor(self):
"""Return True if apparmor is enabled."""
"""Return True if AppArmor is enabled."""
if not self._mesh.get(ATTR_APPARMOR):
return SECURITY_DISABLE
elif self.sys_host.apparmor.exists(self.slug):
@@ -332,19 +379,29 @@ class Addon(CoreSysAttributes):
@property
def legacy(self):
"""Return if the add-on don't support hass labels."""
"""Return if the add-on don't support Home Assistant labels."""
return self._mesh.get(ATTR_LEGACY)
@property
def access_docker_api(self):
"""Return if the add-on need read-only Docker API access."""
return self._mesh.get(ATTR_DOCKER_API)
@property
def access_hassio_api(self):
"""Return True if the add-on access to hassio api."""
"""Return True if the add-on access to Hass.io REASTful API."""
return self._mesh[ATTR_HASSIO_API]
@property
def access_homeassistant_api(self):
"""Return True if the add-on access to Home-Assistant api proxy."""
"""Return True if the add-on access to Home Assistant API proxy."""
return self._mesh[ATTR_HOMEASSISTANT_API]
@property
def hassio_role(self):
"""Return Hass.io role for API."""
return self._mesh[ATTR_HASSIO_ROLE]
@property
def with_stdin(self):
"""Return True if the add-on access use stdin input."""
@@ -352,14 +409,29 @@ class Addon(CoreSysAttributes):
@property
def with_gpio(self):
"""Return True if the add-on access to gpio interface."""
"""Return True if the add-on access to GPIO interface."""
return self._mesh[ATTR_GPIO]
@property
def with_kernel_modules(self):
"""Return True if the add-on access to kernel modules."""
return self._mesh[ATTR_KERNEL_MODULES]
@property
def with_full_access(self):
"""Return True if the add-on want full access to hardware."""
return self._mesh[ATTR_FULL_ACCESS]
@property
def with_devicetree(self):
"""Return True if the add-on read access to devicetree."""
return self._mesh[ATTR_DEVICETREE]
@property
def access_auth_api(self):
"""Return True if the add-on access to login/auth backend."""
return self._mesh[ATTR_AUTH_API]
@property
def with_audio(self):
"""Return True if the add-on access to audio."""
@@ -404,7 +476,7 @@ class Addon(CoreSysAttributes):
@property
def url(self):
"""Return url of addon."""
"""Return URL of add-on."""
return self._mesh.get(ATTR_URL)
@property
@@ -428,27 +500,48 @@ class Addon(CoreSysAttributes):
return self._mesh[ATTR_ARCH]
@property
def image(self):
"""Return image name of addon."""
addon_data = self._mesh
def supported_machine(self):
"""Return list of supported machine."""
return self._mesh.get(ATTR_MACHINE) or MACHINE_ALL
# Repository with dockerhub images
@property
def image(self):
"""Return image name of add-on."""
if self.is_installed:
return self._data.user[self._id].get(ATTR_IMAGE)
return self.image_next
@property
def image_next(self):
"""Return image name for install/update."""
if self.is_detached:
addon_data = self._data.system.get(self._id)
else:
addon_data = self._data.cache.get(self._id)
return self._get_image(addon_data)
def _get_image(self, addon_data) -> str:
"""Generate image name from data."""
# Repository with Dockerhub images
if ATTR_IMAGE in addon_data:
return addon_data[ATTR_IMAGE].format(arch=self.sys_arch)
arch = self.sys_arch.match(addon_data[ATTR_ARCH])
return addon_data[ATTR_IMAGE].format(arch=arch)
# local build
return "{}/{}-addon-{}".format(
addon_data[ATTR_REPOSITORY], self.sys_arch,
addon_data[ATTR_SLUG])
return (f"{addon_data[ATTR_REPOSITORY]}/"
f"{self.sys_arch.default}-"
f"addon-{addon_data[ATTR_SLUG]}")
@property
def need_build(self):
"""Return True if this addon need a local build."""
return ATTR_IMAGE not in self._mesh
"""Return True if this add-on need a local build."""
if self.is_detached:
return ATTR_IMAGE not in self._data.system.get(self._id)
return ATTR_IMAGE not in self._data.cache.get(self._id)
@property
def map_volumes(self):
"""Return a dict of {volume: policy} from addon."""
"""Return a dict of {volume: policy} from add-on."""
volumes = {}
for volume in self._mesh[ATTR_MAP]:
result = RE_VOLUME.match(volume)
@@ -458,37 +551,37 @@ class Addon(CoreSysAttributes):
@property
def path_data(self):
"""Return addon data path inside supervisor."""
"""Return add-on data path inside Supervisor."""
return Path(self.sys_config.path_addons_data, self._id)
@property
def path_extern_data(self):
"""Return addon data path external for docker."""
"""Return add-on data path external for Docker."""
return PurePath(self.sys_config.path_extern_addons_data, self._id)
@property
def path_options(self):
"""Return path to addons options."""
"""Return path to add-on options."""
return Path(self.path_data, "options.json")
@property
def path_location(self):
"""Return path to this addon."""
"""Return path to this add-on."""
return Path(self._mesh[ATTR_LOCATON])
@property
def path_icon(self):
"""Return path to addon icon."""
"""Return path to add-on icon."""
return Path(self.path_location, 'icon.png')
@property
def path_logo(self):
"""Return path to addon logo."""
"""Return path to add-on logo."""
return Path(self.path_location, 'logo.png')
@property
def path_changelog(self):
"""Return path to addon changelog."""
"""Return path to add-on changelog."""
return Path(self.path_location, 'CHANGELOG.md')
@property
@@ -503,15 +596,15 @@ class Addon(CoreSysAttributes):
@property
def path_extern_asound(self):
"""Return path to asound config for docker."""
"""Return path to asound config for Docker."""
return Path(self.sys_config.path_extern_tmp, f"{self.slug}_asound")
def save_data(self):
"""Save data of addon."""
"""Save data of add-on."""
self.sys_addons.data.save_data()
def write_options(self):
"""Return True if addon options is written to data."""
"""Return True if add-on options is written to data."""
schema = self.schema
options = self.options
@@ -519,15 +612,22 @@ class Addon(CoreSysAttributes):
schema(options)
write_json_file(self.path_options, options)
except vol.Invalid as ex:
_LOGGER.error("Addon %s have wrong options: %s", self._id,
_LOGGER.error("Add-on %s have wrong options: %s", self._id,
humanize_error(options, ex))
except (OSError, json.JSONDecodeError) as err:
_LOGGER.error("Addon %s can't write options: %s", self._id, err)
except JsonFileError:
_LOGGER.error("Add-on %s can't write options", self._id)
else:
return True
return False
def remove_discovery(self):
"""Remove all discovery message from add-on."""
for message in self.sys_discovery.list_messages:
if message.addon != self.slug:
continue
self.sys_discovery.remove(message)
def write_asound(self):
"""Write asound config to file and return True on success."""
asound_config = self.sys_host.alsa.asound(
@@ -537,7 +637,7 @@ class Addon(CoreSysAttributes):
with self.path_asound.open('w') as config_file:
config_file.write(asound_config)
except OSError as err:
_LOGGER.error("Addon %s can't write asound: %s", self._id, err)
_LOGGER.error("Add-on %s can't write asound: %s", self._id, err)
return False
return True
@@ -565,15 +665,15 @@ class Addon(CoreSysAttributes):
@property
def schema(self):
"""Create a schema for addon options."""
"""Create a schema for add-on options."""
raw_schema = self._mesh[ATTR_SCHEMA]
if isinstance(raw_schema, bool):
return vol.Schema(dict)
return vol.Schema(vol.All(dict, validate_options(raw_schema)))
def test_udpate_schema(self):
"""Check if the exists config valid after update."""
def test_update_schema(self):
"""Check if the existing configuration is valid after update."""
if not self.is_installed or self.is_detached:
return True
@@ -603,39 +703,41 @@ class Addon(CoreSysAttributes):
return True
async def install(self):
"""Install an addon."""
if self.sys_arch not in self.supported_arch:
"""Install an add-on."""
if not self.available:
_LOGGER.error(
"Addon %s not supported on %s", self._id, self.sys_arch)
"Add-on %s not supported on %s with %s architecture",
self._id, self.sys_machine, self.sys_arch.supported)
return False
if self.is_installed:
_LOGGER.error("Addon %s is already installed", self._id)
_LOGGER.error("Add-on %s is already installed", self._id)
return False
if not self.path_data.is_dir():
_LOGGER.info(
"Create Home-Assistant addon data folder %s", self.path_data)
"Create Home Assistant add-on data folder %s", self.path_data)
self.path_data.mkdir()
# Setup/Fix AppArmor profile
await self._install_apparmor()
if not await self.instance.install(self.last_version):
if not await self.instance.install(
self.last_version, self.image_next):
return False
self._set_install(self.last_version)
self._set_install(self.image_next, self.last_version)
return True
@check_installed
async def uninstall(self):
"""Remove an addon."""
"""Remove an add-on."""
if not await self.instance.remove():
return False
if self.path_data.is_dir():
_LOGGER.info(
"Remove Home-Assistant addon data folder %s", self.path_data)
"Remove Home Assistant add-on data folder %s", self.path_data)
await remove_data(self.path_data)
# Cleanup audio settings
@@ -643,16 +745,19 @@ class Addon(CoreSysAttributes):
with suppress(OSError):
self.path_asound.unlink()
# Cleanup apparmor profile
# Cleanup AppArmor profile
if self.sys_host.apparmor.exists(self.slug):
with suppress(HostAppArmorError):
await self.sys_host.apparmor.remove_profile(self.slug)
# Remove discovery messages
self.remove_discovery()
self._set_uninstall()
return True
async def state(self):
"""Return running state of addon."""
"""Return running state of add-on."""
if not self.is_installed:
return STATE_NONE
@@ -662,7 +767,15 @@ class Addon(CoreSysAttributes):
@check_installed
async def start(self):
"""Set options and start addon."""
"""Set options and start add-on."""
if await self.instance.is_running():
_LOGGER.warning("%s already running!", self.slug)
return
# Access Token
self._data.user[self._id][ATTR_ACCESS_TOKEN] = create_token()
self.save_data()
# Options
if not self.write_options():
return False
@@ -675,7 +788,7 @@ class Addon(CoreSysAttributes):
@check_installed
def stop(self):
"""Stop addon.
"""Stop add-on.
Return a coroutine.
"""
@@ -683,16 +796,17 @@ class Addon(CoreSysAttributes):
@check_installed
async def update(self):
"""Update addon."""
"""Update add-on."""
last_state = await self.state()
if self.last_version == self.version_installed:
_LOGGER.warning("No update available for Addon %s", self._id)
_LOGGER.warning("No update available for add-on %s", self._id)
return False
if not await self.instance.update(self.last_version):
if not await self.instance.update(
self.last_version, self.image_next):
return False
self._set_update(self.last_version)
self._set_update(self.image_next, self.last_version)
# Setup/Fix AppArmor profile
await self._install_apparmor()
@@ -704,13 +818,13 @@ class Addon(CoreSysAttributes):
@check_installed
async def restart(self):
"""Restart addon."""
"""Restart add-on."""
await self.stop()
return await self.start()
@check_installed
def logs(self):
"""Return addons log output.
"""Return add-ons log output.
Return a coroutine.
"""
@@ -726,11 +840,11 @@ class Addon(CoreSysAttributes):
@check_installed
async def rebuild(self):
"""Performe a rebuild of local build addon."""
"""Perform a rebuild of local build add-on."""
last_state = await self.state()
if not self.need_build:
_LOGGER.error("Can't rebuild a none local build addon!")
_LOGGER.error("Can't rebuild a none local build add-on!")
return False
# remove docker container but not addon config
@@ -759,7 +873,7 @@ class Addon(CoreSysAttributes):
@check_installed
async def snapshot(self, tar_file):
"""Snapshot state of an addon."""
"""Snapshot state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
# store local image
if self.need_build and not await \
@@ -773,11 +887,11 @@ class Addon(CoreSysAttributes):
ATTR_STATE: await self.state(),
}
# store local configs/state
# Store local configs/state
try:
write_json_file(Path(temp, 'addon.json'), data)
except (OSError, json.JSONDecodeError) as err:
_LOGGER.error("Can't save meta for %s: %s", self._id, err)
except JsonFileError:
_LOGGER.error("Can't save meta for %s", self._id)
return False
# Store AppArmor Profile
@@ -797,7 +911,7 @@ class Addon(CoreSysAttributes):
snapshot.add(self.path_data, arcname="data")
try:
_LOGGER.info("Build snapshot for addon %s", self._id)
_LOGGER.info("Build snapshot for add-on %s", self._id)
await self.sys_run_in_executor(_write_tarfile)
except (tarfile.TarError, OSError) as err:
_LOGGER.error("Can't write tarfile %s: %s", tar_file, err)
@@ -807,7 +921,7 @@ class Addon(CoreSysAttributes):
return True
async def restore(self, tar_file):
"""Restore state of an addon."""
"""Restore state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
# extract snapshot
def _extract_tarfile():
@@ -821,13 +935,13 @@ class Addon(CoreSysAttributes):
_LOGGER.error("Can't read tarfile %s: %s", tar_file, err)
return False
# read snapshot data
# Read snapshot data
try:
data = read_json_file(Path(temp, 'addon.json'))
except (OSError, json.JSONDecodeError) as err:
_LOGGER.error("Can't read addon.json: %s", err)
except JsonFileError:
return False
# validate
# Validate
try:
data = SCHEMA_ADDON_SNAPSHOT(data)
except vol.Invalid as err:
@@ -835,25 +949,29 @@ class Addon(CoreSysAttributes):
self._id, humanize_error(data, err))
return False
# restore data / reload addon
# Restore local add-on informations
_LOGGER.info("Restore config for addon %s", self._id)
self._restore_data(data[ATTR_USER], data[ATTR_SYSTEM])
restore_image = self._get_image(data[ATTR_SYSTEM])
self._restore_data(data[ATTR_USER], data[ATTR_SYSTEM], restore_image)
# check version / restore image
# Check version / restore image
version = data[ATTR_VERSION]
if not await self.instance.exists():
_LOGGER.info("Restore image for addon %s", self._id)
_LOGGER.info("Restore/Install image for addon %s", self._id)
image_file = Path(temp, 'image.tar')
if image_file.is_file():
await self.instance.import_image(image_file, version)
else:
if await self.instance.install(version):
if await self.instance.install(version, restore_image):
await self.instance.cleanup()
elif self.instance.version != version or self.legacy:
_LOGGER.info("Restore/Update image for addon %s", self._id)
await self.instance.update(version, restore_image)
else:
await self.instance.stop()
# restore data
# Restore data
def _restore_data():
"""Restore data."""
shutil.copytree(str(Path(temp, "data")), str(self.path_data))
@@ -877,9 +995,9 @@ class Addon(CoreSysAttributes):
_LOGGER.error("Can't restore AppArmor profile")
return False
# run addon
# Run add-on
if data[ATTR_STATE] == STATE_STARTED:
return await self.start()
_LOGGER.info("Finish restore for addon %s", self._id)
_LOGGER.info("Finish restore for add-on %s", self._id)
return True

View File

@@ -1,50 +1,55 @@
"""HassIO addons build environment."""
"""Hass.io add-on build environment."""
from __future__ import annotations
from pathlib import Path
from typing import TYPE_CHECKING, Dict
from .validate import SCHEMA_BUILD_CONFIG, BASE_IMAGE
from ..const import ATTR_SQUASH, ATTR_BUILD_FROM, ATTR_ARGS, META_ADDON
from ..coresys import CoreSysAttributes
from ..const import ATTR_ARGS, ATTR_BUILD_FROM, ATTR_SQUASH, META_ADDON
from ..coresys import CoreSys, CoreSysAttributes
from ..utils.json import JsonConfig
from .validate import SCHEMA_BUILD_CONFIG
if TYPE_CHECKING:
from .addon import Addon
class AddonBuild(JsonConfig, CoreSysAttributes):
"""Handle build options for addons."""
"""Handle build options for add-ons."""
def __init__(self, coresys, slug):
"""Initialize addon builder."""
self.coresys = coresys
self._id = slug
def __init__(self, coresys: CoreSys, slug: str) -> None:
"""Initialize Hass.io add-on builder."""
self.coresys: CoreSys = coresys
self._id: str = slug
super().__init__(
Path(self.addon.path_location, 'build.json'), SCHEMA_BUILD_CONFIG)
def save_data(self):
"""Ignore save function."""
pass
@property
def addon(self):
"""Return addon of build data."""
def addon(self) -> Addon:
"""Return add-on of build data."""
return self.sys_addons.get(self._id)
@property
def base_image(self):
"""Base images for this addon."""
def base_image(self) -> str:
"""Base images for this add-on."""
return self._data[ATTR_BUILD_FROM].get(
self.sys_arch, BASE_IMAGE[self.sys_arch])
self.sys_arch.default,
f"homeassistant/{self.sys_arch.default}-base:latest")
@property
def squash(self):
def squash(self) -> bool:
"""Return True or False if squash is active."""
return self._data[ATTR_SQUASH]
@property
def additional_args(self):
"""Return additional docker build arguments."""
def additional_args(self) -> Dict[str, str]:
"""Return additional Docker build arguments."""
return self._data[ATTR_ARGS]
def get_docker_args(self, version):
"""Create a dict with docker build arguments."""
"""Create a dict with Docker build arguments."""
args = {
'path': str(self.addon.path_location),
'tag': f"{self.addon.image}:{version}",
@@ -53,7 +58,7 @@ class AddonBuild(JsonConfig, CoreSysAttributes):
'squash': self.squash,
'labels': {
'io.hass.version': version,
'io.hass.arch': self.sys_arch,
'io.hass.arch': self.sys_arch.default,
'io.hass.type': META_ADDON,
'io.hass.name': self._fix_label('name'),
'io.hass.description': self._fix_label('description'),
@@ -61,7 +66,7 @@ class AddonBuild(JsonConfig, CoreSysAttributes):
'buildargs': {
'BUILD_FROM': self.base_image,
'BUILD_VERSION': version,
'BUILD_ARCH': self.sys_arch,
'BUILD_ARCH': self.sys_arch.default,
**self.additional_args,
}
}
@@ -71,7 +76,7 @@ class AddonBuild(JsonConfig, CoreSysAttributes):
return args
def _fix_label(self, label_name):
def _fix_label(self, label_name: str) -> str:
"""Remove characters they are not supported."""
label = getattr(self.addon, label_name, "")
return label.replace("'", "")

View File

@@ -1,25 +1,31 @@
"""Init file for HassIO addons."""
"""Init file for Hass.io add-on data."""
import logging
import json
from pathlib import Path
import voluptuous as vol
from voluptuous.humanize import humanize_error
from .utils import extract_hash_from_path
from .validate import (
SCHEMA_ADDON_CONFIG, SCHEMA_ADDONS_FILE, SCHEMA_REPOSITORY_CONFIG)
from ..const import (
FILE_HASSIO_ADDONS, ATTR_SLUG, ATTR_REPOSITORY, ATTR_LOCATON,
REPOSITORY_CORE, REPOSITORY_LOCAL, ATTR_USER, ATTR_SYSTEM)
ATTR_LOCATON,
ATTR_REPOSITORY,
ATTR_SLUG,
ATTR_SYSTEM,
ATTR_USER,
FILE_HASSIO_ADDONS,
REPOSITORY_CORE,
REPOSITORY_LOCAL,
)
from ..coresys import CoreSysAttributes
from ..exceptions import JsonFileError
from ..utils.json import JsonConfig, read_json_file
from .utils import extract_hash_from_path
from .validate import SCHEMA_ADDON_CONFIG, SCHEMA_ADDONS_FILE, SCHEMA_REPOSITORY_CONFIG
_LOGGER = logging.getLogger(__name__)
class AddonsData(JsonConfig, CoreSysAttributes):
"""Hold data for addons inside HassIO."""
"""Hold data for Add-ons inside Hass.io."""
def __init__(self, coresys):
"""Initialize data holder."""
@@ -30,36 +36,34 @@ class AddonsData(JsonConfig, CoreSysAttributes):
@property
def user(self):
"""Return local addon user data."""
"""Return local add-on user data."""
return self._data[ATTR_USER]
@property
def system(self):
"""Return local addon data."""
"""Return local add-on data."""
return self._data[ATTR_SYSTEM]
@property
def cache(self):
"""Return addon data from cache/repositories."""
"""Return add-on data from cache/repositories."""
return self._cache
@property
def repositories(self):
"""Return addon data from repositories."""
"""Return add-on data from repositories."""
return self._repositories
def reload(self):
"""Read data from addons repository."""
"""Read data from add-on repository."""
self._cache = {}
self._repositories = {}
# read core repository
self._read_addons_folder(
self.sys_config.path_addons_core, REPOSITORY_CORE)
self._read_addons_folder(self.sys_config.path_addons_core, REPOSITORY_CORE)
# read local repository
self._read_addons_folder(
self.sys_config.path_addons_local, REPOSITORY_LOCAL)
self._read_addons_folder(self.sys_config.path_addons_local, REPOSITORY_LOCAL)
# add built-in repositories information
self._set_builtin_repositories()
@@ -76,15 +80,12 @@ class AddonsData(JsonConfig, CoreSysAttributes):
# exists repository json
repository_file = Path(path, "repository.json")
try:
repository_info = SCHEMA_REPOSITORY_CONFIG(
read_json_file(repository_file)
repository_info = SCHEMA_REPOSITORY_CONFIG(read_json_file(repository_file))
except JsonFileError:
_LOGGER.warning(
"Can't read repository information from %s", repository_file
)
except (OSError, json.JSONDecodeError):
_LOGGER.warning("Can't read repository information from %s",
repository_file)
return
except vol.Invalid:
_LOGGER.warning("Repository parse error %s", repository_file)
return
@@ -94,43 +95,42 @@ class AddonsData(JsonConfig, CoreSysAttributes):
self._read_addons_folder(path, slug)
def _read_addons_folder(self, path, repository):
"""Read data from addons folder."""
"""Read data from add-ons folder."""
for addon in path.glob("**/config.json"):
try:
addon_config = read_json_file(addon)
except JsonFileError:
_LOGGER.warning("Can't read %s from repository %s", addon, repository)
continue
# validate
# validate
try:
addon_config = SCHEMA_ADDON_CONFIG(addon_config)
# Generate slug
addon_slug = "{}_{}".format(
repository, addon_config[ATTR_SLUG])
# store
addon_config[ATTR_REPOSITORY] = repository
addon_config[ATTR_LOCATON] = str(addon.parent)
self._cache[addon_slug] = addon_config
except (OSError, json.JSONDecodeError):
_LOGGER.warning("Can't read %s", addon)
except vol.Invalid as ex:
_LOGGER.warning("Can't read %s: %s", addon,
humanize_error(addon_config, ex))
_LOGGER.warning(
"Can't read %s: %s", addon, humanize_error(addon_config, ex)
)
continue
# Generate slug
addon_slug = "{}_{}".format(repository, addon_config[ATTR_SLUG])
# store
addon_config[ATTR_REPOSITORY] = repository
addon_config[ATTR_LOCATON] = str(addon.parent)
self._cache[addon_slug] = addon_config
def _set_builtin_repositories(self):
"""Add local built-in repository into dataset."""
try:
builtin_file = Path(__file__).parent.joinpath('built-in.json')
builtin_file = Path(__file__).parent.joinpath("built-in.json")
builtin_data = read_json_file(builtin_file)
except (OSError, json.JSONDecodeError) as err:
_LOGGER.warning("Can't read built-in json: %s", err)
except JsonFileError:
_LOGGER.warning("Can't read built-in json")
return
# core repository
self._repositories[REPOSITORY_CORE] = \
builtin_data[REPOSITORY_CORE]
self._repositories[REPOSITORY_CORE] = builtin_data[REPOSITORY_CORE]
# local repository
self._repositories[REPOSITORY_LOCAL] = \
builtin_data[REPOSITORY_LOCAL]
self._repositories[REPOSITORY_LOCAL] = builtin_data[REPOSITORY_LOCAL]

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO addons git."""
"""Init file for Hass.io add-on Git."""
import asyncio
import logging
import functools as ft
@@ -16,10 +16,10 @@ _LOGGER = logging.getLogger(__name__)
class GitRepo(CoreSysAttributes):
"""Manage addons git repo."""
"""Manage Add-on Git repository."""
def __init__(self, coresys, path, url):
"""Initialize git base wrapper."""
"""Initialize Git base wrapper."""
self.coresys = coresys
self.repo = None
self.path = path
@@ -38,13 +38,13 @@ class GitRepo(CoreSysAttributes):
return self._data[ATTR_BRANCH]
async def load(self):
"""Init git addon repo."""
"""Init Git add-on repository."""
if not self.path.is_dir():
return await self.clone()
async with self.lock:
try:
_LOGGER.info("Load addon %s repository", self.path)
_LOGGER.info("Load add-on %s repository", self.path)
self.repo = await self.sys_run_in_executor(
git.Repo, str(self.path))
@@ -57,7 +57,7 @@ class GitRepo(CoreSysAttributes):
return True
async def clone(self):
"""Clone git addon repo."""
"""Clone git add-on repository."""
async with self.lock:
git_args = {
attribute: value
@@ -70,7 +70,7 @@ class GitRepo(CoreSysAttributes):
}
try:
_LOGGER.info("Clone addon %s repository", self.url)
_LOGGER.info("Clone add-on %s repository", self.url)
self.repo = await self.sys_run_in_executor(ft.partial(
git.Repo.clone_from, self.url, str(self.path),
**git_args
@@ -78,20 +78,20 @@ class GitRepo(CoreSysAttributes):
except (git.InvalidGitRepositoryError, git.NoSuchPathError,
git.GitCommandError) as err:
_LOGGER.error("Can't clone %s repo: %s.", self.url, err)
_LOGGER.error("Can't clone %s repository: %s.", self.url, err)
self._remove()
return False
return True
async def pull(self):
"""Pull git addon repo."""
"""Pull Git add-on repo."""
if self.lock.locked():
_LOGGER.warning("It is already a task in progress.")
_LOGGER.warning("It is already a task in progress")
return False
async with self.lock:
_LOGGER.info("Update addon %s repository", self.url)
_LOGGER.info("Update add-on %s repository", self.url)
branch = self.repo.active_branch.name
try:
@@ -130,19 +130,19 @@ class GitRepo(CoreSysAttributes):
class GitRepoHassIO(GitRepo):
"""HassIO addons repository."""
"""Hass.io add-ons repository."""
def __init__(self, coresys):
"""Initialize git hassio addon repository."""
"""Initialize Git Hass.io add-on repository."""
super().__init__(
coresys, coresys.config.path_addons_core, URL_HASSIO_ADDONS)
class GitRepoCustom(GitRepo):
"""Custom addons repository."""
"""Custom add-ons repository."""
def __init__(self, coresys, url):
"""Initialize git hassio addon repository."""
"""Initialize custom Git Hass.io addo-n repository."""
path = Path(
coresys.config.path_addons_git,
get_hash_from_repository(url))
@@ -151,5 +151,5 @@ class GitRepoCustom(GitRepo):
def remove(self):
"""Remove a custom repository."""
_LOGGER.info("Remove custom addon repository %s", self.url)
_LOGGER.info("Remove custom add-on repository %s", self.url)
self._remove()

View File

@@ -1,15 +1,16 @@
"""Represent a HassIO repository."""
"""Represent a Hass.io repository."""
from .git import GitRepoHassIO, GitRepoCustom
from .utils import get_hash_from_repository
from ..const import (
REPOSITORY_CORE, REPOSITORY_LOCAL, ATTR_NAME, ATTR_URL, ATTR_MAINTAINER)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError
UNKNOWN = 'unknown'
class Repository(CoreSysAttributes):
"""Repository in HassIO."""
"""Repository in Hass.io."""
def __init__(self, coresys, repository):
"""Initialize repository object."""
@@ -44,7 +45,7 @@ class Repository(CoreSysAttributes):
@property
def url(self):
"""Return url of repository."""
"""Return URL of repository."""
return self._mesh.get(ATTR_URL, self.source)
@property
@@ -59,14 +60,14 @@ class Repository(CoreSysAttributes):
return True
async def update(self):
"""Update addon repository."""
"""Update add-on repository."""
if self.git:
return await self.git.pull()
return True
def remove(self):
"""Remove addon repository."""
"""Remove add-on repository."""
if self._id in (REPOSITORY_CORE, REPOSITORY_LOCAL):
raise RuntimeError("Can't remove built-in repositories!")
raise APIError("Can't remove built-in repositories!")
self.git.remove()

View File

@@ -1,31 +1,108 @@
"""Util addons functions."""
"""Util add-ons functions."""
from __future__ import annotations
import asyncio
import hashlib
import logging
from pathlib import Path
import re
from typing import TYPE_CHECKING
from ..const import (
PRIVILEGED_DAC_READ_SEARCH,
PRIVILEGED_NET_ADMIN,
PRIVILEGED_SYS_ADMIN,
PRIVILEGED_SYS_MODULE,
PRIVILEGED_SYS_PTRACE,
PRIVILEGED_SYS_RAWIO,
ROLE_ADMIN,
ROLE_MANAGER,
SECURITY_DISABLE,
SECURITY_PROFILE,
)
if TYPE_CHECKING:
from .addon import Addon
RE_SHA1 = re.compile(r"[a-f0-9]{8}")
_LOGGER = logging.getLogger(__name__)
def get_hash_from_repository(name):
def rating_security(addon: Addon) -> int:
"""Return 1-6 for security rating.
1 = not secure
6 = high secure
"""
rating = 5
# AppArmor
if addon.apparmor == SECURITY_DISABLE:
rating += -1
elif addon.apparmor == SECURITY_PROFILE:
rating += 1
# Home Assistant Login
if addon.access_auth_api:
rating += 1
# Privileged options
if any(
privilege in addon.privileged
for privilege in (
PRIVILEGED_NET_ADMIN,
PRIVILEGED_SYS_ADMIN,
PRIVILEGED_SYS_RAWIO,
PRIVILEGED_SYS_PTRACE,
PRIVILEGED_SYS_MODULE,
PRIVILEGED_DAC_READ_SEARCH,
)
):
rating += -1
# API Hass.io role
if addon.hassio_role == ROLE_MANAGER:
rating += -1
elif addon.hassio_role == ROLE_ADMIN:
rating += -2
# Not secure Networking
if addon.host_network:
rating += -1
# Insecure PID namespace
if addon.host_pid:
rating += -2
# Full Access
if addon.with_full_access:
rating += -2
# Docker Access
if addon.access_docker_api:
rating = 1
return max(min(6, rating), 1)
def get_hash_from_repository(name: str) -> str:
"""Generate a hash from repository."""
key = name.lower().encode()
return hashlib.sha1(key).hexdigest()[:8]
def extract_hash_from_path(path):
def extract_hash_from_path(path: Path) -> str:
"""Extract repo id from path."""
repo_dir = path.parts[-1]
repository_dir = path.parts[-1]
if not RE_SHA1.match(repo_dir):
return get_hash_from_repository(repo_dir)
return repo_dir
if not RE_SHA1.match(repository_dir):
return get_hash_from_repository(repository_dir)
return repository_dir
def check_installed(method):
"""Wrap function with check if addon is installed."""
"""Wrap function with check if add-on is installed."""
async def wrap_check(addon, *args, **kwargs):
"""Return False if not installed or the function."""
if not addon.is_installed:
@@ -36,18 +113,18 @@ def check_installed(method):
return wrap_check
async def remove_data(folder):
async def remove_data(folder: Path) -> None:
"""Remove folder and reset privileged."""
try:
proc = await asyncio.create_subprocess_exec(
"rm", "-rf", str(folder),
stdout=asyncio.subprocess.DEVNULL
"rm", "-rf", str(folder), stdout=asyncio.subprocess.DEVNULL
)
_, error_msg = await proc.communicate()
except OSError as err:
error_msg = str(err)
else:
if proc.returncode == 0:
return
if proc.returncode == 0:
return
_LOGGER.error("Can't remove Add-on Data: %s", error_msg)

View File

@@ -1,4 +1,4 @@
"""Validate addons options schema."""
"""Validate add-ons options schema."""
import logging
import re
import uuid
@@ -6,27 +6,30 @@ import uuid
import voluptuous as vol
from ..const import (
ATTR_NAME, ATTR_VERSION, ATTR_SLUG, ATTR_DESCRIPTON, ATTR_STARTUP,
ATTR_BOOT, ATTR_MAP, ATTR_OPTIONS, ATTR_PORTS, STARTUP_ONCE,
STARTUP_SYSTEM, STARTUP_SERVICES, STARTUP_APPLICATION, STARTUP_INITIALIZE,
BOOT_AUTO, BOOT_MANUAL, ATTR_SCHEMA, ATTR_IMAGE, ATTR_URL, ATTR_MAINTAINER,
ATTR_ARCH, ATTR_DEVICES, ATTR_ENVIRONMENT, ATTR_HOST_NETWORK, ARCH_ARMHF,
ARCH_AARCH64, ARCH_AMD64, ARCH_I386, ATTR_TMPFS, ATTR_PRIVILEGED,
ATTR_USER, ATTR_STATE, ATTR_SYSTEM, STATE_STARTED, STATE_STOPPED,
ATTR_LOCATON, ATTR_REPOSITORY, ATTR_TIMEOUT, ATTR_NETWORK, ATTR_UUID,
ATTR_AUTO_UPDATE, ATTR_WEBUI, ATTR_AUDIO, ATTR_AUDIO_INPUT, ATTR_HOST_IPC,
ATTR_AUDIO_OUTPUT, ATTR_HASSIO_API, ATTR_BUILD_FROM, ATTR_SQUASH,
ATTR_ARGS, ATTR_GPIO, ATTR_HOMEASSISTANT_API, ATTR_STDIN, ATTR_LEGACY,
ATTR_HOST_DBUS, ATTR_AUTO_UART, ATTR_SERVICES, ATTR_DISCOVERY,
ATTR_APPARMOR, ATTR_DEVICETREE)
from ..validate import NETWORK_PORT, DOCKER_PORTS, ALSA_DEVICE
ARCH_ALL, ATTR_ACCESS_TOKEN, ATTR_APPARMOR, ATTR_ARCH, ATTR_ARGS,
ATTR_AUDIO, ATTR_AUDIO_INPUT, ATTR_AUDIO_OUTPUT, ATTR_AUTH_API,
ATTR_AUTO_UART, ATTR_AUTO_UPDATE, ATTR_BOOT, ATTR_BUILD_FROM,
ATTR_DESCRIPTON, ATTR_DEVICES, ATTR_DEVICETREE, ATTR_DISCOVERY,
ATTR_DOCKER_API, ATTR_ENVIRONMENT, ATTR_FULL_ACCESS, ATTR_GPIO,
ATTR_HASSIO_API, ATTR_HASSIO_ROLE, ATTR_HOMEASSISTANT_API, ATTR_HOST_DBUS,
ATTR_HOST_IPC, ATTR_HOST_NETWORK, ATTR_HOST_PID, ATTR_IMAGE,
ATTR_KERNEL_MODULES, ATTR_LEGACY, ATTR_LOCATON, ATTR_MACHINE,
ATTR_MAINTAINER, ATTR_MAP, ATTR_NAME, ATTR_NETWORK, ATTR_OPTIONS,
ATTR_PORTS, ATTR_PRIVILEGED, ATTR_PROTECTED, ATTR_REPOSITORY, ATTR_SCHEMA,
ATTR_SERVICES, ATTR_SLUG, ATTR_SQUASH, ATTR_STARTUP, ATTR_STATE,
ATTR_STDIN, ATTR_SYSTEM, ATTR_TIMEOUT, ATTR_TMPFS, ATTR_URL, ATTR_USER,
ATTR_UUID, ATTR_VERSION, ATTR_WEBUI, BOOT_AUTO, BOOT_MANUAL,
PRIVILEGED_ALL, ROLE_ALL, ROLE_DEFAULT, STARTUP_ALL, STARTUP_APPLICATION,
STARTUP_SERVICES, STATE_STARTED, STATE_STOPPED)
from ..services.validate import DISCOVERY_SERVICES
from ..validate import (
ALSA_DEVICE, DOCKER_PORTS, NETWORK_PORT, SHA256, UUID_MATCH)
_LOGGER = logging.getLogger(__name__)
RE_VOLUME = re.compile(r"^(config|ssl|addons|backup|share)(?::(rw|:ro))?$")
RE_SERVICE = re.compile(r"^(?P<service>mqtt)(?::(?P<rights>rw|:ro))?$")
RE_DISCOVERY = re.compile(r"^(?P<component>\w*)(?:/(?P<platform>\w*>))?$")
RE_VOLUME = re.compile(r"^(config|ssl|addons|backup|share)(?::(rw|ro))?$")
RE_SERVICE = re.compile(r"^(?P<service>mqtt):(?P<rights>provide|want|need)$")
V_STR = 'str'
V_INT = 'int'
@@ -46,33 +49,20 @@ RE_SCHEMA_ELEMENT = re.compile(
r")\??$"
)
RE_DOCKER_IMAGE = re.compile(
r"^([a-zA-Z\-\.:\d{}]+/)*?([\-\w{}]+)/([\-\w{}]+)$")
RE_DOCKER_IMAGE_BUILD = re.compile(
r"^([a-zA-Z\-\.:\d{}]+/)*?([\-\w{}]+)/([\-\w{}]+)(:[\.\-\w{}]+)?$")
SCHEMA_ELEMENT = vol.Match(RE_SCHEMA_ELEMENT)
ARCH_ALL = [
ARCH_ARMHF, ARCH_AARCH64, ARCH_AMD64, ARCH_I386
]
STARTUP_ALL = [
STARTUP_ONCE, STARTUP_INITIALIZE, STARTUP_SYSTEM, STARTUP_SERVICES,
STARTUP_APPLICATION
MACHINE_ALL = [
'intel-nuc', 'odroid-c2', 'odroid-xu', 'orangepi-prime', 'qemux86',
'qemux86-64', 'qemuarm', 'qemuarm-64', 'raspberrypi', 'raspberrypi2',
'raspberrypi3', 'raspberrypi3-64', 'tinker',
]
PRIVILEGED_ALL = [
"NET_ADMIN",
"SYS_ADMIN",
"SYS_RAWIO",
"IPC_LOCK",
"SYS_TIME",
"SYS_NICE"
]
BASE_IMAGE = {
ARCH_ARMHF: "homeassistant/armhf-base:latest",
ARCH_AARCH64: "homeassistant/aarch64-base:latest",
ARCH_I386: "homeassistant/i386-base:latest",
ARCH_AMD64: "homeassistant/amd64-base:latest",
}
def _simple_startup(value):
"""Simple startup schema."""
@@ -91,6 +81,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Required(ATTR_DESCRIPTON): vol.Coerce(str),
vol.Optional(ATTR_URL): vol.Url(),
vol.Optional(ATTR_ARCH, default=ARCH_ALL): [vol.In(ARCH_ALL)],
vol.Optional(ATTR_MACHINE): [vol.In(MACHINE_ALL)],
vol.Required(ATTR_STARTUP):
vol.All(_simple_startup, vol.In(STARTUP_ALL)),
vol.Required(ATTR_BOOT):
@@ -99,6 +90,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Optional(ATTR_WEBUI):
vol.Match(r"^(?:https?|\[PROTO:\w+\]):\/\/\[HOST\]:\[PORT:\d+\].*$"),
vol.Optional(ATTR_HOST_NETWORK, default=False): vol.Boolean(),
vol.Optional(ATTR_HOST_PID, default=False): vol.Boolean(),
vol.Optional(ATTR_HOST_IPC, default=False): vol.Boolean(),
vol.Optional(ATTR_HOST_DBUS, default=False): vol.Boolean(),
vol.Optional(ATTR_DEVICES): [vol.Match(r"^(.*):(.*):([rwm]{1,3})$")],
@@ -109,15 +101,20 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Optional(ATTR_ENVIRONMENT): {vol.Match(r"\w*"): vol.Coerce(str)},
vol.Optional(ATTR_PRIVILEGED): [vol.In(PRIVILEGED_ALL)],
vol.Optional(ATTR_APPARMOR, default=True): vol.Boolean(),
vol.Optional(ATTR_FULL_ACCESS, default=False): vol.Boolean(),
vol.Optional(ATTR_AUDIO, default=False): vol.Boolean(),
vol.Optional(ATTR_GPIO, default=False): vol.Boolean(),
vol.Optional(ATTR_DEVICETREE, default=False): vol.Boolean(),
vol.Optional(ATTR_KERNEL_MODULES, default=False): vol.Boolean(),
vol.Optional(ATTR_HASSIO_API, default=False): vol.Boolean(),
vol.Optional(ATTR_HASSIO_ROLE, default=ROLE_DEFAULT): vol.In(ROLE_ALL),
vol.Optional(ATTR_HOMEASSISTANT_API, default=False): vol.Boolean(),
vol.Optional(ATTR_STDIN, default=False): vol.Boolean(),
vol.Optional(ATTR_LEGACY, default=False): vol.Boolean(),
vol.Optional(ATTR_DOCKER_API, default=False): vol.Boolean(),
vol.Optional(ATTR_AUTH_API, default=False): vol.Boolean(),
vol.Optional(ATTR_SERVICES): [vol.Match(RE_SERVICE)],
vol.Optional(ATTR_DISCOVERY): [vol.Match(RE_DISCOVERY)],
vol.Optional(ATTR_DISCOVERY): [vol.In(DISCOVERY_SERVICES)],
vol.Required(ATTR_OPTIONS): dict,
vol.Required(ATTR_SCHEMA): vol.Any(vol.Schema({
vol.Coerce(str): vol.Any(SCHEMA_ELEMENT, [
@@ -129,7 +126,8 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Coerce(str): vol.Any(SCHEMA_ELEMENT, [SCHEMA_ELEMENT])
}))
}), False),
vol.Optional(ATTR_IMAGE): vol.Match(r"^[\w{}]+/[\-\w{}]+$"),
vol.Optional(ATTR_IMAGE):
vol.Match(RE_DOCKER_IMAGE),
vol.Optional(ATTR_TIMEOUT, default=10):
vol.All(vol.Coerce(int), vol.Range(min=10, max=120)),
}, extra=vol.REMOVE_EXTRA)
@@ -145,8 +143,8 @@ SCHEMA_REPOSITORY_CONFIG = vol.Schema({
# pylint: disable=no-value-for-parameter
SCHEMA_BUILD_CONFIG = vol.Schema({
vol.Optional(ATTR_BUILD_FROM, default=BASE_IMAGE): vol.Schema({
vol.In(ARCH_ALL): vol.Match(r"(?:^[\w{}]+/)?[\-\w{}]+:[\.\-\w{}]+$"),
vol.Optional(ATTR_BUILD_FROM, default=dict): vol.Schema({
vol.In(ARCH_ALL): vol.Match(RE_DOCKER_IMAGE_BUILD),
}),
vol.Optional(ATTR_SQUASH, default=False): vol.Boolean(),
vol.Optional(ATTR_ARGS, default=dict): vol.Schema({
@@ -158,8 +156,9 @@ SCHEMA_BUILD_CONFIG = vol.Schema({
# pylint: disable=no-value-for-parameter
SCHEMA_ADDON_USER = vol.Schema({
vol.Required(ATTR_VERSION): vol.Coerce(str),
vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex):
vol.Match(r"^[0-9a-f]{32}$"),
vol.Optional(ATTR_IMAGE): vol.Coerce(str),
vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex): UUID_MATCH,
vol.Optional(ATTR_ACCESS_TOKEN): SHA256,
vol.Optional(ATTR_OPTIONS, default=dict): dict,
vol.Optional(ATTR_AUTO_UPDATE, default=False): vol.Boolean(),
vol.Optional(ATTR_BOOT):
@@ -167,6 +166,7 @@ SCHEMA_ADDON_USER = vol.Schema({
vol.Optional(ATTR_NETWORK): DOCKER_PORTS,
vol.Optional(ATTR_AUDIO_OUTPUT): ALSA_DEVICE,
vol.Optional(ATTR_AUDIO_INPUT): ALSA_DEVICE,
vol.Optional(ATTR_PROTECTED, default=True): vol.Boolean(),
}, extra=vol.REMOVE_EXTRA)
@@ -197,7 +197,7 @@ SCHEMA_ADDON_SNAPSHOT = vol.Schema({
def validate_options(raw_schema):
"""Validate schema."""
def validate(struct):
"""Create schema validator for addons options."""
"""Create schema validator for add-ons options."""
options = {}
# read options

View File

@@ -1,15 +1,17 @@
"""Init file for HassIO rest api."""
"""Init file for Hass.io RESTful API."""
import logging
from pathlib import Path
from aiohttp import web
from .addons import APIAddons
from .auth import APIAuth
from .discovery import APIDiscovery
from .homeassistant import APIHomeAssistant
from .hardware import APIHardware
from .host import APIHost
from .hassos import APIHassOS
from .info import APIInfo
from .proxy import APIProxy
from .supervisor import APISupervisor
from .snapshots import APISnapshots
@@ -21,14 +23,14 @@ _LOGGER = logging.getLogger(__name__)
class RestAPI(CoreSysAttributes):
"""Handle rest api for hassio."""
"""Handle RESTful API for Hass.io."""
def __init__(self, coresys):
"""Initialize docker base wrapper."""
"""Initialize Docker base wrapper."""
self.coresys = coresys
self.security = SecurityMiddleware(coresys)
self.webapp = web.Application(
middlewares=[self.security.token_validation], loop=coresys.loop)
middlewares=[self.security.token_validation])
# service stuff
self._runner = web.AppRunner(self.webapp)
@@ -47,9 +49,11 @@ class RestAPI(CoreSysAttributes):
self._register_snapshots()
self._register_discovery()
self._register_services()
self._register_info()
self._register_auth()
def _register_host(self):
"""Register hostcontrol function."""
"""Register hostcontrol functions."""
api_host = APIHost()
api_host.coresys = self.coresys
@@ -62,14 +66,14 @@ class RestAPI(CoreSysAttributes):
web.get('/host/services', api_host.services),
web.post('/host/services/{service}/stop', api_host.service_stop),
web.post('/host/services/{service}/start', api_host.service_start),
web.post(
'/host/services/{service}/restart', api_host.service_restart),
web.post(
'/host/services/{service}/reload', api_host.service_reload),
web.post('/host/services/{service}/restart',
api_host.service_restart),
web.post('/host/services/{service}/reload',
api_host.service_reload),
])
def _register_hassos(self):
"""Register hassos function."""
"""Register HassOS functions."""
api_hassos = APIHassOS()
api_hassos.coresys = self.coresys
@@ -81,7 +85,7 @@ class RestAPI(CoreSysAttributes):
])
def _register_hardware(self):
"""Register hardware function."""
"""Register hardware functions."""
api_hardware = APIHardware()
api_hardware.coresys = self.coresys
@@ -90,8 +94,26 @@ class RestAPI(CoreSysAttributes):
web.get('/hardware/audio', api_hardware.audio),
])
def _register_info(self):
"""Register info functions."""
api_info = APIInfo()
api_info.coresys = self.coresys
self.webapp.add_routes([
web.get('/info', api_info.info),
])
def _register_auth(self):
"""Register auth functions."""
api_auth = APIAuth()
api_auth.coresys = self.coresys
self.webapp.add_routes([
web.post('/auth', api_auth.auth),
])
def _register_supervisor(self):
"""Register supervisor function."""
"""Register Supervisor functions."""
api_supervisor = APISupervisor()
api_supervisor.coresys = self.coresys
@@ -106,7 +128,7 @@ class RestAPI(CoreSysAttributes):
])
def _register_homeassistant(self):
"""Register homeassistant function."""
"""Register Home Assistant functions."""
api_hass = APIHomeAssistant()
api_hass.coresys = self.coresys
@@ -123,7 +145,7 @@ class RestAPI(CoreSysAttributes):
])
def _register_proxy(self):
"""Register HomeAssistant API Proxy."""
"""Register Home Assistant API Proxy."""
api_proxy = APIProxy()
api_proxy.coresys = self.coresys
@@ -137,7 +159,7 @@ class RestAPI(CoreSysAttributes):
])
def _register_addons(self):
"""Register homeassistant function."""
"""Register Add-on functions."""
api_addons = APIAddons()
api_addons.coresys = self.coresys
@@ -158,11 +180,12 @@ class RestAPI(CoreSysAttributes):
web.get('/addons/{addon}/logo', api_addons.logo),
web.get('/addons/{addon}/changelog', api_addons.changelog),
web.post('/addons/{addon}/stdin', api_addons.stdin),
web.post('/addons/{addon}/security', api_addons.security),
web.get('/addons/{addon}/stats', api_addons.stats),
])
def _register_snapshots(self):
"""Register snapshots function."""
"""Register snapshots functions."""
api_snapshots = APISnapshots()
api_snapshots.coresys = self.coresys
@@ -182,6 +205,7 @@ class RestAPI(CoreSysAttributes):
])
def _register_services(self):
"""Register services functions."""
api_services = APIServices()
api_services.coresys = self.coresys
@@ -193,19 +217,19 @@ class RestAPI(CoreSysAttributes):
])
def _register_discovery(self):
"""Register discovery functions."""
api_discovery = APIDiscovery()
api_discovery.coresys = self.coresys
self.webapp.add_routes([
web.get('/services/discovery', api_discovery.list),
web.get('/services/discovery/{uuid}', api_discovery.get_discovery),
web.delete('/services/discovery/{uuid}',
api_discovery.del_discovery),
web.post('/services/discovery', api_discovery.set_discovery),
web.get('/discovery', api_discovery.list),
web.get('/discovery/{uuid}', api_discovery.get_discovery),
web.delete('/discovery/{uuid}', api_discovery.del_discovery),
web.post('/discovery', api_discovery.set_discovery),
])
def _register_panel(self):
"""Register panel for homeassistant."""
"""Register panel for Home Assistant."""
panel_dir = Path(__file__).parent.joinpath("panel")
def create_response(panel_file):
@@ -214,8 +238,8 @@ class RestAPI(CoreSysAttributes):
return lambda request: web.FileResponse(path)
# This route is for backwards compatibility with HA < 0.58
self.webapp.add_routes([
web.get('/panel', create_response('hassio-main-es5'))])
self.webapp.add_routes(
[web.get('/panel', create_response('hassio-main-es5'))])
# This route is for backwards compatibility with HA 0.58 - 0.61
self.webapp.add_routes([
@@ -233,7 +257,7 @@ class RestAPI(CoreSysAttributes):
self.webapp.add_routes([web.static('/app', panel_dir)])
async def start(self):
"""Run rest api webserver."""
"""Run RESTful API webserver."""
await self._runner.setup()
self._site = web.TCPSite(
self._runner, host="0.0.0.0", port=80, shutdown_timeout=5)
@@ -241,13 +265,13 @@ class RestAPI(CoreSysAttributes):
try:
await self._site.start()
except OSError as err:
_LOGGER.fatal(
"Failed to create HTTP server at 0.0.0.0:80 -> %s", err)
_LOGGER.fatal("Failed to create HTTP server at 0.0.0.0:80 -> %s",
err)
else:
_LOGGER.info("Start API on %s", self.sys_docker.network.supervisor)
async def stop(self):
"""Stop rest api webserver."""
"""Stop RESTful API webserver."""
if not self._site:
return

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO homeassistant rest api."""
"""Init file for Hass.io Home Assistant RESTful API."""
import asyncio
import logging
@@ -6,6 +6,7 @@ import voluptuous as vol
from voluptuous.humanize import humanize_error
from .utils import api_process, api_process_raw, api_validate
from ..addons.utils import rating_security
from ..const import (
ATTR_VERSION, ATTR_LAST_VERSION, ATTR_STATE, ATTR_BOOT, ATTR_OPTIONS,
ATTR_URL, ATTR_DESCRIPTON, ATTR_DETACHED, ATTR_NAME, ATTR_REPOSITORY,
@@ -17,10 +18,14 @@ from ..const import (
ATTR_CHANGELOG, ATTR_HOST_IPC, ATTR_HOST_DBUS, ATTR_LONG_DESCRIPTION,
ATTR_CPU_PERCENT, ATTR_MEMORY_LIMIT, ATTR_MEMORY_USAGE, ATTR_NETWORK_TX,
ATTR_NETWORK_RX, ATTR_BLK_READ, ATTR_BLK_WRITE, ATTR_ICON, ATTR_SERVICES,
ATTR_DISCOVERY, ATTR_APPARMOR, ATTR_DEVICETREE,
CONTENT_TYPE_PNG, CONTENT_TYPE_BINARY, CONTENT_TYPE_TEXT)
ATTR_DISCOVERY, ATTR_APPARMOR, ATTR_DEVICETREE, ATTR_DOCKER_API,
ATTR_FULL_ACCESS, ATTR_PROTECTED, ATTR_RATING, ATTR_HOST_PID,
ATTR_HASSIO_ROLE, ATTR_MACHINE, ATTR_AVAILABLE, ATTR_AUTH_API,
ATTR_KERNEL_MODULES,
CONTENT_TYPE_PNG, CONTENT_TYPE_BINARY, CONTENT_TYPE_TEXT, REQUEST_FROM)
from ..coresys import CoreSysAttributes
from ..validate import DOCKER_PORTS, ALSA_DEVICE
from ..exceptions import APIError
_LOGGER = logging.getLogger(__name__)
@@ -37,32 +42,35 @@ SCHEMA_OPTIONS = vol.Schema({
vol.Optional(ATTR_AUDIO_INPUT): ALSA_DEVICE,
})
# pylint: disable=no-value-for-parameter
SCHEMA_SECURITY = vol.Schema({
vol.Optional(ATTR_PROTECTED): vol.Boolean(),
})
class APIAddons(CoreSysAttributes):
"""Handle rest api for addons functions."""
"""Handle RESTful API for add-on functions."""
def _extract_addon(self, request, check_installed=True):
"""Return addon, throw an exception it it doesn't exist."""
addon = self.sys_addons.get(request.match_info.get('addon'))
addon_slug = request.match_info.get('addon')
# Lookup itself
if addon_slug == 'self':
return request.get(REQUEST_FROM)
addon = self.sys_addons.get(addon_slug)
if not addon:
raise RuntimeError("Addon does not exist")
raise APIError("Addon does not exist")
if check_installed and not addon.is_installed:
raise RuntimeError("Addon is not installed")
raise APIError("Addon is not installed")
return addon
@staticmethod
def _pretty_devices(addon):
"""Return a simplified device list."""
dev_list = addon.devices
if not dev_list:
return None
return [row.split(':')[0] for row in dev_list]
@api_process
async def list(self, request):
"""Return all addons / repositories ."""
"""Return all add-ons or repositories."""
data_addons = []
for addon in self.sys_addons.list_addons:
data_addons.append({
@@ -71,7 +79,7 @@ class APIAddons(CoreSysAttributes):
ATTR_DESCRIPTON: addon.description,
ATTR_VERSION: addon.last_version,
ATTR_INSTALLED: addon.version_installed,
ATTR_ARCH: addon.supported_arch,
ATTR_AVAILABLE: addon.available,
ATTR_DETACHED: addon.is_detached,
ATTR_REPOSITORY: addon.repository,
ATTR_BUILD: addon.need_build,
@@ -97,13 +105,13 @@ class APIAddons(CoreSysAttributes):
@api_process
async def reload(self, request):
"""Reload all addons data."""
"""Reload all add-on data."""
await asyncio.shield(self.sys_addons.reload())
return True
@api_process
async def info(self, request):
"""Return addon information."""
"""Return add-on information."""
addon = self._extract_addon(request, check_installed=False)
return {
@@ -116,37 +124,48 @@ class APIAddons(CoreSysAttributes):
ATTR_REPOSITORY: addon.repository,
ATTR_LAST_VERSION: addon.last_version,
ATTR_STATE: await addon.state(),
ATTR_PROTECTED: addon.protected,
ATTR_RATING: rating_security(addon),
ATTR_BOOT: addon.boot,
ATTR_OPTIONS: addon.options,
ATTR_ARCH: addon.supported_arch,
ATTR_MACHINE: addon.supported_machine,
ATTR_URL: addon.url,
ATTR_DETACHED: addon.is_detached,
ATTR_AVAILABLE: addon.available,
ATTR_BUILD: addon.need_build,
ATTR_NETWORK: addon.ports,
ATTR_HOST_NETWORK: addon.host_network,
ATTR_HOST_PID: addon.host_pid,
ATTR_HOST_IPC: addon.host_ipc,
ATTR_HOST_DBUS: addon.host_dbus,
ATTR_PRIVILEGED: addon.privileged,
ATTR_FULL_ACCESS: addon.with_full_access,
ATTR_APPARMOR: addon.apparmor,
ATTR_DEVICES: self._pretty_devices(addon),
ATTR_DEVICES: _pretty_devices(addon),
ATTR_ICON: addon.with_icon,
ATTR_LOGO: addon.with_logo,
ATTR_CHANGELOG: addon.with_changelog,
ATTR_WEBUI: addon.webui,
ATTR_STDIN: addon.with_stdin,
ATTR_HASSIO_API: addon.access_hassio_api,
ATTR_HASSIO_ROLE: addon.hassio_role,
ATTR_AUTH_API: addon.access_auth_api,
ATTR_HOMEASSISTANT_API: addon.access_homeassistant_api,
ATTR_GPIO: addon.with_gpio,
ATTR_KERNEL_MODULES: addon.with_kernel_modules,
ATTR_DEVICETREE: addon.with_devicetree,
ATTR_DOCKER_API: addon.access_docker_api,
ATTR_AUDIO: addon.with_audio,
ATTR_AUDIO_INPUT: addon.audio_input,
ATTR_AUDIO_OUTPUT: addon.audio_output,
ATTR_SERVICES: addon.services,
ATTR_SERVICES: _pretty_services(addon),
ATTR_DISCOVERY: addon.discovery,
}
@api_process
async def options(self, request):
"""Store user options for addon."""
"""Store user options for add-on."""
addon = self._extract_addon(request)
addon_schema = SCHEMA_OPTIONS.extend({
@@ -171,6 +190,19 @@ class APIAddons(CoreSysAttributes):
addon.save_data()
return True
@api_process
async def security(self, request):
"""Store security options for add-on."""
addon = self._extract_addon(request)
body = await api_validate(SCHEMA_SECURITY, request)
if ATTR_PROTECTED in body:
_LOGGER.warning("Protected flag changing for %s!", addon.slug)
addon.protected = body[ATTR_PROTECTED]
addon.save_data()
return True
@api_process
async def stats(self, request):
"""Return resource information."""
@@ -178,7 +210,7 @@ class APIAddons(CoreSysAttributes):
stats = await addon.stats()
if not stats:
raise RuntimeError("No stats available")
raise APIError("No stats available")
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
@@ -192,19 +224,19 @@ class APIAddons(CoreSysAttributes):
@api_process
def install(self, request):
"""Install addon."""
"""Install add-on."""
addon = self._extract_addon(request, check_installed=False)
return asyncio.shield(addon.install())
@api_process
def uninstall(self, request):
"""Uninstall addon."""
"""Uninstall add-on."""
addon = self._extract_addon(request)
return asyncio.shield(addon.uninstall())
@api_process
def start(self, request):
"""Start addon."""
"""Start add-on."""
addon = self._extract_addon(request)
# check options
@@ -212,83 +244,99 @@ class APIAddons(CoreSysAttributes):
try:
addon.schema(options)
except vol.Invalid as ex:
raise RuntimeError(humanize_error(options, ex)) from None
raise APIError(humanize_error(options, ex)) from None
return asyncio.shield(addon.start())
@api_process
def stop(self, request):
"""Stop addon."""
"""Stop add-on."""
addon = self._extract_addon(request)
return asyncio.shield(addon.stop())
@api_process
def update(self, request):
"""Update addon."""
"""Update add-on."""
addon = self._extract_addon(request)
if addon.last_version == addon.version_installed:
raise RuntimeError("No update available!")
raise APIError("No update available!")
return asyncio.shield(addon.update())
@api_process
def restart(self, request):
"""Restart addon."""
"""Restart add-on."""
addon = self._extract_addon(request)
return asyncio.shield(addon.restart())
@api_process
def rebuild(self, request):
"""Rebuild local build addon."""
"""Rebuild local build add-on."""
addon = self._extract_addon(request)
if not addon.need_build:
raise RuntimeError("Only local build addons are supported")
raise APIError("Only local build addons are supported")
return asyncio.shield(addon.rebuild())
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request):
"""Return logs from addon."""
"""Return logs from add-on."""
addon = self._extract_addon(request)
return addon.logs()
@api_process_raw(CONTENT_TYPE_PNG)
async def icon(self, request):
"""Return icon from addon."""
"""Return icon from add-on."""
addon = self._extract_addon(request, check_installed=False)
if not addon.with_icon:
raise RuntimeError("No icon found!")
raise APIError("No icon found!")
with addon.path_icon.open('rb') as png:
return png.read()
@api_process_raw(CONTENT_TYPE_PNG)
async def logo(self, request):
"""Return logo from addon."""
"""Return logo from add-on."""
addon = self._extract_addon(request, check_installed=False)
if not addon.with_logo:
raise RuntimeError("No logo found!")
raise APIError("No logo found!")
with addon.path_logo.open('rb') as png:
return png.read()
@api_process_raw(CONTENT_TYPE_TEXT)
async def changelog(self, request):
"""Return changelog from addon."""
"""Return changelog from add-on."""
addon = self._extract_addon(request, check_installed=False)
if not addon.with_changelog:
raise RuntimeError("No changelog found!")
raise APIError("No changelog found!")
with addon.path_changelog.open('r') as changelog:
return changelog.read()
@api_process
async def stdin(self, request):
"""Write to stdin of addon."""
"""Write to stdin of add-on."""
addon = self._extract_addon(request)
if not addon.with_stdin:
raise RuntimeError("STDIN not supported by addon")
raise APIError("STDIN not supported by add-on")
data = await request.read()
return await asyncio.shield(addon.write_stdin(data))
def _pretty_devices(addon):
"""Return a simplified device list."""
dev_list = addon.devices
if not dev_list:
return None
return [row.split(':')[0] for row in dev_list]
def _pretty_services(addon):
"""Return a simplified services role list."""
services = []
for name, access in addon.services_role.items():
services.append(f"{name}:{access}")
return services

61
hassio/api/auth.py Normal file
View File

@@ -0,0 +1,61 @@
"""Init file for Hass.io auth/SSO RESTful API."""
import logging
from aiohttp import BasicAuth
from aiohttp.web_exceptions import HTTPUnauthorized
from aiohttp.hdrs import CONTENT_TYPE, AUTHORIZATION, WWW_AUTHENTICATE
from .utils import api_process
from ..const import REQUEST_FROM, CONTENT_TYPE_JSON, CONTENT_TYPE_URL
from ..coresys import CoreSysAttributes
from ..exceptions import APIForbidden
_LOGGER = logging.getLogger(__name__)
class APIAuth(CoreSysAttributes):
"""Handle RESTful API for auth functions."""
def _process_basic(self, request, addon):
"""Process login request with basic auth.
Return a coroutine.
"""
auth = BasicAuth.decode(request.headers[AUTHORIZATION])
return self.sys_auth.check_login(addon, auth.login, auth.password)
def _process_dict(self, request, addon, data):
"""Process login with dict data.
Return a coroutine.
"""
username = data.get('username') or data.get('user')
password = data.get('password')
return self.sys_auth.check_login(addon, username, password)
@api_process
async def auth(self, request):
"""Process login request."""
addon = request[REQUEST_FROM]
if not addon.access_auth_api:
raise APIForbidden("Can't use Home Assistant auth!")
# BasicAuth
if AUTHORIZATION in request.headers:
return await self._process_basic(request, addon)
# Json
if request.headers.get(CONTENT_TYPE) == CONTENT_TYPE_JSON:
data = await request.json()
return await self._process_dict(request, addon, data)
# URL encoded
if request.headers.get(CONTENT_TYPE) == CONTENT_TYPE_URL:
data = await request.post()
return await self._process_dict(request, addon, data)
raise HTTPUnauthorized(headers={
WWW_AUTHENTICATE: "Basic realm=\"Hass.io Authentication\""
})

View File

@@ -1,41 +1,47 @@
"""Init file for HassIO network rest api."""
"""Init file for Hass.io network RESTful API."""
import voluptuous as vol
from .utils import api_process, api_validate
from ..const import (
ATTR_PROVIDER, ATTR_UUID, ATTR_COMPONENT, ATTR_PLATFORM, ATTR_CONFIG,
ATTR_DISCOVERY, REQUEST_FROM)
ATTR_ADDON, ATTR_UUID, ATTR_CONFIG, ATTR_DISCOVERY, ATTR_SERVICE,
REQUEST_FROM)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError, APIForbidden
from ..validate import SERVICE_ALL
SCHEMA_DISCOVERY = vol.Schema({
vol.Required(ATTR_COMPONENT): vol.Coerce(str),
vol.Optional(ATTR_PLATFORM): vol.Any(None, vol.Coerce(str)),
vol.Optional(ATTR_CONFIG): vol.Any(None, dict),
vol.Required(ATTR_SERVICE): SERVICE_ALL,
vol.Optional(ATTR_CONFIG): vol.Maybe(dict),
})
class APIDiscovery(CoreSysAttributes):
"""Handle rest api for discovery functions."""
"""Handle RESTful API for discovery functions."""
def _extract_message(self, request):
"""Extract discovery message from URL."""
message = self.sys_discovery.get(request.match_info.get('uuid'))
if not message:
raise RuntimeError("Discovery message not found")
raise APIError("Discovery message not found")
return message
def _check_permission_ha(self, request):
"""Check permission for API call / Home Assistant."""
if request[REQUEST_FROM] != self.sys_homeassistant:
raise APIForbidden("Only HomeAssistant can use this API!")
@api_process
async def list(self, request):
"""Show register services."""
self._check_permission_ha(request)
discovery = []
for message in self.sys_discovery.list_messages:
discovery.append({
ATTR_PROVIDER: message.provider,
ATTR_ADDON: message.addon,
ATTR_SERVICE: message.service,
ATTR_UUID: message.uuid,
ATTR_COMPONENT: message.component,
ATTR_PLATFORM: message.platform,
ATTR_CONFIG: message.config,
})
@@ -45,8 +51,14 @@ class APIDiscovery(CoreSysAttributes):
async def set_discovery(self, request):
"""Write data into a discovery pipeline."""
body = await api_validate(SCHEMA_DISCOVERY, request)
message = self.sys_discovery.send(
provider=request[REQUEST_FROM], **body)
addon = request[REQUEST_FROM]
# Access?
if body[ATTR_SERVICE] not in addon.discovery:
raise APIForbidden(f"Can't use discovery!")
# Process discovery message
message = self.sys_discovery.send(addon, **body)
return {ATTR_UUID: message.uuid}
@@ -55,11 +67,13 @@ class APIDiscovery(CoreSysAttributes):
"""Read data into a discovery message."""
message = self._extract_message(request)
# HomeAssistant?
self._check_permission_ha(request)
return {
ATTR_PROVIDER: message.provider,
ATTR_ADDON: message.addon,
ATTR_SERVICE: message.service,
ATTR_UUID: message.uuid,
ATTR_COMPONENT: message.component,
ATTR_PLATFORM: message.platform,
ATTR_CONFIG: message.config,
}
@@ -67,6 +81,11 @@ class APIDiscovery(CoreSysAttributes):
async def del_discovery(self, request):
"""Delete data into a discovery message."""
message = self._extract_message(request)
addon = request[REQUEST_FROM]
# Permission
if message.addon != addon.slug:
raise APIForbidden(f"Can't remove discovery message")
self.sys_discovery.remove(message)
return True

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO hardware rest api."""
"""Init file for Hass.io hardware RESTful API."""
import logging
from .utils import api_process
@@ -10,7 +10,7 @@ _LOGGER = logging.getLogger(__name__)
class APIHardware(CoreSysAttributes):
"""Handle rest api for hardware functions."""
"""Handle RESTful API for hardware functions."""
@api_process
async def info(self, request):

View File

@@ -1,4 +1,4 @@
"""Init file for Hass.io hassos rest api."""
"""Init file for Hass.io HassOS RESTful API."""
import asyncio
import logging
@@ -18,11 +18,11 @@ SCHEMA_VERSION = vol.Schema({
class APIHassOS(CoreSysAttributes):
"""Handle rest api for hassos functions."""
"""Handle RESTful API for HassOS functions."""
@api_process
async def info(self, request):
"""Return hassos information."""
"""Return HassOS information."""
return {
ATTR_VERSION: self.sys_hassos.version,
ATTR_VERSION_CLI: self.sys_hassos.version_cli,

View File

@@ -1,37 +1,42 @@
"""Init file for HassIO homeassistant rest api."""
"""Init file for Hass.io Home Assistant RESTful API."""
import asyncio
import logging
import voluptuous as vol
from .utils import api_process, api_process_raw, api_validate
from ..const import (
ATTR_VERSION, ATTR_LAST_VERSION, ATTR_IMAGE, ATTR_CUSTOM, ATTR_BOOT,
ATTR_PORT, ATTR_PASSWORD, ATTR_SSL, ATTR_WATCHDOG, ATTR_CPU_PERCENT,
ATTR_MEMORY_USAGE, ATTR_MEMORY_LIMIT, ATTR_NETWORK_RX, ATTR_NETWORK_TX,
ATTR_BLK_READ, ATTR_BLK_WRITE, ATTR_WAIT_BOOT, ATTR_MACHINE,
ATTR_REFRESH_TOKEN, CONTENT_TYPE_BINARY)
ATTR_ARCH, ATTR_BLK_READ, ATTR_BLK_WRITE, ATTR_BOOT, ATTR_CPU_PERCENT,
ATTR_CUSTOM, ATTR_IMAGE, ATTR_LAST_VERSION, ATTR_MACHINE, ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE, ATTR_NETWORK_RX, ATTR_NETWORK_TX, ATTR_PASSWORD,
ATTR_PORT, ATTR_REFRESH_TOKEN, ATTR_SSL, ATTR_VERSION, ATTR_WAIT_BOOT,
ATTR_WATCHDOG, CONTENT_TYPE_BINARY)
from ..coresys import CoreSysAttributes
from ..validate import NETWORK_PORT, DOCKER_IMAGE
from ..exceptions import APIError
from ..validate import DOCKER_IMAGE, NETWORK_PORT
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_OPTIONS = vol.Schema({
vol.Optional(ATTR_BOOT): vol.Boolean(),
vol.Optional(ATTR_BOOT):
vol.Boolean(),
vol.Inclusive(ATTR_IMAGE, 'custom_hass'):
vol.Any(None, vol.Coerce(str)),
vol.Maybe(vol.Coerce(str)),
vol.Inclusive(ATTR_LAST_VERSION, 'custom_hass'):
vol.Any(None, DOCKER_IMAGE),
vol.Optional(ATTR_PORT): NETWORK_PORT,
vol.Optional(ATTR_PASSWORD): vol.Any(None, vol.Coerce(str)),
vol.Optional(ATTR_SSL): vol.Boolean(),
vol.Optional(ATTR_WATCHDOG): vol.Boolean(),
vol.Optional(ATTR_PORT):
NETWORK_PORT,
vol.Optional(ATTR_PASSWORD):
vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_SSL):
vol.Boolean(),
vol.Optional(ATTR_WATCHDOG):
vol.Boolean(),
vol.Optional(ATTR_WAIT_BOOT):
vol.All(vol.Coerce(int), vol.Range(min=60)),
# Required once we enforce user system
vol.Optional(ATTR_REFRESH_TOKEN): str,
vol.Optional(ATTR_REFRESH_TOKEN):
vol.Maybe(vol.Coerce(str)),
})
SCHEMA_VERSION = vol.Schema({
@@ -40,7 +45,7 @@ SCHEMA_VERSION = vol.Schema({
class APIHomeAssistant(CoreSysAttributes):
"""Handle rest api for homeassistant functions."""
"""Handle RESTful API for Home Assistant functions."""
@api_process
async def info(self, request):
@@ -49,6 +54,7 @@ class APIHomeAssistant(CoreSysAttributes):
ATTR_VERSION: self.sys_homeassistant.version,
ATTR_LAST_VERSION: self.sys_homeassistant.last_version,
ATTR_MACHINE: self.sys_homeassistant.machine,
ATTR_ARCH: self.sys_homeassistant.arch,
ATTR_IMAGE: self.sys_homeassistant.image,
ATTR_CUSTOM: self.sys_homeassistant.is_custom_image,
ATTR_BOOT: self.sys_homeassistant.boot,
@@ -60,7 +66,7 @@ class APIHomeAssistant(CoreSysAttributes):
@api_process
async def options(self, request):
"""Set homeassistant options."""
"""Set Home Assistant options."""
body = await api_validate(SCHEMA_OPTIONS, request)
if ATTR_IMAGE in body and ATTR_LAST_VERSION in body:
@@ -95,7 +101,7 @@ class APIHomeAssistant(CoreSysAttributes):
"""Return resource information."""
stats = await self.sys_homeassistant.stats()
if not stats:
raise RuntimeError("No stats available")
raise APIError("No stats available")
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
@@ -109,7 +115,7 @@ class APIHomeAssistant(CoreSysAttributes):
@api_process
async def update(self, request):
"""Update homeassistant."""
"""Update Home Assistant."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_homeassistant.last_version)
@@ -117,29 +123,27 @@ class APIHomeAssistant(CoreSysAttributes):
@api_process
def stop(self, request):
"""Stop homeassistant."""
"""Stop Home Assistant."""
return asyncio.shield(self.sys_homeassistant.stop())
@api_process
def start(self, request):
"""Start homeassistant."""
"""Start Home Assistant."""
return asyncio.shield(self.sys_homeassistant.start())
@api_process
def restart(self, request):
"""Restart homeassistant."""
"""Restart Home Assistant."""
return asyncio.shield(self.sys_homeassistant.restart())
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request):
"""Return homeassistant docker logs."""
"""Return Home Assistant Docker logs."""
return self.sys_homeassistant.logs()
@api_process
async def check(self, request):
"""Check config of homeassistant."""
"""Check configuration of Home Assistant."""
result = await self.sys_homeassistant.check_config()
if not result.valid:
raise RuntimeError(result.log)
return True
raise APIError(result.log)

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO host rest api."""
"""Init file for Hass.io host RESTful API."""
import asyncio
import logging
@@ -21,7 +21,7 @@ SCHEMA_OPTIONS = vol.Schema({
class APIHost(CoreSysAttributes):
"""Handle rest api for host functions."""
"""Handle RESTful API for host functions."""
@api_process
async def info(self, request):

28
hassio/api/info.py Normal file
View File

@@ -0,0 +1,28 @@
"""Init file for Hass.io info RESTful API."""
import logging
from ..const import (ATTR_ARCH, ATTR_CHANNEL, ATTR_HASSOS, ATTR_HOMEASSISTANT,
ATTR_HOSTNAME, ATTR_MACHINE, ATTR_SUPERVISOR,
ATTR_SUPPORTED_ARCH)
from ..coresys import CoreSysAttributes
from .utils import api_process
_LOGGER = logging.getLogger(__name__)
class APIInfo(CoreSysAttributes):
"""Handle RESTful API for info functions."""
@api_process
async def info(self, request):
"""Show system info."""
return {
ATTR_SUPERVISOR: self.sys_supervisor.version,
ATTR_HOMEASSISTANT: self.sys_homeassistant.version,
ATTR_HASSOS: self.sys_hassos.version,
ATTR_HOSTNAME: self.sys_host.info.hostname,
ATTR_MACHINE: self.sys_machine,
ATTR_ARCH: self.sys_arch.default,
ATTR_SUPPORTED_ARCH: self.sys_arch.supported,
ATTR_CHANNEL: self.sys_updater.channel,
}

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,142 @@
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright 2018 Google Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
/**
* @license
* Copyright 2016 Google Inc.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
/**
@license
Copyright (c) 2019 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
* @license
* Copyright (c) 2018 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2014 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/

Binary file not shown.

View File

@@ -0,0 +1 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.2c1fb1dea4fa88f96920.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -1,419 +0,0 @@
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2014 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/

View File

@@ -0,0 +1 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[4],{110:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(101),i=t.n(e),o=t(103),u=t.n(o),a=i.a,c=u.a}}]);

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[5],{104:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(99),i=t.n(e),o=t(97),u=t.n(o),a=i.a,c=u.a}}]);

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,20 @@
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at
http://polymer.github.io/LICENSE.txt The complete set of authors may be found at
http://polymer.github.io/AUTHORS.txt The complete set of contributors may be
found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by Google as
part of the polymer project is also subject to an additional IP rights grant
found at http://polymer.github.io/PATENTS.txt
*/

Binary file not shown.

View File

@@ -0,0 +1 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.e7d34dbf975fad4b7776.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

View File

@@ -1,389 +0,0 @@
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
/**
@license
Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
!function(e){function n(n){for(var t,o,i=n[0],u=n[1],a=0,l=[];a<i.length;a++)o=i[a],r[o]&&l.push(r[o][0]),r[o]=0;for(t in u)Object.prototype.hasOwnProperty.call(u,t)&&(e[t]=u[t]);for(f&&f(n);l.length;)l.shift()()}var t={},r={6:0};function o(n){if(t[n])return t[n].exports;var r=t[n]={i:n,l:!1,exports:{}};return e[n].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var n=[],t=r[e];if(0!==t)if(t)n.push(t[2]);else{var i=new Promise(function(n,o){t=r[e]=[n,o]});n.push(t[2]=i);var u,a=document.getElementsByTagName("head")[0],f=document.createElement("script");f.charset="utf-8",f.timeout=120,o.nc&&f.setAttribute("nonce",o.nc),f.src=function(e){return o.p+"chunk."+{0:"f3880aa331d3ef2ddf32",1:"a8e86d80be46b3b6e16d",2:"0ef4ef1053fe3d5107b5",3:"ff92199b0d422767d108",4:"c77b56beea1d4547ff5f",5:"c93f37c558ff32991708"}[e]+".js"}(e),u=function(n){f.onerror=f.onload=null,clearTimeout(l);var t=r[e];if(0!==t){if(t){var o=n&&("load"===n.type?"missing":n.type),i=n&&n.target&&n.target.src,u=new Error("Loading chunk "+e+" failed.\n("+o+": "+i+")");u.type=o,u.request=i,t[1](u)}r[e]=void 0}};var l=setTimeout(function(){u({type:"timeout",target:f})},12e4);f.onerror=f.onload=u,a.appendChild(f)}return Promise.all(n)},o.m=e,o.c=t,o.d=function(e,n,t){o.o(e,n)||Object.defineProperty(e,n,{enumerable:!0,get:t})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,n){if(1&n&&(e=o(e)),8&n)return e;if(4&n&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(o.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&n&&"string"!=typeof e)for(var r in e)o.d(t,r,function(n){return e[n]}.bind(null,r));return t},o.n=function(e){var n=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(n,"a",n),n},o.o=function(e,n){return Object.prototype.hasOwnProperty.call(e,n)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var i=window.webpackJsonp=window.webpackJsonp||[],u=i.push.bind(i);i.push=n,i=i.slice();for(var a=0;a<i.length;a++)n(i[a]);var f=u;o(o.s=0)}([function(e,n,t){window.loadES5Adapter().then(function(){Promise.all([t.e(0),t.e(3)]).then(t.bind(null,1)),Promise.all([t.e(0),t.e(1),t.e(2)]).then(t.bind(null,2))})}]);
!function(e){function t(t){for(var n,o,i=t[0],u=t[1],a=0,f=[];a<i.length;a++)o=i[a],r[o]&&f.push(r[o][0]),r[o]=0;for(n in u)Object.prototype.hasOwnProperty.call(u,n)&&(e[n]=u[n]);for(c&&c(t);f.length;)f.shift()()}var n={},r={1:0};function o(t){if(n[t])return n[t].exports;var r=n[t]={i:t,l:!1,exports:{}};return e[t].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var t=[],n=r[e];if(0!==n)if(n)t.push(n[2]);else{var i=new Promise(function(t,o){n=r[e]=[t,o]});t.push(n[2]=i);var u,a=document.createElement("script");a.charset="utf-8",a.timeout=120,o.nc&&a.setAttribute("nonce",o.nc),a.src=function(e){return o.p+"chunk."+{0:"e7d34dbf975fad4b7776",2:"75766aa821239c9936dc",3:"6ff2deda34a647d6051c",4:"b74ddf4cacc7d5de8a55",5:"d33e783375f0db186ab5",6:"2c1fb1dea4fa88f96920",7:"088b1034e27d00ee9329"}[e]+".js"}(e),u=function(t){a.onerror=a.onload=null,clearTimeout(c);var n=r[e];if(0!==n){if(n){var o=t&&("load"===t.type?"missing":t.type),i=t&&t.target&&t.target.src,u=new Error("Loading chunk "+e+" failed.\n("+o+": "+i+")");u.type=o,u.request=i,n[1](u)}r[e]=void 0}};var c=setTimeout(function(){u({type:"timeout",target:a})},12e4);a.onerror=a.onload=u,document.head.appendChild(a)}return Promise.all(t)},o.m=e,o.c=n,o.d=function(e,t,n){o.o(e,t)||Object.defineProperty(e,t,{enumerable:!0,get:n})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,t){if(1&t&&(e=o(e)),8&t)return e;if(4&t&&"object"==typeof e&&e&&e.__esModule)return e;var n=Object.create(null);if(o.r(n),Object.defineProperty(n,"default",{enumerable:!0,value:e}),2&t&&"string"!=typeof e)for(var r in e)o.d(n,r,function(t){return e[t]}.bind(null,r));return n},o.n=function(e){var t=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(t,"a",t),t},o.o=function(e,t){return Object.prototype.hasOwnProperty.call(e,t)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var i=window.webpackJsonp=window.webpackJsonp||[],u=i.push.bind(i);i.push=t,i=i.slice();for(var a=0;a<i.length;a++)t(i[a]);var c=u;o(o.s=0)}([function(e,t,n){window.loadES5Adapter().then(function(){Promise.all([n.e(0),n.e(2)]).then(n.bind(null,2)),Promise.all([n.e(0),n.e(6),n.e(3)]).then(n.bind(null,1))}),document.body.style.height="100%"}]);

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -1,32 +1,37 @@
"""Utils for HomeAssistant Proxy."""
"""Utils for Home Assistant Proxy."""
import asyncio
from contextlib import asynccontextmanager
import logging
import aiohttp
from aiohttp import web
from aiohttp.web_exceptions import (
HTTPBadGateway, HTTPInternalServerError, HTTPUnauthorized)
from aiohttp.hdrs import CONTENT_TYPE
from aiohttp.web_exceptions import HTTPBadGateway, HTTPUnauthorized
from aiohttp.client_exceptions import ClientConnectorError
from aiohttp.hdrs import CONTENT_TYPE, AUTHORIZATION
import async_timeout
from ..const import HEADER_HA_ACCESS
from ..coresys import CoreSysAttributes
from ..exceptions import HomeAssistantAuthError, HomeAssistantAPIError
from ..exceptions import (
HomeAssistantAuthError, HomeAssistantAPIError, APIError)
_LOGGER = logging.getLogger(__name__)
class APIProxy(CoreSysAttributes):
"""API Proxy for Home-Assistant."""
"""API Proxy for Home Assistant."""
def _check_access(self, request):
"""Check the Hass.io token."""
hassio_token = request.headers.get(HEADER_HA_ACCESS)
addon = self.sys_addons.from_uuid(hassio_token)
if AUTHORIZATION in request.headers:
bearer = request.headers[AUTHORIZATION]
hassio_token = bearer.split(' ')[-1]
else:
hassio_token = request.headers.get(HEADER_HA_ACCESS)
addon = self.sys_addons.from_token(hassio_token)
if not addon:
_LOGGER.warning("Unknown HomeAssistant API access!")
_LOGGER.warning("Unknown Home Assistant API access!")
elif not addon.access_homeassistant_api:
_LOGGER.warning("Not permitted API access: %s", addon.slug)
else:
@@ -37,7 +42,7 @@ class APIProxy(CoreSysAttributes):
@asynccontextmanager
async def _api_client(self, request, path, timeout=300):
"""Return a client request with proxy origin for Home-Assistant."""
"""Return a client request with proxy origin for Home Assistant."""
try:
# read data
with async_timeout.timeout(30):
@@ -59,6 +64,8 @@ class APIProxy(CoreSysAttributes):
except HomeAssistantAuthError:
_LOGGER.error("Authenticate error on API for request %s", path)
except HomeAssistantAPIError:
_LOGGER.error("Error on API for request %s", path)
except aiohttp.ClientError as err:
_LOGGER.error("Client error on API %s request %s", path, err)
except asyncio.TimeoutError:
@@ -70,29 +77,23 @@ class APIProxy(CoreSysAttributes):
"""Proxy HomeAssistant EventStream Requests."""
self._check_access(request)
_LOGGER.info("Home-Assistant EventStream start")
_LOGGER.info("Home Assistant EventStream start")
async with self._api_client(request, 'stream', timeout=None) as client:
response = web.StreamResponse()
response.content_type = request.headers.get(CONTENT_TYPE)
try:
await response.prepare(request)
while True:
data = await client.content.read(10)
if not data:
break
async for data in client.content:
await response.write(data)
except aiohttp.ClientError:
except (aiohttp.ClientError, aiohttp.ClientPayloadError):
pass
finally:
client.close()
_LOGGER.info("Home-Assistant EventStream close")
_LOGGER.info("Home Assistant EventStream close")
return response
async def api(self, request):
"""Proxy HomeAssistant API Requests."""
"""Proxy Home Assistant API Requests."""
self._check_access(request)
# Normal request
@@ -106,14 +107,14 @@ class APIProxy(CoreSysAttributes):
)
async def _websocket_client(self):
"""Initialize a websocket api connection."""
"""Initialize a WebSocket API connection."""
url = f"{self.sys_homeassistant.api_url}/api/websocket"
try:
client = await self.sys_websession_ssl.ws_connect(
url, heartbeat=60, verify_ssl=False)
url, heartbeat=30, verify_ssl=False)
# handle authentication
# Handle authentication
data = await client.receive_json()
if data.get('type') == 'auth_ok':
@@ -122,8 +123,8 @@ class APIProxy(CoreSysAttributes):
if data.get('type') != 'auth_required':
# Invalid protocol
_LOGGER.error(
'Got unexpected response from HA websocket: %s', data)
raise HTTPBadGateway()
"Got unexpected response from HA WebSocket: %s", data)
raise APIError()
if self.sys_homeassistant.refresh_token:
await self.sys_homeassistant.ensure_access_token()
@@ -143,25 +144,25 @@ class APIProxy(CoreSysAttributes):
return client
# Renew the Token is invalid
if (data.get('type') == 'invalid_auth' and
self.sys_homeassistant.refresh_token):
if data.get('type') == 'invalid_auth' and self.sys_homeassistant.refresh_token:
self.sys_homeassistant.access_token = None
return await self._websocket_client()
_LOGGER.error(
"Failed authentication to Home-Assistant websocket: %s", data)
raise HomeAssistantAuthError()
except (RuntimeError, HomeAssistantAPIError) as err:
_LOGGER.error("Client error on websocket API %s.", err)
except (RuntimeError, ValueError, ClientConnectorError) as err:
_LOGGER.error("Client error on WebSocket API %s.", err)
except HomeAssistantAuthError:
_LOGGER.error("Failed authentication to Home Assistant WebSocket")
raise HTTPBadGateway()
raise APIError()
async def websocket(self, request):
"""Initialize a websocket api connection."""
_LOGGER.info("Home-Assistant Websocket API request initialze")
"""Initialize a WebSocket API connection."""
_LOGGER.info("Home Assistant WebSocket API request initialize")
# init server
server = web.WebSocketResponse(heartbeat=60)
server = web.WebSocketResponse(heartbeat=30)
await server.prepare(request)
# handle authentication
@@ -173,19 +174,18 @@ class APIProxy(CoreSysAttributes):
# Check API access
response = await server.receive_json()
hassio_token = (response.get('api_password') or
response.get('access_token'))
addon = self.sys_addons.from_uuid(hassio_token)
hassio_token = response.get('api_password') or response.get('access_token')
addon = self.sys_addons.from_token(hassio_token)
if not addon or not addon.access_homeassistant_api:
_LOGGER.warning("Unauthorized websocket access!")
_LOGGER.warning("Unauthorized WebSocket access!")
await server.send_json({
'type': 'auth_invalid',
'message': 'Invalid access',
})
return server
_LOGGER.info("Websocket access from %s", addon.slug)
_LOGGER.info("WebSocket access from %s", addon.slug)
await server.send_json({
'type': 'auth_ok',
@@ -193,12 +193,15 @@ class APIProxy(CoreSysAttributes):
})
except (RuntimeError, ValueError) as err:
_LOGGER.error("Can't initialize handshake: %s", err)
raise HTTPInternalServerError() from None
return server
# init connection to hass
client = await self._websocket_client()
try:
client = await self._websocket_client()
except APIError:
return server
_LOGGER.info("Home-Assistant Websocket API request running")
_LOGGER.info("Home Assistant WebSocket API request running")
try:
client_read = None
server_read = None
@@ -231,8 +234,8 @@ class APIProxy(CoreSysAttributes):
except asyncio.CancelledError:
pass
except RuntimeError as err:
_LOGGER.info("Home-Assistant Websocket API error: %s", err)
except (RuntimeError, ConnectionError, TypeError) as err:
_LOGGER.info("Home Assistant WebSocket API error: %s", err)
finally:
if client_read:
@@ -241,8 +244,10 @@ class APIProxy(CoreSysAttributes):
server_read.cancel()
# close connections
await client.close()
await server.close()
if not client.closed:
await client.close()
if not server.closed:
await server.close()
_LOGGER.info("Home-Assistant Websocket API connection is closed")
_LOGGER.info("Home Assistant WebSocket API connection is closed")
return server

View File

@@ -3,18 +3,76 @@ import logging
import re
from aiohttp.web import middleware
from aiohttp.web_exceptions import HTTPUnauthorized
from aiohttp.web_exceptions import HTTPUnauthorized, HTTPForbidden
from ..const import HEADER_TOKEN, REQUEST_FROM
from ..const import (
HEADER_TOKEN, REQUEST_FROM, ROLE_ADMIN, ROLE_DEFAULT, ROLE_HOMEASSISTANT,
ROLE_MANAGER, ROLE_BACKUP)
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
NO_SECURITY_CHECK = set((
re.compile(r"^/homeassistant/api/.*$"),
re.compile(r"^/homeassistant/websocket$"),
re.compile(r"^/supervisor/ping$"),
))
# Block Anytime
BLACKLIST = re.compile(
r"^(?:"
r"|/homeassistant/api/hassio/.*"
r")$"
)
# Free to call or have own security concepts
NO_SECURITY_CHECK = re.compile(
r"^(?:"
r"|/homeassistant/api/.*"
r"|/homeassistant/websocket"
r"|/supervisor/ping"
r")$"
)
# Can called by every add-on
ADDONS_API_BYPASS = re.compile(
r"^(?:"
r"|/addons/self/(?!security|update)[^/]+"
r"|/info"
r"|/services.*"
r"|/discovery.*"
r"|/auth"
r")$"
)
# Policy role add-on API access
ADDONS_ROLE_ACCESS = {
ROLE_DEFAULT: re.compile(
r"^(?:"
r"|/[^/]+/info"
r"|/addons"
r")$"
),
ROLE_HOMEASSISTANT: re.compile(
r"^(?:"
r"|/homeassistant/.+"
r")$"
),
ROLE_BACKUP: re.compile(
r"^(?:"
r"|/snapshots.*"
r")$"
),
ROLE_MANAGER: re.compile(
r"^(?:"
r"|/homeassistant/.+"
r"|/host/.+"
r"|/hardware/.+"
r"|/hassos/.+"
r"|/supervisor/.+"
r"|/addons(?:/[^/]+/(?!security).+)?"
r"|/snapshots.*"
r")$"
),
ROLE_ADMIN: re.compile(
r".*"
),
}
class SecurityMiddleware(CoreSysAttributes):
@@ -27,33 +85,56 @@ class SecurityMiddleware(CoreSysAttributes):
@middleware
async def token_validation(self, request, handler):
"""Check security access of this layer."""
request_from = None
hassio_token = request.headers.get(HEADER_TOKEN)
# Blacklist
if BLACKLIST.match(request.path):
_LOGGER.warning("%s is blacklisted!", request.path)
raise HTTPForbidden()
# Ignore security check
for rule in NO_SECURITY_CHECK:
if rule.match(request.path):
_LOGGER.debug("Passthrough %s", request.path)
return await handler(request)
if NO_SECURITY_CHECK.match(request.path):
_LOGGER.debug("Passthrough %s", request.path)
return await handler(request)
# Not token
if not hassio_token:
_LOGGER.warning("No API token provided for %s", request.path)
raise HTTPUnauthorized()
# Home-Assistant
if hassio_token == self.sys_homeassistant.uuid:
_LOGGER.debug("%s access from Home-Assistant", request.path)
request[REQUEST_FROM] = 'homeassistant'
# UUID check need removed with 131
if hassio_token in (self.sys_homeassistant.uuid,
self.sys_homeassistant.hassio_token):
_LOGGER.debug("%s access from Home Assistant", request.path)
request_from = self.sys_homeassistant
# Host
if hassio_token == self.sys_machine_id:
_LOGGER.debug("%s access from Host", request.path)
request[REQUEST_FROM] = 'host'
request_from = self.sys_host
# Add-on
addon = self.sys_addons.from_uuid(hassio_token) \
if hassio_token else None
if addon:
_LOGGER.info("%s access from %s", request.path, addon.slug)
request[REQUEST_FROM] = addon.slug
addon = None
if hassio_token and not request_from:
addon = self.sys_addons.from_token(hassio_token)
if request.get(REQUEST_FROM):
# Check Add-on API access
if addon and ADDONS_API_BYPASS.match(request.path):
_LOGGER.debug("Passthrough %s from %s", request.path, addon.slug)
request_from = addon
elif addon and addon.access_hassio_api:
# Check Role
if ADDONS_ROLE_ACCESS[addon.hassio_role].match(request.path):
_LOGGER.info("%s access from %s", request.path, addon.slug)
request_from = addon
else:
_LOGGER.warning("%s no role for %s", request.path, addon.slug)
if request_from:
request[REQUEST_FROM] = request_from
return await handler(request)
_LOGGER.warning("Invalid token for access %s", request.path)
raise HTTPUnauthorized()
_LOGGER.error("Invalid token for access %s", request.path)
raise HTTPForbidden()

View File

@@ -1,19 +1,21 @@
"""Init file for HassIO network rest api."""
"""Init file for Hass.io network RESTful API."""
from .utils import api_process, api_validate
from ..const import (
ATTR_AVAILABLE, ATTR_PROVIDER, ATTR_SLUG, ATTR_SERVICES, REQUEST_FROM)
ATTR_AVAILABLE, ATTR_PROVIDERS, ATTR_SLUG, ATTR_SERVICES, REQUEST_FROM,
PROVIDE_SERVICE)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError, APIForbidden
class APIServices(CoreSysAttributes):
"""Handle rest api for services functions."""
"""Handle RESTful API for services functions."""
def _extract_service(self, request):
"""Return service, throw an exception if it doesn't exist."""
service = self.sys_services.get(request.match_info.get('service'))
if not service:
raise RuntimeError("Service does not exist")
raise APIError("Service does not exist")
return service
@@ -25,7 +27,7 @@ class APIServices(CoreSysAttributes):
services.append({
ATTR_SLUG: service.slug,
ATTR_AVAILABLE: service.enabled,
ATTR_PROVIDER: service.provider,
ATTR_PROVIDERS: service.providers,
})
return {ATTR_SERVICES: services}
@@ -35,21 +37,39 @@ class APIServices(CoreSysAttributes):
"""Write data into a service."""
service = self._extract_service(request)
body = await api_validate(service.schema, request)
addon = request[REQUEST_FROM]
return service.set_service_data(request[REQUEST_FROM], body)
_check_access(request, service.slug)
service.set_service_data(addon, body)
@api_process
async def get_service(self, request):
"""Read data into a service."""
service = self._extract_service(request)
return {
ATTR_AVAILABLE: service.enabled,
service.slug: service.get_service_data(),
}
# Access
_check_access(request, service.slug)
if not service.enabled:
raise APIError("Service not enabled")
return service.get_service_data()
@api_process
async def del_service(self, request):
"""Delete data into a service."""
service = self._extract_service(request)
return service.del_service_data(request[REQUEST_FROM])
addon = request[REQUEST_FROM]
# Access
_check_access(request, service.slug, True)
service.del_service_data(addon)
def _check_access(request, service, provide=False):
"""Raise error if the rights are wrong."""
addon = request[REQUEST_FROM]
if not addon.services_role.get(service):
raise APIForbidden(f"No access to {service} service!")
if provide and addon.services_role.get(service) != PROVIDE_SERVICE:
raise APIForbidden(f"No access to write {service} service!")

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO snapshot rest api."""
"""Init file for Hass.io snapshot RESTful API."""
import asyncio
import logging
from pathlib import Path
@@ -14,6 +14,7 @@ from ..const import (
ATTR_HOMEASSISTANT, ATTR_VERSION, ATTR_SIZE, ATTR_FOLDERS, ATTR_TYPE,
ATTR_SNAPSHOTS, ATTR_PASSWORD, ATTR_PROTECTED, CONTENT_TYPE_TAR)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError
_LOGGER = logging.getLogger(__name__)
@@ -46,13 +47,13 @@ SCHEMA_SNAPSHOT_PARTIAL = SCHEMA_SNAPSHOT_FULL.extend({
class APISnapshots(CoreSysAttributes):
"""Handle rest api for snapshot functions."""
"""Handle RESTful API for snapshot functions."""
def _extract_snapshot(self, request):
"""Return snapshot, throw an exception if it doesn't exist."""
snapshot = self.sys_snapshots.get(request.match_info.get('snapshot'))
if not snapshot:
raise RuntimeError("Snapshot does not exist")
raise APIError("Snapshot does not exist")
return snapshot
@api_process

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO supervisor rest api."""
"""Init file for Hass.io Supervisor RESTful API."""
import asyncio
import logging
@@ -13,7 +13,9 @@ from ..const import (
ATTR_MEMORY_LIMIT, ATTR_NETWORK_RX, ATTR_NETWORK_TX, ATTR_BLK_READ,
ATTR_BLK_WRITE, CONTENT_TYPE_BINARY, ATTR_ICON)
from ..coresys import CoreSysAttributes
from ..validate import validate_timezone, WAIT_BOOT, REPOSITORIES, CHANNELS
from ..validate import WAIT_BOOT, REPOSITORIES, CHANNELS
from ..exceptions import APIError
from ..utils.validate import validate_timezone
_LOGGER = logging.getLogger(__name__)
@@ -30,11 +32,11 @@ SCHEMA_VERSION = vol.Schema({
class APISupervisor(CoreSysAttributes):
"""Handle rest api for supervisor functions."""
"""Handle RESTful API for Supervisor functions."""
@api_process
async def ping(self, request):
"""Return ok for signal that the api is ready."""
"""Return ok for signal that the API is ready."""
return True
@api_process
@@ -59,7 +61,7 @@ class APISupervisor(CoreSysAttributes):
ATTR_VERSION: HASSIO_VERSION,
ATTR_LAST_VERSION: self.sys_updater.version_hassio,
ATTR_CHANNEL: self.sys_updater.channel,
ATTR_ARCH: self.sys_arch,
ATTR_ARCH: self.sys_supervisor.arch,
ATTR_WAIT_BOOT: self.sys_config.wait_boot,
ATTR_TIMEZONE: self.sys_config.timezone,
ATTR_ADDONS: list_addons,
@@ -68,7 +70,7 @@ class APISupervisor(CoreSysAttributes):
@api_process
async def options(self, request):
"""Set supervisor options."""
"""Set Supervisor options."""
body = await api_validate(SCHEMA_OPTIONS, request)
if ATTR_CHANNEL in body:
@@ -93,7 +95,7 @@ class APISupervisor(CoreSysAttributes):
"""Return resource information."""
stats = await self.sys_supervisor.stats()
if not stats:
raise RuntimeError("No stats available")
raise APIError("No stats available")
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
@@ -107,32 +109,30 @@ class APISupervisor(CoreSysAttributes):
@api_process
async def update(self, request):
"""Update supervisor OS."""
"""Update Supervisor OS."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_updater.version_hassio)
if version == self.sys_supervisor.version:
raise RuntimeError("Version {} is already in use".format(version))
raise APIError("Version {} is already in use".format(version))
return await asyncio.shield(
self.sys_supervisor.update(version))
return await asyncio.shield(self.sys_supervisor.update(version))
@api_process
async def reload(self, request):
"""Reload addons, config etc."""
"""Reload add-ons, configuration, etc."""
tasks = [
self.sys_updater.reload(),
]
results, _ = await asyncio.shield(
asyncio.wait(tasks))
results, _ = await asyncio.shield(asyncio.wait(tasks))
for result in results:
if result.exception() is not None:
raise RuntimeError("Some reload task fails!")
raise APIError("Some reload task fails!")
return True
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request):
"""Return supervisor docker logs."""
"""Return supervisor Docker logs."""
return self.sys_supervisor.logs()

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO util for rest api."""
"""Init file for Hass.io util for RESTful API."""
import json
import logging
@@ -9,7 +9,7 @@ from voluptuous.humanize import humanize_error
from ..const import (
JSON_RESULT, JSON_DATA, JSON_MESSAGE, RESULT_OK, RESULT_ERROR,
CONTENT_TYPE_BINARY)
from ..exceptions import HassioError
from ..exceptions import HassioError, APIError, APIForbidden
_LOGGER = logging.getLogger(__name__)
@@ -21,19 +21,19 @@ def json_loads(data):
try:
return json.loads(data)
except json.JSONDecodeError:
raise RuntimeError("Invalid json")
raise APIError("Invalid json")
def api_process(method):
"""Wrap function with true/false calls to rest api."""
async def wrap_api(api, *args, **kwargs):
"""Return api information."""
"""Return API information."""
try:
answer = await method(api, *args, **kwargs)
except RuntimeError as err:
except (APIError, APIForbidden) as err:
return api_return_error(message=str(err))
except HassioError:
return api_return_error()
return api_return_error(message="Unknown Error, see logs")
if isinstance(answer, dict):
return api_return_ok(data=answer)
@@ -55,7 +55,7 @@ def api_process_raw(content):
try:
msg_data = await method(api, *args, **kwargs)
msg_type = content
except RuntimeError as err:
except (APIError, APIForbidden) as err:
msg_data = str(err).encode()
msg_type = CONTENT_TYPE_BINARY
except HassioError:
@@ -90,6 +90,6 @@ async def api_validate(schema, request):
try:
data = schema(data)
except vol.Invalid as ex:
raise RuntimeError(humanize_error(data, ex)) from None
raise APIError(humanize_error(data, ex)) from None
return data

49
hassio/arch.json Normal file
View File

@@ -0,0 +1,49 @@
{
"raspberrypi": [
"armhf"
],
"raspberrypi2": [
"armv7",
"armhf"
],
"raspberrypi3": [
"armv7",
"armhf"
],
"raspberrypi3-64": [
"aarch64",
"armv7",
"armhf"
],
"tinker": [
"armv7",
"armhf"
],
"odroid-c2": [
"aarch64"
],
"odroid-xu": [
"armv7",
"armhf"
],
"orangepi-prime": [
"aarch64"
],
"qemux86": [
"i386"
],
"qemux86-64": [
"amd64",
"i386"
],
"qemuarm": [
"armhf"
],
"qemuarm-64": [
"aarch64"
],
"intel-nuc": [
"amd64",
"i386"
]
}

65
hassio/arch.py Normal file
View File

@@ -0,0 +1,65 @@
"""Handle Arch for underlay maschine/platforms."""
import logging
from typing import List
from pathlib import Path
from .coresys import CoreSysAttributes, CoreSys
from .exceptions import HassioArchNotFound, JsonFileError
from .utils.json import read_json_file
_LOGGER = logging.getLogger(__name__)
class CpuArch(CoreSysAttributes):
"""Manage available architectures."""
def __init__(self, coresys: CoreSys) -> None:
"""Initialize CPU Architecture handler."""
self.coresys = coresys
self._supported_arch: List[str] = []
self._default_arch: str
@property
def default(self) -> str:
"""Return system default arch."""
return self._default_arch
@property
def supervisor(self) -> str:
"""Return supervisor arch."""
return self.sys_supervisor.arch
@property
def supported(self) -> List[str]:
"""Return support arch by CPU/Machine."""
return self._supported_arch
async def load(self) -> None:
"""Load data and initialize default arch."""
try:
arch_data = read_json_file(Path(__file__).parent.joinpath("arch.json"))
except JsonFileError:
_LOGGER.warning("Can't read arch json")
return
# Evaluate current CPU/Platform
if not self.sys_machine or self.sys_machine not in arch_data:
_LOGGER.warning("Can't detect underlay machine type!")
self._default_arch = self.sys_supervisor.arch
self._supported_arch.append(self.default)
return
# Use configs from arch.json
self._supported_arch.extend(arch_data[self.sys_machine])
self._default_arch = self.supported[0]
def is_supported(self, arch_list: List[str]) -> bool:
"""Return True if there is a supported arch by this platform."""
return not set(self.supported).isdisjoint(set(arch_list))
def match(self, arch_list: List[str]) -> str:
"""Return best match for this CPU/Platform."""
for self_arch in self.supported:
if self_arch in arch_list:
return self_arch
raise HassioArchNotFound()

95
hassio/auth.py Normal file
View File

@@ -0,0 +1,95 @@
"""Manage SSO for Add-ons with Home Assistant user."""
import logging
import hashlib
from .const import (
FILE_HASSIO_AUTH, ATTR_PASSWORD, ATTR_USERNAME, ATTR_ADDON)
from .coresys import CoreSysAttributes
from .utils.json import JsonConfig
from .validate import SCHEMA_AUTH_CONFIG
from .exceptions import AuthError, HomeAssistantAPIError
_LOGGER = logging.getLogger(__name__)
class Auth(JsonConfig, CoreSysAttributes):
"""Manage SSO for Add-ons with Home Assistant user."""
def __init__(self, coresys):
"""Initialize updater."""
super().__init__(FILE_HASSIO_AUTH, SCHEMA_AUTH_CONFIG)
self.coresys = coresys
def _check_cache(self, username, password):
"""Check password in cache."""
username_h = _rehash(username)
password_h = _rehash(password, username)
if self._data.get(username_h) == password_h:
_LOGGER.info("Cache hit for %s", username)
return True
_LOGGER.warning("No cache hit for %s", username)
return False
def _update_cache(self, username, password):
"""Cache a username, password."""
username_h = _rehash(username)
password_h = _rehash(password, username)
if self._data.get(username_h) == password_h:
return
self._data[username_h] = password_h
self.save_data()
def _dismatch_cache(self, username, password):
"""Remove user from cache."""
username_h = _rehash(username)
password_h = _rehash(password, username)
if self._data.get(username_h) != password_h:
return
self._data.pop(username_h, None)
self.save_data()
async def check_login(self, addon, username, password):
"""Check username login."""
if password is None:
_LOGGER.error("None as password is not supported!")
raise AuthError()
_LOGGER.info("Auth request from %s for %s", addon.slug, username)
# Check API state
if not await self.sys_homeassistant.check_api_state():
_LOGGER.info("Home Assistant not running, check cache")
return self._check_cache(username, password)
try:
async with self.sys_homeassistant.make_request(
'post', 'api/hassio_auth', json={
ATTR_USERNAME: username,
ATTR_PASSWORD: password,
ATTR_ADDON: addon.slug,
}) as req:
if req.status == 200:
_LOGGER.info("Success login from %s", username)
self._update_cache(username, password)
return True
_LOGGER.warning("Wrong login from %s", username)
self._dismatch_cache(username, password)
return False
except HomeAssistantAPIError:
_LOGGER.error("Can't request auth on Home Assistant!")
raise AuthError()
def _rehash(value, salt2=""):
"""Rehash a value."""
for idx in range(1, 20):
value = hashlib.sha256(f"{value}{idx}{salt2}".encode()).hexdigest()
return value

View File

@@ -1,43 +1,47 @@
"""Bootstrap HassIO."""
"""Bootstrap Hass.io."""
import logging
import os
import signal
import shutil
from pathlib import Path
import shutil
import signal
from colorlog import ColoredFormatter
from .core import HassIO
from .addons import AddonManager
from .api import RestAPI
from .arch import CpuArch
from .auth import Auth
from .const import SOCKET_DOCKER
from .core import HassIO
from .coresys import CoreSys
from .supervisor import Supervisor
from .dbus import DBusManager
from .discovery import Discovery
from .hassos import HassOS
from .homeassistant import HomeAssistant
from .host import HostManager
from .services import ServiceManager
from .snapshots import SnapshotManager
from .supervisor import Supervisor
from .tasks import Tasks
from .updater import Updater
from .services import ServiceManager
from .services import Discovery
from .host import HostManager
from .dbus import DBusManager
from .hassos import HassOS
_LOGGER = logging.getLogger(__name__)
ENV_SHARE = 'SUPERVISOR_SHARE'
ENV_NAME = 'SUPERVISOR_NAME'
ENV_REPO = 'HOMEASSISTANT_REPOSITORY'
ENV_SHARE = "SUPERVISOR_SHARE"
ENV_NAME = "SUPERVISOR_NAME"
ENV_REPO = "HOMEASSISTANT_REPOSITORY"
MACHINE_ID = Path('/etc/machine-id')
MACHINE_ID = Path("/etc/machine-id")
def initialize_coresys(loop):
async def initialize_coresys():
"""Initialize HassIO coresys/objects."""
coresys = CoreSys(loop)
coresys = CoreSys()
# Initialize core objects
coresys.core = HassIO(coresys)
coresys.arch = CpuArch(coresys)
coresys.auth = Auth(coresys)
coresys.updater = Updater(coresys)
coresys.api = RestAPI(coresys)
coresys.supervisor = Supervisor(coresys)
@@ -62,54 +66,54 @@ def initialize_coresys(loop):
def initialize_system_data(coresys):
"""Setup default config and create folders."""
"""Set up the default configuration and create folders."""
config = coresys.config
# homeassistant config folder
if not config.path_config.is_dir():
_LOGGER.info(
"Create Home-Assistant config folder %s", config.path_config)
config.path_config.mkdir()
# Home Assistant configuration folder
if not config.path_homeassistant.is_dir():
_LOGGER.info("Create Home Assistant configuration folder %s",
config.path_homeassistant)
config.path_homeassistant.mkdir()
# hassio ssl folder
if not config.path_ssl.is_dir():
_LOGGER.info("Create hassio ssl folder %s", config.path_ssl)
_LOGGER.info("Create Hass.io SSL/TLS folder %s", config.path_ssl)
config.path_ssl.mkdir()
# hassio addon data folder
if not config.path_addons_data.is_dir():
_LOGGER.info(
"Create hassio addon data folder %s", config.path_addons_data)
_LOGGER.info("Create Hass.io Add-on data folder %s",
config.path_addons_data)
config.path_addons_data.mkdir(parents=True)
if not config.path_addons_local.is_dir():
_LOGGER.info("Create hassio addon local repository folder %s",
_LOGGER.info("Create Hass.io Add-on local repository folder %s",
config.path_addons_local)
config.path_addons_local.mkdir(parents=True)
if not config.path_addons_git.is_dir():
_LOGGER.info("Create hassio addon git repositories folder %s",
_LOGGER.info("Create Hass.io Add-on git repositories folder %s",
config.path_addons_git)
config.path_addons_git.mkdir(parents=True)
# hassio tmp folder
if not config.path_tmp.is_dir():
_LOGGER.info("Create hassio temp folder %s", config.path_tmp)
_LOGGER.info("Create Hass.io temp folder %s", config.path_tmp)
config.path_tmp.mkdir(parents=True)
# hassio backup folder
if not config.path_backup.is_dir():
_LOGGER.info("Create hassio backup folder %s", config.path_backup)
_LOGGER.info("Create Hass.io backup folder %s", config.path_backup)
config.path_backup.mkdir()
# share folder
if not config.path_share.is_dir():
_LOGGER.info("Create hassio share folder %s", config.path_share)
_LOGGER.info("Create Hass.io share folder %s", config.path_share)
config.path_share.mkdir()
# apparmor folder
if not config.path_apparmor.is_dir():
_LOGGER.info("Create hassio apparmor folder %s", config.path_apparmor)
_LOGGER.info("Create Hass.io Apparmor folder %s", config.path_apparmor)
config.path_apparmor.mkdir()
return config
@@ -125,32 +129,32 @@ def migrate_system_env(coresys):
try:
old_build.rmdir()
except OSError:
_LOGGER.warning("Can't cleanup old addons build dir.")
_LOGGER.warning("Can't cleanup old Add-on build directory")
def initialize_logging():
"""Setup the logging."""
logging.basicConfig(level=logging.INFO)
fmt = ("%(asctime)s %(levelname)s (%(threadName)s) "
"[%(name)s] %(message)s")
colorfmt = "%(log_color)s{}%(reset)s".format(fmt)
datefmt = '%y-%m-%d %H:%M:%S'
fmt = "%(asctime)s %(levelname)s (%(threadName)s) [%(name)s] %(message)s"
colorfmt = f"%(log_color)s{fmt}%(reset)s"
datefmt = "%y-%m-%d %H:%M:%S"
# suppress overly verbose logs from libraries that aren't helpful
logging.getLogger("aiohttp.access").setLevel(logging.WARNING)
logging.getLogger().handlers[0].setFormatter(ColoredFormatter(
colorfmt,
datefmt=datefmt,
reset=True,
log_colors={
'DEBUG': 'cyan',
'INFO': 'green',
'WARNING': 'yellow',
'ERROR': 'red',
'CRITICAL': 'red',
}
))
logging.getLogger().handlers[0].setFormatter(
ColoredFormatter(
colorfmt,
datefmt=datefmt,
reset=True,
log_colors={
"DEBUG": "cyan",
"INFO": "green",
"WARNING": "yellow",
"ERROR": "red",
"CRITICAL": "red",
},
))
def check_environment():
@@ -165,38 +169,38 @@ def check_environment():
# check docker socket
if not SOCKET_DOCKER.is_socket():
_LOGGER.fatal("Can't find docker socket!")
_LOGGER.fatal("Can't find Docker socket!")
return False
# check socat exec
if not shutil.which('socat'):
_LOGGER.fatal("Can't find socat program!")
if not shutil.which("socat"):
_LOGGER.fatal("Can't find socat!")
return False
# check socat exec
if not shutil.which('gdbus'):
_LOGGER.fatal("Can't find gdbus program!")
if not shutil.which("gdbus"):
_LOGGER.fatal("Can't find gdbus!")
return False
return True
def reg_signal(loop):
"""Register SIGTERM, SIGKILL to stop system."""
"""Register SIGTERM and SIGKILL to stop system."""
try:
loop.add_signal_handler(
signal.SIGTERM, lambda: loop.call_soon(loop.stop))
loop.add_signal_handler(signal.SIGTERM,
lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGTERM")
try:
loop.add_signal_handler(
signal.SIGHUP, lambda: loop.call_soon(loop.stop))
loop.add_signal_handler(signal.SIGHUP,
lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGHUP")
try:
loop.add_signal_handler(
signal.SIGINT, lambda: loop.call_soon(loop.stop))
loop.add_signal_handler(signal.SIGINT,
lambda: loop.call_soon(loop.stop))
except (ValueError, RuntimeError):
_LOGGER.warning("Could not bind to SIGINT")

View File

@@ -1,9 +1,12 @@
"""Bootstrap HassIO."""
"""Bootstrap Hass.io."""
from datetime import datetime
import logging
import os
import re
from pathlib import Path, PurePath
import pytz
from .const import (
FILE_HASSIO_CONFIG, HASSIO_DATA, ATTR_TIMEZONE, ATTR_ADDONS_CUSTOM_LIST,
ATTR_LAST_BOOT, ATTR_WAIT_BOOT)
@@ -13,7 +16,7 @@ from .validate import SCHEMA_HASSIO_CONFIG
_LOGGER = logging.getLogger(__name__)
HOMEASSISTANT_CONFIG = PurePath("homeassistant")
HOMEASSISTANT_CONFIG = PurePath('homeassistant')
HASSIO_SSL = PurePath("ssl")
@@ -29,6 +32,8 @@ APPARMOR_DATA = PurePath("apparmor")
DEFAULT_BOOT_TIME = datetime.utcfromtimestamp(0).isoformat()
RE_TIMEZONE = re.compile(r"time_zone: (?P<timezone>[\w/\-+]+)")
class CoreConfig(JsonConfig):
"""Hold all core config data."""
@@ -40,7 +45,21 @@ class CoreConfig(JsonConfig):
@property
def timezone(self):
"""Return system timezone."""
return self._data[ATTR_TIMEZONE]
config_file = Path(self.path_homeassistant, 'configuration.yaml')
try:
assert config_file.exists()
configuration = config_file.read_text()
data = RE_TIMEZONE.search(configuration)
assert data
timezone = data.group('timezone')
pytz.timezone(timezone)
except (pytz.exceptions.UnknownTimeZoneError, OSError, AssertionError):
_LOGGER.debug("Can't parse Home Assistant timezone")
return self._data[ATTR_TIMEZONE]
return timezone
@timezone.setter
def timezone(self, value):
@@ -74,27 +93,27 @@ class CoreConfig(JsonConfig):
@property
def path_hassio(self):
"""Return hassio data path."""
"""Return Hass.io data path."""
return HASSIO_DATA
@property
def path_extern_hassio(self):
"""Return hassio data path extern for docker."""
"""Return Hass.io data path external for Docker."""
return PurePath(os.environ['SUPERVISOR_SHARE'])
@property
def path_extern_config(self):
"""Return config path extern for docker."""
def path_extern_homeassistant(self):
"""Return config path external for Docker."""
return str(PurePath(self.path_extern_hassio, HOMEASSISTANT_CONFIG))
@property
def path_config(self):
def path_homeassistant(self):
"""Return config path inside supervisor."""
return Path(HASSIO_DATA, HOMEASSISTANT_CONFIG)
@property
def path_extern_ssl(self):
"""Return SSL path extern for docker."""
"""Return SSL path external for Docker."""
return str(PurePath(self.path_extern_hassio, HASSIO_SSL))
@property
@@ -104,42 +123,42 @@ class CoreConfig(JsonConfig):
@property
def path_addons_core(self):
"""Return git path for core addons."""
"""Return git path for core Add-ons."""
return Path(HASSIO_DATA, ADDONS_CORE)
@property
def path_addons_git(self):
"""Return path for git addons."""
"""Return path for Git Add-on."""
return Path(HASSIO_DATA, ADDONS_GIT)
@property
def path_addons_local(self):
"""Return path for customs addons."""
"""Return path for custom Add-ons."""
return Path(HASSIO_DATA, ADDONS_LOCAL)
@property
def path_extern_addons_local(self):
"""Return path for customs addons."""
"""Return path for custom Add-ons."""
return PurePath(self.path_extern_hassio, ADDONS_LOCAL)
@property
def path_addons_data(self):
"""Return root addon data folder."""
"""Return root Add-on data folder."""
return Path(HASSIO_DATA, ADDONS_DATA)
@property
def path_extern_addons_data(self):
"""Return root addon data folder extern for docker."""
"""Return root add-on data folder external for Docker."""
return PurePath(self.path_extern_hassio, ADDONS_DATA)
@property
def path_tmp(self):
"""Return hass.io temp folder."""
"""Return Hass.io temp folder."""
return Path(HASSIO_DATA, TMP_DATA)
@property
def path_extern_tmp(self):
"""Return hass.io temp folder for docker."""
"""Return Hass.io temp folder for Docker."""
return PurePath(self.path_extern_hassio, TMP_DATA)
@property
@@ -149,7 +168,7 @@ class CoreConfig(JsonConfig):
@property
def path_extern_backup(self):
"""Return root backup data folder extern for docker."""
"""Return root backup data folder external for Docker."""
return PurePath(self.path_extern_hassio, BACKUP_DATA)
@property
@@ -159,17 +178,17 @@ class CoreConfig(JsonConfig):
@property
def path_apparmor(self):
"""Return root apparmor profile folder."""
"""Return root Apparmor profile folder."""
return Path(HASSIO_DATA, APPARMOR_DATA)
@property
def path_extern_share(self):
"""Return root share data folder extern for docker."""
"""Return root share data folder external for Docker."""
return PurePath(self.path_extern_hassio, SHARE_DATA)
@property
def addons_repositories(self):
"""Return list of addons custom repositories."""
"""Return list of custom Add-on repositories."""
return self._data[ATTR_ADDONS_CUSTOM_LIST]
def add_addon_repository(self, repo):

View File

@@ -1,233 +1,292 @@
"""Const file for HassIO."""
"""Constants file for Hass.io."""
from pathlib import Path
from ipaddress import ip_network
HASSIO_VERSION = '121'
HASSIO_VERSION = "150"
URL_HASSIO_ADDONS = "https://github.com/home-assistant/hassio-addons"
URL_HASSIO_VERSION = \
"https://s3.amazonaws.com/hassio-version/{channel}.json"
URL_HASSIO_APPARMOR = \
"https://s3.amazonaws.com/hassio-version/apparmor.txt"
URL_HASSIO_VERSION = "https://s3.amazonaws.com/hassio-version/{channel}.json"
URL_HASSIO_APPARMOR = "https://s3.amazonaws.com/hassio-version/apparmor.txt"
URL_HASSOS_OTA = (
"https://github.com/home-assistant/hassos/releases/download/"
"{version}/hassos_{board}-{version}.raucb")
"{version}/hassos_{board}-{version}.raucb"
)
HASSIO_DATA = Path("/data")
FILE_HASSIO_AUTH = Path(HASSIO_DATA, "auth.json")
FILE_HASSIO_ADDONS = Path(HASSIO_DATA, "addons.json")
FILE_HASSIO_CONFIG = Path(HASSIO_DATA, "config.json")
FILE_HASSIO_HOMEASSISTANT = Path(HASSIO_DATA, "homeassistant.json")
FILE_HASSIO_UPDATER = Path(HASSIO_DATA, "updater.json")
FILE_HASSIO_SERVICES = Path(HASSIO_DATA, "services.json")
FILE_HASSIO_DISCOVERY = Path(HASSIO_DATA, "discovery.json")
SOCKET_DOCKER = Path("/var/run/docker.sock")
DOCKER_NETWORK = 'hassio'
DOCKER_NETWORK_MASK = ip_network('172.30.32.0/23')
DOCKER_NETWORK_RANGE = ip_network('172.30.33.0/24')
DOCKER_NETWORK = "hassio"
DOCKER_NETWORK_MASK = ip_network("172.30.32.0/23")
DOCKER_NETWORK_RANGE = ip_network("172.30.33.0/24")
LABEL_VERSION = 'io.hass.version'
LABEL_ARCH = 'io.hass.arch'
LABEL_TYPE = 'io.hass.type'
LABEL_MACHINE = 'io.hass.machine'
LABEL_VERSION = "io.hass.version"
LABEL_ARCH = "io.hass.arch"
LABEL_TYPE = "io.hass.type"
LABEL_MACHINE = "io.hass.machine"
META_ADDON = 'addon'
META_SUPERVISOR = 'supervisor'
META_HOMEASSISTANT = 'homeassistant'
META_ADDON = "addon"
META_SUPERVISOR = "supervisor"
META_HOMEASSISTANT = "homeassistant"
JSON_RESULT = 'result'
JSON_DATA = 'data'
JSON_MESSAGE = 'message'
JSON_RESULT = "result"
JSON_DATA = "data"
JSON_MESSAGE = "message"
RESULT_ERROR = 'error'
RESULT_OK = 'ok'
RESULT_ERROR = "error"
RESULT_OK = "ok"
CONTENT_TYPE_BINARY = 'application/octet-stream'
CONTENT_TYPE_PNG = 'image/png'
CONTENT_TYPE_JSON = 'application/json'
CONTENT_TYPE_TEXT = 'text/plain'
CONTENT_TYPE_TAR = 'application/tar'
HEADER_HA_ACCESS = 'x-ha-access'
HEADER_TOKEN = 'x-hassio-key'
CONTENT_TYPE_BINARY = "application/octet-stream"
CONTENT_TYPE_PNG = "image/png"
CONTENT_TYPE_JSON = "application/json"
CONTENT_TYPE_TEXT = "text/plain"
CONTENT_TYPE_TAR = "application/tar"
CONTENT_TYPE_URL = "application/x-www-form-urlencoded"
HEADER_HA_ACCESS = "x-ha-access"
HEADER_TOKEN = "x-hassio-key"
ENV_TOKEN = 'HASSIO_TOKEN'
ENV_TIME = 'TZ'
ENV_TOKEN = "HASSIO_TOKEN"
ENV_TIME = "TZ"
REQUEST_FROM = 'HASSIO_FROM'
REQUEST_FROM = "HASSIO_FROM"
ATTR_MACHINE = 'machine'
ATTR_WAIT_BOOT = 'wait_boot'
ATTR_DEPLOYMENT = 'deployment'
ATTR_WATCHDOG = 'watchdog'
ATTR_CHANGELOG = 'changelog'
ATTR_DATE = 'date'
ATTR_ARCH = 'arch'
ATTR_LONG_DESCRIPTION = 'long_description'
ATTR_HOSTNAME = 'hostname'
ATTR_TIMEZONE = 'timezone'
ATTR_ARGS = 'args'
ATTR_OPERATING_SYSTEM = 'operating_system'
ATTR_CHASSIS = 'chassis'
ATTR_TYPE = 'type'
ATTR_SOURCE = 'source'
ATTR_FEATURES = 'features'
ATTR_ADDONS = 'addons'
ATTR_VERSION = 'version'
ATTR_VERSION_LATEST = 'version_latest'
ATTR_AUTO_UART = 'auto_uart'
ATTR_LAST_BOOT = 'last_boot'
ATTR_LAST_VERSION = 'last_version'
ATTR_CHANNEL = 'channel'
ATTR_NAME = 'name'
ATTR_SLUG = 'slug'
ATTR_DESCRIPTON = 'description'
ATTR_STARTUP = 'startup'
ATTR_BOOT = 'boot'
ATTR_PORTS = 'ports'
ATTR_PORT = 'port'
ATTR_SSL = 'ssl'
ATTR_MAP = 'map'
ATTR_WEBUI = 'webui'
ATTR_OPTIONS = 'options'
ATTR_INSTALLED = 'installed'
ATTR_DETACHED = 'detached'
ATTR_STATE = 'state'
ATTR_SCHEMA = 'schema'
ATTR_IMAGE = 'image'
ATTR_ICON = 'icon'
ATTR_LOGO = 'logo'
ATTR_STDIN = 'stdin'
ATTR_ADDONS_REPOSITORIES = 'addons_repositories'
ATTR_REPOSITORY = 'repository'
ATTR_REPOSITORIES = 'repositories'
ATTR_URL = 'url'
ATTR_MAINTAINER = 'maintainer'
ATTR_PASSWORD = 'password'
ATTR_TOTP = 'totp'
ATTR_INITIALIZE = 'initialize'
ATTR_SESSION = 'session'
ATTR_SESSIONS = 'sessions'
ATTR_LOCATON = 'location'
ATTR_BUILD = 'build'
ATTR_DEVICES = 'devices'
ATTR_ENVIRONMENT = 'environment'
ATTR_HOST_NETWORK = 'host_network'
ATTR_HOST_IPC = 'host_ipc'
ATTR_HOST_DBUS = 'host_dbus'
ATTR_NETWORK = 'network'
ATTR_TMPFS = 'tmpfs'
ATTR_PRIVILEGED = 'privileged'
ATTR_USER = 'user'
ATTR_SYSTEM = 'system'
ATTR_SNAPSHOTS = 'snapshots'
ATTR_HOMEASSISTANT = 'homeassistant'
ATTR_HASSIO = 'hassio'
ATTR_HASSIO_API = 'hassio_api'
ATTR_HOMEASSISTANT_API = 'homeassistant_api'
ATTR_UUID = 'uuid'
ATTR_FOLDERS = 'folders'
ATTR_SIZE = 'size'
ATTR_TYPE = 'type'
ATTR_TIMEOUT = 'timeout'
ATTR_AUTO_UPDATE = 'auto_update'
ATTR_CUSTOM = 'custom'
ATTR_AUDIO = 'audio'
ATTR_AUDIO_INPUT = 'audio_input'
ATTR_AUDIO_OUTPUT = 'audio_output'
ATTR_INPUT = 'input'
ATTR_OUTPUT = 'output'
ATTR_DISK = 'disk'
ATTR_SERIAL = 'serial'
ATTR_SECURITY = 'security'
ATTR_BUILD_FROM = 'build_from'
ATTR_SQUASH = 'squash'
ATTR_GPIO = 'gpio'
ATTR_LEGACY = 'legacy'
ATTR_ADDONS_CUSTOM_LIST = 'addons_custom_list'
ATTR_CPU_PERCENT = 'cpu_percent'
ATTR_NETWORK_RX = 'network_rx'
ATTR_NETWORK_TX = 'network_tx'
ATTR_MEMORY_LIMIT = 'memory_limit'
ATTR_MEMORY_USAGE = 'memory_usage'
ATTR_BLK_READ = 'blk_read'
ATTR_BLK_WRITE = 'blk_write'
ATTR_PROVIDER = 'provider'
ATTR_AVAILABLE = 'available'
ATTR_HOST = 'host'
ATTR_USERNAME = 'username'
ATTR_PROTOCOL = 'protocol'
ATTR_DISCOVERY = 'discovery'
ATTR_PLATFORM = 'platform'
ATTR_COMPONENT = 'component'
ATTR_CONFIG = 'config'
ATTR_DISCOVERY_ID = 'discovery_id'
ATTR_SERVICES = 'services'
ATTR_DISCOVERY = 'discovery'
ATTR_PROTECTED = 'protected'
ATTR_CRYPTO = 'crypto'
ATTR_BRANCH = 'branch'
ATTR_KERNEL = 'kernel'
ATTR_APPARMOR = 'apparmor'
ATTR_DEVICETREE = 'devicetree'
ATTR_CPE = 'cpe'
ATTR_BOARD = 'board'
ATTR_HASSOS = 'hassos'
ATTR_HASSOS_CLI = 'hassos_cli'
ATTR_VERSION_CLI = 'version_cli'
ATTR_VERSION_CLI_LATEST = 'version_cli_latest'
ATTR_REFRESH_TOKEN = 'refresh_token'
ATTR_MACHINE = "machine"
ATTR_WAIT_BOOT = "wait_boot"
ATTR_DEPLOYMENT = "deployment"
ATTR_WATCHDOG = "watchdog"
ATTR_CHANGELOG = "changelog"
ATTR_DATE = "date"
ATTR_ARCH = "arch"
ATTR_LONG_DESCRIPTION = "long_description"
ATTR_HOSTNAME = "hostname"
ATTR_TIMEZONE = "timezone"
ATTR_ARGS = "args"
ATTR_OPERATING_SYSTEM = "operating_system"
ATTR_CHASSIS = "chassis"
ATTR_TYPE = "type"
ATTR_SOURCE = "source"
ATTR_FEATURES = "features"
ATTR_ADDONS = "addons"
ATTR_PROVIDERS = "providers"
ATTR_VERSION = "version"
ATTR_VERSION_LATEST = "version_latest"
ATTR_AUTO_UART = "auto_uart"
ATTR_LAST_BOOT = "last_boot"
ATTR_LAST_VERSION = "last_version"
ATTR_CHANNEL = "channel"
ATTR_NAME = "name"
ATTR_SLUG = "slug"
ATTR_DESCRIPTON = "description"
ATTR_STARTUP = "startup"
ATTR_BOOT = "boot"
ATTR_PORTS = "ports"
ATTR_PORT = "port"
ATTR_SSL = "ssl"
ATTR_MAP = "map"
ATTR_WEBUI = "webui"
ATTR_OPTIONS = "options"
ATTR_INSTALLED = "installed"
ATTR_DETACHED = "detached"
ATTR_STATE = "state"
ATTR_SCHEMA = "schema"
ATTR_IMAGE = "image"
ATTR_ICON = "icon"
ATTR_LOGO = "logo"
ATTR_STDIN = "stdin"
ATTR_ADDONS_REPOSITORIES = "addons_repositories"
ATTR_REPOSITORY = "repository"
ATTR_REPOSITORIES = "repositories"
ATTR_URL = "url"
ATTR_MAINTAINER = "maintainer"
ATTR_PASSWORD = "password"
ATTR_TOTP = "totp"
ATTR_INITIALIZE = "initialize"
ATTR_LOCATON = "location"
ATTR_BUILD = "build"
ATTR_DEVICES = "devices"
ATTR_ENVIRONMENT = "environment"
ATTR_HOST_NETWORK = "host_network"
ATTR_HOST_PID = "host_pid"
ATTR_HOST_IPC = "host_ipc"
ATTR_HOST_DBUS = "host_dbus"
ATTR_NETWORK = "network"
ATTR_TMPFS = "tmpfs"
ATTR_PRIVILEGED = "privileged"
ATTR_USER = "user"
ATTR_SYSTEM = "system"
ATTR_SNAPSHOTS = "snapshots"
ATTR_HOMEASSISTANT = "homeassistant"
ATTR_HASSIO = "hassio"
ATTR_HASSIO_API = "hassio_api"
ATTR_HOMEASSISTANT_API = "homeassistant_api"
ATTR_UUID = "uuid"
ATTR_FOLDERS = "folders"
ATTR_SIZE = "size"
ATTR_TYPE = "type"
ATTR_TIMEOUT = "timeout"
ATTR_AUTO_UPDATE = "auto_update"
ATTR_CUSTOM = "custom"
ATTR_AUDIO = "audio"
ATTR_AUDIO_INPUT = "audio_input"
ATTR_AUDIO_OUTPUT = "audio_output"
ATTR_INPUT = "input"
ATTR_OUTPUT = "output"
ATTR_DISK = "disk"
ATTR_SERIAL = "serial"
ATTR_SECURITY = "security"
ATTR_BUILD_FROM = "build_from"
ATTR_SQUASH = "squash"
ATTR_GPIO = "gpio"
ATTR_LEGACY = "legacy"
ATTR_ADDONS_CUSTOM_LIST = "addons_custom_list"
ATTR_CPU_PERCENT = "cpu_percent"
ATTR_NETWORK_RX = "network_rx"
ATTR_NETWORK_TX = "network_tx"
ATTR_MEMORY_LIMIT = "memory_limit"
ATTR_MEMORY_USAGE = "memory_usage"
ATTR_BLK_READ = "blk_read"
ATTR_BLK_WRITE = "blk_write"
ATTR_ADDON = "addon"
ATTR_AVAILABLE = "available"
ATTR_HOST = "host"
ATTR_USERNAME = "username"
ATTR_PROTOCOL = "protocol"
ATTR_DISCOVERY = "discovery"
ATTR_CONFIG = "config"
ATTR_SERVICES = "services"
ATTR_SERVICE = "service"
ATTR_DISCOVERY = "discovery"
ATTR_PROTECTED = "protected"
ATTR_CRYPTO = "crypto"
ATTR_BRANCH = "branch"
ATTR_KERNEL = "kernel"
ATTR_APPARMOR = "apparmor"
ATTR_DEVICETREE = "devicetree"
ATTR_CPE = "cpe"
ATTR_BOARD = "board"
ATTR_HASSOS = "hassos"
ATTR_HASSOS_CLI = "hassos_cli"
ATTR_VERSION_CLI = "version_cli"
ATTR_VERSION_CLI_LATEST = "version_cli_latest"
ATTR_REFRESH_TOKEN = "refresh_token"
ATTR_ACCESS_TOKEN = "access_token"
ATTR_DOCKER_API = "docker_api"
ATTR_FULL_ACCESS = "full_access"
ATTR_PROTECTED = "protected"
ATTR_RATING = "rating"
ATTR_HASSIO_ROLE = "hassio_role"
ATTR_SUPERVISOR = "supervisor"
ATTR_AUTH_API = "auth_api"
ATTR_KERNEL_MODULES = "kernel_modules"
ATTR_SUPPORTED_ARCH = "supported_arch"
SERVICE_MQTT = 'mqtt'
SERVICE_MQTT = "mqtt"
PROVIDE_SERVICE = "provide"
NEED_SERVICE = "need"
WANT_SERVICE = "want"
STARTUP_INITIALIZE = 'initialize'
STARTUP_SYSTEM = 'system'
STARTUP_SERVICES = 'services'
STARTUP_APPLICATION = 'application'
STARTUP_ONCE = 'once'
STARTUP_INITIALIZE = "initialize"
STARTUP_SYSTEM = "system"
STARTUP_SERVICES = "services"
STARTUP_APPLICATION = "application"
STARTUP_ONCE = "once"
BOOT_AUTO = 'auto'
BOOT_MANUAL = 'manual'
STARTUP_ALL = [
STARTUP_ONCE,
STARTUP_INITIALIZE,
STARTUP_SYSTEM,
STARTUP_SERVICES,
STARTUP_APPLICATION,
]
STATE_STARTED = 'started'
STATE_STOPPED = 'stopped'
STATE_NONE = 'none'
BOOT_AUTO = "auto"
BOOT_MANUAL = "manual"
MAP_CONFIG = 'config'
MAP_SSL = 'ssl'
MAP_ADDONS = 'addons'
MAP_BACKUP = 'backup'
MAP_SHARE = 'share'
STATE_STARTED = "started"
STATE_STOPPED = "stopped"
STATE_NONE = "none"
ARCH_ARMHF = 'armhf'
ARCH_AARCH64 = 'aarch64'
ARCH_AMD64 = 'amd64'
ARCH_I386 = 'i386'
MAP_CONFIG = "config"
MAP_SSL = "ssl"
MAP_ADDONS = "addons"
MAP_BACKUP = "backup"
MAP_SHARE = "share"
CHANNEL_STABLE = 'stable'
CHANNEL_BETA = 'beta'
CHANNEL_DEV = 'dev'
ARCH_ARMHF = "armhf"
ARCH_ARMV7 = "armv7"
ARCH_AARCH64 = "aarch64"
ARCH_AMD64 = "amd64"
ARCH_I386 = "i386"
REPOSITORY_CORE = 'core'
REPOSITORY_LOCAL = 'local'
ARCH_ALL = [ARCH_ARMHF, ARCH_ARMV7, ARCH_AARCH64, ARCH_AMD64, ARCH_I386]
FOLDER_HOMEASSISTANT = 'homeassistant'
FOLDER_SHARE = 'share'
FOLDER_ADDONS = 'addons/local'
FOLDER_SSL = 'ssl'
CHANNEL_STABLE = "stable"
CHANNEL_BETA = "beta"
CHANNEL_DEV = "dev"
SNAPSHOT_FULL = 'full'
SNAPSHOT_PARTIAL = 'partial'
REPOSITORY_CORE = "core"
REPOSITORY_LOCAL = "local"
CRYPTO_AES128 = 'aes128'
FOLDER_HOMEASSISTANT = "homeassistant"
FOLDER_SHARE = "share"
FOLDER_ADDONS = "addons/local"
FOLDER_SSL = "ssl"
SECURITY_PROFILE = 'profile'
SECURITY_DEFAULT = 'default'
SECURITY_DISABLE = 'disable'
SNAPSHOT_FULL = "full"
SNAPSHOT_PARTIAL = "partial"
FEATURES_SHUTDOWN = 'shutdown'
FEATURES_REBOOT = 'reboot'
FEATURES_HASSOS = 'hassos'
FEATURES_HOSTNAME = 'hostname'
FEATURES_SERVICES = 'services'
CRYPTO_AES128 = "aes128"
SECURITY_PROFILE = "profile"
SECURITY_DEFAULT = "default"
SECURITY_DISABLE = "disable"
PRIVILEGED_NET_ADMIN = "NET_ADMIN"
PRIVILEGED_SYS_ADMIN = "SYS_ADMIN"
PRIVILEGED_SYS_RAWIO = "SYS_RAWIO"
PRIVILEGED_IPC_LOCK = "IPC_LOCK"
PRIVILEGED_SYS_TIME = "SYS_TIME"
PRIVILEGED_SYS_NICE = "SYS_NICE"
PRIVILEGED_SYS_MODULE = "SYS_MODULE"
PRIVILEGED_SYS_RESOURCE = "SYS_RESOURCE"
PRIVILEGED_SYS_PTRACE = "SYS_PTRACE"
PRIVILEGED_DAC_READ_SEARCH = "DAC_READ_SEARCH"
PRIVILEGED_ALL = [
PRIVILEGED_NET_ADMIN,
PRIVILEGED_SYS_ADMIN,
PRIVILEGED_SYS_RAWIO,
PRIVILEGED_IPC_LOCK,
PRIVILEGED_SYS_TIME,
PRIVILEGED_SYS_NICE,
PRIVILEGED_SYS_RESOURCE,
PRIVILEGED_SYS_PTRACE,
PRIVILEGED_SYS_MODULE,
PRIVILEGED_DAC_READ_SEARCH,
]
FEATURES_SHUTDOWN = "shutdown"
FEATURES_REBOOT = "reboot"
FEATURES_HASSOS = "hassos"
FEATURES_HOSTNAME = "hostname"
FEATURES_SERVICES = "services"
ROLE_DEFAULT = "default"
ROLE_HOMEASSISTANT = "homeassistant"
ROLE_BACKUP = "backup"
ROLE_MANAGER = "manager"
ROLE_ADMIN = "admin"
ROLE_ALL = [ROLE_DEFAULT, ROLE_HOMEASSISTANT, ROLE_BACKUP, ROLE_MANAGER, ROLE_ADMIN]
CHAN_ID = "chan_id"
CHAN_TYPE = "chan_type"

View File

@@ -1,4 +1,4 @@
"""Main file for HassIO."""
"""Main file for Hass.io."""
from contextlib import suppress
import asyncio
import logging
@@ -6,18 +6,18 @@ import logging
import async_timeout
from .coresys import CoreSysAttributes
from .const import (
STARTUP_SYSTEM, STARTUP_SERVICES, STARTUP_APPLICATION, STARTUP_INITIALIZE)
from .const import (STARTUP_SYSTEM, STARTUP_SERVICES, STARTUP_APPLICATION,
STARTUP_INITIALIZE)
from .exceptions import HassioError, HomeAssistantError
_LOGGER = logging.getLogger(__name__)
class HassIO(CoreSysAttributes):
"""Main object of hassio."""
"""Main object of Hass.io."""
def __init__(self, coresys):
"""Initialize hassio object."""
"""Initialize Hass.io object."""
self.coresys = coresys
async def setup(self):
@@ -31,12 +31,15 @@ class HassIO(CoreSysAttributes):
# Load Host
await self.sys_host.load()
# Load HassOS
await self.sys_hassos.load()
# Load Home Assistant
await self.sys_homeassistant.load()
# Load CPU/Arch
await self.sys_arch.load()
# Load HassOS
await self.sys_hassos.load()
# Load Add-ons
await self.sys_addons.load()
@@ -52,18 +55,20 @@ class HassIO(CoreSysAttributes):
# load services
await self.sys_services.load()
# Load discovery
await self.sys_discovery.load()
# start dns forwarding
self.sys_create_task(self.sys_dns.start())
async def start(self):
"""Start HassIO orchestration."""
"""Start Hass.io orchestration."""
# on release channel, try update itself
# on dev mode, only read new versions
if not self.sys_dev and self.sys_supervisor.need_update:
if await self.sys_supervisor.update():
if self.sys_supervisor.need_update:
if self.sys_dev:
_LOGGER.warning("Ignore Hass.io updates on dev!")
elif await self.sys_supervisor.update():
return
else:
_LOGGER.info("Ignore Hass.io auto updates on dev channel")
# start api
await self.sys_api.start()

View File

@@ -1,283 +1,455 @@
"""Handle core shared data."""
from __future__ import annotations
import asyncio
from typing import TYPE_CHECKING
import aiohttp
from .const import CHANNEL_DEV
from .config import CoreConfig
from .const import CHANNEL_DEV
from .docker import DockerAPI
from .misc.dns import DNSForward
from .misc.hardware import Hardware
from .misc.scheduler import Scheduler
if TYPE_CHECKING:
from .addons import AddonManager
from .api import RestAPI
from .arch import CpuArch
from .auth import Auth
from .core import HassIO
from .dbus import DBusManager
from .discovery import Discovery
from .hassos import HassOS
from .homeassistant import HomeAssistant
from .host import HostManager
from .services import ServiceManager
from .snapshots import SnapshotManager
from .supervisor import Supervisor
from .tasks import Tasks
from .updater import Updater
class CoreSys:
"""Class that handle all shared data."""
def __init__(self, loop):
def __init__(self):
"""Initialize coresys."""
# Static attributes
self.exit_code = 0
self.machine_id = None
self.machine_id: str = None
# External objects
self._loop = loop
self._websession = aiohttp.ClientSession(loop=loop)
self._websession_ssl = aiohttp.ClientSession(
connector=aiohttp.TCPConnector(verify_ssl=False), loop=loop)
self._loop: asyncio.BaseEventLoop = asyncio.get_running_loop()
self._websession: aiohttp.ClientSession = aiohttp.ClientSession()
self._websession_ssl: aiohttp.ClientSession = aiohttp.ClientSession(
connector=aiohttp.TCPConnector(ssl=False))
# Global objects
self._config = CoreConfig()
self._hardware = Hardware()
self._docker = DockerAPI()
self._scheduler = Scheduler(loop=loop)
self._dns = DNSForward(loop=loop)
self._config: CoreConfig = CoreConfig()
self._hardware: Hardware = Hardware()
self._docker: DockerAPI = DockerAPI()
self._scheduler: Scheduler = Scheduler()
self._dns: DNSForward = DNSForward()
# Internal objects pointers
self._core = None
self._homeassistant = None
self._supervisor = None
self._addons = None
self._api = None
self._updater = None
self._snapshots = None
self._tasks = None
self._host = None
self._dbus = None
self._hassos = None
self._services = None
self._discovery = None
self._core: HassIO = None
self._arch: CpuArch = None
self._auth: Auth = None
self._homeassistant: HomeAssistant = None
self._supervisor: Supervisor = None
self._addons: AddonManager = None
self._api: RestAPI = None
self._updater: Updater = None
self._snapshots: SnapshotManager = None
self._tasks: Tasks = None
self._host: HostManager = None
self._dbus: DBusManager = None
self._hassos: HassOS = None
self._services: ServiceManager = None
self._discovery: Discovery = None
@property
def arch(self):
"""Return running arch of hass.io system."""
if self._supervisor:
return self._supervisor.arch
return None
@property
def machine(self):
"""Return running machine type of hass.io system."""
def machine(self) -> str:
"""Return running machine type of the Hass.io system."""
if self._homeassistant:
return self._homeassistant.machine
return None
@property
def dev(self):
"""Return True if we run dev modus."""
def dev(self) -> str:
"""Return True if we run dev mode."""
return self._updater.channel == CHANNEL_DEV
@property
def loop(self):
def timezone(self) -> str:
"""Return timezone."""
return self._config.timezone
@property
def loop(self) -> asyncio.BaseEventLoop:
"""Return loop object."""
return self._loop
@property
def websession(self):
def websession(self) -> aiohttp.ClientSession:
"""Return websession object."""
return self._websession
@property
def websession_ssl(self):
def websession_ssl(self) -> aiohttp.ClientSession:
"""Return websession object with disabled SSL."""
return self._websession_ssl
@property
def config(self):
def config(self) -> CoreConfig:
"""Return CoreConfig object."""
return self._config
@property
def hardware(self):
def hardware(self) -> Hardware:
"""Return Hardware object."""
return self._hardware
@property
def docker(self):
def docker(self) -> DockerAPI:
"""Return DockerAPI object."""
return self._docker
@property
def scheduler(self):
def scheduler(self) -> Scheduler:
"""Return Scheduler object."""
return self._scheduler
@property
def dns(self):
def dns(self) -> DNSForward:
"""Return DNSForward object."""
return self._dns
@property
def core(self):
def core(self) -> HassIO:
"""Return HassIO object."""
return self._core
@core.setter
def core(self, value):
"""Set a HassIO object."""
def core(self, value: HassIO):
"""Set a Hass.io object."""
if self._core:
raise RuntimeError("HassIO already set!")
raise RuntimeError("Hass.io already set!")
self._core = value
@property
def homeassistant(self):
"""Return HomeAssistant object."""
def arch(self) -> CpuArch:
"""Return CpuArch object."""
return self._arch
@arch.setter
def arch(self, value: CpuArch):
"""Set a CpuArch object."""
if self._arch:
raise RuntimeError("CpuArch already set!")
self._arch = value
@property
def auth(self) -> Auth:
"""Return Auth object."""
return self._auth
@auth.setter
def auth(self, value: Auth):
"""Set a Auth object."""
if self._auth:
raise RuntimeError("Auth already set!")
self._auth = value
@property
def homeassistant(self) -> HomeAssistant:
"""Return Home Assistant object."""
return self._homeassistant
@homeassistant.setter
def homeassistant(self, value):
def homeassistant(self, value: HomeAssistant):
"""Set a HomeAssistant object."""
if self._homeassistant:
raise RuntimeError("HomeAssistant already set!")
raise RuntimeError("Home Assistant already set!")
self._homeassistant = value
@property
def supervisor(self):
def supervisor(self) -> Supervisor:
"""Return Supervisor object."""
return self._supervisor
@supervisor.setter
def supervisor(self, value):
def supervisor(self, value: Supervisor):
"""Set a Supervisor object."""
if self._supervisor:
raise RuntimeError("Supervisor already set!")
self._supervisor = value
@property
def api(self):
def api(self) -> RestAPI:
"""Return API object."""
return self._api
@api.setter
def api(self, value):
def api(self, value: RestAPI):
"""Set an API object."""
if self._api:
raise RuntimeError("API already set!")
self._api = value
@property
def updater(self):
def updater(self) -> Updater:
"""Return Updater object."""
return self._updater
@updater.setter
def updater(self, value):
def updater(self, value: Updater):
"""Set a Updater object."""
if self._updater:
raise RuntimeError("Updater already set!")
self._updater = value
@property
def addons(self):
def addons(self) -> AddonManager:
"""Return AddonManager object."""
return self._addons
@addons.setter
def addons(self, value):
def addons(self, value: AddonManager):
"""Set a AddonManager object."""
if self._addons:
raise RuntimeError("AddonManager already set!")
self._addons = value
@property
def snapshots(self):
def snapshots(self) -> SnapshotManager:
"""Return SnapshotManager object."""
return self._snapshots
@snapshots.setter
def snapshots(self, value):
def snapshots(self, value: SnapshotManager):
"""Set a SnapshotManager object."""
if self._snapshots:
raise RuntimeError("SnapshotsManager already set!")
self._snapshots = value
@property
def tasks(self):
def tasks(self) -> Tasks:
"""Return Tasks object."""
return self._tasks
@tasks.setter
def tasks(self, value):
def tasks(self, value: Tasks):
"""Set a Tasks object."""
if self._tasks:
raise RuntimeError("Tasks already set!")
self._tasks = value
@property
def services(self):
def services(self) -> ServiceManager:
"""Return ServiceManager object."""
return self._services
@services.setter
def services(self, value):
def services(self, value: ServiceManager):
"""Set a ServiceManager object."""
if self._services:
raise RuntimeError("Services already set!")
self._services = value
@property
def discovery(self):
def discovery(self) -> Discovery:
"""Return ServiceManager object."""
return self._discovery
@discovery.setter
def discovery(self, value):
def discovery(self, value: Discovery):
"""Set a Discovery object."""
if self._discovery:
raise RuntimeError("Discovery already set!")
self._discovery = value
@property
def dbus(self):
def dbus(self) -> DBusManager:
"""Return DBusManager object."""
return self._dbus
@dbus.setter
def dbus(self, value):
def dbus(self, value: DBusManager):
"""Set a DBusManager object."""
if self._dbus:
raise RuntimeError("DBusManager already set!")
self._dbus = value
@property
def host(self):
def host(self) -> HostManager:
"""Return HostManager object."""
return self._host
@host.setter
def host(self, value):
def host(self, value: HostManager):
"""Set a HostManager object."""
if self._host:
raise RuntimeError("HostManager already set!")
self._host = value
@property
def hassos(self):
def hassos(self) -> HassOS:
"""Return HassOS object."""
return self._hassos
@hassos.setter
def hassos(self, value):
def hassos(self, value: HassOS):
"""Set a HassOS object."""
if self._hassos:
raise RuntimeError("HassOS already set!")
self._hassos = value
def run_in_executor(self, funct, *args):
"""Wrapper for executor pool."""
return self._loop.run_in_executor(None, funct, *args)
def create_task(self, coroutine):
"""Wrapper for async task."""
return self._loop.create_task(coroutine)
class CoreSysAttributes:
"""Inheret basic CoreSysAttributes."""
coresys = None
def __getattr__(self, name):
"""Mapping to coresys."""
if name.startswith("sys_") and hasattr(self.coresys, name[4:]):
return getattr(self.coresys, name[4:])
raise AttributeError(f"Can't resolve {name} on {self}")
@property
def sys_machine(self) -> str:
"""Return running machine type of the Hass.io system."""
return self.coresys.machine
@property
def sys_dev(self) -> str:
"""Return True if we run dev mode."""
return self.coresys.dev
@property
def sys_timezone(self) -> str:
"""Return timezone."""
return self.coresys.timezone
@property
def sys_machine_id(self) -> str:
"""Return timezone."""
return self.coresys.machine_id
@property
def sys_loop(self) -> asyncio.BaseEventLoop:
"""Return loop object."""
return self.coresys.loop
@property
def sys_websession(self) -> aiohttp.ClientSession:
"""Return websession object."""
return self.coresys.websession
@property
def sys_websession_ssl(self) -> aiohttp.ClientSession:
"""Return websession object with disabled SSL."""
return self.coresys.websession_ssl
@property
def sys_config(self) -> CoreConfig:
"""Return CoreConfig object."""
return self.coresys.config
@property
def sys_hardware(self) -> Hardware:
"""Return Hardware object."""
return self.coresys.hardware
@property
def sys_docker(self) -> DockerAPI:
"""Return DockerAPI object."""
return self.coresys.docker
@property
def sys_scheduler(self) -> Scheduler:
"""Return Scheduler object."""
return self.coresys.scheduler
@property
def sys_dns(self) -> DNSForward:
"""Return DNSForward object."""
return self.coresys.dns
@property
def sys_core(self) -> HassIO:
"""Return HassIO object."""
return self.coresys.core
@property
def sys_arch(self) -> CpuArch:
"""Return CpuArch object."""
return self.coresys.arch
@property
def sys_auth(self) -> Auth:
"""Return Auth object."""
return self.coresys.auth
@property
def sys_homeassistant(self) -> HomeAssistant:
"""Return Home Assistant object."""
return self.coresys.homeassistant
@property
def sys_supervisor(self) -> Supervisor:
"""Return Supervisor object."""
return self.coresys.supervisor
@property
def sys_api(self) -> RestAPI:
"""Return API object."""
return self.coresys.api
@property
def sys_updater(self) -> Updater:
"""Return Updater object."""
return self.coresys.updater
@property
def sys_addons(self) -> AddonManager:
"""Return AddonManager object."""
return self.coresys.addons
@property
def sys_snapshots(self) -> SnapshotManager:
"""Return SnapshotManager object."""
return self.coresys.snapshots
@property
def sys_tasks(self) -> Tasks:
"""Return Tasks object."""
return self.coresys.tasks
@property
def sys_services(self) -> ServiceManager:
"""Return ServiceManager object."""
return self.coresys.services
@property
def sys_discovery(self) -> Discovery:
"""Return ServiceManager object."""
return self.coresys.discovery
@property
def sys_dbus(self) -> DBusManager:
"""Return DBusManager object."""
return self.coresys.dbus
@property
def sys_host(self) -> HostManager:
"""Return HostManager object."""
return self.coresys.host
@property
def sys_hassos(self) -> HassOS:
"""Return HassOS object."""
return self.coresys.hassos
def sys_run_in_executor(self, funct, *args) -> asyncio.Future:
"""Wrapper for executor pool."""
return self.sys_loop.run_in_executor(None, funct, *args)
def sys_create_task(self, coroutine) -> asyncio.Task:
"""Wrapper for async task."""
return self.sys_loop.create_task(coroutine)

View File

@@ -1,4 +1,4 @@
"""DBus interface objects."""
"""D-Bus interface objects."""
from .systemd import Systemd
from .hostname import Hostname
@@ -7,10 +7,10 @@ from ..coresys import CoreSysAttributes
class DBusManager(CoreSysAttributes):
"""DBus Interface handler."""
"""A DBus Interface handler."""
def __init__(self, coresys):
"""Initialize DBus Interface."""
"""Initialize D-Bus interface."""
self.coresys = coresys
self._systemd = Systemd()
@@ -19,21 +19,21 @@ class DBusManager(CoreSysAttributes):
@property
def systemd(self):
"""Return Systemd Interface."""
"""Return the systemd interface."""
return self._systemd
@property
def hostname(self):
"""Return hostname Interface."""
"""Return the hostname interface."""
return self._hostname
@property
def rauc(self):
"""Return rauc Interface."""
"""Return the rauc interface."""
return self._rauc
async def load(self):
"""Connect interfaces to dbus."""
"""Connect interfaces to D-Bus."""
await self.systemd.connect()
await self.hostname.connect()
await self.rauc.connect()

View File

@@ -1,4 +1,4 @@
"""DBus interface for hostname."""
"""D-Bus interface for hostname."""
import logging
from .interface import DBusInterface
@@ -13,10 +13,10 @@ DBUS_OBJECT = '/org/freedesktop/hostname1'
class Hostname(DBusInterface):
"""Handle DBus interface for hostname/system."""
"""Handle D-Bus interface for hostname/system."""
async def connect(self):
"""Connect do bus."""
"""Connect to system's D-Bus."""
try:
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:

View File

@@ -1,8 +1,8 @@
"""Interface class for dbus wrappers."""
"""Interface class for D-Bus wrappers."""
class DBusInterface:
"""Handle DBus interface for hostname/system."""
"""Handle D-Bus interface for hostname/system."""
def __init__(self):
"""Initialize systemd."""
@@ -10,9 +10,9 @@ class DBusInterface:
@property
def is_connected(self):
"""Return True, if they is connected to dbus."""
"""Return True, if they is connected to D-Bus."""
return self.dbus is not None
async def connect(self):
"""Connect do bus."""
"""Connect to D-Bus."""
raise NotImplementedError()

View File

@@ -1,4 +1,4 @@
"""DBus interface for rauc."""
"""D-Bus interface for rauc."""
import logging
from .interface import DBusInterface
@@ -13,10 +13,10 @@ DBUS_OBJECT = '/'
class Rauc(DBusInterface):
"""Handle DBus interface for rauc."""
"""Handle D-Bus interface for rauc."""
async def connect(self):
"""Connect do bus."""
"""Connect to D-Bus."""
try:
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:

View File

@@ -1,4 +1,4 @@
"""Interface to Systemd over dbus."""
"""Interface to Systemd over D-Bus."""
import logging
from .interface import DBusInterface
@@ -16,7 +16,7 @@ class Systemd(DBusInterface):
"""Systemd function handler."""
async def connect(self):
"""Connect do bus."""
"""Connect to D-Bus."""
try:
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:

View File

@@ -1,12 +1,12 @@
"""Utils for dbus."""
"""Utils for D-Bus."""
from ..exceptions import DBusNotConnectedError
def dbus_connected(method):
"""Wrapper for check if dbus is connected."""
"""Wrapper for check if D-Bus is connected."""
def wrap_dbus(api, *args, **kwargs):
"""Check if dbus is connected before call a method."""
"""Check if D-Bus is connected before call a method."""
if api.dbus is None:
raise DBusNotConnectedError()
return method(api, *args, **kwargs)

122
hassio/discovery.py Normal file
View File

@@ -0,0 +1,122 @@
"""Handle discover message for Home Assistant."""
import logging
from contextlib import suppress
from uuid import uuid4
import attr
import voluptuous as vol
from voluptuous.humanize import humanize_error
from .const import FILE_HASSIO_DISCOVERY, ATTR_CONFIG, ATTR_DISCOVERY
from .coresys import CoreSysAttributes
from .exceptions import DiscoveryError, HomeAssistantAPIError
from .validate import SCHEMA_DISCOVERY_CONFIG
from .utils.json import JsonConfig
from .services.validate import DISCOVERY_SERVICES
_LOGGER = logging.getLogger(__name__)
CMD_NEW = 'post'
CMD_DEL = 'delete'
class Discovery(CoreSysAttributes, JsonConfig):
"""Home Assistant Discovery handler."""
def __init__(self, coresys):
"""Initialize discovery handler."""
super().__init__(FILE_HASSIO_DISCOVERY, SCHEMA_DISCOVERY_CONFIG)
self.coresys = coresys
self.message_obj = {}
async def load(self):
"""Load exists discovery message into storage."""
messages = {}
for message in self._data[ATTR_DISCOVERY]:
discovery = Message(**message)
messages[discovery.uuid] = discovery
_LOGGER.info("Load %d messages", len(messages))
self.message_obj = messages
def save(self):
"""Write discovery message into data file."""
messages = []
for message in self.list_messages:
messages.append(attr.asdict(message))
self._data[ATTR_DISCOVERY].clear()
self._data[ATTR_DISCOVERY].extend(messages)
self.save_data()
def get(self, uuid):
"""Return discovery message."""
return self.message_obj.get(uuid)
@property
def list_messages(self):
"""Return list of available discovery messages."""
return list(self.message_obj.values())
def send(self, addon, service, config):
"""Send a discovery message to Home Assistant."""
try:
config = DISCOVERY_SERVICES[service](config)
except vol.Invalid as err:
_LOGGER.error(
"Invalid discovery %s config", humanize_error(config, err))
raise DiscoveryError() from None
# Create message
message = Message(addon.slug, service, config)
# Already exists?
for old_message in self.list_messages:
if old_message != message:
continue
_LOGGER.info("Duplicate discovery message from %s", addon.slug)
return old_message
_LOGGER.info("Send discovery to Home Assistant %s from %s",
service, addon.slug)
self.message_obj[message.uuid] = message
self.save()
self.sys_create_task(self._push_discovery(message, CMD_NEW))
return message
def remove(self, message):
"""Remove a discovery message from Home Assistant."""
self.message_obj.pop(message.uuid, None)
self.save()
_LOGGER.info("Delete discovery to Home Assistant %s from %s",
message.service, message.addon)
self.sys_create_task(self._push_discovery(message, CMD_DEL))
async def _push_discovery(self, message, command):
"""Send a discovery request."""
if not await self.sys_homeassistant.check_api_state():
_LOGGER.info("Discovery %s mesage ignore", message.uuid)
return
data = attr.asdict(message)
data.pop(ATTR_CONFIG)
with suppress(HomeAssistantAPIError):
async with self.sys_homeassistant.make_request(
command, f"api/hassio_push/discovery/{message.uuid}",
json=data, timeout=10):
_LOGGER.info("Discovery %s message send", message.uuid)
return
_LOGGER.warning("Discovery %s message fail", message.uuid)
@attr.s
class Message:
"""Represent a single Discovery message."""
addon = attr.ib()
service = attr.ib()
config = attr.ib(cmp=False)
uuid = attr.ib(factory=lambda: uuid4().hex, cmp=False)

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO docker object."""
"""Init file for Hass.io Docker object."""
from contextlib import suppress
import logging
@@ -10,18 +10,22 @@ from ..const import SOCKET_DOCKER
_LOGGER = logging.getLogger(__name__)
# pylint: disable=invalid-name
CommandReturn = attr.make_class('CommandReturn', ['exit_code', 'output'])
@attr.s(frozen=True)
class CommandReturn:
"""Return object from command run."""
exit_code = attr.ib()
output = attr.ib()
class DockerAPI:
"""Docker hassio wrapper.
"""Docker Hass.io wrapper.
This class is not AsyncIO safe!
"""
def __init__(self):
"""Initialize docker base wrapper."""
"""Initialize Docker base wrapper."""
self.docker = docker.DockerClient(
base_url="unix:/{}".format(str(SOCKET_DOCKER)),
version='auto', timeout=900)
@@ -29,21 +33,21 @@ class DockerAPI:
@property
def images(self):
"""Return api images."""
"""Return API images."""
return self.docker.images
@property
def containers(self):
"""Return api containers."""
"""Return API containers."""
return self.docker.containers
@property
def api(self):
"""Return api containers."""
"""Return API containers."""
return self.docker.api
def run(self, image, **kwargs):
""""Create a docker and run it.
""""Create a Docker container and run it.
Need run inside executor.
"""
@@ -51,7 +55,7 @@ class DockerAPI:
network_mode = kwargs.get('network_mode')
hostname = kwargs.get('hostname')
# setup network
# Setup network
kwargs['dns_search'] = ["."]
if network_mode:
kwargs['dns'] = [str(self.network.supervisor)]
@@ -59,9 +63,10 @@ class DockerAPI:
else:
kwargs['network'] = None
# create container
# Create container
try:
container = self.docker.containers.create(image, **kwargs)
container = self.docker.containers.create(
image, use_config_proxy=False, **kwargs)
except docker.errors.DockerException as err:
_LOGGER.error("Can't create container from %s: %s", name, err)
return False
@@ -97,6 +102,7 @@ class DockerAPI:
image,
command=command,
network=self.network.name,
use_config_proxy=False,
**kwargs
)
@@ -108,8 +114,9 @@ class DockerAPI:
_LOGGER.error("Can't execute command: %s", err)
return CommandReturn(None, b"")
# cleanup container
with suppress(docker.errors.DockerException):
container.remove(force=True)
finally:
# cleanup container
with suppress(docker.errors.DockerException):
container.remove(force=True)
return CommandReturn(result.get('StatusCode'), output)

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO addon docker object."""
"""Init file for Hass.io add-on Docker object."""
import logging
import os
@@ -7,9 +7,8 @@ import requests
from .interface import DockerInterface
from ..addons.build import AddonBuild
from ..const import (
MAP_CONFIG, MAP_SSL, MAP_ADDONS, MAP_BACKUP, MAP_SHARE, ENV_TOKEN,
ENV_TIME, SECURITY_PROFILE, SECURITY_DISABLE)
from ..const import (MAP_CONFIG, MAP_SSL, MAP_ADDONS, MAP_BACKUP, MAP_SHARE,
ENV_TOKEN, ENV_TIME, SECURITY_PROFILE, SECURITY_DISABLE)
from ..utils import process_lock
_LOGGER = logging.getLogger(__name__)
@@ -18,45 +17,45 @@ AUDIO_DEVICE = "/dev/snd:/dev/snd:rwm"
class DockerAddon(DockerInterface):
"""Docker hassio wrapper for HomeAssistant."""
"""Docker Hass.io wrapper for Home Assistant."""
def __init__(self, coresys, slug):
"""Initialize docker homeassistant wrapper."""
"""Initialize Docker Home Assistant wrapper."""
super().__init__(coresys)
self._id = slug
@property
def addon(self):
"""Return addon of docker image."""
"""Return add-on of Docker image."""
return self.sys_addons.get(self._id)
@property
def image(self):
"""Return name of docker image."""
"""Return name of Docker image."""
return self.addon.image
@property
def timeout(self):
"""Return timeout for docker actions."""
"""Return timeout for Docker actions."""
return self.addon.timeout
@property
def version(self):
"""Return version of docker image."""
if not self.addon.legacy:
return super().version
return self.addon.version_installed
"""Return version of Docker image."""
if self.addon.legacy:
return self.addon.version_installed
return super().version
@property
def arch(self):
"""Return arch of docker image."""
if not self.addon.legacy:
return super().arch
return self.sys_arch
"""Return arch of Docker image."""
if self.addon.legacy:
return self.sys_arch.default
return super().arch
@property
def name(self):
"""Return name of docker container."""
"""Return name of Docker container."""
return "addon_{}".format(self.addon.slug)
@property
@@ -66,27 +65,34 @@ class DockerAddon(DockerInterface):
return 'host'
return None
@property
def full_access(self):
"""Return True if full access is enabled."""
return not self.addon.protected and self.addon.with_full_access
@property
def hostname(self):
"""Return slug/id of addon."""
"""Return slug/id of add-on."""
return self.addon.slug.replace('_', '-')
@property
def environment(self):
"""Return environment for docker add-on."""
"""Return environment for Docker add-on."""
addon_env = self.addon.environment or {}
# Need audio settings
if self.addon.with_audio:
addon_env.update({
'ALSA_OUTPUT': self.addon.audio_output,
'ALSA_INPUT': self.addon.audio_input,
})
# Provide options for legacy add-ons
if self.addon.legacy:
for key, value in self.addon.options.items():
if isinstance(value, (int, str)):
addon_env[key] = value
else:
_LOGGER.warning(
"Can not set nested option %s as Docker env", key)
return {
**addon_env,
ENV_TIME: self.sys_config.timezone,
ENV_TOKEN: self.addon.uuid,
ENV_TIME: self.sys_timezone,
ENV_TOKEN: self.addon.hassio_token,
}
@property
@@ -95,7 +101,7 @@ class DockerAddon(DockerInterface):
devices = self.addon.devices or []
# Use audio devices
if self.addon.with_audio and AUDIO_DEVICE not in devices:
if self.addon.with_audio and self.sys_hardware.support_audio:
devices.append(AUDIO_DEVICE)
# Auto mapping UART devices
@@ -108,7 +114,7 @@ class DockerAddon(DockerInterface):
@property
def ports(self):
"""Filter None from addon ports."""
"""Filter None from add-on ports."""
if not self.addon.ports:
return None
@@ -120,7 +126,7 @@ class DockerAddon(DockerInterface):
@property
def security_opt(self):
"""Controlling security opt."""
"""Controlling security options."""
security = []
# AppArmor
@@ -138,7 +144,7 @@ class DockerAddon(DockerInterface):
@property
def tmpfs(self):
"""Return tmpfs for docker add-on."""
"""Return tmpfs for Docker add-on."""
options = self.addon.tmpfs
if options:
return {"/tmpfs": f"{options}"}
@@ -154,97 +160,143 @@ class DockerAddon(DockerInterface):
@property
def network_mode(self):
"""Return network mode for addon."""
"""Return network mode for add-on."""
if self.addon.host_network:
return 'host'
return None
@property
def pid_mode(self):
"""Return PID mode for add-on."""
if not self.addon.protected and self.addon.host_pid:
return 'host'
return None
@property
def volumes(self):
"""Generate volumes for mappings."""
volumes = {
str(self.addon.path_extern_data): {
'bind': "/data", 'mode': 'rw'
}}
'bind': "/data",
'mode': 'rw'
}
}
addon_mapping = self.addon.map_volumes
# setup config mappings
if MAP_CONFIG in addon_mapping:
volumes.update({
str(self.sys_config.path_extern_config): {
'bind': "/config", 'mode': addon_mapping[MAP_CONFIG]
}})
str(self.sys_config.path_extern_homeassistant): {
'bind': "/config",
'mode': addon_mapping[MAP_CONFIG]
}
})
if MAP_SSL in addon_mapping:
volumes.update({
str(self.sys_config.path_extern_ssl): {
'bind': "/ssl", 'mode': addon_mapping[MAP_SSL]
}})
'bind': "/ssl",
'mode': addon_mapping[MAP_SSL]
}
})
if MAP_ADDONS in addon_mapping:
volumes.update({
str(self.sys_config.path_extern_addons_local): {
'bind': "/addons", 'mode': addon_mapping[MAP_ADDONS]
}})
'bind': "/addons",
'mode': addon_mapping[MAP_ADDONS]
}
})
if MAP_BACKUP in addon_mapping:
volumes.update({
str(self.sys_config.path_extern_backup): {
'bind': "/backup", 'mode': addon_mapping[MAP_BACKUP]
}})
'bind': "/backup",
'mode': addon_mapping[MAP_BACKUP]
}
})
if MAP_SHARE in addon_mapping:
volumes.update({
str(self.sys_config.path_extern_share): {
'bind': "/share", 'mode': addon_mapping[MAP_SHARE]
}})
'bind': "/share",
'mode': addon_mapping[MAP_SHARE]
}
})
# Init other hardware mappings
# GPIO support
if self.addon.with_gpio:
volumes.update({
"/sys/class/gpio": {
'bind': "/sys/class/gpio", 'mode': 'rw'
},
"/sys/devices/platform/soc": {
'bind': "/sys/devices/platform/soc", 'mode': 'rw'
},
})
if self.addon.with_gpio and self.sys_hardware.support_gpio:
for gpio_path in ("/sys/class/gpio", "/sys/devices/platform/soc"):
volumes.update({
gpio_path: {
'bind': gpio_path,
'mode': 'rw'
},
})
# DeviceTree support
if self.addon.with_devicetree:
volumes.update({
"/sys/firmware/devicetree/base": {
'bind': "/device-tree", 'mode': 'ro'
'bind': "/device-tree",
'mode': 'ro'
},
})
# Host dbus system
# Kernel Modules support
if self.addon.with_kernel_modules:
volumes.update({
"/lib/modules": {
'bind': "/lib/modules",
'mode': 'ro'
},
})
# Docker API support
if not self.addon.protected and self.addon.access_docker_api:
volumes.update({
"/var/run/docker.sock": {
'bind': "/var/run/docker.sock",
'mode': 'ro'
},
})
# Host D-Bus system
if self.addon.host_dbus:
volumes.update({
"/var/run/dbus": {
'bind': "/var/run/dbus", 'mode': 'rw'
}})
'bind': "/var/run/dbus",
'mode': 'rw'
}
})
# ALSA configuration
if self.addon.with_audio:
volumes.update({
str(self.addon.path_extern_asound): {
'bind': "/etc/asound.conf", 'mode': 'ro'
}})
'bind': "/etc/asound.conf",
'mode': 'ro'
}
})
return volumes
def _run(self):
"""Run docker image.
"""Run Docker image.
Need run inside executor.
"""
if self._is_running():
return True
# Security check
if not self.addon.protected:
_LOGGER.warning("%s run with disabled protected mode!",
self.addon.name)
# cleanup
self._stop()
@@ -254,9 +306,11 @@ class DockerAddon(DockerInterface):
hostname=self.hostname,
detach=True,
init=True,
privileged=self.full_access,
ipc_mode=self.ipc,
stdin_open=self.addon.with_stdin,
network_mode=self.network_mode,
pid_mode=self.pid_mode,
ports=self.ports,
extra_hosts=self.network_mapping,
devices=self.devices,
@@ -264,27 +318,26 @@ class DockerAddon(DockerInterface):
security_opt=self.security_opt,
environment=self.environment,
volumes=self.volumes,
tmpfs=self.tmpfs
)
tmpfs=self.tmpfs)
if ret:
_LOGGER.info("Start docker addon %s with version %s",
self.image, self.version)
_LOGGER.info("Start Docker add-on %s with version %s", self.image,
self.version)
return ret
def _install(self, tag):
"""Pull docker image or build it.
def _install(self, tag, image=None):
"""Pull Docker image or build it.
Need run inside executor.
"""
if self.addon.need_build:
return self._build(tag)
return super()._install(tag)
return super()._install(tag, image)
def _build(self, tag):
"""Build a docker container.
"""Build a Docker container.
Need run inside executor.
"""
@@ -293,7 +346,7 @@ class DockerAddon(DockerInterface):
_LOGGER.info("Start build %s:%s", self.image, tag)
try:
image, log = self.sys_docker.images.build(
**build_env.get_docker_args(tag))
use_config_proxy=False, **build_env.get_docker_args(tag))
_LOGGER.debug("Build %s:%s done: %s", self.image, tag, log)
image.tag(self.image, tag='latest')
@@ -301,7 +354,7 @@ class DockerAddon(DockerInterface):
# Update meta data
self._meta = image.attrs
except (docker.errors.DockerException) as err:
except docker.errors.DockerException as err:
_LOGGER.error("Can't build %s:%s: %s", self.image, tag, err)
return False
@@ -375,7 +428,7 @@ class DockerAddon(DockerInterface):
return False
try:
# load needed docker objects
# Load needed docker objects
container = self.sys_docker.containers.get(self.name)
socket = container.attach_socket(params={'stdin': 1, 'stream': 1})
except docker.errors.DockerException as err:
@@ -383,7 +436,7 @@ class DockerAddon(DockerInterface):
return False
try:
# write to stdin
# Write to stdin
data += b"\n"
os.write(socket.fileno(), data)
socket.close()

View File

@@ -3,35 +3,35 @@ import logging
import docker
from .interface import DockerInterface
from ..coresys import CoreSysAttributes
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__)
class DockerHassOSCli(DockerInterface, CoreSysAttributes):
"""Docker hassio wrapper for HassOS Cli."""
"""Docker Hass.io wrapper for HassOS Cli."""
@property
def image(self):
"""Return name of HassOS cli image."""
return f"homeassistant/{self.sys_arch}-hassio-cli"
"""Return name of HassOS CLI image."""
return f"homeassistant/{self.sys_arch.supervisor}-hassio-cli"
def _stop(self):
"""Don't need stop."""
return True
def _attach(self):
"""Attach to running docker container.
"""Attach to running Docker container.
Need run inside executor.
"""
try:
image = self.sys_docker.images.get(self.image)
except docker.errors.DockerException:
_LOGGER.warning("Can't find a HassOS cli %s", self.image)
_LOGGER.warning("Can't find a HassOS CLI %s", self.image)
else:
self._meta = image.attrs
_LOGGER.info("Found HassOS cli %s with version %s",
self.image, self.version)
_LOGGER.info("Found HassOS CLI %s with version %s", self.image,
self.version)

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO docker object."""
"""Init file for Hass.io Docker object."""
import logging
import docker
@@ -12,35 +12,35 @@ HASS_DOCKER_NAME = 'homeassistant'
class DockerHomeAssistant(DockerInterface):
"""Docker hassio wrapper for HomeAssistant."""
"""Docker Hass.io wrapper for Home Assistant."""
@property
def machine(self):
"""Return machine of Home-Assistant docker image."""
"""Return machine of Home Assistant Docker image."""
if self._meta and LABEL_MACHINE in self._meta['Config']['Labels']:
return self._meta['Config']['Labels'][LABEL_MACHINE]
return None
@property
def image(self):
"""Return name of docker image."""
"""Return name of Docker image."""
return self.sys_homeassistant.image
@property
def name(self):
"""Return name of docker container."""
"""Return name of Docker container."""
return HASS_DOCKER_NAME
@property
def devices(self):
"""Create list of special device to map into docker."""
"""Create list of special device to map into Docker."""
devices = []
for device in self.sys_hardware.serial_devices:
devices.append(f"{device}:{device}:rwm")
return devices or None
def _run(self):
"""Run docker image.
"""Run Docker image.
Need run inside executor.
"""
@@ -61,11 +61,11 @@ class DockerHomeAssistant(DockerInterface):
network_mode='host',
environment={
'HASSIO': self.sys_docker.network.supervisor,
ENV_TIME: self.sys_config.timezone,
ENV_TOKEN: self.sys_homeassistant.uuid,
ENV_TIME: self.sys_timezone,
ENV_TOKEN: self.sys_homeassistant.hassio_token,
},
volumes={
str(self.sys_config.path_extern_config):
str(self.sys_config.path_extern_homeassistant):
{'bind': '/config', 'mode': 'rw'},
str(self.sys_config.path_extern_ssl):
{'bind': '/ssl', 'mode': 'ro'},
@@ -95,18 +95,20 @@ class DockerHomeAssistant(DockerInterface):
stdout=True,
stderr=True,
environment={
ENV_TIME: self.sys_config.timezone,
ENV_TIME: self.sys_timezone,
},
volumes={
str(self.sys_config.path_extern_config):
str(self.sys_config.path_extern_homeassistant):
{'bind': '/config', 'mode': 'rw'},
str(self.sys_config.path_extern_ssl):
{'bind': '/ssl', 'mode': 'ro'},
str(self.sys_config.path_extern_share):
{'bind': '/share', 'mode': 'ro'},
}
)
def is_initialize(self):
"""Return True if docker container exists."""
"""Return True if Docker container exists."""
return self.sys_run_in_executor(self._is_initialize)
def _is_initialize(self):

View File

@@ -1,4 +1,4 @@
"""Interface class for HassIO docker object."""
"""Interface class for Hass.io Docker object."""
import asyncio
from contextlib import suppress
import logging
@@ -14,44 +14,50 @@ _LOGGER = logging.getLogger(__name__)
class DockerInterface(CoreSysAttributes):
"""Docker hassio interface."""
"""Docker Hass.io interface."""
def __init__(self, coresys):
"""Initialize docker base wrapper."""
"""Initialize Docker base wrapper."""
self.coresys = coresys
self._meta = None
self.lock = asyncio.Lock(loop=coresys.loop)
@property
def timeout(self):
"""Return timeout for docker actions."""
"""Return timeout for Docker actions."""
return 30
@property
def name(self):
"""Return name of docker container."""
"""Return name of Docker container."""
return None
@property
def meta_config(self):
"""Return meta data of configuration for container/image."""
if not self._meta:
return {}
return self._meta.get('Config', {})
@property
def meta_labels(self):
"""Return meta data of labels for container/image."""
return self.meta_config.get('Labels') or {}
@property
def image(self):
"""Return name of docker image."""
if not self._meta:
return None
return self._meta['Config']['Image']
"""Return name of Docker image."""
return self.meta_config.get('Image')
@property
def version(self):
"""Return version of docker image."""
if self._meta and LABEL_VERSION in self._meta['Config']['Labels']:
return self._meta['Config']['Labels'][LABEL_VERSION]
return None
"""Return version of Docker image."""
return self.meta_labels.get(LABEL_VERSION)
@property
def arch(self):
"""Return arch of docker image."""
if self._meta and LABEL_ARCH in self._meta['Config']['Labels']:
return self._meta['Config']['Labels'][LABEL_ARCH]
return None
"""Return arch of Docker image."""
return self.meta_labels.get(LABEL_ARCH)
@property
def in_progress(self):
@@ -59,76 +65,78 @@ class DockerInterface(CoreSysAttributes):
return self.lock.locked()
@process_lock
def install(self, tag):
def install(self, tag, image=None):
"""Pull docker image."""
return self.sys_run_in_executor(self._install, tag)
return self.sys_run_in_executor(self._install, tag, image)
def _install(self, tag):
"""Pull docker image.
def _install(self, tag, image=None):
"""Pull Docker image.
Need run inside executor.
"""
try:
_LOGGER.info("Pull image %s tag %s.", self.image, tag)
image = self.sys_docker.images.pull(f"{self.image}:{tag}")
image = image or self.image
image.tag(self.image, tag='latest')
self._meta = image.attrs
try:
_LOGGER.info("Pull image %s tag %s.", image, tag)
docker_image = self.sys_docker.images.pull(f"{image}:{tag}")
docker_image.tag(image, tag='latest')
self._meta = docker_image.attrs
except docker.errors.APIError as err:
_LOGGER.error("Can't install %s:%s -> %s.", self.image, tag, err)
_LOGGER.error("Can't install %s:%s -> %s.", image, tag, err)
return False
_LOGGER.info("Tag image %s with version %s as latest", self.image, tag)
_LOGGER.info("Tag image %s with version %s as latest", image, tag)
return True
def exists(self):
"""Return True if docker image exists in local repo."""
"""Return True if Docker image exists in local repository."""
return self.sys_run_in_executor(self._exists)
def _exists(self):
"""Return True if docker image exists in local repo.
"""Return True if Docker image exists in local repository.
Need run inside executor.
"""
try:
image = self.sys_docker.images.get(self.image)
assert f"{self.image}:{self.version}" in image.tags
docker_image = self.sys_docker.images.get(self.image)
assert f"{self.image}:{self.version}" in docker_image.tags
except (docker.errors.DockerException, AssertionError):
return False
return True
def is_running(self):
"""Return True if docker is Running.
"""Return True if Docker is running.
Return a Future.
"""
return self.sys_run_in_executor(self._is_running)
def _is_running(self):
"""Return True if docker is Running.
"""Return True if Docker is running.
Need run inside executor.
"""
try:
container = self.sys_docker.containers.get(self.name)
image = self.sys_docker.images.get(self.image)
docker_container = self.sys_docker.containers.get(self.name)
docker_image = self.sys_docker.images.get(self.image)
except docker.errors.DockerException:
return False
# container is not running
if container.status != 'running':
if docker_container.status != 'running':
return False
# we run on an old image, stop and start it
if container.image.id != image.id:
if docker_container.image.id != docker_image.id:
return False
return True
@process_lock
def attach(self):
"""Attach to running docker container."""
"""Attach to running Docker container."""
return self.sys_run_in_executor(self._attach)
def _attach(self):
@@ -144,18 +152,18 @@ class DockerInterface(CoreSysAttributes):
except docker.errors.DockerException:
return False
_LOGGER.info(
"Attach to image %s with version %s", self.image, self.version)
_LOGGER.info("Attach to image %s with version %s", self.image,
self.version)
return True
@process_lock
def run(self):
"""Run docker image."""
"""Run Docker image."""
return self.sys_run_in_executor(self._run)
def _run(self):
"""Run docker image.
"""Run Docker image.
Need run inside executor.
"""
@@ -163,7 +171,7 @@ class DockerInterface(CoreSysAttributes):
@process_lock
def stop(self):
"""Stop/remove docker container."""
"""Stop/remove Docker container."""
return self.sys_run_in_executor(self._stop)
def _stop(self):
@@ -172,24 +180,24 @@ class DockerInterface(CoreSysAttributes):
Need run inside executor.
"""
try:
container = self.sys_docker.containers.get(self.name)
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
return False
if container.status == 'running':
_LOGGER.info("Stop %s docker application", self.image)
if docker_container.status == 'running':
_LOGGER.info("Stop %s Docker application", self.image)
with suppress(docker.errors.DockerException):
container.stop(timeout=self.timeout)
docker_container.stop(timeout=self.timeout)
with suppress(docker.errors.DockerException):
_LOGGER.info("Clean %s docker application", self.image)
container.remove(force=True)
_LOGGER.info("Clean %s Docker application", self.image)
docker_container.remove(force=True)
return True
@process_lock
def remove(self):
"""Remove docker images."""
"""Remove Docker images."""
return self.sys_run_in_executor(self._remove)
def _remove(self):
@@ -197,11 +205,11 @@ class DockerInterface(CoreSysAttributes):
Need run inside executor.
"""
# cleanup container
# Cleanup container
self._stop()
_LOGGER.info(
"Remove docker %s with latest and %s", self.image, self.version)
_LOGGER.info("Remove Docker %s with latest and %s", self.image,
self.version)
try:
with suppress(docker.errors.ImageNotFound):
@@ -220,49 +228,51 @@ class DockerInterface(CoreSysAttributes):
return True
@process_lock
def update(self, tag):
"""Update a docker image."""
return self.sys_run_in_executor(self._update, tag)
def update(self, tag, image=None):
"""Update a Docker image."""
return self.sys_run_in_executor(self._update, tag, image)
def _update(self, tag):
def _update(self, tag, image=None):
"""Update a docker image.
Need run inside executor.
"""
_LOGGER.info(
"Update docker %s with %s:%s", self.version, self.image, tag)
image = image or self.image
# update docker image
if not self._install(tag):
_LOGGER.info("Update Docker %s:%s to %s:%s", self.image, self.version,
image, tag)
# Update docker image
if not self._install(tag, image):
return False
# stop container & cleanup
# Stop container & cleanup
self._stop()
self._cleanup()
return True
def logs(self):
"""Return docker logs of container.
"""Return Docker logs of container.
Return a Future.
"""
return self.sys_run_in_executor(self._logs)
def _logs(self):
"""Return docker logs of container.
"""Return Docker logs of container.
Need run inside executor.
"""
try:
container = self.sys_docker.containers.get(self.name)
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
return b""
try:
return container.logs(tail=100, stdout=True, stderr=True)
return docker_container.logs(tail=100, stdout=True, stderr=True)
except docker.errors.DockerException as err:
_LOGGER.warning("Can't grap logs from %s: %s", self.image, err)
_LOGGER.warning("Can't grep logs from %s: %s", self.image, err)
@process_lock
def cleanup(self):
@@ -285,7 +295,7 @@ class DockerInterface(CoreSysAttributes):
continue
with suppress(docker.errors.DockerException):
_LOGGER.info("Cleanup docker images: %s", image.tags)
_LOGGER.info("Cleanup Docker images: %s", image.tags)
self.sys_docker.images.remove(image.id, force=True)
return True
@@ -312,12 +322,12 @@ class DockerInterface(CoreSysAttributes):
Need run inside executor.
"""
try:
container = self.sys_docker.containers.get(self.name)
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
return None
try:
stats = container.stats(stream=False)
stats = docker_container.stats(stream=False)
return DockerStats(stats)
except docker.errors.DockerException as err:
_LOGGER.error("Can't read stats from %s: %s", self.name, err)

View File

@@ -1,4 +1,4 @@
"""Internal network manager for HassIO."""
"""Internal network manager for Hass.io."""
import logging
import docker
@@ -9,13 +9,13 @@ _LOGGER = logging.getLogger(__name__)
class DockerNetwork:
"""Internal HassIO Network.
"""Internal Hass.io Network.
This class is not AsyncIO safe!
"""
def __init__(self, dock):
"""Initialize internal hassio network."""
"""Initialize internal Hass.io network."""
self.docker = dock
self.network = self._get_network()
@@ -44,7 +44,7 @@ class DockerNetwork:
try:
return self.docker.networks.get(DOCKER_NETWORK)
except docker.errors.NotFound:
_LOGGER.info("Can't find HassIO network, create new network")
_LOGGER.info("Can't find Hass.io network, create new network")
ipam_pool = docker.types.IPAMPool(
subnet=str(DOCKER_NETWORK_MASK),
@@ -61,7 +61,7 @@ class DockerNetwork:
})
def attach_container(self, container, alias=None, ipv4=None):
"""Attach container to hassio network.
"""Attach container to Hass.io network.
Need run inside executor.
"""
@@ -77,7 +77,7 @@ class DockerNetwork:
return True
def detach_default_bridge(self, container):
"""Detach default docker bridge.
"""Detach default Docker bridge.
Need run inside executor.
"""

View File

@@ -1,4 +1,4 @@
"""Calc & represent docker stats data."""
"""Calc and represent docker stats data."""
from contextlib import suppress
@@ -6,7 +6,7 @@ class DockerStats:
"""Hold stats data from container inside."""
def __init__(self, stats):
"""Initialize docker stats."""
"""Initialize Docker stats."""
self._cpu = 0.0
self._network_rx = 0
self._network_tx = 0

View File

@@ -1,4 +1,4 @@
"""Init file for HassIO docker object."""
"""Init file for Hass.io Docker object."""
import logging
import os
@@ -11,11 +11,11 @@ _LOGGER = logging.getLogger(__name__)
class DockerSupervisor(DockerInterface, CoreSysAttributes):
"""Docker hassio wrapper for Supervisor."""
"""Docker Hass.io wrapper for Supervisor."""
@property
def name(self):
"""Return name of docker container."""
"""Return name of Docker container."""
return os.environ['SUPERVISOR_NAME']
def _attach(self):
@@ -29,14 +29,14 @@ class DockerSupervisor(DockerInterface, CoreSysAttributes):
return False
self._meta = container.attrs
_LOGGER.info("Attach to supervisor %s with version %s",
_LOGGER.info("Attach to Supervisor %s with version %s",
self.image, self.version)
# if already attach
# If already attach
if container in self.sys_docker.network.containers:
return True
# attach to network
# Attach to network
return self.sys_docker.network.attach_container(
container, alias=['hassio'],
ipv4=self.sys_docker.network.supervisor)

View File

@@ -1,92 +1,115 @@
"""Core Exceptions."""
import asyncio
import aiohttp
class HassioError(Exception):
"""Root exception."""
pass
class HassioNotSupportedError(HassioError):
"""Function is not supported."""
pass
# HomeAssistant
class HomeAssistantError(HassioError):
"""Home Assistant exception."""
pass
class HomeAssistantUpdateError(HomeAssistantError):
"""Error on update of a Home Assistant."""
pass
class HomeAssistantAuthError(HomeAssistantError):
"""Home Assistant Auth API exception."""
pass
class HomeAssistantAPIError(
HomeAssistantAuthError, asyncio.TimeoutError, aiohttp.ClientError):
class HomeAssistantAPIError(HomeAssistantError):
"""Home Assistant API exception."""
pass
class HomeAssistantAuthError(HomeAssistantAPIError):
"""Home Assistant Auth API exception."""
# HassOS
class HassOSError(HassioError):
"""HassOS exception."""
pass
class HassOSUpdateError(HassOSError):
"""Error on update of a HassOS."""
pass
class HassOSNotSupportedError(HassioNotSupportedError):
"""Function not supported by HassOS."""
pass
# Arch
class HassioArchNotFound(HassioNotSupportedError):
"""No matches with exists arch."""
# Updater
class HassioUpdaterError(HassioError):
"""Error on Updater."""
pass
# Auth
class AuthError(HassioError):
"""Auth errors."""
# Host
class HostError(HassioError):
"""Internal Host error."""
pass
class HostNotSupportedError(HassioNotSupportedError):
"""Host function is not supprted."""
pass
class HostServiceError(HostError):
"""Host service functions fails."""
pass
class HostAppArmorError(HostError):
"""Host apparmor functions fails."""
# API
class APIError(HassioError, RuntimeError):
"""API errors."""
class APIForbidden(APIError):
"""API forbidden error."""
# Service / Discovery
class DiscoveryError(HassioError):
"""Discovery Errors."""
class ServicesError(HassioError):
"""Services Errors."""
# utils/gdbus
class DBusError(HassioError):
"""DBus generic error."""
pass
class DBusNotConnectedError(HostNotSupportedError):
@@ -95,26 +118,29 @@ class DBusNotConnectedError(HostNotSupportedError):
class DBusFatalError(DBusError):
"""DBus call going wrong."""
pass
class DBusParseError(DBusError):
"""DBus parse error."""
pass
# util/apparmor
class AppArmorError(HostAppArmorError):
"""General AppArmor error."""
pass
class AppArmorFileError(AppArmorError):
"""AppArmor profile file error."""
pass
class AppArmorInvalidError(AppArmorError):
"""AppArmor profile validate error."""
pass
# util/json
class JsonFileError(HassioError):
"""Invalid json file."""

View File

@@ -66,9 +66,9 @@ class HassOS(CoreSysAttributes):
return self._board
def _check_host(self):
"""Check if HassOS is availabe."""
"""Check if HassOS is available."""
if not self.available:
_LOGGER.error("No HassOS availabe")
_LOGGER.error("No HassOS available")
raise HassOSNotSupportedError()
async def _download_raucb(self, version):
@@ -97,7 +97,7 @@ class HassOS(CoreSysAttributes):
_LOGGER.warning("Can't fetch versions from %s: %s", url, err)
except OSError as err:
_LOGGER.error("Can't write ota file: %s", err)
_LOGGER.error("Can't write OTA file: %s", err)
raise HassOSUpdateError()
@@ -131,7 +131,7 @@ class HassOS(CoreSysAttributes):
"""
self._check_host()
_LOGGER.info("Sync config from USB on HassOS.")
_LOGGER.info("Syncing configuration from USB with HassOS.")
return self.sys_host.services.restart('hassos-config.service')
async def update(self, version=None):
@@ -182,5 +182,5 @@ class HassOS(CoreSysAttributes):
if await self.instance.update(version):
return
_LOGGER.error("HassOS CLI update fails.")
_LOGGER.error("HassOS CLI update fails")
raise HassOSUpdateError()

View File

@@ -1,9 +1,11 @@
"""HomeAssistant control object."""
"""Home Assistant control object."""
import asyncio
from contextlib import asynccontextmanager, suppress
from datetime import datetime, timedelta
import logging
import os
import re
from pathlib import Path
import socket
import time
@@ -11,17 +13,15 @@ import aiohttp
from aiohttp import hdrs
import attr
from .const import (
FILE_HASSIO_HOMEASSISTANT, ATTR_IMAGE, ATTR_LAST_VERSION, ATTR_UUID,
ATTR_BOOT, ATTR_PASSWORD, ATTR_PORT, ATTR_SSL, ATTR_WATCHDOG,
ATTR_WAIT_BOOT, ATTR_REFRESH_TOKEN,
HEADER_HA_ACCESS)
from .const import (FILE_HASSIO_HOMEASSISTANT, ATTR_IMAGE, ATTR_LAST_VERSION,
ATTR_UUID, ATTR_BOOT, ATTR_PASSWORD, ATTR_PORT, ATTR_SSL,
ATTR_WATCHDOG, ATTR_WAIT_BOOT, ATTR_REFRESH_TOKEN,
ATTR_ACCESS_TOKEN, HEADER_HA_ACCESS)
from .coresys import CoreSysAttributes
from .docker.homeassistant import DockerHomeAssistant
from .exceptions import (
HomeAssistantUpdateError, HomeAssistantError, HomeAssistantAPIError,
HomeAssistantAuthError)
from .utils import convert_to_ascii, process_lock
from .exceptions import (HomeAssistantUpdateError, HomeAssistantError,
HomeAssistantAPIError, HomeAssistantAuthError)
from .utils import convert_to_ascii, process_lock, create_token
from .utils.json import JsonConfig
from .validate import SCHEMA_HASS_CONFIG
@@ -29,15 +29,19 @@ _LOGGER = logging.getLogger(__name__)
RE_YAML_ERROR = re.compile(r"homeassistant\.util\.yaml")
# pylint: disable=invalid-name
ConfigResult = attr.make_class('ConfigResult', ['valid', 'log'], frozen=True)
@attr.s(frozen=True)
class ConfigResult:
"""Return object from config check."""
valid = attr.ib()
log = attr.ib()
class HomeAssistant(JsonConfig, CoreSysAttributes):
"""Hass core object for handle it."""
"""Home Assistant core object for handle it."""
def __init__(self, coresys):
"""Initialize hass object."""
"""Initialize Home Assistant object."""
super().__init__(FILE_HASSIO_HOMEASSISTANT, SCHEMA_HASS_CONFIG)
self.coresys = coresys
self.instance = DockerHomeAssistant(coresys)
@@ -45,20 +49,26 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
self._error_state = False
# We don't persist access tokens. Instead we fetch new ones when needed
self.access_token = None
self._access_token_expires = None
async def load(self):
"""Prepare HomeAssistant object."""
"""Prepare Home Assistant object."""
if await self.instance.attach():
return
_LOGGER.info("No HomeAssistant docker %s found.", self.image)
_LOGGER.info("No Home Assistant Docker image %s found.", self.image)
await self.install_landingpage()
@property
def machine(self):
"""Return System Machines."""
"""Return the system machines."""
return self.instance.machine
@property
def arch(self):
"""Return arch of running Home Assistant."""
return self.instance.arch
@property
def error_state(self):
"""Return True if system is in error."""
@@ -66,81 +76,80 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
@property
def api_ip(self):
"""Return IP of HomeAssistant instance."""
"""Return IP of Home Assistant instance."""
return self.sys_docker.network.gateway
@property
def api_port(self):
"""Return network port to home-assistant instance."""
"""Return network port to Home Assistant instance."""
return self._data[ATTR_PORT]
@api_port.setter
def api_port(self, value):
"""Set network port for home-assistant instance."""
"""Set network port for Home Assistant instance."""
self._data[ATTR_PORT] = value
@property
def api_password(self):
"""Return password for home-assistant instance."""
"""Return password for Home Assistant instance."""
return self._data.get(ATTR_PASSWORD)
@api_password.setter
def api_password(self, value):
"""Set password for home-assistant instance."""
"""Set password for Home Assistant instance."""
self._data[ATTR_PASSWORD] = value
@property
def api_ssl(self):
"""Return if we need ssl to home-assistant instance."""
"""Return if we need ssl to Home Assistant instance."""
return self._data[ATTR_SSL]
@api_ssl.setter
def api_ssl(self, value):
"""Set SSL for home-assistant instance."""
"""Set SSL for Home Assistant instance."""
self._data[ATTR_SSL] = value
@property
def api_url(self):
"""Return API url to Home-Assistant."""
return "{}://{}:{}".format(
'https' if self.api_ssl else 'http', self.api_ip, self.api_port
)
"""Return API url to Home Assistant."""
return "{}://{}:{}".format('https' if self.api_ssl else 'http',
self.api_ip, self.api_port)
@property
def watchdog(self):
"""Return True if the watchdog should protect Home-Assistant."""
"""Return True if the watchdog should protect Home Assistant."""
return self._data[ATTR_WATCHDOG]
@watchdog.setter
def watchdog(self, value):
"""Return True if the watchdog should protect Home-Assistant."""
"""Return True if the watchdog should protect Home Assistant."""
self._data[ATTR_WATCHDOG] = value
@property
def wait_boot(self):
"""Return time to wait for Home-Assistant startup."""
"""Return time to wait for Home Assistant startup."""
return self._data[ATTR_WAIT_BOOT]
@wait_boot.setter
def wait_boot(self, value):
"""Set time to wait for Home-Assistant startup."""
"""Set time to wait for Home Assistant startup."""
self._data[ATTR_WAIT_BOOT] = value
@property
def version(self):
"""Return version of running homeassistant."""
"""Return version of running Home Assistant."""
return self.instance.version
@property
def last_version(self):
"""Return last available version of homeassistant."""
"""Return last available version of Home Assistant."""
if self.is_custom_image:
return self._data.get(ATTR_LAST_VERSION)
return self.sys_updater.version_homeassistant
@last_version.setter
def last_version(self, value):
"""Set last available version of homeassistant."""
"""Set last available version of Home Assistant."""
if value:
self._data[ATTR_LAST_VERSION] = value
else:
@@ -148,14 +157,14 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
@property
def image(self):
"""Return image name of hass containter."""
"""Return image name of the Home Assistant container."""
if self._data.get(ATTR_IMAGE):
return self._data[ATTR_IMAGE]
return os.environ['HOMEASSISTANT_REPOSITORY']
@image.setter
def image(self, value):
"""Set image name of hass containter."""
"""Set image name of Home Assistant container."""
if value:
self._data[ATTR_IMAGE] = value
else:
@@ -164,27 +173,32 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
@property
def is_custom_image(self):
"""Return True if a custom image is used."""
return all(attr in self._data for attr in
(ATTR_IMAGE, ATTR_LAST_VERSION))
return all(
attr in self._data for attr in (ATTR_IMAGE, ATTR_LAST_VERSION))
@property
def boot(self):
"""Return True if home-assistant boot is enabled."""
"""Return True if Home Assistant boot is enabled."""
return self._data[ATTR_BOOT]
@boot.setter
def boot(self, value):
"""Set home-assistant boot options."""
"""Set Home Assistant boot options."""
self._data[ATTR_BOOT] = value
@property
def uuid(self):
"""Return a UUID of this HomeAssistant."""
"""Return a UUID of this Home Assistant instance."""
return self._data[ATTR_UUID]
@property
def hassio_token(self):
"""Return an access token for the Hass.io API."""
return self._data.get(ATTR_ACCESS_TOKEN)
@property
def refresh_token(self):
"""Return the refresh token to authenticate with HomeAssistant."""
"""Return the refresh token to authenticate with Home Assistant."""
return self._data.get(ATTR_REFRESH_TOKEN)
@refresh_token.setter
@@ -194,25 +208,18 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
@process_lock
async def install_landingpage(self):
"""Install a landingpage."""
"""Install a landing page."""
_LOGGER.info("Setup HomeAssistant landingpage")
while True:
if await self.instance.install('landingpage'):
break
_LOGGER.warning("Fails install landingpage, retry after 60sec")
await asyncio.sleep(60)
# Run landingpage after installation
_LOGGER.info("Start landingpage")
try:
await self._start()
except HomeAssistantError:
_LOGGER.warning("Can't start landingpage")
_LOGGER.warning("Fails install landingpage, retry after 30sec")
await asyncio.sleep(30)
@process_lock
async def install(self):
"""Install a landingpage."""
_LOGGER.info("Setup HomeAssistant")
"""Install a landing page."""
_LOGGER.info("Setup Home Assistant")
while True:
# read homeassistant tag and install it
if not self.last_version:
@@ -221,18 +228,18 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
tag = self.last_version
if tag and await self.instance.install(tag):
break
_LOGGER.warning("Error on install HomeAssistant. Retry in 60sec")
await asyncio.sleep(60)
_LOGGER.warning("Error on install Home Assistant. Retry in 30sec")
await asyncio.sleep(30)
# finishing
_LOGGER.info("HomeAssistant docker now installed")
_LOGGER.info("Home Assistant docker now installed")
try:
if not self.boot:
return
_LOGGER.info("Start HomeAssistant")
_LOGGER.info("Start Home Assistant")
await self._start()
except HomeAssistantError:
_LOGGER.error("Can't start HomeAssistant!")
_LOGGER.error("Can't start Home Assistant!")
finally:
await self.instance.cleanup()
@@ -248,17 +255,17 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
_LOGGER.warning("Version %s is already installed", version)
return HomeAssistantUpdateError()
# process a update
# process an update
async def _update(to_version):
"""Run Home Assistant update."""
try:
_LOGGER.info("Update HomeAssistant to version %s", to_version)
_LOGGER.info("Update Home Assistant to version %s", to_version)
if not await self.instance.update(to_version):
raise HomeAssistantUpdateError()
finally:
if running:
await self._start()
_LOGGER.info("Successfull run HomeAssistant %s", to_version)
_LOGGER.info("Successful run Home Assistant %s", to_version)
# Update Home Assistant
with suppress(HomeAssistantError):
@@ -273,14 +280,22 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
raise HomeAssistantUpdateError()
async def _start(self):
"""Start HomeAssistant docker & wait."""
"""Start Home Assistant Docker & wait."""
if await self.instance.is_running():
_LOGGER.warning("Home Assistant is already running!")
return
# Create new API token
self._data[ATTR_ACCESS_TOKEN] = create_token()
self.save_data()
if not await self.instance.run():
raise HomeAssistantError()
await self._block_till_run()
@process_lock
def start(self):
"""Run HomeAssistant docker.
"""Run Home Assistant docker.
Return a coroutine.
"""
@@ -288,7 +303,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
@process_lock
def stop(self):
"""Stop HomeAssistant docker.
"""Stop Home Assistant Docker.
Return a coroutine.
"""
@@ -296,7 +311,7 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
@process_lock
async def restart(self):
"""Restart HomeAssistant docker."""
"""Restart Home Assistant Docker."""
await self.instance.stop()
await self._start()
@@ -308,21 +323,21 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
return self.instance.logs()
def stats(self):
"""Return stats of HomeAssistant.
"""Return stats of Home Assistant.
Return a coroutine.
"""
return self.instance.stats()
def is_running(self):
"""Return True if docker container is running.
"""Return True if Docker container is running.
Return a coroutine.
"""
return self.instance.is_running()
def is_initialize(self):
"""Return True if a docker container is exists.
"""Return True if a Docker container is exists.
Return a coroutine.
"""
@@ -334,48 +349,55 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
return self.instance.in_progress or self.lock.locked()
async def check_config(self):
"""Run homeassistant config check."""
"""Run Home Assistant config check."""
result = await self.instance.execute_command(
"python3 -m homeassistant -c /config --script check_config"
)
"python3 -m homeassistant -c /config --script check_config")
# if not valid
if result.exit_code is None:
_LOGGER.error("Fatal error on config check!")
raise HomeAssistantError()
# parse output
log = convert_to_ascii(result.output)
if result.exit_code != 0 or RE_YAML_ERROR.search(log):
_LOGGER.error("Invalid Home Assistant config found!")
return ConfigResult(False, log)
_LOGGER.info("Home Assistant config is valid")
return ConfigResult(True, log)
async def ensure_access_token(self):
"""Ensures there is an access token."""
if self.access_token is not None:
if self.access_token is not None and self._access_token_expires > datetime.utcnow():
return
with suppress(asyncio.TimeoutError, aiohttp.ClientError):
async with self.sys_websession_ssl.get(
async with self.sys_websession_ssl.post(
f"{self.api_url}/auth/token",
timeout=30,
data={
"grant_type": "refresh_token",
"refresh_token": self.refresh_token
}
) as resp:
}) as resp:
if resp.status != 200:
_LOGGER.error("Authenticate problem with HomeAssistant!")
_LOGGER.error("Can't update Home Assistant access token!")
raise HomeAssistantAuthError()
_LOGGER.info("Updated Home Assistant API token")
tokens = await resp.json()
self.access_token = tokens['access_token']
return
_LOGGER.error("Can't update HomeAssistant access token!")
raise HomeAssistantAPIError()
self._access_token_expires = \
datetime.utcnow() + timedelta(seconds=tokens['expires_in'])
@asynccontextmanager
async def make_request(self, method, path, json=None, content_type=None,
data=None, timeout=30):
async def make_request(self,
method,
path,
json=None,
content_type=None,
data=None,
timeout=30):
"""Async context manager to make a request with right auth."""
url = f"{self.api_url}/{path}"
headers = {}
@@ -394,45 +416,39 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
await self.ensure_access_token()
headers[hdrs.AUTHORIZATION] = f'Bearer {self.access_token}'
async with getattr(self.sys_websession_ssl, method)(
url, data=data, timeout=timeout, json=json, headers=headers
) as resp:
# Access token expired
if resp.status == 401 and self.refresh_token:
self.access_token = None
continue
yield resp
return
try:
async with getattr(self.sys_websession_ssl, method)(
url, data=data, timeout=timeout, json=json,
headers=headers) as resp:
# Access token expired
if resp.status == 401 and self.refresh_token:
self.access_token = None
continue
yield resp
return
except (asyncio.TimeoutError, aiohttp.ClientError) as err:
_LOGGER.error("Error on call %s: %s", url, err)
break
raise HomeAssistantAPIError()
async def check_api_state(self):
"""Return True if Home-Assistant up and running."""
"""Return True if Home Assistant up and running."""
with suppress(HomeAssistantAPIError):
async with self.make_request('get', 'api/') as resp:
if resp.status in (200, 201):
return True
err = resp.status
status = resp.status
_LOGGER.warning("Home Assistant API config mismatch: %s", status)
_LOGGER.warning("Home-Assistant API config missmatch: %d", err)
return False
async def send_event(self, event_type, event_data=None):
"""Send event to Home-Assistant."""
with suppress(HomeAssistantAPIError):
async with self.make_request(
'get', f'api/events/{event_type}'
) as resp:
if resp.status in (200, 201):
return
err = resp.status
_LOGGER.warning("HomeAssistant event %s fails: %s", event_type, err)
return HomeAssistantError()
async def _block_till_run(self):
"""Block until Home-Assistant is booting up or startup timeout."""
start_time = time.monotonic()
migration_progress = False
migration_file = Path(self.sys_config.path_homeassistant,
'.migration_progress')
def check_port():
"""Check if port is mapped."""
@@ -448,21 +464,35 @@ class HomeAssistant(JsonConfig, CoreSysAttributes):
pass
return False
while time.monotonic() - start_time < self.wait_boot:
# Check if API response
while True:
await asyncio.sleep(5)
# 1: Check if Container is is_running
if not await self.instance.is_running():
_LOGGER.error("Home Assistant has crashed!")
break
# 2: Check if API response
if await self.sys_run_in_executor(check_port):
_LOGGER.info("Detect a running HomeAssistant instance")
_LOGGER.info("Detect a running Home Assistant instance")
self._error_state = False
return
# wait and don't hit the system
await asyncio.sleep(10)
# 3: Running DB Migration
if migration_file.exists():
if not migration_progress:
migration_progress = True
_LOGGER.info("Home Assistant record migration in progress")
continue
elif migration_progress:
migration_progress = False # Reset start time
start_time = time.monotonic()
_LOGGER.info("Home Assistant record migration done")
# Check if Container is is_running
if not await self.instance.is_running():
_LOGGER.error("Home Assistant is crashed!")
# 4: Timeout
if time.monotonic() - start_time > self.wait_boot:
_LOGGER.warning("Don't wait anymore of Home Assistant startup!")
break
_LOGGER.warning("Don't wait anymore of HomeAssistant startup!")
self._error_state = True
raise HomeAssistantError()

View File

@@ -1,4 +1,4 @@
"""Host function like audio/dbus/systemd."""
"""Host function like audio, D-Bus or systemd."""
from contextlib import suppress
import logging
@@ -35,7 +35,7 @@ class HostManager(CoreSysAttributes):
@property
def apparmor(self):
"""Return host apparmor handler."""
"""Return host AppArmor handler."""
return self._apparmor
@property

View File

@@ -1,4 +1,4 @@
"""Host Audio-support."""
"""Host Audio support."""
import logging
import json
from pathlib import Path
@@ -6,7 +6,8 @@ from string import Template
import attr
from ..const import ATTR_INPUT, ATTR_OUTPUT, ATTR_DEVICES, ATTR_NAME
from ..const import (
ATTR_INPUT, ATTR_OUTPUT, ATTR_DEVICES, ATTR_NAME, CHAN_ID, CHAN_TYPE)
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
@@ -19,7 +20,7 @@ class AlsaAudio(CoreSysAttributes):
"""Handle Audio ALSA host data."""
def __init__(self, coresys):
"""Initialize Alsa audio system."""
"""Initialize ALSA audio system."""
self.coresys = coresys
self._data = {
ATTR_INPUT: {},
@@ -58,7 +59,9 @@ class AlsaAudio(CoreSysAttributes):
# Process devices
for dev_id, dev_data in self.sys_hardware.audio_devices.items():
for chan_id, chan_type in dev_data[ATTR_DEVICES].items():
for chan_info in dev_data[ATTR_DEVICES]:
chan_id = chan_info[CHAN_ID]
chan_type = chan_info[CHAN_TYPE]
alsa_id = f"{dev_id},{chan_id}"
dev_name = dev_data[ATTR_NAME]

View File

@@ -13,7 +13,7 @@ SYSTEMD_SERVICES = {'hassos-apparmor.service', 'hassio-apparmor.service'}
class AppArmorControl(CoreSysAttributes):
"""Handle host apparmor controls."""
"""Handle host AppArmor controls."""
def __init__(self, coresys):
"""Initialize host power handling."""
@@ -23,7 +23,7 @@ class AppArmorControl(CoreSysAttributes):
@property
def available(self):
"""Return True if AppArmor is availabe on host."""
"""Return True if AppArmor is available on host."""
return self._service is not None
def exists(self, profile):
@@ -62,12 +62,12 @@ class AppArmorControl(CoreSysAttributes):
if self.available:
await self._reload_service()
else:
_LOGGER.info("AppArmor is not enabled on Host")
_LOGGER.info("AppArmor is not enabled on host")
async def load_profile(self, profile_name, profile_file):
"""Load/Update a new/exists profile into AppArmor."""
if not validate_profile(profile_name, profile_file):
_LOGGER.error("profile is not valid with name %s", profile_name)
_LOGGER.error("Profile is not valid with name %s", profile_name)
raise HostAppArmorError()
# Copy to AppArmor folder

View File

@@ -24,7 +24,7 @@ class SystemControl(CoreSysAttributes):
if flag == HOSTNAME and self.sys_dbus.hostname.is_connected:
return
_LOGGER.error("No %s dbus connection available", flag)
_LOGGER.error("No %s D-Bus connection available", flag)
raise HostNotSupportedError()
async def reboot(self):
@@ -51,6 +51,6 @@ class SystemControl(CoreSysAttributes):
"""Set local a new Hostname."""
self._check_dbus(HOSTNAME)
_LOGGER.info("Set Hostname %s", hostname)
_LOGGER.info("Set hostname %s", hostname)
await self.sys_dbus.hostname.set_static_hostname(hostname)
await self.sys_host.info.update()

View File

@@ -48,7 +48,7 @@ class InfoCenter(CoreSysAttributes):
async def update(self):
"""Update properties over dbus."""
if not self.sys_dbus.hostname.is_connected:
_LOGGER.error("No hostname dbus connection available")
_LOGGER.error("No hostname D-Bus connection available")
raise HostNotSupportedError()
_LOGGER.info("Update local host information")

Some files were not shown because too many files have changed in this diff Show More