Compare commits

...

168 Commits
0.14 ... 0.29

Author SHA1 Message Date
Pascal Vizeli
81e1227a7b Merge pull request #59 from home-assistant/dev
Release 0.29
2017-05-17 23:43:09 +02:00
Pascal Vizeli
75be8666a6 Update version.json 2017-05-17 23:41:50 +02:00
Pascal Vizeli
6031a60084 Add addon share and allow to mount host mnt (#58)
* Add addon share and allow to mount host mnt

* fix comments logs
2017-05-17 17:21:54 +02:00
Pascal Vizeli
39d5785118 Allow config.json to manipulate docker env. (#57)
Allow config.json to manipulate docker env.
2017-05-17 14:45:02 +02:00
Pascal Vizeli
bddcdcadb2 Pump version 2017-05-17 14:25:57 +02:00
Pascal Vizeli
3eac6a3366 Merge pull request #56 from home-assistant/dev
Release 0.28
2017-05-16 22:31:47 +02:00
Pascal Vizeli
3c7b962cf9 update hass.io 2017-05-16 22:17:19 +02:00
Pascal Vizeli
bd756e2a9c fix dev regex 2017-05-16 22:11:42 +02:00
Pascal Vizeli
e7920bee2a Fix group regex 2017-05-16 22:03:51 +02:00
Pascal Vizeli
ebcc21370e Add policy for mappings (#55)
* Add policy for mappings

* fix travis
2017-05-16 17:15:35 +02:00
Pascal Vizeli
34c4acf199 Add device support (#54)
Add device support
2017-05-16 14:50:47 +02:00
Pascal Vizeli
47e45dfc9f Pump version 2017-05-16 11:45:20 +02:00
Pascal Vizeli
2ecea7c1b4 Merge pull request #53 from home-assistant/dev
Release 0.27
2017-05-16 00:20:29 +02:00
Pascal Vizeli
5c0eccd12f Bugfix attach container/image (#52) 2017-05-16 00:07:43 +02:00
Pascal Vizeli
f34ab9402b Fix remove (#51) 2017-05-15 23:39:34 +02:00
Pascal Vizeli
2569a82caf Update Hass.IO version 2017-05-15 23:27:18 +02:00
Pascal Vizeli
4bdd256000 Use label instead env, cleanup build (#50)
* Use label instead env, cleanup build

* Update const.py

* fix lint

* add space

* fix lint

* use dynamic type

* fix lint

* fix path

* fix label read

* fix bug
2017-05-15 23:19:35 +02:00
Pascal Vizeli
6f4f6338c5 Pump version 2017-05-15 16:35:30 +02:00
Pascal Vizeli
7cb72b55a8 Merge pull request #49 from home-assistant/dev
Release 0.26
2017-05-15 00:26:30 +02:00
Pascal Vizeli
1a9a08cbfb Update version.json 2017-05-15 00:17:59 +02:00
Pascal Vizeli
237ee0363d Update error message (#48) 2017-05-15 00:08:15 +02:00
Pascal Vizeli
86180ddc34 Allow every repository to make a local build (#47)
* Allow every repository to make a local build

* store version of build

* cleanup code

* fix lint
2017-05-14 23:32:54 +02:00
Pascal Vizeli
eed41d30ec Update const.py 2017-05-13 23:17:23 +02:00
Pascal Vizeli
0b0fd6b910 Add files via upload 2017-05-13 19:14:54 +02:00
Pascal Vizeli
1f887b47ab register panel on core 2017-05-13 17:44:16 +02:00
Pascal Vizeli
affd8057ca WIP Add panel to hass.io (#46)
* Add poliymare

* Add commit

* add static route

* fix name

* add panel
2017-05-13 17:41:46 +02:00
Pascal Vizeli
7a8ee2c46a Merge pull request #45 from home-assistant/dev
Release 0.25
2017-05-12 16:20:12 +02:00
Pascal Vizeli
35fe1f464c Update Hass.IO 0.25 2017-05-12 16:15:18 +02:00
Pascal Vizeli
0955bafebd Update data handling of addons (#44)
* Update data handling of addons

* Update addons api

* Update data.py

* Update data.py

* Add url fix bug
2017-05-12 16:14:49 +02:00
Pascal Vizeli
2e0c540c63 Pump version 2017-05-12 08:53:20 +02:00
Pascal Vizeli
6e9ef17a28 Merge pull request #43 from home-assistant/dev
Release 0.24
2017-05-12 01:47:48 +02:00
Pascal Vizeli
eb3cdbfeb9 update Hass.IO 0.24 2017-05-12 01:37:42 +02:00
Pascal Vizeli
f4cb16ad09 WIP: Add support for build docker on local repository (#42)
* Add support for build docker on local repository

* Add docker support

* finish build

* change api

* add dockerfile generator

* finish it

* fix lint

* fix path

* fix path

* fix copy

* add debug stuff

* fix docker template

* cleanups

* fix addons

* change handling

* fix lint / cleanup code

* fix lint

* tag
2017-05-12 01:37:03 +02:00
Pascal Vizeli
956af2bd62 Add security api and TOTP on supervisor (#41)
* Add security api and TOTP on supervisor

* finish security api

* fix lint

* fix lint p2

* add new api view to init

* Task session cleanup / fix hass wachdog

* fix lint

* fix api return

* fix check
2017-05-10 22:02:47 +02:00
Pascal Vizeli
b76cd5c004 Add files via upload 2017-05-10 17:01:35 +02:00
Pascal Vizeli
61d9301dcc Add files via upload 2017-05-10 11:06:34 +02:00
Pascal Vizeli
2ded05be83 Add files via upload 2017-05-10 10:39:00 +02:00
Pascal Vizeli
899d6766c5 Pump version to 0.24 2017-05-10 00:30:11 +02:00
Pascal Vizeli
c67d57cef4 Update version.json 2017-05-10 00:15:59 +02:00
Pascal Vizeli
b5cca7d341 Update __init__.py 2017-05-10 00:10:11 +02:00
Pascal Vizeli
8919f13911 Fix api (#40)
* Bugfix api call

* add log
2017-05-10 00:00:30 +02:00
Pascal Vizeli
990ae49608 Add tasks and watchdog for homeassistant (#39)
* Add tasks and watchdog for homeassistant

* code cleanup
2017-05-09 23:08:15 +02:00
Pascal Vizeli
c2ba02722c Add arch to addon config / hole api (#38)
* Add arch to addon config / hole api

* fix wrong copy past
2017-05-09 17:03:59 +02:00
Pascal Vizeli
5bd1957337 Update README.md 2017-05-09 09:06:48 +02:00
Pascal Vizeli
f59f0793bc Add restart support to homeassistant and add-ons (#37) 2017-05-08 23:31:30 +02:00
Pascal Vizeli
63b96700e0 Add more infos to /addons/xy/info (#36) 2017-05-08 19:46:08 +02:00
Pascal Vizeli
dffbcc2c7e Pump version to 0.23 2017-05-08 13:00:57 +02:00
Pascal Vizeli
0dbe1ecc2a Merge remote-tracking branch 'origin/master' into dev 2017-05-08 12:58:59 +02:00
Pascal Vizeli
da8526fcec Update Version 0.44.2 - 0.22 2017-05-08 12:00:47 +02:00
Pascal Vizeli
933b6f4d1e Update hass to 0.44.2 2017-05-08 07:11:35 +02:00
Pascal Vizeli
16f2dfeebd Revert 0.44.1 2017-05-08 06:59:53 +02:00
Pascal Vizeli
bc6eb5cab4 Merge pull request #35 from home-assistant/dev
Release 0.22
2017-05-08 06:47:04 +02:00
Pascal Vizeli
8833845b2e add url to api 2017-05-08 01:20:36 +02:00
Pascal Vizeli
391be6afac Update to hassio 0.22 and hass 0.44.1 2017-05-08 01:09:04 +02:00
Pascal Vizeli
a4f74676b6 add slug 2017-05-08 00:55:21 +02:00
Pascal Vizeli
600b32f75b add manifest (#34) 2017-05-08 00:44:07 +02:00
Pascal Vizeli
f199a5cf95 Update log info 2017-05-08 00:32:45 +02:00
Pascal Vizeli
5896fde441 Add support for 0.44.1 (#33)
* Add support for 0.44.1

* fix lint

* fix lint

* fix lint
2017-05-08 00:19:57 +02:00
Pascal Vizeli
9998f9720f pump version 2017-05-07 18:49:25 +02:00
Pascal Vizeli
f37589daa6 Merge remote-tracking branch 'origin/dev' 2017-05-07 18:47:21 +02:00
Pascal Vizeli
ce2513f175 Merge branch 'dev' 2017-05-07 18:46:53 +02:00
Pascal Vizeli
1a4c5d24a4 dev relase 0.21 from hass.io 2017-05-07 17:27:21 +02:00
Pascal Vizeli
886d202f39 Fix bug 2017-05-07 17:21:06 +02:00
Pascal Vizeli
5a42019ed7 Rollback update 0.21 2017-05-07 17:18:50 +02:00
Pascal Vizeli
354093c121 Update Hass.IO to 0.21 2017-05-07 17:15:53 +02:00
Pascal Vizeli
aa9c300d7c Return only installed addons inside info call (#32) 2017-05-07 16:33:27 +02:00
Pascal Vizeli
d9ad5daae3 Update HASS Version 2017-05-07 09:07:37 +02:00
Pascal Vizeli
4680ba6d0d Update HASS Version 2017-05-07 09:06:48 +02:00
Pascal Vizeli
1423062ac3 Merge pull request #31 from home-assistant/master
Master
2017-05-04 21:30:28 +02:00
Pascal Vizeli
a036096684 Update to ResinOS 0.7 2017-05-04 20:50:06 +02:00
Pascal Vizeli
a1c443a6f2 Update to ResinOS 0.7 2017-05-04 20:47:51 +02:00
Paulus Schoutsen
e6e1367cd6 Update README.md 2017-05-03 21:39:38 -07:00
Pascal Vizeli
303e741289 Pump version 2017-05-03 00:15:19 +02:00
Pascal Vizeli
79cc23e273 Merge pull request #30 from home-assistant/dev
Release 0.20
2017-05-03 00:12:22 +02:00
Pascal Vizeli
046ce0230a Fix bug 2017-05-02 23:50:39 +02:00
Pascal Vizeli
33a66bee01 Fix wrong parser 2017-05-02 23:35:31 +02:00
Pascal Vizeli
f76749a933 Cleanup output 2017-05-02 23:33:51 +02:00
Pascal Vizeli
385af5bef5 Update version.json 2017-05-02 23:17:23 +02:00
Pascal Vizeli
19b72b1a79 add logoutbut (#29)
* add logoutbut

* Protect host update

* schedule a hostcontrol update task

* change log level

* cleanup names
2017-05-02 23:16:58 +02:00
Pascal Vizeli
f6048467ad Update api infos (#28)
* New cleanup

* Cleanup addons data object from api stuff.

* Fix lint

* Fix repo export

* Fix part 2

* Update API.md
2017-05-02 18:50:35 +02:00
Pascal Vizeli
dbe6b860c7 Pump version 2017-05-02 02:28:40 +02:00
Pascal Vizeli
0a1e6b3e2d Merge pull request #27 from home-assistant/dev
Release 0.19
2017-05-02 02:28:01 +02:00
Pascal Vizeli
f178dde589 fix lint 2017-05-02 02:25:22 +02:00
Pascal Vizeli
545d45ecf0 Add more exceptions 2017-05-02 02:24:19 +02:00
Pascal Vizeli
c333f94cfa Fix options path 2017-05-02 02:01:12 +02:00
Pascal Vizeli
76a999f650 Update __init__.py 2017-05-02 01:55:48 +02:00
Pascal Vizeli
d2f8e35622 Fix path 2017-05-02 01:36:15 +02:00
Pascal Vizeli
1e4ed9c9d1 Fix str by docker extern 2017-05-02 01:34:54 +02:00
Pascal Vizeli
57c21b4eb5 Fix glob 2017-05-02 01:27:53 +02:00
Pascal Vizeli
088cc3ef15 Only loop dir in custom 2017-05-02 01:24:42 +02:00
Pascal Vizeli
9e326f6324 Remove old file handling 2017-05-02 01:09:39 +02:00
Pascal Vizeli
5840290df7 Fix bug 2017-05-02 00:57:03 +02:00
Pascal Vizeli
58fdabb8ff Fix relative glob 2017-05-02 00:32:56 +02:00
Pascal Vizeli
b8fc002fbc Update new path 2017-05-02 00:24:29 +02:00
Pascal Vizeli
d5e349266b Convert socket to string 2017-05-02 00:17:19 +02:00
Pascal Vizeli
4c135dd617 Convert dock to string 2017-05-02 00:16:39 +02:00
Pascal Vizeli
a98a0d1a1a Fix HC for new path lib 2017-05-02 00:15:50 +02:00
Pascal Vizeli
768b4d2b1a Start test for 0.19 2017-05-02 00:06:20 +02:00
Pascal Vizeli
ff640c598d Support for repository store. (#26)
* Support for repository store.

* Fix api

* part 1 of restruct and migrate pathlib

* Migrate p2

* fix lint / cleanups

* fix lint p2

* fix lint p3
2017-05-02 00:02:55 +02:00
Pascal Vizeli
c76408e4e8 Move list of all available addons to own api call (#24)
* Move list of all available addons to own api call

* fix lint

* fix lint

* fix style
2017-04-30 22:15:03 +02:00
Pascal Vizeli
2e168d089c Fix validate required arguments (#25) 2017-04-30 19:03:26 +02:00
Pascal Vizeli
a61311e928 Don't store homeassistant image from env to config (#23) 2017-04-30 09:46:19 +02:00
Pascal Vizeli
85ffba90b2 Fix severals things with json (#22) 2017-04-30 01:57:03 +02:00
Pascal Vizeli
6623ec9bbc Update README.md 2017-04-30 01:47:20 +02:00
Pascal Vizeli
7a4ed4029c Is not needed anymore 2017-04-28 23:40:11 +02:00
Pascal Vizeli
d4c52ce298 Merge remote-tracking branch 'origin/master' into dev 2017-04-28 23:37:01 +02:00
Pascal Vizeli
c0aab8497f Use now dev for beta_upstream 2017-04-28 23:06:33 +02:00
Pascal Vizeli
cabe5b143f update reamde 2017-04-28 22:56:32 +02:00
Pascal Vizeli
3a791bace6 Add schema 2017-04-28 22:54:09 +02:00
Pascal Vizeli
19dfdb094b Update 0.43.2 2017-04-28 19:00:17 +02:00
Pascal Vizeli
ec5caa483e Update HomeAssistant 0.34.2 2017-04-28 18:59:48 +02:00
Pascal Vizeli
e6967f8db5 Update version_beta.json 2017-04-28 11:18:03 +02:00
Pascal Vizeli
759356ff2e ResinOS 0.6 2017-04-28 11:17:45 +02:00
Pascal Vizeli
4d8dbdb486 Update README.md 2017-04-28 10:43:17 +02:00
Pascal Vizeli
cddb40a104 HassIO 0.18 2017-04-28 02:21:58 +02:00
Pascal Vizeli
36128d119a HassIO 0.18 2017-04-28 02:21:23 +02:00
Pascal Vizeli
dadb72aca8 Pump version 0.19 2017-04-28 02:09:21 +02:00
Pascal Vizeli
390d4fa6c7 Merge pull request #21 from home-assistant/dev
Release 0.18
2017-04-28 02:05:44 +02:00
Pascal Vizeli
9559c39351 fix update merge 2017-04-28 01:56:07 +02:00
Pascal Vizeli
9d95b70534 fix list remove 2017-04-28 01:40:06 +02:00
Pascal Vizeli
3bad896978 fix handling 2017-04-28 01:26:37 +02:00
Pascal Vizeli
9f406df129 Fix regex 2017-04-28 01:12:27 +02:00
Pascal Vizeli
a9b4174590 Add support for arch in custom image name 2017-04-28 01:07:54 +02:00
Pascal Vizeli
9109e3803b Use in every case a hash for addon name 2017-04-28 01:01:57 +02:00
Pascal Vizeli
422dd78489 Fix list handling 2017-04-28 00:45:53 +02:00
Pascal Vizeli
9e1d6c9d2b Change log output / fix bug with store repository 2017-04-28 00:26:58 +02:00
Pascal Vizeli
c8e3f2b48a Merge pull request #20 from pvizeli/multible_git_repos
Allow custom repository  / improve config validate
2017-04-27 23:54:29 +02:00
Pascal Vizeli
a287f52e47 fix spell 2017-04-27 23:43:18 +02:00
Pascal Vizeli
76952db3eb fix old code 2017-04-27 23:37:35 +02:00
Pascal Vizeli
871721f04b Fix lint p2 2017-04-27 23:31:34 +02:00
Pascal Vizeli
3c0ebdf643 fix lint 2017-04-27 23:28:58 +02:00
Pascal Vizeli
0e258a4ae0 short hash 2017-04-27 23:18:05 +02:00
Pascal Vizeli
645a8e2372 Add id to addons slug 2017-04-27 23:09:52 +02:00
Pascal Vizeli
c6cc8adbb7 Change handling with repo list 2017-04-27 21:58:21 +02:00
pvizeli
dd38c73b85 Fix lint p4 2017-04-27 21:08:28 +02:00
pvizeli
c916314704 Fix lint p3 2017-04-27 21:08:28 +02:00
pvizeli
e0dcce5895 Fix lint p2 2017-04-27 21:08:28 +02:00
pvizeli
906616e224 Update description 2017-04-27 21:08:28 +02:00
pvizeli
db20ea95d9 Fix lint 2017-04-27 21:08:28 +02:00
pvizeli
d142ea5d23 Allow custome repository / improve config validate 2017-04-27 21:08:28 +02:00
Pascal Vizeli
5d52404dab Update new docker hup 2017-04-27 17:11:33 +02:00
Pascal Vizeli
43f4b36cfe Change status code on api error 2017-04-27 16:25:49 +02:00
Pascal Vizeli
0393db19e6 Pump version to 0.18 2017-04-27 09:33:58 +02:00
Pascal Vizeli
b197578df4 HassIO 0.17 2017-04-27 08:55:42 +02:00
Pascal Vizeli
ed428c0df4 HassIO 0.17 2017-04-27 08:55:15 +02:00
Pascal Vizeli
d38707821c Merge pull request #19 from home-assistant/dev
Release 0.17
2017-04-27 08:44:15 +02:00
Pascal Vizeli
cfb392054e Merge pull request #18 from pvizeli/update_v3_api
* Update POST/GET api for Hass

* fix lint
2017-04-27 08:42:40 +02:00
pvizeli
0ea65efeb3 fix lint 2017-04-27 08:41:33 +02:00
pvizeli
c4ce7d1a74 Update POST/GET api for Hass 2017-04-27 08:31:42 +02:00
Pascal Vizeli
7ac95b98bc Update readme and fix links 2017-04-26 23:16:00 +02:00
Pascal Vizeli
f8413d8d63 Update version of HassIO 2017-04-26 23:08:30 +02:00
Pascal Vizeli
709b80b864 Pump version 0.17 2017-04-26 22:42:54 +02:00
Pascal Vizeli
b5b68c5c42 Release 0.16
Release 0.16
2017-04-26 22:38:16 +02:00
Pascal Vizeli
d58e847978 Merge remote-tracking branch 'origin/master' into dev 2017-04-26 22:19:08 +02:00
Pascal Vizeli
aad9ae6997 Add OS attribute for hostcontrol (#16)
* Add OS attribute for hostcontrol

* fix lint
2017-04-26 22:14:58 +02:00
Pascal Vizeli
139cf4fae4 Update path (#15) 2017-04-26 22:08:49 +02:00
Pascal Vizeli
e01b2da223 Cleanup 2017-04-26 21:49:29 +02:00
Pascal Vizeli
cbbe2d2d3c Cleanup 2017-04-26 21:48:27 +02:00
Pascal Vizeli
7ca11a96b9 Pump version 2017-04-26 21:47:04 +02:00
Pascal Vizeli
3443d6d715 Update API.md 2017-04-26 17:13:16 +02:00
Pascal Vizeli
99730734a0 update generic HostControll v0.2 2017-04-26 12:59:19 +02:00
Pascal Vizeli
20fcd28dbe Update version.json 2017-04-26 11:27:29 +02:00
Pascal Vizeli
76cead72e8 Update API for hass api v2 (#14)
* Update API for hass api v2

* fix lint

* Refactory the old version of host_control

* cleanup

* Cleanup name inside addons/data

* Cleanup name inside addons/data p2

* Rename api list

* Fix path bug

* Fix wrong config set
2017-04-26 11:15:56 +02:00
Pascal Vizeli
a0f17ffd1d Update version_beta.json 2017-04-26 10:38:20 +02:00
Pascal Vizeli
86d92bdfa2 Update version.json 2017-04-26 10:37:56 +02:00
pvizeli
25a0bc6549 Start rename options inside version 2017-04-26 08:19:16 +02:00
Pascal Vizeli
96971e7054 Update version.json 2017-04-26 00:21:22 +02:00
Pascal Vizeli
2729877fbf Update version_beta.json 2017-04-26 00:10:36 +02:00
41 changed files with 1698 additions and 613 deletions

3
.gitmodules vendored Normal file
View File

@@ -0,0 +1,3 @@
[submodule "home-assistant-polymer"]
path = home-assistant-polymer
url = https://github.com/home-assistant/home-assistant-polymer

200
API.md
View File

@@ -2,7 +2,7 @@
## HassIO REST API ## HassIO REST API
Interface for HomeAssistant to controll things from supervisor. Interface for HomeAssistant to control things from supervisor.
On error: On error:
```json ```json
@@ -22,29 +22,71 @@ On success
### HassIO ### HassIO
- `/supervisor/ping` - GET `/supervisor/ping`
- `/supervisor/info` - GET `/supervisor/info`
The addons from `addons` are only installed one.
```json ```json
{ {
"version": "INSTALL_VERSION", "version": "INSTALL_VERSION",
"current": "CURRENT_VERSION", "last_version": "LAST_VERSION",
"beta": "true|false", "arch": "armhf|aarch64|i386|amd64",
"beta_channel": "true|false",
"addons": [ "addons": [
{ {
"name": "xy bla", "name": "xy bla",
"slug": "xy", "slug": "xy",
"version": "CURRENT_VERSION", "description": "description",
"arch": ["armhf", "aarch64", "i386", "amd64"],
"repository": "12345678|null",
"version": "LAST_VERSION",
"installed": "INSTALL_VERSION",
"detached": "bool",
"build": "bool",
"url": "null|url"
}
],
"addons_repositories": [
"REPO_URL"
]
}
```
- GET `/supervisor/addons`
Get all available addons
```json
{
"addons": [
{
"name": "xy bla",
"slug": "xy",
"description": "description",
"arch": ["armhf", "aarch64", "i386", "amd64"],
"repository": "core|local|REP_ID",
"version": "LAST_VERSION",
"installed": "none|INSTALL_VERSION", "installed": "none|INSTALL_VERSION",
"dedicated": "bool", "detached": "bool",
"description": "description" "build": "bool",
"url": "null|url"
}
],
"repositories": [
{
"slug": "12345678",
"name": "Repitory Name",
"source": "URL_OF_REPOSITORY",
"url": "null|WEBSITE",
"maintainer": "null|BLA BLU <fla@dld.ch>"
} }
] ]
} }
``` ```
- `/supervisor/update` - POST `/supervisor/update`
Optional: Optional:
```json ```json
{ {
@@ -52,40 +94,78 @@ Optional:
} }
``` ```
- `/supervisor/option` - POST `/supervisor/options`
```json ```json
{ {
"beta": "true|false" "beta_channel": "true|false",
"addons_repositories": [
"REPO_URL"
]
} }
``` ```
- `/supervisor/reload` - POST `/supervisor/reload`
Reload addons/version. Reload addons/version.
- `/supervisor/logs` - GET `/supervisor/logs`
Output the raw docker log Output the raw docker log
### Host ### Security
- `/host/shutdown` - GET `/security/info`
- `/host/reboot`
- `/host/info`
See HostControll info command.
```json ```json
{ {
"os": "", "initialize": "bool",
"version": "", "totp": "bool"
"current": "",
"level": "",
"hostname": "",
} }
``` ```
- `/host/update` - POST `/security/options`
```json
{
"password": "xy"
}
```
- POST `/security/totp`
```json
{
"password": "xy"
}
```
Return QR-Code
- POST `/security/session`
```json
{
"password": "xy",
"totp": "null|123456"
}
```
### Host
- POST `/host/shutdown`
- POST `/host/reboot`
- GET `/host/info`
See HostControl info command.
```json
{
"type": "",
"version": "",
"last_version": "",
"features": ["shutdown", "reboot", "update", "network_info", "network_control"],
"hostname": "",
"os": ""
}
```
- POST `/host/update`
Optional: Optional:
```json ```json
{ {
@@ -95,9 +175,9 @@ Optional:
### Network ### Network
- `/network/info` - GET `/network/info`
- `/network/options` - POST `/network/options`
```json ```json
{ {
"hostname": "", "hostname": "",
@@ -111,16 +191,16 @@ Optional:
### HomeAssistant ### HomeAssistant
- `/homeassistant/info` - GET `/homeassistant/info`
```json ```json
{ {
"version": "INSTALL_VERSION", "version": "INSTALL_VERSION",
"current": "CURRENT_VERSION" "last_version": "LAST_VERSION"
} }
``` ```
- `/homeassistant/update` - POST `/homeassistant/update`
Optional: Optional:
```json ```json
{ {
@@ -128,36 +208,44 @@ Optional:
} }
``` ```
- `/homeassistant/logs` - GET `/homeassistant/logs`
Output the raw docker log Output the raw docker log
- POST `/homeassistant/restart`
### REST API addons ### REST API addons
- `/addons/{addon}/info` - GET `/addons/{addon}/info`
```json ```json
{ {
"name": "xy bla",
"description": "description",
"url": "null|url of addon",
"detached": "bool",
"repository": "12345678|null",
"version": "VERSION", "version": "VERSION",
"current": "CURRENT_VERSION", "last_version": "LAST_VERSION",
"state": "started|stopped", "state": "started|stopped",
"boot": "auto|manual",
"build": "bool",
"options": {},
}
```
- POST `/addons/{addon}/options`
```json
{
"boot": "auto|manual", "boot": "auto|manual",
"options": {}, "options": {},
} }
``` ```
- `/addons/{addon}/options` - POST `/addons/{addon}/start`
```json
{
"boot": "auto|manual",
"options": {},
}
```
- `/addons/{addon}/start` - POST `/addons/{addon}/stop`
- `/addons/{addon}/stop` - POST `/addons/{addon}/install`
- `/addons/{addon}/install`
Optional: Optional:
```json ```json
{ {
@@ -165,9 +253,9 @@ Optional:
} }
``` ```
- `/addons/{addon}/uninstall` - POST `/addons/{addon}/uninstall`
- `/addons/{addon}/update` - POST `/addons/{addon}/update`
Optional: Optional:
```json ```json
{ {
@@ -175,18 +263,20 @@ Optional:
} }
``` ```
- `/addons/{addon}/logs` - GET `/addons/{addon}/logs`
Output the raw docker log Output the raw docker log
## Host Controll - POST `/addons/{addon}/restart`
## Host Control
Communicate over unix socket with a host daemon. Communicate over unix socket with a host daemon.
- commands - commands
``` ```
# info # info
-> {'os', 'version', 'current', 'level', 'hostname'} -> {'type', 'version', 'last_version', 'features', 'hostname'}
# reboot # reboot
# shutdown # shutdown
# host-update [v] # host-update [v]
@@ -200,10 +290,12 @@ Communicate over unix socket with a host daemon.
# network int route xy # network int route xy
``` ```
level: features:
- 1: power functions - shutdown
- 2: host update - reboot
- 4: network functions - update
- network_info
- network_control
Answer: Answer:
``` ```

3
MANIFEST.in Normal file
View File

@@ -0,0 +1,3 @@
include LICENSE.md
graft hassio
recursive-exclude * *.py[co]

View File

@@ -1,56 +1,26 @@
# HassIO # HassIO
First private cloud solution for home automation. ### First private cloud solution for home automation.
It is a docker image (supervisor) they manage HomeAssistant docker and give a interface to controll itself over UI. It have a own eco system with addons to extend the functionality in a easy way. Hass.io is a Docker based system for managing your Home Assistant installation and related applications. The system is controlled via Home Assistant which communicates with the supervisor. The supervisor provides an API to manage the installation. This includes changing network settings or installing and updating software.
[HassIO-Addons](https://github.com/pvizeli/hassio-addons) | [HassIO-Build](https://github.com/pvizeli/hassio-build) ![](misc/hassio.png?raw=true)
**HassIO is at the moment on development and not ready to use productive!** [HassIO-Addons](https://github.com/home-assistant/hassio-addons) | [HassIO-Build](https://github.com/home-assistant/hassio-build)
## Feature in progress **HassIO is under active development and is not ready yet for production use.**
- Backup/Restore
- MQTT addon ## Installing Hass.io
- DHCP-Server addon
Looks to our [website](https://home-assistant.io/hassio).
# HomeAssistant # HomeAssistant
## SSL ## SSL
All addons they can create SSL certs do that in same schema. So you can put follow lines to your `configuration.yaml`. All addons that create SSL certs follow the same file structure. If you use one, put follow lines in your `configuration.yaml`.
```yaml ```yaml
http: http:
ssl_certificate: /ssl/fullchain.pem ssl_certificate: /ssl/fullchain.pem
ssl_key: /ssl/privkey.pem ssl_key: /ssl/privkey.pem
``` ```
# Hardware Image
The image is based on ResinOS and Yocto Linux. It comes with the HassIO supervisor pre-installed. This includes support to update the supervisor over the air. After flashing your host OS will not require any more maintenance! The image does not include Home Assistant, instead it will downloaded when the image boots up for the first time.
Download can be found here: https://drive.google.com/drive/folders/0B2o1Uz6l1wVNbFJnb2gwNXJja28?usp=sharing
After extracting the archive, flash it to a drive using [Etcher](https://etcher.io/).
## History
- **0.1**: First techpreview with dumy supervisor (ResinOS 2.0.0-RC5)
- **0.2**: Fix some bugs and update it to HassIO 0.2
- **0.3**: Update HostControll and feature for HassIO 0.3 (ResinOS 2.0.0 / need reflash)
- **0.4**: Update HostControll and bring resinos OTA (resinhub) back (ResinOS 2.0.0-rev3)
## Configuring the image
You can configure the WiFi network that the image should connect to after flashing using [`resin-device-toolbox`](https://resinos.io/docs/raspberrypi3/gettingstarted/#install-resin-device-toolbox).
## Developer access to ResinOS host
Create an `authorized_keys` file in the boot partition of your SD card with your public key. After a boot it, you can acces your device as root over ssh on port 22222.
## Troubleshooting
Read logoutput from supervisor:
```bash
journalctl -f -u resin-supervisor.service
docker logs homeassistant
```
## Install on a own System
We have a installer to install HassIO on own linux device without our hardware image:
https://github.com/pvizeli/hassio-build/tree/master/install

View File

@@ -1,11 +1,10 @@
"""Init file for HassIO addons.""" """Init file for HassIO addons."""
import asyncio import asyncio
import logging import logging
import os
import shutil import shutil
from .data import AddonsData from .data import AddonsData
from .git import AddonsRepo from .git import AddonsRepoHassIO, AddonsRepoCustom
from ..const import STATE_STOPPED, STATE_STARTED from ..const import STATE_STOPPED, STATE_STARTED
from ..dock.addon import DockerAddon from ..dock.addon import DockerAddon
@@ -21,16 +20,29 @@ class AddonManager(AddonsData):
self.loop = loop self.loop = loop
self.dock = dock self.dock = dock
self.repo = AddonsRepo(config, loop) self.repositories = []
self.dockers = {} self.dockers = {}
async def prepare(self, arch): async def prepare(self, arch):
"""Startup addon management.""" """Startup addon management."""
self.arch = arch self.arch = arch
# init hassio repository
self.repositories.append(AddonsRepoHassIO(self.config, self.loop))
# init custom repositories
for url in self.config.addons_repositories:
self.repositories.append(
AddonsRepoCustom(self.config, self.loop, url))
# load addon repository # load addon repository
if await self.repo.load(): tasks = [addon.load() for addon in self.repositories]
self.read_addons_repo() if tasks:
await asyncio.wait(tasks, loop=self.loop)
# read data from repositories
self.read_data_from_repositories()
self.merge_update_config()
# load installed addons # load installed addons
for addon in self.list_installed: for addon in self.list_installed:
@@ -38,23 +50,53 @@ class AddonManager(AddonsData):
self.config, self.loop, self.dock, self, addon) self.config, self.loop, self.dock, self, addon)
await self.dockers[addon].attach() await self.dockers[addon].attach()
async def add_git_repository(self, url):
"""Add a new custom repository."""
if url in self.config.addons_repositories:
_LOGGER.warning("Repository already exists %s", url)
return False
repo = AddonsRepoCustom(self.config, self.loop, url)
if not await repo.load():
_LOGGER.error("Can't load from repository %s", url)
return False
self.config.addons_repositories = url
self.repositories.append(repo)
return True
def drop_git_repository(self, url):
"""Remove a custom repository."""
for repo in self.repositories:
if repo.url == url:
self.repositories.remove(repo)
self.config.drop_addon_repository(url)
repo.remove()
return True
return False
async def reload(self): async def reload(self):
"""Update addons from repo and reload list.""" """Update addons from repo and reload list."""
if not await self.repo.pull(): tasks = [addon.pull() for addon in self.repositories]
if not tasks:
return return
self.read_addons_repo()
await asyncio.wait(tasks, loop=self.loop)
# read data from repositories
self.read_data_from_repositories()
self.merge_update_config()
# remove stalled addons # remove stalled addons
for addon in self.list_removed: for addon in self.list_detached:
_LOGGER.warning("Dedicated addon '%s' found!", addon) _LOGGER.warning("Dedicated addon '%s' found!", addon)
async def auto_boot(self, start_type): async def auto_boot(self, start_type):
"""Boot addons with mode auto.""" """Boot addons with mode auto."""
boot_list = self.list_startup(start_type) boot_list = self.list_startup(start_type)
tasks = [] tasks = [self.start(addon) for addon in boot_list]
for addon in boot_list:
tasks.append(self.loop.create_task(self.start(addon)))
_LOGGER.info("Startup %s run %d addons", start_type, len(tasks)) _LOGGER.info("Startup %s run %d addons", start_type, len(tasks))
if tasks: if tasks:
@@ -66,19 +108,23 @@ class AddonManager(AddonsData):
_LOGGER.error("Addon %s not exists for install", addon) _LOGGER.error("Addon %s not exists for install", addon)
return False return False
if self.arch not in self.get_arch(addon):
_LOGGER.error("Addon %s not supported on %s", addon, self.arch)
return False
if self.is_installed(addon): if self.is_installed(addon):
_LOGGER.error("Addon %s is already installed", addon) _LOGGER.error("Addon %s is already installed", addon)
return False return False
if not os.path.isdir(self.path_data(addon)): if not self.path_data(addon).is_dir():
_LOGGER.info("Create Home-Assistant addon data folder %s", _LOGGER.info("Create Home-Assistant addon data folder %s",
self.path_data(addon)) self.path_data(addon))
os.mkdir(self.path_data(addon)) self.path_data(addon).mkdir()
addon_docker = DockerAddon( addon_docker = DockerAddon(
self.config, self.loop, self.dock, self, addon) self.config, self.loop, self.dock, self, addon)
version = version or self.get_version(addon) version = version or self.get_last_version(addon)
if not await addon_docker.install(version): if not await addon_docker.install(version):
return False return False
@@ -99,10 +145,10 @@ class AddonManager(AddonsData):
if not await self.dockers[addon].remove(): if not await self.dockers[addon].remove():
return False return False
if os.path.isdir(self.path_data(addon)): if self.path_data(addon).is_dir():
_LOGGER.info("Remove Home-Assistant addon data folder %s", _LOGGER.info("Remove Home-Assistant addon data folder %s",
self.path_data(addon)) self.path_data(addon))
shutil.rmtree(self.path_data(addon)) shutil.rmtree(str(self.path_data(addon)))
self.dockers.pop(addon) self.dockers.pop(addon)
self.set_addon_uninstall(addon) self.set_addon_uninstall(addon)
@@ -144,8 +190,8 @@ class AddonManager(AddonsData):
_LOGGER.error("No docker found for addon %s", addon) _LOGGER.error("No docker found for addon %s", addon)
return False return False
version = version or self.get_version(addon) version = version or self.get_last_version(addon)
is_running = self.dockers[addon].is_running() is_running = await self.dockers[addon].is_running()
# update # update
if await self.dockers[addon].update(version): if await self.dockers[addon].update(version):
@@ -155,6 +201,14 @@ class AddonManager(AddonsData):
return True return True
return False return False
async def restart(self, addon):
"""Restart addon."""
if addon not in self.dockers:
_LOGGER.error("No docker found for addon %s", addon)
return False
return await self.dockers[addon].restart()
async def logs(self, addon): async def logs(self, addon):
"""Return addons log output.""" """Return addons log output."""
if addon not in self.dockers: if addon not in self.dockers:

View File

@@ -0,0 +1,14 @@
{
"local": {
"slug": "local",
"name": "Local Add-Ons",
"url": "https://home-assistant.io/hassio",
"maintainer": "By our self"
},
"core": {
"slug": "core",
"name": "Built-in Add-Ons",
"url": "https://home-assistant.io/addons",
"maintainer": "Home Assistant authors"
}
}

View File

@@ -1,24 +1,34 @@
"""Init file for HassIO addons.""" """Init file for HassIO addons."""
import copy
import logging import logging
import glob import json
from pathlib import Path, PurePath
import re
import voluptuous as vol import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
from .validate import validate_options, SCHEMA_ADDON_CONFIG from .util import extract_hash_from_path
from .validate import (
validate_options, SCHEMA_ADDON_CONFIG, SCHEMA_REPOSITORY_CONFIG,
MAP_VOLUME)
from ..const import ( from ..const import (
FILE_HASSIO_ADDONS, ATTR_NAME, ATTR_VERSION, ATTR_SLUG, ATTR_DESCRIPTON, FILE_HASSIO_ADDONS, ATTR_NAME, ATTR_VERSION, ATTR_SLUG, ATTR_DESCRIPTON,
ATTR_STARTUP, ATTR_BOOT, ATTR_MAP_SSL, ATTR_MAP_CONFIG, ATTR_OPTIONS, ATTR_STARTUP, ATTR_BOOT, ATTR_MAP, ATTR_OPTIONS, ATTR_PORTS, BOOT_AUTO,
ATTR_PORTS, BOOT_AUTO, DOCKER_REPO, ATTR_INSTALLED, ATTR_SCHEMA, ATTR_SCHEMA, ATTR_IMAGE, ATTR_REPOSITORY, ATTR_URL, ATTR_ARCH,
ATTR_IMAGE, ATTR_DEDICATED) ATTR_LOCATON, ATTR_DEVICES, ATTR_ENVIRONMENT)
from ..config import Config from ..config import Config
from ..tools import read_json_file, write_json_file from ..tools import read_json_file, write_json_file
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
ADDONS_REPO_PATTERN = "{}/*/config.json" SYSTEM = 'system'
SYSTEM = "system" USER = 'user'
USER = "user"
REPOSITORY_CORE = 'core'
REPOSITORY_LOCAL = 'local'
RE_VOLUME = re.compile(MAP_VOLUME)
class AddonsData(Config): class AddonsData(Config):
@@ -28,79 +38,155 @@ class AddonsData(Config):
"""Initialize data holder.""" """Initialize data holder."""
super().__init__(FILE_HASSIO_ADDONS) super().__init__(FILE_HASSIO_ADDONS)
self.config = config self.config = config
self._addons_data = self._data.get(SYSTEM, {}) self._system_data = self._data.get(SYSTEM, {})
self._user_data = self._data.get(USER, {}) self._user_data = self._data.get(USER, {})
self._current_data = {} self._addons_cache = {}
self._repositories_data = {}
self.arch = None self.arch = None
def save(self): def save(self):
"""Store data to config file.""" """Store data to config file."""
self._data = { self._data = {
USER: self._user_data, USER: self._user_data,
SYSTEM: self._addons_data, SYSTEM: self._system_data,
} }
super().save() super().save()
def read_addons_repo(self): def read_data_from_repositories(self):
"""Read data from addons repository.""" """Read data from addons repository."""
self._current_data = {} self._addons_cache = {}
self._repositories_data = {}
self._read_addons_folder(self.config.path_addons_repo) # read core repository
self._read_addons_folder(self.config.path_addons_custom) self._read_addons_folder(
self.config.path_addons_core, REPOSITORY_CORE)
def _read_addons_folder(self, folder): # read local repository
self._read_addons_folder(
self.config.path_addons_local, REPOSITORY_LOCAL)
# add built-in repositories information
self._set_builtin_repositories()
# read custom git repositories
for repository_element in self.config.path_addons_git.iterdir():
if repository_element.is_dir():
self._read_git_repository(repository_element)
def _read_git_repository(self, path):
"""Process a custom repository folder."""
slug = extract_hash_from_path(path)
repository_info = {ATTR_SLUG: slug}
# exists repository json
repository_file = Path(path, "repository.json")
try:
repository_info.update(SCHEMA_REPOSITORY_CONFIG(
read_json_file(repository_file)
))
except OSError:
_LOGGER.warning("Can't read repository information from %s",
repository_file)
return
except vol.Invalid:
_LOGGER.warning("Repository parse error %s", repository_file)
return
# process data
self._repositories_data[slug] = repository_info
self._read_addons_folder(path, slug)
def _read_addons_folder(self, path, repository):
"""Read data from addons folder.""" """Read data from addons folder."""
pattern = ADDONS_REPO_PATTERN.format(folder) for addon in path.glob("**/config.json"):
for addon in glob.iglob(pattern):
try: try:
addon_config = read_json_file(addon) addon_config = read_json_file(addon)
# validate
addon_config = SCHEMA_ADDON_CONFIG(addon_config) addon_config = SCHEMA_ADDON_CONFIG(addon_config)
self._current_data[addon_config[ATTR_SLUG]] = addon_config
except (OSError, KeyError): # Generate slug
addon_slug = "{}_{}".format(
repository, addon_config[ATTR_SLUG])
# store
addon_config[ATTR_REPOSITORY] = repository
addon_config[ATTR_LOCATON] = str(addon.parent)
self._addons_cache[addon_slug] = addon_config
except OSError:
_LOGGER.warning("Can't read %s", addon) _LOGGER.warning("Can't read %s", addon)
except vol.Invalid as ex: except vol.Invalid as ex:
_LOGGER.warning("Can't read %s -> %s", addon, _LOGGER.warning("Can't read %s -> %s", addon,
humanize_error(addon_config, ex)) humanize_error(addon_config, ex))
def _set_builtin_repositories(self):
"""Add local built-in repository into dataset."""
try:
builtin_file = Path(__file__).parent.joinpath('built-in.json')
builtin_data = read_json_file(builtin_file)
except (OSError, json.JSONDecodeError) as err:
_LOGGER.warning("Can't read built-in.json -> %s", err)
return
# if core addons are available
for data in self._addons_cache.values():
if data[ATTR_REPOSITORY] == REPOSITORY_CORE:
self._repositories_data[REPOSITORY_CORE] = \
builtin_data[REPOSITORY_CORE]
break
# if local addons are available
for data in self._addons_cache.values():
if data[ATTR_REPOSITORY] == REPOSITORY_LOCAL:
self._repositories_data[REPOSITORY_LOCAL] = \
builtin_data[REPOSITORY_LOCAL]
break
def merge_update_config(self):
"""Update local config if they have update.
It need to be the same version as the local version is.
"""
have_change = False
for addon in self.list_installed:
# detached
if addon not in self._addons_cache:
continue
cache = self._addons_cache[addon]
data = self._system_data[addon]
if data[ATTR_VERSION] == cache[ATTR_VERSION]:
if data != cache:
self._system_data[addon] = copy.deepcopy(cache)
have_change = True
if have_change:
self.save()
@property @property
def list_installed(self): def list_installed(self):
"""Return a list of installed addons.""" """Return a list of installed addons."""
return set(self._addons_data.keys()) return set(self._system_data)
@property @property
def list(self): def list_all(self):
"""Return a list of available addons.""" """Return a dict of all addons."""
data = [] return set(self._system_data) | set(self._addons_cache)
all_addons = {**self._addons_data, **self._current_data}
dedicated = self.list_removed
for addon, values in all_addons.items():
i_version = self._user_data.get(addon, {}).get(ATTR_VERSION)
data.append({
ATTR_NAME: values[ATTR_NAME],
ATTR_SLUG: values[ATTR_SLUG],
ATTR_DESCRIPTON: values[ATTR_DESCRIPTON],
ATTR_VERSION: values[ATTR_VERSION],
ATTR_INSTALLED: i_version,
ATTR_DEDICATED: addon in dedicated,
})
return data
def list_startup(self, start_type): def list_startup(self, start_type):
"""Get list of installed addon with need start by type.""" """Get list of installed addon with need start by type."""
addon_list = set() addon_list = set()
for addon in self._addons_data.keys(): for addon in self._system_data.keys():
if self.get_boot(addon) != BOOT_AUTO: if self.get_boot(addon) != BOOT_AUTO:
continue continue
try: try:
if self._addons_data[addon][ATTR_STARTUP] == start_type: if self._system_data[addon][ATTR_STARTUP] == start_type:
addon_list.add(addon) addon_list.add(addon)
except KeyError: except KeyError:
_LOGGER.warning("Orphaned addon detect %s", addon) _LOGGER.warning("Orphaned addon detect %s", addon)
@@ -109,33 +195,35 @@ class AddonsData(Config):
return addon_list return addon_list
@property @property
def list_removed(self): def list_detached(self):
"""Return local addons they not support from repo.""" """Return local addons they not support from repo."""
addon_list = set() addon_list = set()
for addon in self._addons_data.keys(): for addon in self._system_data.keys():
if addon not in self._current_data: if addon not in self._addons_cache:
addon_list.add(addon) addon_list.add(addon)
return addon_list return addon_list
@property
def list_repositories(self):
"""Return list of addon repositories."""
return list(self._repositories_data.values())
def exists_addon(self, addon): def exists_addon(self, addon):
"""Return True if a addon exists.""" """Return True if a addon exists."""
return addon in self._current_data or addon in self._addons_data return addon in self._addons_cache or addon in self._system_data
def is_installed(self, addon): def is_installed(self, addon):
"""Return True if a addon is installed.""" """Return True if a addon is installed."""
return addon in self._addons_data return addon in self._system_data
def version_installed(self, addon): def version_installed(self, addon):
"""Return installed version.""" """Return installed version."""
if ATTR_VERSION not in self._user_data[addon]: return self._user_data.get(addon, {}).get(ATTR_VERSION)
return self._addons_data[addon][ATTR_VERSION]
return self._user_data[addon][ATTR_VERSION]
def set_addon_install(self, addon, version): def set_addon_install(self, addon, version):
"""Set addon as installed.""" """Set addon as installed."""
self._addons_data[addon] = self._current_data[addon] self._system_data[addon] = copy.deepcopy(self._addons_cache[addon])
self._user_data[addon] = { self._user_data[addon] = {
ATTR_OPTIONS: {}, ATTR_OPTIONS: {},
ATTR_VERSION: version, ATTR_VERSION: version,
@@ -144,19 +232,19 @@ class AddonsData(Config):
def set_addon_uninstall(self, addon): def set_addon_uninstall(self, addon):
"""Set addon as uninstalled.""" """Set addon as uninstalled."""
self._addons_data.pop(addon, None) self._system_data.pop(addon, None)
self._user_data.pop(addon, None) self._user_data.pop(addon, None)
self.save() self.save()
def set_addon_update(self, addon, version): def set_addon_update(self, addon, version):
"""Update version of addon.""" """Update version of addon."""
self._addons_data[addon] = self._current_data[addon] self._system_data[addon] = copy.deepcopy(self._addons_cache[addon])
self._user_data[addon][ATTR_VERSION] = version self._user_data[addon][ATTR_VERSION] = version
self.save() self.save()
def set_options(self, addon, options): def set_options(self, addon, options):
"""Store user addon options.""" """Store user addon options."""
self._user_data[addon][ATTR_OPTIONS] = options self._user_data[addon][ATTR_OPTIONS] = copy.deepcopy(options)
self.save() self.save()
def set_boot(self, addon, boot): def set_boot(self, addon, boot):
@@ -167,7 +255,7 @@ class AddonsData(Config):
def get_options(self, addon): def get_options(self, addon):
"""Return options with local changes.""" """Return options with local changes."""
return { return {
**self._addons_data[addon][ATTR_OPTIONS], **self._system_data[addon][ATTR_OPTIONS],
**self._user_data[addon][ATTR_OPTIONS], **self._user_data[addon][ATTR_OPTIONS],
} }
@@ -176,54 +264,101 @@ class AddonsData(Config):
if ATTR_BOOT in self._user_data[addon]: if ATTR_BOOT in self._user_data[addon]:
return self._user_data[addon][ATTR_BOOT] return self._user_data[addon][ATTR_BOOT]
return self._addons_data[addon][ATTR_BOOT] return self._system_data[addon][ATTR_BOOT]
def get_name(self, addon): def get_name(self, addon):
"""Return name of addon.""" """Return name of addon."""
return self._addons_data[addon][ATTR_NAME] if addon in self._addons_cache:
return self._addons_cache[addon][ATTR_NAME]
return self._system_data[addon][ATTR_NAME]
def get_description(self, addon): def get_description(self, addon):
"""Return description of addon.""" """Return description of addon."""
return self._addons_data[addon][ATTR_DESCRIPTON] if addon in self._addons_cache:
return self._addons_cache[addon][ATTR_DESCRIPTON]
return self._system_data[addon][ATTR_DESCRIPTON]
def get_version(self, addon): def get_repository(self, addon):
"""Return repository of addon."""
if addon in self._addons_cache:
return self._addons_cache[addon][ATTR_REPOSITORY]
return self._system_data[addon][ATTR_REPOSITORY]
def get_last_version(self, addon):
"""Return version of addon.""" """Return version of addon."""
if addon not in self._current_data: if addon in self._addons_cache:
return self.version_installed(addon) return self._addons_cache[addon][ATTR_VERSION]
return self._current_data[addon][ATTR_VERSION] return self.version_installed(addon)
def get_ports(self, addon): def get_ports(self, addon):
"""Return ports of addon.""" """Return ports of addon."""
return self._addons_data[addon].get(ATTR_PORTS) return self._system_data[addon].get(ATTR_PORTS)
def get_devices(self, addon):
"""Return devices of addon."""
return self._system_data[addon].get(ATTR_DEVICES)
def get_environment(self, addon):
"""Return environment of addon."""
return self._system_data[addon].get(ATTR_ENVIRONMENT)
def get_url(self, addon):
"""Return url of addon."""
if addon in self._addons_cache:
return self._addons_cache[addon].get(ATTR_URL)
return self._system_data[addon].get(ATTR_URL)
def get_arch(self, addon):
"""Return list of supported arch."""
if addon in self._addons_cache:
return self._addons_cache[addon][ATTR_ARCH]
return self._system_data[addon][ATTR_ARCH]
def get_image(self, addon): def get_image(self, addon):
"""Return image name of addon.""" """Return image name of addon."""
addon_data = self._addons_data.get(addon, self._current_data[addon]) addon_data = self._system_data.get(
addon, self._addons_cache.get(addon)
)
if ATTR_IMAGE not in addon_data: # Repository with dockerhub images
return "{}/{}-addon-{}".format(DOCKER_REPO, self.arch, addon) if ATTR_IMAGE in addon_data:
return addon_data[ATTR_IMAGE].format(arch=self.arch)
return addon_data[ATTR_IMAGE] # local build
return "{}/{}-addon-{}".format(
addon_data[ATTR_REPOSITORY], self.arch, addon_data[ATTR_SLUG])
def need_config(self, addon): def need_build(self, addon):
"""Return True if config map is needed.""" """Return True if this addon need a local build."""
return self._addons_data[addon][ATTR_MAP_CONFIG] addon_data = self._system_data.get(
addon, self._addons_cache.get(addon)
)
return ATTR_IMAGE not in addon_data
def need_ssl(self, addon): def map_volumes(self, addon):
"""Return True if ssl map is needed.""" """Return a dict of {volume: policy} from addon."""
return self._addons_data[addon][ATTR_MAP_SSL] volumes = {}
for volume in self._system_data[addon][ATTR_MAP]:
result = RE_VOLUME.match(volume)
volumes[result.group(1)] = result.group(2) or 'ro'
return volumes
def path_data(self, addon): def path_data(self, addon):
"""Return addon data path inside supervisor.""" """Return addon data path inside supervisor."""
return "{}/{}".format(self.config.path_addons_data, addon) return Path(self.config.path_addons_data, addon)
def path_data_docker(self, addon): def path_extern_data(self, addon):
"""Return addon data path external for docker.""" """Return addon data path external for docker."""
return "{}/{}".format(self.config.path_addons_data_docker, addon) return PurePath(self.config.path_extern_addons_data, addon)
def path_addon_options(self, addon): def path_addon_options(self, addon):
"""Return path to addons options.""" """Return path to addons options."""
return "{}/options.json".format(self.path_data(addon)) return Path(self.path_data(addon), "options.json")
def path_addon_location(self, addon):
"""Return path to this addon."""
return Path(self._addons_cache[addon][ATTR_LOCATON])
def write_addon_options(self, addon): def write_addon_options(self, addon):
"""Return True if addon options is written to data.""" """Return True if addon options is written to data."""
@@ -241,7 +376,7 @@ class AddonsData(Config):
def get_schema(self, addon): def get_schema(self, addon):
"""Create a schema for addon options.""" """Create a schema for addon options."""
raw_schema = self._addons_data[addon][ATTR_SCHEMA] raw_schema = self._system_data[addon][ATTR_SCHEMA]
schema = vol.Schema(vol.All(dict, validate_options(raw_schema))) schema = vol.Schema(vol.All(dict, validate_options(raw_schema)))
return schema return schema

View File

@@ -1,10 +1,12 @@
"""Init file for HassIO addons git.""" """Init file for HassIO addons git."""
import asyncio import asyncio
import logging import logging
import os from pathlib import Path
import shutil
import git import git
from .util import get_hash_from_repository
from ..const import URL_HASSIO_ADDONS from ..const import URL_HASSIO_ADDONS
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -13,26 +15,29 @@ _LOGGER = logging.getLogger(__name__)
class AddonsRepo(object): class AddonsRepo(object):
"""Manage addons git repo.""" """Manage addons git repo."""
def __init__(self, config, loop): def __init__(self, config, loop, path, url):
"""Initialize docker base wrapper.""" """Initialize git base wrapper."""
self.config = config self.config = config
self.loop = loop self.loop = loop
self.repo = None self.repo = None
self.path = path
self.url = url
self._lock = asyncio.Lock(loop=loop) self._lock = asyncio.Lock(loop=loop)
async def load(self): async def load(self):
"""Init git addon repo.""" """Init git addon repo."""
if not os.path.isdir(self.config.path_addons_repo): if not self.path.is_dir():
return await self.clone() return await self.clone()
async with self._lock: async with self._lock:
try: try:
_LOGGER.info("Load addons repository") _LOGGER.info("Load addon %s repository", self.path)
self.repo = await self.loop.run_in_executor( self.repo = await self.loop.run_in_executor(
None, git.Repo, self.config.path_addons_repo) None, git.Repo, str(self.path))
except (git.InvalidGitRepositoryError, git.NoSuchPathError) as err: except (git.InvalidGitRepositoryError, git.NoSuchPathError,
_LOGGER.error("Can't load addons repo: %s.", err) git.GitCommandError) as err:
_LOGGER.error("Can't load %s repo: %s.", self.path, err)
return False return False
return True return True
@@ -41,13 +46,13 @@ class AddonsRepo(object):
"""Clone git addon repo.""" """Clone git addon repo."""
async with self._lock: async with self._lock:
try: try:
_LOGGER.info("Clone addons repository") _LOGGER.info("Clone addon %s repository", self.url)
self.repo = await self.loop.run_in_executor( self.repo = await self.loop.run_in_executor(
None, git.Repo.clone_from, URL_HASSIO_ADDONS, None, git.Repo.clone_from, self.url, str(self.path))
self.config.path_addons_repo)
except (git.InvalidGitRepositoryError, git.NoSuchPathError) as err: except (git.InvalidGitRepositoryError, git.NoSuchPathError,
_LOGGER.error("Can't clone addons repo: %s.", err) git.GitCommandError) as err:
_LOGGER.error("Can't clone %s repo: %s.", self.url, err)
return False return False
return True return True
@@ -60,12 +65,43 @@ class AddonsRepo(object):
async with self._lock: async with self._lock:
try: try:
_LOGGER.info("Pull addons repository") _LOGGER.info("Pull addon %s repository", self.url)
await self.loop.run_in_executor( await self.loop.run_in_executor(
None, self.repo.remotes.origin.pull) None, self.repo.remotes.origin.pull)
except (git.InvalidGitRepositoryError, git.NoSuchPathError) as err: except (git.InvalidGitRepositoryError, git.NoSuchPathError,
_LOGGER.error("Can't pull addons repo: %s.", err) git.exc.GitCommandError) as err:
_LOGGER.error("Can't pull %s repo: %s.", self.url, err)
return False return False
return True return True
class AddonsRepoHassIO(AddonsRepo):
"""HassIO addons repository."""
def __init__(self, config, loop):
"""Initialize git hassio addon repository."""
super().__init__(
config, loop, config.path_addons_core, URL_HASSIO_ADDONS)
class AddonsRepoCustom(AddonsRepo):
"""Custom addons repository."""
def __init__(self, config, loop, url):
"""Initialize git hassio addon repository."""
path = Path(config.path_addons_git, get_hash_from_repository(url))
super().__init__(config, loop, path, url)
def remove(self):
"""Remove a custom addon."""
if self.path.is_dir():
_LOGGER.info("Remove custom addon repository %s", self.url)
def log_err(funct, path, _):
"""Log error."""
_LOGGER.warning("Can't remove %s", path)
shutil.rmtree(str(self.path), onerror=log_err)

26
hassio/addons/util.py Normal file
View File

@@ -0,0 +1,26 @@
"""Util addons functions."""
import hashlib
import re
RE_SLUGIFY = re.compile(r'[^a-z0-9_]+')
RE_SHA1 = re.compile(r"[a-f0-9]{8}")
def get_hash_from_repository(name):
"""Generate a hash from repository."""
key = name.lower().encode()
return hashlib.sha1(key).hexdigest()[:8]
def extract_hash_from_path(path):
"""Extract repo id from path."""
repo_dir = path.parts[-1]
if not RE_SHA1.match(repo_dir):
return get_hash_from_repository(repo_dir)
return repo_dir
def create_hash_index_list(name_list):
"""Create a dict with hash from repositories list."""
return {get_hash_from_repository(repo): repo for repo in name_list}

View File

@@ -3,9 +3,13 @@ import voluptuous as vol
from ..const import ( from ..const import (
ATTR_NAME, ATTR_VERSION, ATTR_SLUG, ATTR_DESCRIPTON, ATTR_STARTUP, ATTR_NAME, ATTR_VERSION, ATTR_SLUG, ATTR_DESCRIPTON, ATTR_STARTUP,
ATTR_BOOT, ATTR_MAP_SSL, ATTR_MAP_CONFIG, ATTR_OPTIONS, ATTR_BOOT, ATTR_MAP, ATTR_OPTIONS, ATTR_PORTS, STARTUP_ONCE, STARTUP_AFTER,
ATTR_PORTS, STARTUP_ONCE, STARTUP_AFTER, STARTUP_BEFORE, BOOT_AUTO, STARTUP_BEFORE, BOOT_AUTO, BOOT_MANUAL, ATTR_SCHEMA, ATTR_IMAGE,
BOOT_MANUAL, ATTR_SCHEMA, ATTR_IMAGE) ATTR_URL, ATTR_MAINTAINER, ATTR_ARCH, ATTR_DEVICES, ATTR_ENVIRONMENT,
ARCH_ARMHF, ARCH_AARCH64, ARCH_AMD64, ARCH_I386)
MAP_VOLUME = r"^(config|ssl|addons|backup|share|mnt)(?::(rw|:ro))?$"
V_STR = 'str' V_STR = 'str'
V_INT = 'int' V_INT = 'int'
@@ -16,19 +20,26 @@ V_URL = 'url'
ADDON_ELEMENT = vol.In([V_STR, V_INT, V_FLOAT, V_BOOL, V_EMAIL, V_URL]) ADDON_ELEMENT = vol.In([V_STR, V_INT, V_FLOAT, V_BOOL, V_EMAIL, V_URL])
ARCH_ALL = [
ARCH_ARMHF, ARCH_AARCH64, ARCH_AMD64, ARCH_I386
]
# pylint: disable=no-value-for-parameter # pylint: disable=no-value-for-parameter
SCHEMA_ADDON_CONFIG = vol.Schema({ SCHEMA_ADDON_CONFIG = vol.Schema({
vol.Required(ATTR_NAME): vol.Coerce(str), vol.Required(ATTR_NAME): vol.Coerce(str),
vol.Required(ATTR_VERSION): vol.Coerce(str), vol.Required(ATTR_VERSION): vol.Coerce(str),
vol.Required(ATTR_SLUG): vol.Coerce(str), vol.Required(ATTR_SLUG): vol.Coerce(str),
vol.Required(ATTR_DESCRIPTON): vol.Coerce(str), vol.Required(ATTR_DESCRIPTON): vol.Coerce(str),
vol.Optional(ATTR_URL): vol.Url(),
vol.Optional(ATTR_ARCH, default=ARCH_ALL): [vol.In(ARCH_ALL)],
vol.Required(ATTR_STARTUP): vol.Required(ATTR_STARTUP):
vol.In([STARTUP_BEFORE, STARTUP_AFTER, STARTUP_ONCE]), vol.In([STARTUP_BEFORE, STARTUP_AFTER, STARTUP_ONCE]),
vol.Required(ATTR_BOOT): vol.Required(ATTR_BOOT):
vol.In([BOOT_AUTO, BOOT_MANUAL]), vol.In([BOOT_AUTO, BOOT_MANUAL]),
vol.Optional(ATTR_PORTS): dict, vol.Optional(ATTR_PORTS): dict,
vol.Optional(ATTR_MAP_CONFIG, default=False): vol.Boolean(), vol.Optional(ATTR_DEVICES): [vol.Match(r"^(.*):(.*):([rwm]{1,3})$")],
vol.Optional(ATTR_MAP_SSL, default=False): vol.Boolean(), vol.Optional(ATTR_MAP, default=[]): [vol.Match(MAP_VOLUME)],
vol.Optional(ATTR_ENVIRONMENT): {vol.Match(r"\w*"): vol.Coerce(str)},
vol.Required(ATTR_OPTIONS): dict, vol.Required(ATTR_OPTIONS): dict,
vol.Required(ATTR_SCHEMA): { vol.Required(ATTR_SCHEMA): {
vol.Coerce(str): vol.Any(ADDON_ELEMENT, [ vol.Coerce(str): vol.Any(ADDON_ELEMENT, [
@@ -36,7 +47,15 @@ SCHEMA_ADDON_CONFIG = vol.Schema({
]) ])
}, },
vol.Optional(ATTR_IMAGE): vol.Match(r"\w*/\w*"), vol.Optional(ATTR_IMAGE): vol.Match(r"\w*/\w*"),
}) }, extra=vol.ALLOW_EXTRA)
# pylint: disable=no-value-for-parameter
SCHEMA_REPOSITORY_CONFIG = vol.Schema({
vol.Required(ATTR_NAME): vol.Coerce(str),
vol.Optional(ATTR_URL): vol.Url(),
vol.Optional(ATTR_MAINTAINER): vol.Coerce(str),
}, extra=vol.ALLOW_EXTRA)
def validate_options(raw_schema): def validate_options(raw_schema):
@@ -54,10 +73,10 @@ def validate_options(raw_schema):
try: try:
if isinstance(typ, list): if isinstance(typ, list):
# nested value # nested value
options[key] = _nested_validate(typ[0], value) options[key] = _nested_validate(typ[0], value, key)
else: else:
# normal value # normal value
options[key] = _single_validate(typ, value) options[key] = _single_validate(typ, value, key)
except (IndexError, KeyError): except (IndexError, KeyError):
raise vol.Invalid( raise vol.Invalid(
"Type error for {}.".format(key)) from None "Type error for {}.".format(key)) from None
@@ -68,9 +87,13 @@ def validate_options(raw_schema):
# pylint: disable=no-value-for-parameter # pylint: disable=no-value-for-parameter
def _single_validate(typ, value): def _single_validate(typ, value, key):
"""Validate a single element.""" """Validate a single element."""
try: try:
# if required argument
if value is None:
raise vol.Invalid("Missing required option '{}'.".format(key))
if typ == V_STR: if typ == V_STR:
return str(value) return str(value)
elif typ == V_INT: elif typ == V_INT:
@@ -84,13 +107,13 @@ def _single_validate(typ, value):
elif typ == V_URL: elif typ == V_URL:
return vol.Url()(value) return vol.Url()(value)
raise vol.Invalid("Fatal error for {}.".format(value)) raise vol.Invalid("Fatal error for {} type {}.".format(key, typ))
except TypeError: except ValueError:
raise vol.Invalid( raise vol.Invalid(
"Type {} error for {}.".format(typ, value)) from None "Type {} error for '{}' on {}.".format(typ, value, key)) from None
def _nested_validate(typ, data_list): def _nested_validate(typ, data_list, key):
"""Validate nested items.""" """Validate nested items."""
options = [] options = []
@@ -103,10 +126,10 @@ def _nested_validate(typ, data_list):
raise vol.Invalid( raise vol.Invalid(
"Unknown nested options {}.".format(c_key)) "Unknown nested options {}.".format(c_key))
c_options[c_key] = _single_validate(typ[c_key], c_value) c_options[c_key] = _single_validate(typ[c_key], c_value, c_key)
options.append(c_options) options.append(c_options)
# normal list # normal list
else: else:
options.append(_single_validate(typ, element)) options.append(_single_validate(typ, element, key))
return options return options

View File

@@ -1,5 +1,6 @@
"""Init file for HassIO rest api.""" """Init file for HassIO rest api."""
import logging import logging
from pathlib import Path
from aiohttp import web from aiohttp import web
@@ -8,6 +9,7 @@ from .homeassistant import APIHomeAssistant
from .host import APIHost from .host import APIHost
from .network import APINetwork from .network import APINetwork
from .supervisor import APISupervisor from .supervisor import APISupervisor
from .security import APISecurity
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -25,32 +27,36 @@ class RestAPI(object):
self._handler = None self._handler = None
self.server = None self.server = None
def register_host(self, host_controll): def register_host(self, host_control):
"""Register hostcontroll function.""" """Register hostcontrol function."""
api_host = APIHost(self.config, self.loop, host_controll) api_host = APIHost(self.config, self.loop, host_control)
self.webapp.router.add_get('/host/info', api_host.info) self.webapp.router.add_get('/host/info', api_host.info)
self.webapp.router.add_get('/host/reboot', api_host.reboot) self.webapp.router.add_post('/host/reboot', api_host.reboot)
self.webapp.router.add_get('/host/shutdown', api_host.shutdown) self.webapp.router.add_post('/host/shutdown', api_host.shutdown)
self.webapp.router.add_get('/host/update', api_host.update) self.webapp.router.add_post('/host/update', api_host.update)
def register_network(self, host_controll): def register_network(self, host_control):
"""Register network function.""" """Register network function."""
api_net = APINetwork(self.config, self.loop, host_controll) api_net = APINetwork(self.config, self.loop, host_control)
self.webapp.router.add_get('/network/info', api_net.info) self.webapp.router.add_get('/network/info', api_net.info)
self.webapp.router.add_get('/network/options', api_net.options) self.webapp.router.add_post('/network/options', api_net.options)
def register_supervisor(self, supervisor, addons): def register_supervisor(self, supervisor, addons, host_control):
"""Register supervisor function.""" """Register supervisor function."""
api_supervisor = APISupervisor( api_supervisor = APISupervisor(
self.config, self.loop, supervisor, addons) self.config, self.loop, supervisor, addons, host_control)
self.webapp.router.add_get('/supervisor/ping', api_supervisor.ping) self.webapp.router.add_get('/supervisor/ping', api_supervisor.ping)
self.webapp.router.add_get('/supervisor/info', api_supervisor.info) self.webapp.router.add_get('/supervisor/info', api_supervisor.info)
self.webapp.router.add_get('/supervisor/update', api_supervisor.update)
self.webapp.router.add_get('/supervisor/reload', api_supervisor.reload)
self.webapp.router.add_get( self.webapp.router.add_get(
'/supervisor/addons', api_supervisor.available_addons)
self.webapp.router.add_post(
'/supervisor/update', api_supervisor.update)
self.webapp.router.add_post(
'/supervisor/reload', api_supervisor.reload)
self.webapp.router.add_post(
'/supervisor/options', api_supervisor.options) '/supervisor/options', api_supervisor.options)
self.webapp.router.add_get('/supervisor/logs', api_supervisor.logs) self.webapp.router.add_get('/supervisor/logs', api_supervisor.logs)
@@ -59,7 +65,8 @@ class RestAPI(object):
api_hass = APIHomeAssistant(self.config, self.loop, dock_homeassistant) api_hass = APIHomeAssistant(self.config, self.loop, dock_homeassistant)
self.webapp.router.add_get('/homeassistant/info', api_hass.info) self.webapp.router.add_get('/homeassistant/info', api_hass.info)
self.webapp.router.add_get('/homeassistant/update', api_hass.update) self.webapp.router.add_post('/homeassistant/update', api_hass.update)
self.webapp.router.add_post('/homeassistant/restart', api_hass.restart)
self.webapp.router.add_get('/homeassistant/logs', api_hass.logs) self.webapp.router.add_get('/homeassistant/logs', api_hass.logs)
def register_addons(self, addons): def register_addons(self, addons):
@@ -67,17 +74,36 @@ class RestAPI(object):
api_addons = APIAddons(self.config, self.loop, addons) api_addons = APIAddons(self.config, self.loop, addons)
self.webapp.router.add_get('/addons/{addon}/info', api_addons.info) self.webapp.router.add_get('/addons/{addon}/info', api_addons.info)
self.webapp.router.add_get( self.webapp.router.add_post(
'/addons/{addon}/install', api_addons.install) '/addons/{addon}/install', api_addons.install)
self.webapp.router.add_get( self.webapp.router.add_post(
'/addons/{addon}/uninstall', api_addons.uninstall) '/addons/{addon}/uninstall', api_addons.uninstall)
self.webapp.router.add_get('/addons/{addon}/start', api_addons.start) self.webapp.router.add_post('/addons/{addon}/start', api_addons.start)
self.webapp.router.add_get('/addons/{addon}/stop', api_addons.stop) self.webapp.router.add_post('/addons/{addon}/stop', api_addons.stop)
self.webapp.router.add_get('/addons/{addon}/update', api_addons.update) self.webapp.router.add_post(
self.webapp.router.add_get( '/addons/{addon}/restart', api_addons.restart)
self.webapp.router.add_post(
'/addons/{addon}/update', api_addons.update)
self.webapp.router.add_post(
'/addons/{addon}/options', api_addons.options) '/addons/{addon}/options', api_addons.options)
self.webapp.router.add_get('/addons/{addon}/logs', api_addons.logs) self.webapp.router.add_get('/addons/{addon}/logs', api_addons.logs)
def register_security(self):
"""Register security function."""
api_security = APISecurity(self.config, self.loop)
self.webapp.router.add_get('/security/info', api_security.info)
self.webapp.router.add_post('/security/options', api_security.options)
self.webapp.router.add_post('/security/totp', api_security.totp)
self.webapp.router.add_post('/security/session', api_security.session)
def register_panel(self):
"""Register panel for homeassistant."""
panel_dir = Path(__file__).parents[1].joinpath('panel')
self.webapp.router.register_resource(
web.StaticResource('/panel', str(panel_dir)))
async def start(self): async def start(self):
"""Run rest api webserver.""" """Run rest api webserver."""
self._handler = self.webapp.make_handler(loop=self.loop) self._handler = self.webapp.make_handler(loop=self.loop)

View File

@@ -7,8 +7,9 @@ from voluptuous.humanize import humanize_error
from .util import api_process, api_process_raw, api_validate from .util import api_process, api_process_raw, api_validate
from ..const import ( from ..const import (
ATTR_VERSION, ATTR_CURRENT, ATTR_STATE, ATTR_BOOT, ATTR_OPTIONS, ATTR_VERSION, ATTR_LAST_VERSION, ATTR_STATE, ATTR_BOOT, ATTR_OPTIONS,
STATE_STOPPED, STATE_STARTED, BOOT_AUTO, BOOT_MANUAL) ATTR_URL, ATTR_DESCRIPTON, ATTR_DETACHED, ATTR_NAME, ATTR_REPOSITORY,
ATTR_BUILD, STATE_STOPPED, STATE_STARTED, BOOT_AUTO, BOOT_MANUAL)
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -47,14 +48,19 @@ class APIAddons(object):
"""Return addon information.""" """Return addon information."""
addon = self._extract_addon(request) addon = self._extract_addon(request)
info = { return {
ATTR_NAME: self.addons.get_name(addon),
ATTR_DESCRIPTON: self.addons.get_description(addon),
ATTR_VERSION: self.addons.version_installed(addon), ATTR_VERSION: self.addons.version_installed(addon),
ATTR_CURRENT: self.addons.get_version(addon), ATTR_REPOSITORY: self.addons.get_repository(addon),
ATTR_LAST_VERSION: self.addons.get_last_version(addon),
ATTR_STATE: await self.addons.state(addon), ATTR_STATE: await self.addons.state(addon),
ATTR_BOOT: self.addons.get_boot(addon), ATTR_BOOT: self.addons.get_boot(addon),
ATTR_OPTIONS: self.addons.get_options(addon), ATTR_OPTIONS: self.addons.get_options(addon),
ATTR_URL: self.addons.get_url(addon),
ATTR_DETACHED: addon in self.addons.list_detached,
ATTR_BUILD: self.addons.need_build(addon),
} }
return info
@api_process @api_process
async def options(self, request): async def options(self, request):
@@ -66,12 +72,12 @@ class APIAddons(object):
vol.Optional(ATTR_OPTIONS): options_schema, vol.Optional(ATTR_OPTIONS): options_schema,
}) })
addon_config = await api_validate(addon_schema, request) body = await api_validate(addon_schema, request)
if ATTR_OPTIONS in addon_config: if ATTR_OPTIONS in body:
self.addons.set_options(addon, addon_config[ATTR_OPTIONS]) self.addons.set_options(addon, body[ATTR_OPTIONS])
if ATTR_BOOT in addon_config: if ATTR_BOOT in body:
self.addons.set_options(addon, addon_config[ATTR_BOOT]) self.addons.set_boot(addon, body[ATTR_BOOT])
return True return True
@@ -81,7 +87,12 @@ class APIAddons(object):
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
addon = self._extract_addon(request, check_installed=False) addon = self._extract_addon(request, check_installed=False)
version = body.get( version = body.get(
ATTR_VERSION, self.addons.get_version(addon)) ATTR_VERSION, self.addons.get_last_version(addon))
# check if arch supported
if self.addons.arch not in self.addons.get_arch(addon):
raise RuntimeError(
"Addon is not supported on {}".format(self.addons.arch))
return await asyncio.shield( return await asyncio.shield(
self.addons.install(addon, version), loop=self.loop) self.addons.install(addon, version), loop=self.loop)
@@ -130,7 +141,7 @@ class APIAddons(object):
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
addon = self._extract_addon(request) addon = self._extract_addon(request)
version = body.get( version = body.get(
ATTR_VERSION, self.addons.get_version(addon)) ATTR_VERSION, self.addons.get_last_version(addon))
if version == self.addons.version_installed(addon): if version == self.addons.version_installed(addon):
raise RuntimeError("Version is already in use") raise RuntimeError("Version is already in use")
@@ -138,6 +149,12 @@ class APIAddons(object):
return await asyncio.shield( return await asyncio.shield(
self.addons.update(addon, version), loop=self.loop) self.addons.update(addon, version), loop=self.loop)
@api_process
async def restart(self, request):
"""Restart addon."""
addon = self._extract_addon(request)
return await asyncio.shield(self.addons.restart(addon), loop=self.loop)
@api_process_raw @api_process_raw
def logs(self, request): def logs(self, request):
"""Return logs from addon.""" """Return logs from addon."""

View File

@@ -5,7 +5,7 @@ import logging
import voluptuous as vol import voluptuous as vol
from .util import api_process, api_process_raw, api_validate from .util import api_process, api_process_raw, api_validate
from ..const import ATTR_VERSION, ATTR_CURRENT from ..const import ATTR_VERSION, ATTR_LAST_VERSION
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -26,18 +26,16 @@ class APIHomeAssistant(object):
@api_process @api_process
async def info(self, request): async def info(self, request):
"""Return host information.""" """Return host information."""
info = { return {
ATTR_VERSION: self.homeassistant.version, ATTR_VERSION: self.homeassistant.version,
ATTR_CURRENT: self.config.current_homeassistant, ATTR_LAST_VERSION: self.config.last_homeassistant,
} }
return info
@api_process @api_process
async def update(self, request): async def update(self, request):
"""Update host OS.""" """Update homeassistant."""
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.config.current_homeassistant) version = body.get(ATTR_VERSION, self.config.last_homeassistant)
if self.homeassistant.in_progress: if self.homeassistant.in_progress:
raise RuntimeError("Other task is in progress") raise RuntimeError("Other task is in progress")
@@ -48,6 +46,15 @@ class APIHomeAssistant(object):
return await asyncio.shield( return await asyncio.shield(
self.homeassistant.update(version), loop=self.loop) self.homeassistant.update(version), loop=self.loop)
@api_process
async def restart(self, request):
"""Restart homeassistant."""
if self.homeassistant.in_progress:
raise RuntimeError("Other task is in progress")
return await asyncio.shield(
self.homeassistant.restart(), loop=self.loop)
@api_process_raw @api_process_raw
def logs(self, request): def logs(self, request):
"""Return homeassistant docker logs. """Return homeassistant docker logs.

View File

@@ -1,15 +1,16 @@
"""Init file for HassIO host rest api.""" """Init file for HassIO host rest api."""
import asyncio
import logging import logging
import voluptuous as vol import voluptuous as vol
from .util import api_process_hostcontroll, api_process, api_validate from .util import api_process_hostcontrol, api_process, api_validate
from ..const import ATTR_VERSION from ..const import (
ATTR_VERSION, ATTR_LAST_VERSION, ATTR_TYPE, ATTR_HOSTNAME, ATTR_FEATURES,
ATTR_OS)
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
UNKNOWN = 'unknown'
SCHEMA_VERSION = vol.Schema({ SCHEMA_VERSION = vol.Schema({
vol.Optional(ATTR_VERSION): vol.Coerce(str), vol.Optional(ATTR_VERSION): vol.Coerce(str),
}) })
@@ -18,44 +19,42 @@ SCHEMA_VERSION = vol.Schema({
class APIHost(object): class APIHost(object):
"""Handle rest api for host functions.""" """Handle rest api for host functions."""
def __init__(self, config, loop, host_controll): def __init__(self, config, loop, host_control):
"""Initialize host rest api part.""" """Initialize host rest api part."""
self.config = config self.config = config
self.loop = loop self.loop = loop
self.host_controll = host_controll self.host_control = host_control
@api_process @api_process
async def info(self, request): async def info(self, request):
"""Return host information.""" """Return host information."""
if not self.host_controll.active: return {
info = { ATTR_TYPE: self.host_control.type,
'os': UNKNOWN, ATTR_VERSION: self.host_control.version,
'version': UNKNOWN, ATTR_LAST_VERSION: self.host_control.last_version,
'current': UNKNOWN, ATTR_FEATURES: self.host_control.features,
'level': 0, ATTR_HOSTNAME: self.host_control.hostname,
'hostname': UNKNOWN, ATTR_OS: self.host_control.os_info,
} }
return info
return await self.host_controll.info() @api_process_hostcontrol
@api_process_hostcontroll
def reboot(self, request): def reboot(self, request):
"""Reboot host.""" """Reboot host."""
return self.host_controll.reboot() return self.host_control.reboot()
@api_process_hostcontroll @api_process_hostcontrol
def shutdown(self, request): def shutdown(self, request):
"""Poweroff host.""" """Poweroff host."""
return self.host_controll.shutdown() return self.host_control.shutdown()
@api_process_hostcontroll @api_process_hostcontrol
async def update(self, request): async def update(self, request):
"""Update host OS.""" """Update host OS."""
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION) version = body.get(ATTR_VERSION, self.host_control.last_version)
if version == self.host_controll.version: if version == self.host_control.version:
raise RuntimeError("Version is already in use") raise RuntimeError("Version is already in use")
return await self.host_controll.host_update(version=version) return await asyncio.shield(
self.host_control.update(version=version), loop=self.loop)

View File

@@ -1,7 +1,7 @@
"""Init file for HassIO network rest api.""" """Init file for HassIO network rest api."""
import logging import logging
from .util import api_process_hostcontroll from .util import api_process_hostcontrol
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -9,18 +9,18 @@ _LOGGER = logging.getLogger(__name__)
class APINetwork(object): class APINetwork(object):
"""Handle rest api for network functions.""" """Handle rest api for network functions."""
def __init__(self, config, loop, host_controll): def __init__(self, config, loop, host_control):
"""Initialize network rest api part.""" """Initialize network rest api part."""
self.config = config self.config = config
self.loop = loop self.loop = loop
self.host_controll = host_controll self.host_control = host_control
@api_process_hostcontroll @api_process_hostcontrol
def info(self, request): def info(self, request):
"""Show network settings.""" """Show network settings."""
pass pass
@api_process_hostcontroll @api_process_hostcontrol
def options(self, request): def options(self, request):
"""Edit network settings.""" """Edit network settings."""
pass pass

102
hassio/api/security.py Normal file
View File

@@ -0,0 +1,102 @@
"""Init file for HassIO security rest api."""
from datetime import datetime, timedelta
import io
import logging
import hashlib
import os
from aiohttp import web
import voluptuous as vol
import pyotp
import pyqrcode
from .util import api_process, api_validate, hash_password
from ..const import ATTR_INITIALIZE, ATTR_PASSWORD, ATTR_TOTP, ATTR_SESSION
_LOGGER = logging.getLogger(__name__)
SCHEMA_PASSWORD = vol.Schema({
vol.Required(ATTR_PASSWORD): vol.Coerce(str),
})
SCHEMA_SESSION = SCHEMA_PASSWORD.extend({
vol.Optional(ATTR_TOTP, default=None): vol.Coerce(str),
})
class APISecurity(object):
"""Handle rest api for security functions."""
def __init__(self, config, loop):
"""Initialize security rest api part."""
self.config = config
self.loop = loop
def _check_password(self, body):
"""Check if password is valid and security is initialize."""
if not self.config.security_initialize:
raise RuntimeError("First set a password")
password = hash_password(body[ATTR_PASSWORD])
if password != self.config.security_password:
raise RuntimeError("Wrong password")
@api_process
async def info(self, request):
"""Return host information."""
return {
ATTR_INITIALIZE: self.config.security_initialize,
ATTR_TOTP: self.config.security_totp is not None,
}
@api_process
async def options(self, request):
"""Set options / password."""
body = await api_validate(SCHEMA_PASSWORD, request)
if self.config.security_initialize:
raise RuntimeError("Password is already set!")
self.config.security_password = hash_password(body[ATTR_PASSWORD])
self.config.security_initialize = True
return True
@api_process
async def totp(self, request):
"""Set and initialze TOTP."""
body = await api_validate(SCHEMA_PASSWORD, request)
self._check_password(body)
# generate TOTP
totp_init_key = pyotp.random_base32()
totp = pyotp.TOTP(totp_init_key)
# init qrcode
buff = io.BytesIO()
qrcode = pyqrcode.create(totp.provisioning_uri("Hass.IO"))
qrcode.svg(buff)
# finish
self.config.security_totp = totp_init_key
return web.Response(body=buff.getvalue(), content_type='image/svg+xml')
@api_process
async def session(self, request):
"""Set and initialze session."""
body = await api_validate(SCHEMA_SESSION, request)
self._check_password(body)
# check TOTP
if self.config.security_totp:
totp = pyotp.TOTP(self.config.security_totp)
if body[ATTR_TOTP] != totp.now():
raise RuntimeError("Invalid TOTP token!")
# create session
valid_until = datetime.now() + timedelta(days=1)
session = hashlib.sha256(os.urandom(54)).hexdigest()
# store session
self.config.security_sessions = (session, valid_until)
return {ATTR_SESSION: session}

View File

@@ -5,14 +5,20 @@ import logging
import voluptuous as vol import voluptuous as vol
from .util import api_process, api_process_raw, api_validate from .util import api_process, api_process_raw, api_validate
from ..addons.util import create_hash_index_list
from ..const import ( from ..const import (
ATTR_ADDONS, ATTR_VERSION, ATTR_CURRENT, ATTR_BETA, HASSIO_VERSION) ATTR_ADDONS, ATTR_VERSION, ATTR_LAST_VERSION, ATTR_BETA_CHANNEL,
HASSIO_VERSION, ATTR_ADDONS_REPOSITORIES, ATTR_REPOSITORIES,
ATTR_REPOSITORY, ATTR_DESCRIPTON, ATTR_NAME, ATTR_SLUG, ATTR_INSTALLED,
ATTR_DETACHED, ATTR_SOURCE, ATTR_MAINTAINER, ATTR_URL, ATTR_ARCH,
ATTR_BUILD)
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
SCHEMA_OPTIONS = vol.Schema({ SCHEMA_OPTIONS = vol.Schema({
# pylint: disable=no-value-for-parameter # pylint: disable=no-value-for-parameter
vol.Optional(ATTR_BETA): vol.Boolean(), vol.Optional(ATTR_BETA_CHANNEL): vol.Boolean(),
vol.Optional(ATTR_ADDONS_REPOSITORIES): [vol.Url()],
}) })
SCHEMA_VERSION = vol.Schema({ SCHEMA_VERSION = vol.Schema({
@@ -23,12 +29,55 @@ SCHEMA_VERSION = vol.Schema({
class APISupervisor(object): class APISupervisor(object):
"""Handle rest api for supervisor functions.""" """Handle rest api for supervisor functions."""
def __init__(self, config, loop, supervisor, addons): def __init__(self, config, loop, supervisor, addons, host_control):
"""Initialize supervisor rest api part.""" """Initialize supervisor rest api part."""
self.config = config self.config = config
self.loop = loop self.loop = loop
self.supervisor = supervisor self.supervisor = supervisor
self.addons = addons self.addons = addons
self.host_control = host_control
def _addons_list(self, only_installed=False):
"""Return a list of addons."""
detached = self.addons.list_detached
if only_installed:
addons = self.addons.list_installed
else:
addons = self.addons.list_all
data = []
for addon in addons:
data.append({
ATTR_NAME: self.addons.get_name(addon),
ATTR_SLUG: addon,
ATTR_DESCRIPTON: self.addons.get_description(addon),
ATTR_VERSION: self.addons.get_last_version(addon),
ATTR_INSTALLED: self.addons.version_installed(addon),
ATTR_ARCH: self.addons.get_arch(addon),
ATTR_DETACHED: addon in detached,
ATTR_REPOSITORY: self.addons.get_repository(addon),
ATTR_BUILD: self.addons.need_build(addon),
ATTR_URL: self.addons.get_url(addon),
})
return data
def _repositories_list(self):
"""Return a list of addons repositories."""
data = []
list_id = create_hash_index_list(self.config.addons_repositories)
for repository in self.addons.list_repositories:
data.append({
ATTR_SLUG: repository[ATTR_SLUG],
ATTR_NAME: repository[ATTR_NAME],
ATTR_SOURCE: list_id.get(repository[ATTR_SLUG]),
ATTR_URL: repository.get(ATTR_URL),
ATTR_MAINTAINER: repository.get(ATTR_MAINTAINER),
})
return data
@api_process @api_process
async def ping(self, request): async def ping(self, request):
@@ -38,29 +87,56 @@ class APISupervisor(object):
@api_process @api_process
async def info(self, request): async def info(self, request):
"""Return host information.""" """Return host information."""
info = { return {
ATTR_VERSION: HASSIO_VERSION, ATTR_VERSION: HASSIO_VERSION,
ATTR_CURRENT: self.config.current_hassio, ATTR_LAST_VERSION: self.config.last_hassio,
ATTR_BETA: self.config.upstream_beta, ATTR_BETA_CHANNEL: self.config.upstream_beta,
ATTR_ADDONS: self.addons.list, ATTR_ARCH: self.addons.arch,
ATTR_ADDONS: self._addons_list(only_installed=True),
ATTR_ADDONS_REPOSITORIES: self.config.addons_repositories,
}
@api_process
async def available_addons(self, request):
"""Return information for all available addons."""
return {
ATTR_ADDONS: self._addons_list(),
ATTR_REPOSITORIES: self._repositories_list(),
} }
return info
@api_process @api_process
async def options(self, request): async def options(self, request):
"""Set supervisor options.""" """Set supervisor options."""
body = await api_validate(SCHEMA_OPTIONS, request) body = await api_validate(SCHEMA_OPTIONS, request)
if ATTR_BETA in body: if ATTR_BETA_CHANNEL in body:
self.config.upstream_beta = body[ATTR_BETA] self.config.upstream_beta = body[ATTR_BETA_CHANNEL]
return self.config.save() if ATTR_ADDONS_REPOSITORIES in body:
new = set(body[ATTR_ADDONS_REPOSITORIES])
old = set(self.config.addons_repositories)
# add new repositories
tasks = [self.addons.add_git_repository(url) for url in
set(new - old)]
if tasks:
await asyncio.shield(
asyncio.wait(tasks, loop=self.loop), loop=self.loop)
# remove old repositories
for url in set(old - new):
self.addons.drop_git_repository(url)
# read repository
self.addons.read_data_from_repositories()
return True
@api_process @api_process
async def update(self, request): async def update(self, request):
"""Update supervisor OS.""" """Update supervisor OS."""
body = await api_validate(SCHEMA_VERSION, request) body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.config.current_hassio) version = body.get(ATTR_VERSION, self.config.last_hassio)
if version == self.supervisor.version: if version == self.supervisor.version:
raise RuntimeError("Version is already in use") raise RuntimeError("Version is already in use")
@@ -71,7 +147,10 @@ class APISupervisor(object):
@api_process @api_process
async def reload(self, request): async def reload(self, request):
"""Reload addons, config ect.""" """Reload addons, config ect."""
tasks = [self.addons.reload(), self.config.fetch_update_infos()] tasks = [
self.addons.reload(), self.config.fetch_update_infos(),
self.host_control.load()
]
results, _ = await asyncio.shield( results, _ = await asyncio.shield(
asyncio.wait(tasks, loop=self.loop), loop=self.loop) asyncio.wait(tasks, loop=self.loop), loop=self.loop)

View File

@@ -1,5 +1,6 @@
"""Init file for HassIO util for rest api.""" """Init file for HassIO util for rest api."""
import json import json
import hashlib
import logging import logging
from aiohttp import web from aiohttp import web
@@ -32,6 +33,8 @@ def api_process(method):
if isinstance(answer, dict): if isinstance(answer, dict):
return api_return_ok(data=answer) return api_return_ok(data=answer)
if isinstance(answer, web.Response):
return answer
elif answer: elif answer:
return api_return_ok() return api_return_ok()
return api_return_error() return api_return_error()
@@ -39,11 +42,11 @@ def api_process(method):
return wrap_api return wrap_api
def api_process_hostcontroll(method): def api_process_hostcontrol(method):
"""Wrap HostControll calls to rest api.""" """Wrap HostControl calls to rest api."""
async def wrap_hostcontroll(api, *args, **kwargs): async def wrap_hostcontrol(api, *args, **kwargs):
"""Return host information.""" """Return host information."""
if not api.host_controll.active: if not api.host_control.active:
raise HTTPServiceUnavailable() raise HTTPServiceUnavailable()
try: try:
@@ -59,7 +62,7 @@ def api_process_hostcontroll(method):
return api_return_ok() return api_return_ok()
return api_return_error() return api_return_error()
return wrap_hostcontroll return wrap_hostcontrol
def api_process_raw(method): def api_process_raw(method):
@@ -81,7 +84,7 @@ def api_return_error(message=None):
return web.json_response({ return web.json_response({
JSON_RESULT: RESULT_ERROR, JSON_RESULT: RESULT_ERROR,
JSON_MESSAGE: message, JSON_MESSAGE: message,
}) }, status=400)
def api_return_ok(data=None): def api_return_ok(data=None):
@@ -101,3 +104,9 @@ async def api_validate(schema, request):
raise RuntimeError(humanize_error(data, ex)) from None raise RuntimeError(humanize_error(data, ex)) from None
return data return data
def hash_password(password):
"""Hash and salt our passwords."""
key = ")*()*SALT_HASSIO2123{}6554547485HSKA!!*JSLAfdasda$".format(password)
return hashlib.sha256(key.encode()).hexdigest()

View File

@@ -1,7 +1,6 @@
"""Bootstrap HassIO.""" """Bootstrap HassIO."""
import logging import logging
import os import os
import stat
import signal import signal
from colorlog import ColoredFormatter from colorlog import ColoredFormatter
@@ -17,26 +16,46 @@ def initialize_system_data(websession):
config = CoreConfig(websession) config = CoreConfig(websession)
# homeassistant config folder # homeassistant config folder
if not os.path.isdir(config.path_config): if not config.path_config.is_dir():
_LOGGER.info( _LOGGER.info(
"Create Home-Assistant config folder %s", config.path_config) "Create Home-Assistant config folder %s", config.path_config)
os.mkdir(config.path_config) config.path_config.mkdir()
# homeassistant ssl folder # hassio ssl folder
if not os.path.isdir(config.path_ssl): if not config.path_ssl.is_dir():
_LOGGER.info("Create Home-Assistant ssl folder %s", config.path_ssl) _LOGGER.info("Create hassio ssl folder %s", config.path_ssl)
os.mkdir(config.path_ssl) config.path_ssl.mkdir()
# homeassistant addon data folder # hassio addon data folder
if not os.path.isdir(config.path_addons_data): if not config.path_addons_data.is_dir():
_LOGGER.info("Create Home-Assistant addon data folder %s", _LOGGER.info(
config.path_addons_data) "Create hassio addon data folder %s", config.path_addons_data)
os.mkdir(config.path_addons_data) config.path_addons_data.mkdir(parents=True)
if not os.path.isdir(config.path_addons_custom): if not config.path_addons_local.is_dir():
_LOGGER.info("Create Home-Assistant addon custom folder %s", _LOGGER.info("Create hassio addon local repository folder %s",
config.path_addons_custom) config.path_addons_local)
os.mkdir(config.path_addons_custom) config.path_addons_local.mkdir(parents=True)
if not config.path_addons_git.is_dir():
_LOGGER.info("Create hassio addon git repositories folder %s",
config.path_addons_git)
config.path_addons_git.mkdir(parents=True)
if not config.path_addons_build.is_dir():
_LOGGER.info("Create Home-Assistant addon build folder %s",
config.path_addons_build)
config.path_addons_build.mkdir(parents=True)
# hassio backup folder
if not config.path_backup.is_dir():
_LOGGER.info("Create hassio backup folder %s", config.path_backup)
config.path_backup.mkdir()
# share folder
if not config.path_share.is_dir():
_LOGGER.info("Create hassio share folder %s", config.path_share)
config.path_share.mkdir()
return config return config
@@ -76,8 +95,7 @@ def check_environment():
_LOGGER.fatal("Can't find %s in env!", key) _LOGGER.fatal("Can't find %s in env!", key)
return False return False
mode = os.stat(SOCKET_DOCKER)[stat.ST_MODE] if not SOCKET_DOCKER.is_socket():
if not stat.S_ISSOCK(mode):
_LOGGER.fatal("Can't find docker socket!") _LOGGER.fatal("Can't find docker socket!")
return False return False

View File

@@ -1,49 +1,85 @@
"""Bootstrap HassIO.""" """Bootstrap HassIO."""
from datetime import datetime
import logging import logging
import json
import os import os
from pathlib import Path, PurePath
import voluptuous as vol
from voluptuous.humanize import humanize_error
from .const import FILE_HASSIO_CONFIG, HASSIO_SHARE from .const import FILE_HASSIO_CONFIG, HASSIO_SHARE
from .tools import ( from .tools import (
fetch_current_versions, write_json_file, read_json_file) fetch_last_versions, write_json_file, read_json_file)
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
HOMEASSISTANT_CONFIG = "{}/homeassistant" DATETIME_FORMAT = "%Y%m%d %H:%M:%S"
HOMEASSISTANT_IMAGE = 'homeassistant_image'
HOMEASSISTANT_CURRENT = 'homeassistant_current'
HASSIO_SSL = "{}/ssl" HOMEASSISTANT_CONFIG = PurePath("homeassistant")
HASSIO_CURRENT = 'hassio_current' HOMEASSISTANT_LAST = 'homeassistant_last'
HASSIO_SSL = PurePath("ssl")
HASSIO_LAST = 'hassio_last'
HASSIO_CLEANUP = 'hassio_cleanup' HASSIO_CLEANUP = 'hassio_cleanup'
ADDONS_REPO = "{}/addons" ADDONS_CORE = PurePath("addons/core")
ADDONS_DATA = "{}/addons_data" ADDONS_LOCAL = PurePath("addons/local")
ADDONS_CUSTOM = "{}/addons_custom" ADDONS_GIT = PurePath("addons/git")
ADDONS_DATA = PurePath("addons/data")
ADDONS_BUILD = PurePath("addons/build")
ADDONS_CUSTOM_LIST = 'addons_custom_list'
BACKUP_DATA = PurePath("backup")
SHARE_DATA = PurePath("share")
UPSTREAM_BETA = 'upstream_beta' UPSTREAM_BETA = 'upstream_beta'
API_ENDPOINT = 'api_endpoint' API_ENDPOINT = 'api_endpoint'
SECURITY_INITIALIZE = 'security_initialize'
SECURITY_TOTP = 'security_totp'
SECURITY_PASSWORD = 'security_password'
SECURITY_SESSIONS = 'security_sessions'
# pylint: disable=no-value-for-parameter
SCHEMA_CONFIG = vol.Schema({
vol.Optional(UPSTREAM_BETA, default=False): vol.Boolean(),
vol.Optional(API_ENDPOINT): vol.Coerce(str),
vol.Optional(HOMEASSISTANT_LAST): vol.Coerce(str),
vol.Optional(HASSIO_LAST): vol.Coerce(str),
vol.Optional(HASSIO_CLEANUP): vol.Coerce(str),
vol.Optional(ADDONS_CUSTOM_LIST, default=[]): [vol.Url()],
vol.Optional(SECURITY_INITIALIZE, default=False): vol.Boolean(),
vol.Optional(SECURITY_TOTP): vol.Coerce(str),
vol.Optional(SECURITY_PASSWORD): vol.Coerce(str),
vol.Optional(SECURITY_SESSIONS, default={}):
{vol.Coerce(str): vol.Coerce(str)},
}, extra=vol.REMOVE_EXTRA)
class Config(object): class Config(object):
"""Hold all config data.""" """Hold all config data."""
def __init__(self, config_file): def __init__(self, config_file):
"""Initialize config object.""" """Initialize config object."""
self._filename = config_file self._file = config_file
self._data = {} self._data = {}
# init or load data # init or load data
if os.path.isfile(self._filename): if self._file.is_file():
try: try:
self._data = read_json_file(self._filename) self._data = read_json_file(self._file)
except OSError: except (OSError, json.JSONDecodeError):
_LOGGER.warning("Can't read %s", self._filename) _LOGGER.warning("Can't read %s", self._file)
self._data = {}
def save(self): def save(self):
"""Store data to config file.""" """Store data to config file."""
if not write_json_file(self._filename, self._data): if not write_json_file(self._file, self._data):
_LOGGER.exception("Can't store config in %s", self._filename) _LOGGER.error("Can't store config in %s", self._file)
return False return False
return True return True
@@ -57,23 +93,23 @@ class CoreConfig(Config):
super().__init__(FILE_HASSIO_CONFIG) super().__init__(FILE_HASSIO_CONFIG)
# init data # validate data
if not self._data: try:
self._data.update({ self._data = SCHEMA_CONFIG(self._data)
HOMEASSISTANT_IMAGE: os.environ['HOMEASSISTANT_REPOSITORY'],
UPSTREAM_BETA: False,
})
self.save() self.save()
except vol.Invalid as ex:
_LOGGER.warning(
"Invalid config %s", humanize_error(self._data, ex))
async def fetch_update_infos(self): async def fetch_update_infos(self):
"""Read current versions from web.""" """Read current versions from web."""
current = await fetch_current_versions( last = await fetch_last_versions(
self.websession, beta=self.upstream_beta) self.websession, beta=self.upstream_beta)
if current: if last:
self._data.update({ self._data.update({
HOMEASSISTANT_CURRENT: current.get('homeassistant_tag'), HOMEASSISTANT_LAST: last.get('homeassistant'),
HASSIO_CURRENT: current.get('hassio_tag'), HASSIO_LAST: last.get('hassio'),
}) })
self.save() self.save()
return True return True
@@ -93,7 +129,7 @@ class CoreConfig(Config):
@property @property
def upstream_beta(self): def upstream_beta(self):
"""Return True if we run in beta upstream.""" """Return True if we run in beta upstream."""
return self._data.get(UPSTREAM_BETA, False) return self._data[UPSTREAM_BETA]
@upstream_beta.setter @upstream_beta.setter
def upstream_beta(self, value): def upstream_beta(self, value):
@@ -117,59 +153,168 @@ class CoreConfig(Config):
@property @property
def homeassistant_image(self): def homeassistant_image(self):
"""Return docker homeassistant repository.""" """Return docker homeassistant repository."""
return self._data.get(HOMEASSISTANT_IMAGE) return os.environ['HOMEASSISTANT_REPOSITORY']
@property @property
def current_homeassistant(self): def last_homeassistant(self):
"""Actual version of homeassistant.""" """Actual version of homeassistant."""
return self._data.get(HOMEASSISTANT_CURRENT) return self._data.get(HOMEASSISTANT_LAST)
@property @property
def current_hassio(self): def last_hassio(self):
"""Actual version of hassio.""" """Actual version of hassio."""
return self._data.get(HASSIO_CURRENT) return self._data.get(HASSIO_LAST)
@property @property
def path_hassio_docker(self): def path_extern_hassio(self):
"""Return hassio data path extern for docker.""" """Return hassio data path extern for docker."""
return os.environ['SUPERVISOR_SHARE'] return PurePath(os.environ['SUPERVISOR_SHARE'])
@property @property
def path_config_docker(self): def path_extern_config(self):
"""Return config path extern for docker.""" """Return config path extern for docker."""
return HOMEASSISTANT_CONFIG.format(self.path_hassio_docker) return str(PurePath(self.path_extern_hassio, HOMEASSISTANT_CONFIG))
@property @property
def path_config(self): def path_config(self):
"""Return config path inside supervisor.""" """Return config path inside supervisor."""
return HOMEASSISTANT_CONFIG.format(HASSIO_SHARE) return Path(HASSIO_SHARE, HOMEASSISTANT_CONFIG)
@property @property
def path_ssl_docker(self): def path_extern_ssl(self):
"""Return SSL path extern for docker.""" """Return SSL path extern for docker."""
return HASSIO_SSL.format(self.path_hassio_docker) return str(PurePath(self.path_extern_hassio, HASSIO_SSL))
@property @property
def path_ssl(self): def path_ssl(self):
"""Return SSL path inside supervisor.""" """Return SSL path inside supervisor."""
return HASSIO_SSL.format(HASSIO_SHARE) return Path(HASSIO_SHARE, HASSIO_SSL)
@property @property
def path_addons_repo(self): def path_addons_core(self):
"""Return git repo path for addons.""" """Return git path for core addons."""
return ADDONS_REPO.format(HASSIO_SHARE) return Path(HASSIO_SHARE, ADDONS_CORE)
@property @property
def path_addons_custom(self): def path_addons_git(self):
"""Return path for git addons."""
return Path(HASSIO_SHARE, ADDONS_GIT)
@property
def path_addons_local(self):
"""Return path for customs addons.""" """Return path for customs addons."""
return ADDONS_CUSTOM.format(HASSIO_SHARE) return Path(HASSIO_SHARE, ADDONS_LOCAL)
@property
def path_extern_addons_local(self):
"""Return path for customs addons."""
return PurePath(self.path_extern_hassio, ADDONS_LOCAL)
@property @property
def path_addons_data(self): def path_addons_data(self):
"""Return root addon data folder.""" """Return root addon data folder."""
return ADDONS_DATA.format(HASSIO_SHARE) return Path(HASSIO_SHARE, ADDONS_DATA)
@property @property
def path_addons_data_docker(self): def path_extern_addons_data(self):
"""Return root addon data folder extern for docker.""" """Return root addon data folder extern for docker."""
return ADDONS_DATA.format(self.path_hassio_docker) return PurePath(self.path_extern_hassio, ADDONS_DATA)
@property
def path_addons_build(self):
"""Return root addon build folder."""
return Path(HASSIO_SHARE, ADDONS_BUILD)
@property
def path_backup(self):
"""Return root backup data folder."""
return Path(HASSIO_SHARE, BACKUP_DATA)
@property
def path_extern_backup(self):
"""Return root backup data folder extern for docker."""
return PurePath(self.path_extern_hassio, BACKUP_DATA)
@property
def path_share(self):
"""Return root share data folder."""
return Path(HASSIO_SHARE, SHARE_DATA)
@property
def path_extern_share(self):
"""Return root share data folder extern for docker."""
return PurePath(self.path_extern_hassio, SHARE_DATA)
@property
def addons_repositories(self):
"""Return list of addons custom repositories."""
return self._data[ADDONS_CUSTOM_LIST]
@addons_repositories.setter
def addons_repositories(self, repo):
"""Add a custom repository to list."""
if repo in self._data[ADDONS_CUSTOM_LIST]:
return
self._data[ADDONS_CUSTOM_LIST].append(repo)
self.save()
def drop_addon_repository(self, repo):
"""Remove a custom repository from list."""
if repo not in self._data[ADDONS_CUSTOM_LIST]:
return
self._data[ADDONS_CUSTOM_LIST].remove(repo)
self.save()
@property
def security_initialize(self):
"""Return is security was initialize."""
return self._data[SECURITY_INITIALIZE]
@security_initialize.setter
def security_initialize(self, value):
"""Set is security initialize."""
self._data[SECURITY_INITIALIZE] = value
self.save()
@property
def security_totp(self):
"""Return the TOTP key."""
return self._data.get(SECURITY_TOTP)
@security_totp.setter
def security_totp(self, value):
"""Set the TOTP key."""
self._data[SECURITY_TOTP] = value
self.save()
@property
def security_password(self):
"""Return the password key."""
return self._data.get(SECURITY_PASSWORD)
@security_password.setter
def security_password(self, value):
"""Set the password key."""
self._data[SECURITY_PASSWORD] = value
self.save()
@property
def security_sessions(self):
"""Return api sessions."""
return {session: datetime.strptime(until, DATETIME_FORMAT) for
session, until in self._data[SECURITY_SESSIONS].items()}
@security_sessions.setter
def security_sessions(self, value):
"""Set the a new session."""
session, valid = value
if valid is None:
self._data[SECURITY_SESSIONS].pop(session, None)
else:
self._data[SECURITY_SESSIONS].update(
{session: valid.strftime(DATETIME_FORMAT)}
)
self.save()

View File

@@ -1,28 +1,38 @@
"""Const file for HassIO.""" """Const file for HassIO."""
HASSIO_VERSION = '0.14' from pathlib import Path
URL_HASSIO_VERSION = \ HASSIO_VERSION = '0.29'
'https://raw.githubusercontent.com/pvizeli/hassio/master/version.json'
URL_HASSIO_VERSION_BETA = \
'https://raw.githubusercontent.com/pvizeli/hassio/master/version_beta.json'
URL_HASSIO_ADDONS = 'https://github.com/pvizeli/hassio-addons' URL_HASSIO_VERSION = ('https://raw.githubusercontent.com/home-assistant/'
'hassio/master/version.json')
URL_HASSIO_VERSION_BETA = ('https://raw.githubusercontent.com/home-assistant/'
'hassio/dev/version.json')
DOCKER_REPO = "pvizeli" URL_HASSIO_ADDONS = 'https://github.com/home-assistant/hassio-addons'
HASSIO_SHARE = "/data" HASSIO_SHARE = Path("/data")
RUN_UPDATE_INFO_TASKS = 28800 RUN_UPDATE_INFO_TASKS = 28800
RUN_UPDATE_SUPERVISOR_TASKS = 29100 RUN_UPDATE_SUPERVISOR_TASKS = 29100
RUN_RELOAD_ADDONS_TASKS = 28800 RUN_RELOAD_ADDONS_TASKS = 28800
RUN_WATCHDOG_HOMEASSISTANT = 15
RUN_CLEANUP_API_SESSIONS = 900
RESTART_EXIT_CODE = 100 RESTART_EXIT_CODE = 100
FILE_HASSIO_ADDONS = "{}/addons.json".format(HASSIO_SHARE) FILE_HASSIO_ADDONS = Path(HASSIO_SHARE, "addons.json")
FILE_HASSIO_CONFIG = "{}/config.json".format(HASSIO_SHARE) FILE_HASSIO_CONFIG = Path(HASSIO_SHARE, "config.json")
SOCKET_DOCKER = "/var/run/docker.sock" SOCKET_DOCKER = Path("/var/run/docker.sock")
SOCKET_HC = "/var/run/hassio-hc.sock" SOCKET_HC = Path("/var/run/hassio-hc.sock")
LABEL_VERSION = 'io.hass.version'
LABEL_ARCH = 'io.hass.arch'
LABEL_TYPE = 'io.hass.type'
META_ADDON = 'addon'
META_SUPERVISOR = 'supervisor'
META_HOMEASSISTANT = 'homeassistant'
JSON_RESULT = 'result' JSON_RESULT = 'result'
JSON_DATA = 'data' JSON_DATA = 'data'
@@ -31,24 +41,42 @@ JSON_MESSAGE = 'message'
RESULT_ERROR = 'error' RESULT_ERROR = 'error'
RESULT_OK = 'ok' RESULT_OK = 'ok'
ATTR_ARCH = 'arch'
ATTR_HOSTNAME = 'hostname'
ATTR_OS = 'os'
ATTR_TYPE = 'type'
ATTR_SOURCE = 'source'
ATTR_FEATURES = 'features'
ATTR_ADDONS = 'addons' ATTR_ADDONS = 'addons'
ATTR_VERSION = 'version' ATTR_VERSION = 'version'
ATTR_CURRENT = 'current' ATTR_LAST_VERSION = 'last_version'
ATTR_BETA = 'beta' ATTR_BETA_CHANNEL = 'beta_channel'
ATTR_NAME = 'name' ATTR_NAME = 'name'
ATTR_SLUG = 'slug' ATTR_SLUG = 'slug'
ATTR_DESCRIPTON = 'description' ATTR_DESCRIPTON = 'description'
ATTR_STARTUP = 'startup' ATTR_STARTUP = 'startup'
ATTR_BOOT = 'boot' ATTR_BOOT = 'boot'
ATTR_PORTS = 'ports' ATTR_PORTS = 'ports'
ATTR_MAP_CONFIG = 'map_config' ATTR_MAP = 'map'
ATTR_MAP_SSL = 'map_ssl'
ATTR_OPTIONS = 'options' ATTR_OPTIONS = 'options'
ATTR_INSTALLED = 'installed' ATTR_INSTALLED = 'installed'
ATTR_DEDICATED = 'dedicated' ATTR_DETACHED = 'detached'
ATTR_STATE = 'state' ATTR_STATE = 'state'
ATTR_SCHEMA = 'schema' ATTR_SCHEMA = 'schema'
ATTR_IMAGE = 'image' ATTR_IMAGE = 'image'
ATTR_ADDONS_REPOSITORIES = 'addons_repositories'
ATTR_REPOSITORY = 'repository'
ATTR_REPOSITORIES = 'repositories'
ATTR_URL = 'url'
ATTR_MAINTAINER = 'maintainer'
ATTR_PASSWORD = 'password'
ATTR_TOTP = 'totp'
ATTR_INITIALIZE = 'initialize'
ATTR_SESSION = 'session'
ATTR_LOCATON = 'location'
ATTR_BUILD = 'build'
ATTR_DEVICES = 'devices'
ATTR_ENVIRONMENT = 'environment'
STARTUP_BEFORE = 'before' STARTUP_BEFORE = 'before'
STARTUP_AFTER = 'after' STARTUP_AFTER = 'after'
@@ -59,3 +87,15 @@ BOOT_MANUAL = 'manual'
STATE_STARTED = 'started' STATE_STARTED = 'started'
STATE_STOPPED = 'stopped' STATE_STOPPED = 'stopped'
MAP_CONFIG = 'config'
MAP_SSL = 'ssl'
MAP_ADDONS = 'addons'
MAP_BACKUP = 'backup'
MAP_SHARE = 'share'
MAP_MNT = 'mnt'
ARCH_ARMHF = 'armhf'
ARCH_AARCH64 = 'aarch64'
ARCH_AMD64 = 'amd64'
ARCH_I386 = 'i386'

View File

@@ -8,13 +8,17 @@ import docker
from . import bootstrap from . import bootstrap
from .addons import AddonManager from .addons import AddonManager
from .api import RestAPI from .api import RestAPI
from .host_controll import HostControll from .host_control import HostControl
from .const import ( from .const import (
SOCKET_DOCKER, RUN_UPDATE_INFO_TASKS, RUN_RELOAD_ADDONS_TASKS, SOCKET_DOCKER, RUN_UPDATE_INFO_TASKS, RUN_RELOAD_ADDONS_TASKS,
RUN_UPDATE_SUPERVISOR_TASKS, STARTUP_AFTER, STARTUP_BEFORE) RUN_UPDATE_SUPERVISOR_TASKS, RUN_WATCHDOG_HOMEASSISTANT,
RUN_CLEANUP_API_SESSIONS, STARTUP_AFTER, STARTUP_BEFORE)
from .scheduler import Scheduler from .scheduler import Scheduler
from .dock.homeassistant import DockerHomeAssistant from .dock.homeassistant import DockerHomeAssistant
from .dock.supervisor import DockerSupervisor from .dock.supervisor import DockerSupervisor
from .tasks import (
hassio_update, homeassistant_watchdog, homeassistant_setup,
api_sessions_cleanup)
from .tools import get_arch_from_image, get_local_ip from .tools import get_arch_from_image, get_local_ip
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -32,7 +36,7 @@ class HassIO(object):
self.scheduler = Scheduler(self.loop) self.scheduler = Scheduler(self.loop)
self.api = RestAPI(self.config, self.loop) self.api = RestAPI(self.config, self.loop)
self.dock = docker.DockerClient( self.dock = docker.DockerClient(
base_url="unix:/{}".format(SOCKET_DOCKER), version='auto') base_url="unix:/{}".format(str(SOCKET_DOCKER)), version='auto')
# init basic docker container # init basic docker container
self.supervisor = DockerSupervisor( self.supervisor = DockerSupervisor(
@@ -40,8 +44,8 @@ class HassIO(object):
self.homeassistant = DockerHomeAssistant( self.homeassistant = DockerHomeAssistant(
self.config, self.loop, self.dock) self.config, self.loop, self.dock)
# init HostControll # init HostControl
self.host_controll = HostControll(self.loop) self.host_control = HostControl(self.loop)
# init addon system # init addon system
self.addons = AddonManager(self.config, self.loop, self.dock) self.addons = AddonManager(self.config, self.loop, self.dock)
@@ -55,22 +59,27 @@ class HassIO(object):
# set api endpoint # set api endpoint
self.config.api_endpoint = await get_local_ip(self.loop) self.config.api_endpoint = await get_local_ip(self.loop)
# hostcontroll # hostcontrol
host_info = await self.host_controll.info() await self.host_control.load()
if host_info:
self.host_controll.version = host_info.get('version') # schedule update info tasks
_LOGGER.info( self.scheduler.register_task(
"Connected to HostControll. OS: %s Version: %s Hostname: %s " self.host_control.load, RUN_UPDATE_INFO_TASKS)
"Feature-lvl: %d", host_info.get('os'),
host_info.get('version'), host_info.get('hostname'),
host_info.get('level', 0))
# rest api views # rest api views
self.api.register_host(self.host_controll) self.api.register_host(self.host_control)
self.api.register_network(self.host_controll) self.api.register_network(self.host_control)
self.api.register_supervisor(self.supervisor, self.addons) self.api.register_supervisor(
self.supervisor, self.addons, self.host_control)
self.api.register_homeassistant(self.homeassistant) self.api.register_homeassistant(self.homeassistant)
self.api.register_addons(self.addons) self.api.register_addons(self.addons)
self.api.register_security()
self.api.register_panel()
# schedule api session cleanup
self.scheduler.register_task(
api_sessions_cleanup(self.config), RUN_CLEANUP_API_SESSIONS,
now=True)
# schedule update info tasks # schedule update info tasks
self.scheduler.register_task( self.scheduler.register_task(
@@ -80,7 +89,8 @@ class HassIO(object):
# first start of supervisor? # first start of supervisor?
if not await self.homeassistant.exists(): if not await self.homeassistant.exists():
_LOGGER.info("No HomeAssistant docker found.") _LOGGER.info("No HomeAssistant docker found.")
await self._setup_homeassistant() await homeassistant_setup(
self.config, self.loop, self.homeassistant)
# Load addons # Load addons
arch = get_arch_from_image(self.supervisor.image) arch = get_arch_from_image(self.supervisor.image)
@@ -92,7 +102,8 @@ class HassIO(object):
# schedule self update task # schedule self update task
self.scheduler.register_task( self.scheduler.register_task(
self._hassio_update, RUN_UPDATE_SUPERVISOR_TASKS) hassio_update(self.config, self.supervisor),
RUN_UPDATE_SUPERVISOR_TASKS)
async def start(self): async def start(self):
"""Start HassIO orchestration.""" """Start HassIO orchestration."""
@@ -100,19 +111,26 @@ class HassIO(object):
await self.api.start() await self.api.start()
_LOGGER.info("Start hassio api on %s", self.config.api_endpoint) _LOGGER.info("Start hassio api on %s", self.config.api_endpoint)
# HomeAssistant is already running / supervisor have only reboot try:
if await self.homeassistant.is_running(): # HomeAssistant is already running / supervisor have only reboot
_LOGGER.info("HassIO reboot detected") if await self.homeassistant.is_running():
return _LOGGER.info("HassIO reboot detected")
return
# start addon mark as before # start addon mark as before
await self.addons.auto_boot(STARTUP_BEFORE) await self.addons.auto_boot(STARTUP_BEFORE)
# run HomeAssistant # run HomeAssistant
await self.homeassistant.run() await self.homeassistant.run()
# start addon mark as after # start addon mark as after
await self.addons.auto_boot(STARTUP_AFTER) await self.addons.auto_boot(STARTUP_AFTER)
finally:
# schedule homeassistant watchdog
self.scheduler.register_task(
homeassistant_watchdog(self.loop, self.homeassistant),
RUN_WATCHDOG_HOMEASSISTANT)
async def stop(self, exit_code=0): async def stop(self, exit_code=0):
"""Stop a running orchestration.""" """Stop a running orchestration."""
@@ -125,28 +143,3 @@ class HassIO(object):
self.exit_code = exit_code self.exit_code = exit_code
self.loop.stop() self.loop.stop()
async def _setup_homeassistant(self):
"""Install a homeassistant docker container."""
while True:
# read homeassistant tag and install it
if not self.config.current_homeassistant:
await self.config.fetch_update_infos()
tag = self.config.current_homeassistant
if tag and await self.homeassistant.install(tag):
break
_LOGGER.warning("Error on setup HomeAssistant. Retry in 60.")
await asyncio.sleep(60, loop=self.loop)
# store version
_LOGGER.info("HomeAssistant docker now installed.")
async def _hassio_update(self):
"""Check and run update of supervisor hassio."""
if self.config.current_hassio == self.supervisor.version:
return
_LOGGER.info(
"Found new HassIO version %s.", self.config.current_hassio)
await self.supervisor.update(self.config.current_hassio)

View File

@@ -5,6 +5,7 @@ import logging
import docker import docker
from ..const import LABEL_VERSION
from ..tools import get_version_from_env from ..tools import get_version_from_env
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -33,6 +34,19 @@ class DockerBase(object):
"""Return True if a task is in progress.""" """Return True if a task is in progress."""
return self._lock.locked() return self._lock.locked()
def process_metadata(self, metadata=None, force=False):
"""Read metadata and set it to object."""
if not force and self.version:
return
# read metadata
metadata = metadata or self.container.attrs
if LABEL_VERSION in metadata['Config']['Labels']:
self.version = metadata['Config']['Labels'][LABEL_VERSION]
else:
# dedicated
self.version = get_version_from_env(metadata['Config']['Env'])
async def install(self, tag): async def install(self, tag):
"""Pull docker image.""" """Pull docker image."""
if self._lock.locked(): if self._lock.locked():
@@ -52,12 +66,12 @@ class DockerBase(object):
image = self.dock.images.pull("{}:{}".format(self.image, tag)) image = self.dock.images.pull("{}:{}".format(self.image, tag))
image.tag(self.image, tag='latest') image.tag(self.image, tag='latest')
self.version = get_version_from_env(image.attrs['Config']['Env']) self.process_metadata(metadata=image.attrs, force=True)
_LOGGER.info("Tag image %s with version %s as latest",
self.image, self.version)
except docker.errors.APIError as err: except docker.errors.APIError as err:
_LOGGER.error("Can't install %s:%s -> %s.", self.image, tag, err) _LOGGER.error("Can't install %s:%s -> %s.", self.image, tag, err)
return False return False
_LOGGER.info("Tag image %s with version %s as latest", self.image, tag)
return True return True
def exists(self): def exists(self):
@@ -74,7 +88,7 @@ class DockerBase(object):
""" """
try: try:
image = self.dock.images.get(self.image) image = self.dock.images.get(self.image)
self.version = get_version_from_env(image.attrs['Config']['Env']) self.process_metadata(metadata=image.attrs)
except docker.errors.DockerException: except docker.errors.DockerException:
return False return False
@@ -95,8 +109,7 @@ class DockerBase(object):
if not self.container: if not self.container:
try: try:
self.container = self.dock.containers.get(self.docker_name) self.container = self.dock.containers.get(self.docker_name)
self.version = get_version_from_env( self.process_metadata()
self.container.attrs['Config']['Env'])
except docker.errors.DockerException: except docker.errors.DockerException:
return False return False
else: else:
@@ -121,8 +134,7 @@ class DockerBase(object):
try: try:
self.container = self.dock.containers.get(self.docker_name) self.container = self.dock.containers.get(self.docker_name)
self.image = self.container.attrs['Config']['Image'] self.image = self.container.attrs['Config']['Image']
self.version = get_version_from_env( self.process_metadata()
self.container.attrs['Config']['Env'])
_LOGGER.info("Attach to image %s with version %s", _LOGGER.info("Attach to image %s with version %s",
self.image, self.version) self.image, self.version)
except (docker.errors.DockerException, KeyError): except (docker.errors.DockerException, KeyError):
@@ -199,10 +211,14 @@ class DockerBase(object):
self.image, self.version) self.image, self.version)
try: try:
self.dock.images.remove( with suppress(docker.errors.ImageNotFound):
image="{}:latest".format(self.image), force=True) self.dock.images.remove(
self.dock.images.remove( image="{}:latest".format(self.image), force=True)
image="{}:{}".format(self.image, self.version), force=True)
with suppress(docker.errors.ImageNotFound):
self.dock.images.remove(
image="{}:{}".format(self.image, self.version), force=True)
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.warning("Can't remove image %s -> %s", self.image, err) _LOGGER.warning("Can't remove image %s -> %s", self.image, err)
return False return False
@@ -262,3 +278,30 @@ class DockerBase(object):
return self.container.logs(tail=100, stdout=True, stderr=True) return self.container.logs(tail=100, stdout=True, stderr=True)
except docker.errors.DockerException as err: except docker.errors.DockerException as err:
_LOGGER.warning("Can't grap logs from %s -> %s", self.image, err) _LOGGER.warning("Can't grap logs from %s -> %s", self.image, err)
async def restart(self):
"""Restart docker container."""
if self._lock.locked():
_LOGGER.error("Can't excute restart while a task is in progress")
return False
async with self._lock:
return await self.loop.run_in_executor(None, self._restart)
def _restart(self):
"""Restart docker container.
Need run inside executor.
"""
if not self.container:
return False
_LOGGER.info("Restart %s", self.image)
try:
self.container.restart(timeout=30)
except docker.errors.DockerException as err:
_LOGGER.warning("Can't restart %s -> %s", self.image, err)
return False
return True

View File

@@ -1,15 +1,18 @@
"""Init file for HassIO addon docker object.""" """Init file for HassIO addon docker object."""
import logging import logging
from pathlib import Path
import shutil
import docker import docker
from . import DockerBase from . import DockerBase
from ..tools import get_version_from_env from .util import dockerfile_template
from ..const import (
META_ADDON, MAP_CONFIG, MAP_SSL, MAP_ADDONS, MAP_BACKUP, MAP_SHARE,
MAP_MNT)
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
HASS_DOCKER_NAME = 'homeassistant'
class DockerAddon(DockerBase): class DockerAddon(DockerBase):
"""Docker hassio wrapper for HomeAssistant.""" """Docker hassio wrapper for HomeAssistant."""
@@ -26,6 +29,54 @@ class DockerAddon(DockerBase):
"""Return name of docker container.""" """Return name of docker container."""
return "addon_{}".format(self.addon) return "addon_{}".format(self.addon)
@property
def volumes(self):
"""Generate volumes for mappings."""
volumes = {
str(self.addons_data.path_extern_data(self.addon)): {
'bind': '/data', 'mode': 'rw'
}}
addon_mapping = self.addons_data.map_volumes(self.addon)
if MAP_CONFIG in addon_mapping:
volumes.update({
str(self.config.path_extern_config): {
'bind': '/config', 'mode': addon_mapping[MAP_CONFIG]
}})
if MAP_SSL in addon_mapping:
volumes.update({
str(self.config.path_extern_ssl): {
'bind': '/ssl', 'mode': addon_mapping[MAP_SSL]
}})
if MAP_ADDONS in addon_mapping:
volumes.update({
str(self.config.path_extern_addons_local): {
'bind': '/addons', 'mode': addon_mapping[MAP_ADDONS]
}})
if MAP_BACKUP in addon_mapping:
volumes.update({
str(self.config.path_extern_backup): {
'bind': '/backup', 'mode': addon_mapping[MAP_BACKUP]
}})
if MAP_SHARE in addon_mapping:
volumes.update({
str(self.config.path_extern_share): {
'bind': '/share', 'mode': addon_mapping[MAP_SHARE]
}})
if MAP_MNT in addon_mapping:
volumes.update({
'/mnt': {
'bind': '/mnt', 'mode': addon_mapping[MAP_MNT]
}})
return volumes
def _run(self): def _run(self):
"""Run docker image. """Run docker image.
@@ -37,22 +88,6 @@ class DockerAddon(DockerBase):
# cleanup old container # cleanup old container
self._stop() self._stop()
# volumes
volumes = {
self.addons_data.path_data_docker(self.addon): {
'bind': '/data', 'mode': 'rw'
}}
if self.addons_data.need_config(self.addon):
volumes.update({
self.config.path_config_docker: {
'bind': '/config', 'mode': 'rw'
}})
if self.addons_data.need_ssl(self.addon):
volumes.update({
self.config.path_ssl_docker: {
'bind': '/ssl', 'mode': 'rw'
}})
try: try:
self.container = self.dock.containers.run( self.container = self.dock.containers.run(
self.image, self.image,
@@ -60,12 +95,12 @@ class DockerAddon(DockerBase):
detach=True, detach=True,
network_mode='bridge', network_mode='bridge',
ports=self.addons_data.get_ports(self.addon), ports=self.addons_data.get_ports(self.addon),
volumes=volumes, devices=self.addons_data.get_devices(self.addon),
environment=self.addons_data.get_environment(self.addon),
volumes=self.volumes,
) )
self.version = get_version_from_env( self.process_metadata()
self.container.attrs['Config']['Env'])
_LOGGER.info("Start docker addon %s with version %s", _LOGGER.info("Start docker addon %s with version %s",
self.image, self.version) self.image, self.version)
@@ -80,11 +115,87 @@ class DockerAddon(DockerBase):
Need run inside executor. Need run inside executor.
""" """
# read container
try: try:
self.container = self.dock.containers.get(self.docker_name) self.container = self.dock.containers.get(self.docker_name)
self.version = get_version_from_env( self.process_metadata()
self.container.attrs['Config']['Env'])
_LOGGER.info("Attach to container %s with version %s",
self.image, self.version)
return
except (docker.errors.DockerException, KeyError):
pass
# read image
try:
image = self.dock.images.get(self.image)
self.process_metadata(metadata=image.attrs)
_LOGGER.info("Attach to image %s with version %s", _LOGGER.info("Attach to image %s with version %s",
self.image, self.version) self.image, self.version)
except (docker.errors.DockerException, KeyError): except (docker.errors.DockerException, KeyError):
pass _LOGGER.error("No container/image found for %s", self.image)
def _install(self, tag):
"""Pull docker image or build it.
Need run inside executor.
"""
if self.addons_data.need_build(self.addon):
return self._build(tag)
return super()._install(tag)
async def build(self, tag):
"""Build a docker container."""
if self._lock.locked():
_LOGGER.error("Can't excute build while a task is in progress")
return False
async with self._lock:
return await self.loop.run_in_executor(None, self._build, tag)
def _build(self, tag):
"""Build a docker container.
Need run inside executor.
"""
build_dir = Path(self.config.path_addons_build, self.addon)
try:
# prepare temporary addon build folder
try:
source = self.addons_data.path_addon_location(self.addon)
shutil.copytree(str(source), str(build_dir))
except shutil.Error as err:
_LOGGER.error("Can't copy %s to temporary build folder -> %s",
source, build_dir)
return False
# prepare Dockerfile
try:
dockerfile_template(
Path(build_dir, 'Dockerfile'), self.addons_data.arch,
tag, META_ADDON)
except OSError as err:
_LOGGER.error("Can't prepare dockerfile -> %s", err)
# run docker build
try:
build_tag = "{}:{}".format(self.image, tag)
_LOGGER.info("Start build %s on %s", build_tag, build_dir)
image = self.dock.images.build(
path=str(build_dir), tag=build_tag, pull=True)
image.tag(self.image, tag='latest')
self.process_metadata(metadata=image.attrs, force=True)
except (docker.errors.DockerException, TypeError) as err:
_LOGGER.error("Can't build %s -> %s", build_tag, err)
return False
_LOGGER.info("Build %s done", build_tag)
return True
finally:
shutil.rmtree(str(build_dir), ignore_errors=True)

View File

@@ -4,7 +4,6 @@ import logging
import docker import docker
from . import DockerBase from . import DockerBase
from ..tools import get_version_from_env
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
@@ -45,14 +44,15 @@ class DockerHomeAssistant(DockerBase):
'HASSIO': self.config.api_endpoint, 'HASSIO': self.config.api_endpoint,
}, },
volumes={ volumes={
self.config.path_config_docker: str(self.config.path_extern_config):
{'bind': '/config', 'mode': 'rw'}, {'bind': '/config', 'mode': 'rw'},
self.config.path_ssl_docker: str(self.config.path_extern_ssl):
{'bind': '/ssl', 'mode': 'rw'}, {'bind': '/ssl', 'mode': 'ro'},
str(self.config.path_extern_share):
{'bind': '/share', 'mode': 'rw'},
}) })
self.version = get_version_from_env( self.process_metadata()
self.container.attrs['Config']['Env'])
_LOGGER.info("Start docker addon %s with version %s", _LOGGER.info("Start docker addon %s with version %s",
self.image, self.version) self.image, self.version)

View File

@@ -81,3 +81,7 @@ class DockerSupervisor(DockerBase):
async def remove(self): async def remove(self):
"""Remove docker image.""" """Remove docker image."""
raise RuntimeError("Not support on supervisor docker container!") raise RuntimeError("Not support on supervisor docker container!")
async def restart(self):
"""Restart docker container."""
raise RuntimeError("Not support on supervisor docker container!")

40
hassio/dock/util.py Normal file
View File

@@ -0,0 +1,40 @@
"""HassIO docker utilitys."""
import re
from ..const import ARCH_AARCH64, ARCH_ARMHF, ARCH_I386, ARCH_AMD64
RESIN_BASE_IMAGE = {
ARCH_ARMHF: "resin/armhf-alpine:3.5",
ARCH_AARCH64: "resin/aarch64-alpine:3.5",
ARCH_I386: "resin/i386-alpine:3.5",
ARCH_AMD64: "resin/amd64-alpine:3.5",
}
TMPL_IMAGE = re.compile(r"%%BASE_IMAGE%%")
def dockerfile_template(dockerfile, arch, version, meta_type):
"""Prepare a Hass.IO dockerfile."""
buff = []
resin_image = RESIN_BASE_IMAGE[arch]
# read docker
with dockerfile.open('r') as dock_input:
for line in dock_input:
line = TMPL_IMAGE.sub(resin_image, line)
buff.append(line)
# add metadata
buff.append(create_metadata(version, arch, meta_type))
# write docker
with dockerfile.open('w') as dock_output:
dock_output.writelines(buff)
def create_metadata(version, arch, meta_type):
"""Generate docker label layer for hassio."""
return ('LABEL io.hass.version="{}" '
'io.hass.arch="{}" '
'io.hass.type="{}"').format(version, arch, meta_type)

119
hassio/host_control.py Normal file
View File

@@ -0,0 +1,119 @@
"""Host control for HassIO."""
import asyncio
import json
import logging
import async_timeout
from .const import (
SOCKET_HC, ATTR_LAST_VERSION, ATTR_VERSION, ATTR_TYPE, ATTR_FEATURES,
ATTR_HOSTNAME, ATTR_OS)
_LOGGER = logging.getLogger(__name__)
TIMEOUT = 15
UNKNOWN = 'unknown'
FEATURES_SHUTDOWN = 'shutdown'
FEATURES_REBOOT = 'reboot'
FEATURES_UPDATE = 'update'
FEATURES_NETWORK_INFO = 'network_info'
FEATURES_NETWORK_CONTROL = 'network_control'
class HostControl(object):
"""Client for host control."""
def __init__(self, loop):
"""Initialize HostControl socket client."""
self.loop = loop
self.active = False
self.version = UNKNOWN
self.last_version = UNKNOWN
self.type = UNKNOWN
self.features = []
self.hostname = UNKNOWN
self.os_info = UNKNOWN
if SOCKET_HC.is_socket():
self.active = True
async def _send_command(self, command):
"""Send command to host.
Is a coroutine.
"""
if not self.active:
return
reader, writer = await asyncio.open_unix_connection(
str(SOCKET_HC), loop=self.loop)
try:
# send
_LOGGER.info("Send '%s' to HostControl.", command)
with async_timeout.timeout(TIMEOUT, loop=self.loop):
writer.write("{}\n".format(command).encode())
data = await reader.readline()
response = data.decode().rstrip()
_LOGGER.info("Receive from HostControl: %s.", response)
if response == "OK":
return True
elif response == "ERROR":
return False
elif response == "WRONG":
return None
else:
try:
return json.loads(response)
except json.JSONDecodeError:
_LOGGER.warning("Json parse error from HostControl '%s'.",
response)
except asyncio.TimeoutError:
_LOGGER.error("Timeout from HostControl!")
finally:
writer.close()
async def load(self):
"""Load Info from host.
Return a coroutine.
"""
info = await self._send_command("info")
if not info:
return
self.version = info.get(ATTR_VERSION, UNKNOWN)
self.last_version = info.get(ATTR_LAST_VERSION, UNKNOWN)
self.type = info.get(ATTR_TYPE, UNKNOWN)
self.features = info.get(ATTR_FEATURES, [])
self.hostname = info.get(ATTR_HOSTNAME, UNKNOWN)
self.os_info = info.get(ATTR_OS, UNKNOWN)
def reboot(self):
"""Reboot the host system.
Return a coroutine.
"""
return self._send_command("reboot")
def shutdown(self):
"""Shutdown the host system.
Return a coroutine.
"""
return self._send_command("shutdown")
def update(self, version=None):
"""Update the host system.
Return a coroutine.
"""
if version:
return self._send_command("update {}".format(version))
return self._send_command("update")

View File

@@ -1,102 +0,0 @@
"""Host controll for HassIO."""
import asyncio
import json
import logging
import os
import stat
import async_timeout
from .const import SOCKET_HC
_LOGGER = logging.getLogger(__name__)
TIMEOUT = 15
LEVEL_POWER = 1
LEVEL_UPDATE_HOST = 2
LEVEL_NETWORK = 4
class HostControll(object):
"""Client for host controll."""
def __init__(self, loop):
"""Initialize HostControll socket client."""
self.loop = loop
self.active = False
self.version = None
mode = os.stat(SOCKET_HC)[stat.ST_MODE]
if stat.S_ISSOCK(mode):
self.active = True
async def _send_command(self, command):
"""Send command to host.
Is a coroutine.
"""
if not self.active:
return
reader, writer = await asyncio.open_unix_connection(
SOCKET_HC, loop=self.loop)
try:
# send
_LOGGER.info("Send '%s' to HostControll.", command)
with async_timeout.timeout(TIMEOUT, loop=self.loop):
writer.write("{}\n".format(command).encode())
data = await reader.readline()
response = data.decode()
_LOGGER.debug("Receive from HostControll: %s.", response)
if response == "OK":
return True
elif response == "ERROR":
return False
elif response == "WRONG":
return None
else:
try:
return json.loads(response)
except json.JSONDecodeError:
_LOGGER.warning("Json parse error from HostControll.")
except asyncio.TimeoutError:
_LOGGER.error("Timeout from HostControll!")
finally:
writer.close()
def info(self):
"""Return Info from host.
Return a coroutine.
"""
return self._send_command("info")
def reboot(self):
"""Reboot the host system.
Return a coroutine.
"""
return self._send_command("reboot")
def shutdown(self):
"""Shutdown the host system.
Return a coroutine.
"""
return self._send_command("shutdown")
def host_update(self, version=None):
"""Update the host system.
Return a coroutine.
"""
if version:
return self._send_command("host-update {}".format(version))
return self._send_command("host-update")

File diff suppressed because one or more lines are too long

Binary file not shown.

60
hassio/tasks.py Normal file
View File

@@ -0,0 +1,60 @@
"""Multible tasks."""
import asyncio
from datetime import datetime
import logging
_LOGGER = logging.getLogger(__name__)
def api_sessions_cleanup(config):
"""Create scheduler task for cleanup api sessions."""
async def _api_sessions_cleanup():
"""Cleanup old api sessions."""
now = datetime.now()
for session, until_valid in config.security_sessions.items():
if now >= until_valid:
config.security_sessions = (session, None)
return _api_sessions_cleanup
def hassio_update(config, supervisor):
"""Create scheduler task for update of supervisor hassio."""
async def _hassio_update():
"""Check and run update of supervisor hassio."""
if config.last_hassio == supervisor.version:
return
_LOGGER.info("Found new HassIO version %s.", config.last_hassio)
await supervisor.update(config.last_hassio)
return _hassio_update
def homeassistant_watchdog(loop, homeassistant):
"""Create scheduler task for montoring running state."""
async def _homeassistant_watchdog():
"""Check running state and start if they is close."""
if homeassistant.in_progress or await homeassistant.is_running():
return
loop.create_task(homeassistant.run())
return _homeassistant_watchdog
async def homeassistant_setup(config, loop, homeassistant):
"""Install a homeassistant docker container."""
while True:
# read homeassistant tag and install it
if not config.last_homeassistant:
await config.fetch_update_infos()
tag = config.last_homeassistant
if tag and await homeassistant.install(tag):
break
_LOGGER.warning("Error on setup HomeAssistant. Retry in 60.")
await asyncio.sleep(60, loop=loop)
# store version
_LOGGER.info("HomeAssistant docker now installed.")

View File

@@ -16,7 +16,7 @@ _RE_VERSION = re.compile(r"VERSION=(.*)")
_IMAGE_ARCH = re.compile(r".*/([a-z0-9]*)-hassio-supervisor") _IMAGE_ARCH = re.compile(r".*/([a-z0-9]*)-hassio-supervisor")
async def fetch_current_versions(websession, beta=False): async def fetch_last_versions(websession, beta=False):
"""Fetch current versions from github. """Fetch current versions from github.
Is a coroutine. Is a coroutine.
@@ -77,9 +77,10 @@ def get_local_ip(loop):
def write_json_file(jsonfile, data): def write_json_file(jsonfile, data):
"""Write a json file.""" """Write a json file."""
try: try:
with open(jsonfile, 'w') as conf_file: json_str = json.dumps(data, indent=2)
conf_file.write(json.dumps(data)) with jsonfile.open('w') as conf_file:
except OSError: conf_file.write(json_str)
except (OSError, json.JSONDecodeError):
return False return False
return True return True
@@ -87,5 +88,5 @@ def write_json_file(jsonfile, data):
def read_json_file(jsonfile): def read_json_file(jsonfile):
"""Read a json file and return a dict.""" """Read a json file and return a dict."""
with open(jsonfile, 'r') as cfile: with jsonfile.open('r') as cfile:
return json.loads(cfile.read()) return json.loads(cfile.read())

BIN
misc/hassio.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

1
misc/hassio.xml Normal file
View File

@@ -0,0 +1 @@
<mxfile userAgent="Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.81 Safari/537.36" version="6.5.6" editor="www.draw.io" type="device"><diagram name="Page-1">5Vptc6M2EP41/ng3gHj9mPiSy820c5n6Q3sfsVBsNTJyhYid/voKkABZkOBY+KYtmYnR6pVn99ld1l6A5e74laX77a80Q2ThOdlxAb4sPC8OY/G/Erw2At9xG8GG4awR9QQr/DeSQkdKS5yhQhvIKSUc73UhpHmOINdkKWP0oA97okTfdZ9ukCFYwZSY0t9xxrdS6oZJ1/GA8GYrt469sOlYp/B5w2iZy/0WHniqr6Z7l6q15IMW2zSjh54I3C3AklHKm7vdcYlIBa2CrZl3P9LbnpuhnE+Z4DUTXlJSInXikIipt09UrCAOyF8lKOFfJVUdn4paZTdigNjtKD5ERw206DtIYKrenLJdSrrJ4m5TfX5fqX3E2Zqtmg4JS7urd9hijlb7FFbtg7A2MWjLd0S03Oo0mJAlJZTVowXYKIRQyAvO6DPq9Tj1Jc+/kutLvF4Q4+g4CqHbKkbYO6I7xNmrGKImJKCZIm09SKRuD53l+Arobc9oQjkulca6aZfuFCZupM6G9QcM/X3LcaW31WvB0e5CNGGG1vF6CE0QggRkrb7sAhhNBNCzAKBvAPiFwmfELkUOokCQ/trI+SZy3hBywAJyoYHcw9JArXaFqJpRUe9MLscQDXN5HQd+4NjB0A8DHcPQxDBwTAgDCxAmBl4oE3FINinjW7qheUruOumtjmgPPXTE/I9K/DkKZPOH6srFwZq+QDV/yBX+RJy/ygiclpwKUbfxL5Tu5RrNUavzvQ20eBxaMihHRTJ4p2yDeM9uTHUwRFKOX/TVLwFX5RK20fXeQDcB3im+deMRMSweALGfBbp/JdCj0Xxi3UX48xIMN6wSjNMEYlXuEXvBhXAJagOm+h7Sovj2fTTBaMXr0aSjMwP3fbdluKflMgybVEN3aFmA4sy347ZAoLstMJB1uPGA33JtRE3Xm4Nbbo9Yyou13NJ4VbuxeUnkqveOHouiK7EIzOO6NHh1dE/iQtc89VyFwIPfVK9YQgCJYBqGSnyPidpzqm5QnpmLCWFvqcFMfrm0qlgvvlZQUm8cvaxJrPLpRjy6wLByU9dxRSmKn6CtLFR3Rd5A/t56HS1/9224ovDKXHE/O3qQ/+zG8aWBfiKtPmjxwLR4d0Sn1i3enyVUSJ30srCJCPYcTk5zpHmb8xQ2Vl+AJXtp+WpPYdeKPa5ZUrjJMpoXhhqLbbqvbveMQlQU73sn3ZVN9lX34qr9fZMTCt07XhiBxANhEHtx7PhgpqRqyJN5bmB6ssSCI1O1nDmJ0rVOHdWlqYAkU59uc7zoXEAAOfWR4vq9Q5WqneE0Wq3Q0FJO6hdSz1ynobKxTm0U7dNMs5PYJCjk1KxYKX6WO9IMALcVOzAUyKdrRB5pgTmmuRiyppzTnRhAqo7btoitVVbrMna3xg3Bm2oup+fRvCvEnpZu5QYWiHxS0wEDNR0wkJBYqciaNJ5AUifSWOq/x1LX5OgUOk5Ity8PgO97LQshEng/L0SqvXsMPBwOpvcmBO+LWg2SiZDQMrs4Tl6FQInuz3xnIKeP5iovgLcLo9K4P5DEn8mRmTLEXqzt3hyaQ3qj0faDNPFNmjTmaz+S+icmc+pN7YVAMP6tjfNQrkcjIUzZ5fQL62uAfkH1Z4d+CThJJ4boN1TdsxLBopnY17f7yGaWOT9lP8i+YAb2TVZjYJDkK+bbuekxFp2QmwUomocevnppvQo94v9LcEpCnaOR5dgU/idjk/m9+G9oX71qUYbReBXl30s+Vf6dgXyi2f0WqlFG93szcPcP</diagram></mxfile>

BIN
misc/security.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

1
misc/security.xml Normal file
View File

@@ -0,0 +1 @@
<mxfile userAgent="Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:53.0) Gecko/20100101 Firefox/53.0" version="6.5.8" editor="www.draw.io" type="device"><diagram name="Page-1">5Vxdd5s4EP01fmwOkgCbx9hp2j7sNrvpnnYfiVFsTjDyghwn++tXGMmAxileEB9O+9BjBhjM3GHmzjXKhCw2L58Sf7v+jQU0mmAreJmQmwnGU4eI/zPDa24gyM0NqyQMchMqDPfhv1QaLWndhQFNKwdyxiIebqvGJYtjuuQVm58kbF897JFF1atu/RUFhvulH0Hr9zDg69w6w25h/0zD1VpdGblevufBXz6tEraL5fUmmDwe/uW7N77yJW80XfsB25dM5OOELBLGeP5p87KgURZaFbb8vNs39h6/d0Jjfs4JOD/h2Y928tZvwyTlwnTP/YTLL8lfVWA4fRF+52u+iYQBiY8pT9gTXbCIJcISs1gcOX8Mo0gz+VG4isXmUnwzKuzzZ5rwUIT8Wu7YhEGQXWa+X4ec3m/9ZXbNvcivzCGL+b38Go7aztMGeWIb3rcMRXYV+lIyyTh8omxDefIqDpF7ySw/Q6asKxHaF/gjS9rWJewVkr5MudXRcRF28UFG/jQKBKDwVypipAe/FPUtC2N+uKIznzg3mYUmobhwFtoblvA1W7HYj+4KawcxQhgGyT0Vo5mBINkgSJ/9NB1hkDAiw0XJAVFaiyhdffk6wkDZ7oCBckGg2JbGh1uKs2b2drT0wvXAOGcbsYPGwXXWfDJbxJZPP4uSqK4ryiuZTYNKU4JhK4VFRSChkc/D52rbOhUW6e0uQ7pAwNOeZ1sLbMp2yZLKk8ptRPMjoNMc4aqj/HaBowNIxzs8C7cpwE2ckdLlLgm5uNPbMH5kvaLnDIYenmrPj9sQPuLUODIH3wzCNxVxFtdz/9llrGcexiEvtibkOiNwfpTS7KjpTVtsD085mQd+uqaBPE/slmRilm29hPyH+PzBurIcuf232LauCFH7S5XwxvpZpuQQVDKlyaPfMlNsy60AjK2mmYJrHJnLFA9kip8+ZfsP+WHdfe8+E856/kk/EOqsApOGECJS48gchGqcK2GYUm4Sw8vss7hpoT5GVDlyvM6wg6NhtdGyLQ9ZLAi4G2WF+kHMK+7qULK1gr4VBHTPkkAv6nrJt7b70iFGir1Kj/K4iC6vsWPPUGMHjgzmCxxiq/mS0jQVCfNGvvyvZOk1VxQdQFcWmlbowNRtRQfsMacc0XWNpikHHL2RcgIG/7V0mJxJWyYlFA306lSk5Rv5Jg94oq+mM66egDSqW31xSm16J9OmGTOrcWSwSEF5xMi43xGSA1FL0rTd6NQSODKIJNRvfmfJxodQvmPJGlfZoN2nZo2gEHMZorWDYJQ6UxkR1DsuRLXuN0xw2L8c2brXSGE4Ug+mW6vkHn6gdpqKIbpw7RDcVcc6JtpolGv11I1g3HAcQ+MGcGQQwBOKyBnaNU/E0XhROY4zvn2fGrfKqUZ1wrDK7TSWTXCNI4NJBWWTXOYejb6tiF7fU4jbVIHQpxDgyCB6UF/IZ4Xete3x9GK3aSnXxW3X7kzcPvHrfzdi5SAypVuVKV3itqros1EzhykyxByAoz6FylOvNbx7obI3XqANbNPG70nMahwZrFBQOBizUjkUSZjqM3VTkgAcGYQSihuXoZR5fQobBAobF6KU9RsmqCJcjlLWb6TguD6YUqaSe3h27plSyrzulDJS9ypB70qZeupGwHc9U0oZcGQQwPqf3dsoZflxFy6UkTZlwrBQ5pkSyoAjgzkFf7ovhLLbb1+/3XWfDGfVCnzubGyYCiPLlGAGPRmEESovZcXMCJAX2pqRZUo5Q1Z30hmpW4DRjXSWdYVDLzgcNcu64gVqaSrZRsotEDIlpkFPfapppH6VyftT03ojD/qqvebLjmZ1ngyWLSjCjFlPG4xEIFOCGvRkDky1TPHEy3+iSooiia2TPOLXeRVw5kqeVWoauKtXAW2oSY1U4LQ1noQ9G4SpuwXsGIRptAqnM2ScoPwzZolz0FBBouMvRTvwOT3WQJ2GywJZEHAzHLrgzIpB54wZ2a0Ys32iOaoHaQDGfHyd+rjQXWld7ZfMqwbaQb+E5Kc6s0mVzeDANsR6LNIy1fCJVDt3CUYXw5lWWWyvYaoRp85Tn8OZA8nbH39+WLCAts2YrtZTnVtuWg9Wem1pysXJTAPcsc8DvAmckPyNHM5z9ZbWo5UOgtvw+UWkzpNBOCFJ/ZKvzv7lJiqtPx8LV3l1lXpNp+VIJTaLv/mWo1b8XT3y8T8=</diagram></mxfile>

View File

@@ -38,5 +38,7 @@ setup(
'colorlog', 'colorlog',
'voluptuous', 'voluptuous',
'gitpython', 'gitpython',
'pyotp',
'pyqrcode'
] ]
) )

View File

@@ -1,7 +1,7 @@
{ {
"hassio_tag": "0.13", "hassio": "0.29",
"homeassistant_tag": "0.43.1", "homeassistant": "0.44.2",
"resinos_version": "0.4", "resinos": "0.7",
"resinhup_version": "0.1", "resinhup": "0.1",
"generic_hc_version": "0.1" "generic": "0.3"
} }

View File

@@ -1,7 +0,0 @@
{
"hassio_tag": "0.13",
"homeassistant_tag": "0.43.1",
"resinos_version": "0.4",
"resinhup_version": "0.1",
"generic_hc_version": "0.1"
}