Compare commits

...

439 Commits
165 ... 222

Author SHA1 Message Date
Pascal Vizeli
56af4752f4 Merge pull request #1712 from home-assistant/dev
Release 222
2020-05-08 16:24:18 +02:00
Pascal Vizeli
81413d08ed Prevent boot loop (#1711)
* Prevent boot loop

* sort
2020-05-08 16:13:36 +02:00
Bram Kragten
2bc2a476d9 Bump forntend (#1710) 2020-05-08 13:58:20 +02:00
Pascal Vizeli
4d070a65c6 Bump version 222 2020-05-07 19:16:27 +02:00
Pascal Vizeli
6185fbaf26 Merge pull request #1704 from home-assistant/dev
Fix panel
2020-05-07 19:15:59 +02:00
Pascal Vizeli
698a126b93 Fix panel
Signed-off-by: Pascal Vizeli <pvizeli@syshack.ch>
2020-05-07 17:14:52 +00:00
Pascal Vizeli
acf921f55d Merge pull request #1703 from home-assistant/dev
Release 221
2020-05-07 19:02:31 +02:00
Bram Kragten
f5a78c88f8 Bump supervisor panel (#1702) 2020-05-07 17:41:10 +02:00
dependabot-preview[bot]
206ece1575 Bump pylint from 2.5.0 to 2.5.2 (#1700)
Bumps [pylint](https://github.com/PyCQA/pylint) from 2.5.0 to 2.5.2.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.5.0...pylint-2.5.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-07 17:04:08 +02:00
dependabot-preview[bot]
a8028dbe10 Bump gitpython from 3.1.1 to 3.1.2 (#1697)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.1.1 to 3.1.2.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.1.1...3.1.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-07 17:04:00 +02:00
Pascal Vizeli
c605af6ccc Add discovery support for new ozw integration (#1701) 2020-05-07 08:32:10 +02:00
Pascal Vizeli
b7b8e6c40e Bump version 221 2020-05-01 18:11:58 +02:00
Pascal Vizeli
3fcb1de419 Merge pull request #1692 from home-assistant/dev
Release 220
2020-05-01 18:07:35 +02:00
Pascal Vizeli
12034fe5fc Update panel with new tabs (#1691)
Signed-off-by: Pascal Vizeli <pvizeli@syshack.ch>
2020-05-01 17:51:35 +02:00
dependabot-preview[bot]
56959d781a Bump pylint from 2.4.4 to 2.5.0 (#1684)
* Bump pylint from 2.4.4 to 2.5.0

Bumps [pylint](https://github.com/PyCQA/pylint) from 2.4.4 to 2.5.0.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.4.4...pylint-2.5.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

* fix lint

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
Co-authored-by: Pascal Vizeli <pvizeli@syshack.ch>
2020-05-01 15:34:12 +02:00
Pascal Vizeli
9a2f025646 Add support for homematic (#1690) 2020-05-01 15:23:56 +02:00
Pascal Vizeli
12cc163058 New addon panel (#1689)
* Update add-on pages

* update panel
2020-05-01 14:23:54 +02:00
dependabot-preview[bot]
74971d9753 Bump pytz from 2019.3 to 2020.1 (#1686)
Bumps [pytz](https://github.com/stub42/pytz) from 2019.3 to 2020.1.
- [Release notes](https://github.com/stub42/pytz/releases)
- [Commits](https://github.com/stub42/pytz/compare/release_2019.3...release_2020.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-04-29 12:15:31 +02:00
Franck Nijhof
a9157e3a9f Fix possible Ingress port collisions (#1682)
* Fix possible Ingress port collisions

* Cleanup dynamic port assignment on uninstall

* Check port against gateway address

* gateway address is already of type IPv4Address

* Update supervisor/ingress.py

Co-Authored-By: Pascal Vizeli <pvizeli@syshack.ch>

Co-authored-by: Pascal Vizeli <pvizeli@syshack.ch>
2020-04-26 11:21:56 +02:00
dependabot-preview[bot]
b96697b708 Bump cryptography from 2.9.1 to 2.9.2 (#1676)
Bumps [cryptography](https://github.com/pyca/cryptography) from 2.9.1 to 2.9.2.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/2.9.1...2.9.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-04-23 18:44:32 +02:00
dependabot-preview[bot]
81e6896391 Bump cryptography from 2.9 to 2.9.1 (#1673)
Bumps [cryptography](https://github.com/pyca/cryptography) from 2.9 to 2.9.1.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/2.9...2.9.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-04-22 14:23:00 +02:00
dependabot-preview[bot]
2dcaa3608d Bump pulsectl from 20.2.4 to 20.4.3 (#1674)
Bumps [pulsectl](https://github.com/mk-fg/python-pulse-control) from 20.2.4 to 20.4.3.
- [Release notes](https://github.com/mk-fg/python-pulse-control/releases)
- [Changelog](https://github.com/mk-fg/python-pulse-control/blob/master/CHANGES.rst)
- [Commits](https://github.com/mk-fg/python-pulse-control/commits)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-04-22 13:14:54 +02:00
Pascal Vizeli
e21671ec5e Bump version to 220 2020-04-22 11:32:13 +02:00
Pascal Vizeli
7841f14163 Merge pull request #1671 from home-assistant/dev
Release 219
2020-04-22 11:31:50 +02:00
Pascal Vizeli
cc9f594ab4 Better initial image load handling (#1672) 2020-04-22 11:26:53 +02:00
Pascal Vizeli
ebfaaeaa6b Improve the flow with fallback if there is a network issue (#1670) 2020-04-22 11:06:59 +02:00
Pascal Vizeli
ffa91e150d Fix handling with reset/default on json (#1669)
* Fix handling with reset/default on json

* black
2020-04-22 11:06:49 +02:00
Pascal Vizeli
06fa9f9a9e Bump version to 219 2020-04-21 20:15:15 +02:00
Pascal Vizeli
9f203c42ec Merge pull request #1668 from home-assistant/dev
Release 218
2020-04-21 20:14:44 +02:00
Pascal Vizeli
5d0d34a4af Fix version detection 2020-04-21 15:32:20 +00:00
Pascal Vizeli
c2cfc0d3d4 Allow moving registry for supervisor (#1667)
* Allow moving registry for supervisor

* fix

* Filter number
2020-04-21 16:59:28 +02:00
Pascal Vizeli
0f4810d41f Add debug 2020-04-21 13:09:52 +00:00
Pascal Vizeli
175848f2a8 Fix update order (#1666) 2020-04-21 14:41:44 +02:00
Fabian Affolter
472bd66f4d Update README (#1662) 2020-04-14 13:38:00 +02:00
dependabot-preview[bot]
168ea32d2c Bump jinja2 from 2.11.1 to 2.11.2 (#1663)
Bumps [jinja2](https://github.com/pallets/jinja) from 2.11.1 to 2.11.2.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/master/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/2.11.1...2.11.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-04-14 13:37:49 +02:00
Fabian Affolter
e82d6b1ea4 Minor re-work (#1661)
* Minor re-work

* Fix typo
2020-04-14 10:27:10 +02:00
dependabot-preview[bot]
6c60ca088c Bump gitpython from 3.1.0 to 3.1.1 (#1659)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.1.0 to 3.1.1.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.1.0...3.1.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-04-14 10:25:59 +02:00
Fabian Affolter
83e8f935fd Remove duoble DNS section (#1660) 2020-04-14 10:25:38 +02:00
Pascal Vizeli
71867302a4 Bump version to 218 2020-04-12 12:00:53 +02:00
Pascal Vizeli
8bcc402c5f Merge pull request #1656 from home-assistant/dev
Release 217
2020-04-12 12:00:16 +02:00
Pascal Vizeli
72b7d2a123 Fix restore on older snapshots (#1655) 2020-04-12 11:56:37 +02:00
Aidan Timson
20c1183450 Remove Azure CI badge (#1648)
* Fix azure ci badge

* Remove Azure CI badge
2020-04-11 14:07:19 +02:00
Pascal Vizeli
0bbfbd2544 Readd audio api 2020-04-11 13:35:26 +02:00
Pascal Vizeli
350bd9c32f Bump version to 217 2020-04-11 11:16:31 +02:00
Pascal Vizeli
dcca8b0a9a Merge pull request #1652 from home-assistant/dev
Release 216
2020-04-11 11:15:57 +02:00
Pascal Vizeli
f77b479e45 Add timeout for clean shutdown (#1650)
* Fix overlay

* Update Dockerfile
2020-04-10 23:31:24 +02:00
Pascal Vizeli
216565affb Give a bit more time to update apparmor (#1647) 2020-04-10 01:15:47 +02:00
Pascal Vizeli
6f235c2a11 force dns plugin v9 2020-04-09 16:45:10 +02:00
Pascal Vizeli
27a770bd1d force dns plugin v7 2020-04-08 23:39:18 +02:00
Pascal Vizeli
ef15b67571 Bump version to 216 2020-04-08 14:44:10 +02:00
Pascal Vizeli
6aad966c52 Merge pull request #1644 from home-assistant/dev
Release 215
2020-04-08 14:43:37 +02:00
Pascal Vizeli
9811f11859 Force plugin: cli v25 - dns v6 2020-04-08 12:34:42 +00:00
Pascal Vizeli
13148ec7fb Filter add-ons in the store based on advanced mode (#1643)
* Filter add-ons in the store based on advanced mode

Signed-off-by: Pascal Vizeli <pvizeli@syshack.ch>

* Filter add-ons in the store based on advanced mode p2

Signed-off-by: Pascal Vizeli <pvizeli@syshack.ch>
2020-04-08 14:29:43 +02:00
Pascal Vizeli
b2d7464790 Fix multicast API for logs (#1642) 2020-04-08 14:08:09 +02:00
Joakim Sørensen
ce84e185ad Restore repositories with partial snapshot (#1641)
* Restore repositories with partial snapshot

* Run black
2020-04-08 13:44:03 +02:00
Franck Nijhof
c3f5ee43b6 Add missing multicast/logs endpoint to API docs (#1640) 2020-04-08 12:11:59 +02:00
Pascal Vizeli
e2dc1a4471 Support plugin requirements & mdns (#1638)
* Support plugin requirements & mdns

* better error handling

* Use debug from LogLevel

* fix lint

* fix logger

* fix test env

* Use new style

* fix typo
2020-04-07 12:08:29 +02:00
Pascal Vizeli
e787e59b49 Cleanup network (#1637)
* Cleanup network forward

* Add permission
2020-04-06 14:00:52 +02:00
Pascal Vizeli
f0ed2eba2b Multicast support on Hass.io Network (#1634)
* Add multicast layer to docker

* support network forward

* add pluginmanager

* finish multicast plugin

* fix lint

* Add shutdown for plugins

* Add API

* Add watchdog

* Fix black

* Fix path
2020-04-05 23:26:22 +02:00
Pascal Vizeli
2364e1e652 Remove old authentication (#1635) 2020-04-05 19:16:12 +02:00
Pascal Vizeli
cc56944d75 relicense project 2020-04-05 16:23:52 +02:00
Pascal Vizeli
69cea9fc96 Plugin cleanup (#1632)
* Plugin cleanup

* Fix setup
2020-04-05 01:20:49 +02:00
Pascal Vizeli
fcebc9d1ed Use updater for image data (#1628)
* Use updater for image data

* Fix message

* Fix handling

* Update code

* fix lint

* fix names

* Fix log

* Fix error log

* make it better
2020-04-05 00:47:05 +02:00
Pascal Vizeli
9350e4f961 Store image on local config (#1627)
* Store image on local config

* Fix api validate

* Fix install handling
2020-04-03 17:12:18 +02:00
Pascal Vizeli
387e0ad03e Adjust dbus support for pulse (#1626) 2020-04-03 14:27:30 +02:00
dependabot-preview[bot]
61fec8b290 Bump cryptography from 2.8 to 2.9 (#1625)
Bumps [cryptography](https://github.com/pyca/cryptography) from 2.8 to 2.9.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/2.8...2.9)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-04-03 13:44:03 +02:00
Pascal Vizeli
1228baebf4 Cleanup host groups (#1623) 2020-04-02 18:00:17 +02:00
Pascal Vizeli
a30063e85c Bump version to 215 2020-03-30 12:38:37 +02:00
Pascal Vizeli
524cebac4d Merge pull request #1617 from home-assistant/dev
Release 214
2020-03-30 12:37:45 +02:00
Pascal Vizeli
c94114a566 Update panel (#1616) 2020-03-30 12:34:21 +02:00
Pascal Vizeli
b6ec7a9e64 Bump version to 214 2020-03-29 14:35:55 +02:00
Pascal Vizeli
69be7a6d22 Merge pull request #1611 from home-assistant/dev
Release 213
2020-03-29 14:35:10 +02:00
Pascal Vizeli
58155c35f9 Supervisor use own dns layer (#1610)
* Supervisor use own dns layer

* Add logs
2020-03-29 13:45:46 +02:00
Pascal Vizeli
7b2377291f Fix error handling on tasks (#1608) 2020-03-28 17:05:42 +01:00
Pascal Vizeli
657ee84e39 Bump version to 213 2020-03-28 16:10:22 +01:00
Pascal Vizeli
2e4b545265 Merge pull request #1607 from home-assistant/dev
Release 212
2020-03-28 16:09:49 +01:00
Pascal Vizeli
2de1d35dd1 Get host dmesg as host logs (#1606)
* Get host dmesg as host logs

* fix lint
2020-03-28 15:56:57 +01:00
Pascal Vizeli
2b082b362d Fix cli update (#1605)
* Fix cli update

* fix logging

* Fix
2020-03-28 15:21:32 +01:00
Pascal Vizeli
dfdd0d6b4b Bump version to 212 2020-03-27 22:27:52 +01:00
Pascal Vizeli
a00e81c03f Merge pull request #1604 from home-assistant/dev
Release 211
2020-03-27 22:26:56 +01:00
Pascal Vizeli
776e6bb418 Rename API last/latest (#1603)
* Rename API last/latest

* Update panel

* Fix for HA

* ha need this
2020-03-27 22:07:54 +01:00
Pascal Vizeli
b31fca656e Bump version to 211 2020-03-27 17:48:02 +01:00
Pascal Vizeli
fa783a0d2c Merge pull request #1602 from home-assistant/dev
Release 210
2020-03-27 17:46:51 +01:00
Pascal Vizeli
96c0fbaf10 Cli rebrand (#1601)
* Rebrand CLI

* forward

* Fix startup command

* add cli api

* Add token handling

* Fix security check

* fix repair

* fix lint

* Update for new cli

* Add watchdog

* rename

* use s6
2020-03-27 17:37:48 +01:00
Pascal Vizeli
24f7801ddc Fix wrong function for set profiles (#1600) 2020-03-27 12:32:14 +01:00
Pascal Vizeli
8e83e007e9 DNS loop protection (#1599)
* DNS loop protection

* Update supervisor/dns.py

Co-Authored-By: Franck Nijhof <git@frenck.dev>

* cleanup not needed code

* Fix

Co-authored-by: Franck Nijhof <git@frenck.dev>
2020-03-27 11:54:32 +01:00
Pascal Vizeli
d0db466e67 Use DoT as fallback (#1597)
* Use DoT as fallback / add cache

* Stage

* merge

* fix lint

* Fallback server

* use fallback

* add nxdomain

* Address comments
2020-03-27 00:38:54 +01:00
Franck Nijhof
3010bd4eb6 Remove Home Panel Discovery (#1594)
* Remove Home Panel Discovery

* Remove related tests
2020-03-23 10:32:56 +01:00
Phill (pssc)
069bed8815 Rasie limit on container shutdown for addons usecase tmpfs based mariadb for recorder taking over 2 mins for dump (#1595) 2020-03-23 09:15:35 +01:00
Pascal Vizeli
d2088ae5f8 Update azure-pipelines-wheels.yml for Azure Pipelines 2020-03-21 09:13:13 +01:00
dependabot-preview[bot]
0ca5a241bb Bump cchardet from 2.1.5 to 2.1.6 (#1593)
Bumps [cchardet](https://github.com/PyYoshi/cChardet) from 2.1.5 to 2.1.6.
- [Release notes](https://github.com/PyYoshi/cChardet/releases)
- [Changelog](https://github.com/PyYoshi/cChardet/blob/master/CHANGES.rst)
- [Commits](https://github.com/PyYoshi/cChardet/compare/2.1.5...2.1.6)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-03-18 15:21:12 +01:00
dependabot-preview[bot]
dff32a8e84 Bump pytest from 5.3.5 to 5.4.1 (#1591)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.3.5 to 5.4.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.3.5...5.4.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-03-16 14:36:11 +01:00
Pascal Vizeli
4a20344652 Log config check (#1583)
* Add more log for config check to debug

* Convert to ascii

* fix comment
2020-03-12 15:16:40 +01:00
Pascal Vizeli
98b969ef06 Bump version to 210 2020-03-06 12:43:11 +01:00
Pascal Vizeli
c8cb8aecf7 Merge pull request #1574 from home-assistant/dev
Release 209
2020-03-06 12:41:56 +01:00
Pascal Vizeli
73e8875018 Fix Issue with pulse folder (#1573)
* Fix Issue with pulse folder

* Fix config check
2020-03-06 12:32:29 +01:00
Pascal Vizeli
02aed9c084 Enforce Pulse client (#1572) 2020-03-05 15:54:23 +01:00
Pascal Vizeli
89148f8fff Bump packaging from 20.1 to 20.3 (#1570)
Bumps [packaging](https://github.com/pypa/packaging) from 20.1 to 20.3.
- [Release notes](https://github.com/pypa/packaging/releases)
- [Changelog](https://github.com/pypa/packaging/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pypa/packaging/compare/20.1...20.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-03-05 13:44:10 +01:00
Pascal Vizeli
6bde527f5c Bump version to 209 2020-03-04 19:09:20 +01:00
Pascal Vizeli
d62aabc01b Merge pull request #1567 from home-assistant/dev
Release 208
2020-03-04 19:08:12 +01:00
Pascal Vizeli
82299a3799 Fix udev error without privileged (#1566)
* Fix udev error without privileged

* Fix udev

* Remove context

* Update supervisor/hwmon.py

Co-Authored-By: Paulus Schoutsen <balloob@gmail.com>

* Update supervisor/hwmon.py

Co-Authored-By: Paulus Schoutsen <balloob@gmail.com>

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
2020-03-04 18:55:07 +01:00
Pascal Vizeli
c02f30dd7e Enforce env check (#1565) 2020-03-04 18:15:10 +01:00
Pascal Vizeli
e91983adb4 Watchdog check in_progress for audio/dns (#1555) 2020-03-03 15:06:09 +01:00
Pascal Vizeli
ff88359429 Bump version to 208 2020-03-02 11:32:01 +01:00
Pascal Vizeli
5a60d5cbe8 Merge pull request #1554 from home-assistant/dev
Release 207
2020-03-02 11:31:26 +01:00
Pascal Vizeli
2b41ffe019 Bump version to 207 2020-03-01 17:52:35 +01:00
Pascal Vizeli
1c23e26f93 Check if SND is loaded (#1552)
* Check if SND is loaded

* add warning
2020-02-29 23:45:37 +01:00
Pascal Vizeli
3d555f951d Fix supervisor update flow with apparmor (#1551) 2020-02-29 23:07:34 +01:00
Bram Kragten
6d39b4d7cd Update audio.py (#1548) 2020-02-29 19:28:52 +01:00
Pascal Vizeli
4fe5d09f01 Bump version to 206 2020-02-29 12:19:42 +01:00
Pascal Vizeli
e52af3bfb4 Merge pull request #1547 from home-assistant/dev
Release 205
2020-02-29 12:18:51 +01:00
Pascal Vizeli
0467b33cd5 Core support audio settings (#1546) 2020-02-29 12:14:03 +01:00
Pascal Vizeli
14167f6e13 Merge pull request #1544 from home-assistant/dev
Release 204
2020-02-29 00:30:16 +01:00
Pascal Vizeli
7a1aba6f81 Fix old alsa format settings (#1543) 2020-02-29 00:25:11 +01:00
Pascal Vizeli
920f7f2ece Support for own init on image (#1542)
* Support for own init on image

* fix params
2020-02-28 23:15:46 +01:00
Pascal Vizeli
06fadbd70f fix lint 2020-02-28 19:25:15 +00:00
Pascal Vizeli
d4f486864f Make audio socket RO and aware of restarts (#1545) 2020-02-29 11:23:36 +01:00
Pascal Vizeli
d3a21303d9 Bump version to 205 2020-02-29 00:31:21 +01:00
Pascal Vizeli
e1cbfdd84b Support mute + applications from pulse (#1541)
* Support mute + applications from pulse

* Fix lint

* Fix application parser

* Fix type

* Add application endpoints

* error handling

* Fix
2020-02-28 17:52:12 +01:00
Pascal Vizeli
87170a4497 Restart add-ons attach to audio with update pulse (#1540) 2020-02-28 14:05:31 +01:00
Pascal Vizeli
ae6f8bd345 Bump version to 203 2020-02-28 10:57:05 +01:00
Pascal Vizeli
b9496e0972 Merge pull request #1539 from home-assistant/dev
Release 203
2020-02-28 10:56:22 +01:00
Pascal Vizeli
c36a6dcd65 Add default asound for pulse (#1538)
* Add default asound for pulse

* fix lint

* fix config
2020-02-28 01:14:43 +01:00
Pascal Vizeli
19ca836b78 Prevent using pulseaudio on event loop (#1536)
* Prevent using pulseaudio on event loop

* Fix name overwrite

* Fix value
2020-02-27 22:01:20 +01:00
Pascal Vizeli
8a6ea7ab50 Use shorter function for soundcard (#1535) 2020-02-27 17:22:12 +01:00
Pascal Vizeli
6721b8f265 Expose sound cards and profiles with endpoint (#1534)
* Expose sound cards and profiles with endpoint

* Fix naming

* Fix issue

* Update API
2020-02-27 16:25:04 +01:00
Pascal Vizeli
9393521f98 Update Panel for audio (#1533) 2020-02-27 13:47:46 +01:00
Pascal Vizeli
398b24e0ab Fix homeassistant config check with overlay-s6 (#1532) 2020-02-27 13:29:42 +01:00
Pascal Vizeli
374bcf8073 Adjust sound reload (#1531)
* Adjust sound reload & remove quirk

* clean info message

* fix hack
2020-02-27 11:58:28 +01:00
Pascal Vizeli
7e3859e2f5 Observe host hardware for realtime actions (#1530)
* Observe host hardware for realtime actions

* Better logging

* fix testenv
2020-02-27 10:31:35 +01:00
Pascal Vizeli
490ec0d462 Bump version to 203 2020-02-26 14:47:38 +01:00
Pascal Vizeli
15bf1ee50e Merge pull request #1528 from home-assistant/dev
Release 202
2020-02-26 14:46:50 +01:00
Pascal Vizeli
6376d92a0d Merge remote-tracking branch 'origin/master' into dev 2020-02-26 13:38:12 +00:00
Pascal Vizeli
10230b0b4c Support profiles on template (#1527) 2020-02-26 14:28:09 +01:00
Pascal Vizeli
2495cda5ec Add Pulse audio control basics (#1525)
* Add Pulse audio control basics

* add functionality

* Fix handling

* Give access to all

* Fix latest issues

* revert docker

* Fix pipeline
2020-02-26 11:48:11 +01:00
Pascal Vizeli
ae8ddca040 Delete entry.sh 2020-02-25 18:38:52 +01:00
Pascal Vizeli
0212d027fb Add Audio layer / PulseAudio (#1523)
* Improve alsa handling

* use default from image

* create alsa folder

* Map config into addon

* Add Audio object

* Fix dbus

* add host group file

* Fix persistent file

* Use new template

* fix lint

* Fix lint

* add API

* Update new base image / build system

* Add audio container

* extend new audio settings

* provide pulse client config

* Adjust files

* Use without auth

* reset did not exists now

* cleanup old alsa layer

* fix tasks

* fix black

* fix lint

* Add dbus support

* add dbus adjustments

* Fixups
2020-02-25 18:37:06 +01:00
dependabot-preview[bot]
a3096153ab Bump gitpython from 3.0.8 to 3.1.0 (#1524)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.8 to 3.1.0.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.8...3.1.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-25 18:30:11 +01:00
dependabot-preview[bot]
7434ca9e99 Bump gitpython from 3.0.8 to 3.1.0 (#1524)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.8 to 3.1.0.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.8...3.1.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-25 18:29:44 +01:00
Pascal Vizeli
4ac7f7dcf0 Rename Hass.io -> Supervisor (#1522)
* Rename Hass.io -> Supervisor

* part 2

* fix lint

* fix auth name
2020-02-21 17:55:41 +01:00
Pascal Vizeli
e9f5b13aa5 Fix wrong last boot (#1521)
* Protect overwrite last boot uptime

* Fix naming

* Fix lint
2020-02-20 21:37:59 +01:00
Pascal Vizeli
1fbb6d46ea Fix webui option (#1519) 2020-02-20 09:17:53 +01:00
Pascal Vizeli
8dbfea75b1 Extend video & add tests (#1518) 2020-02-20 00:29:48 +01:00
zewelor
3b3840c087 Update hardware.py (#1516)
Allow to pass video* devices to containers. Should allow to use usb webcams / uvc tuners.
2020-02-19 09:08:11 +01:00
dependabot-preview[bot]
a21353909d Bump gitpython from 3.0.7 to 3.0.8 (#1513)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.7 to 3.0.8.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.7...3.0.8)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-17 16:58:00 +01:00
Pascal Vizeli
5497ed885a Bump version to 202 2020-02-17 11:38:08 +01:00
Pascal Vizeli
39baea759a Merge pull request #1512 from home-assistant/dev
Release 201
2020-02-17 11:37:29 +01:00
Pascal Vizeli
80ddb1d262 Fix startup of dev envoirement 2020-02-14 17:05:03 +00:00
Pascal Vizeli
e24987a610 Fix ingress on panel after restore (#1508)
* Fix ingress on panel after restore

* Supress errors
2020-02-14 15:58:26 +01:00
Pascal Vizeli
9e5c276e3b Home Assistant Core start flow / partial restore (#1507)
* Fix start flow logic

* Add a note

* Fix flow on partial restore
2020-02-14 15:25:43 +01:00
Pascal Vizeli
c33d31996d Set timeout of 10min for download OTA (#1505) 2020-02-13 17:25:37 +01:00
Franck Nijhof
aa1f08fe8a Update API docs to reflect latest changes (#1502) 2020-02-11 23:09:23 +01:00
Pascal Vizeli
d78689554a Cleanup old logic (#1500) 2020-02-10 23:52:22 +01:00
dependabot-preview[bot]
5bee1d851c Bump gitpython from 3.0.5 to 3.0.7 (#1497)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.5 to 3.0.7.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.5...3.0.7)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-10 16:08:56 +01:00
Pascal Vizeli
ddb8eef4d1 Pump version to 201 2020-02-09 22:41:22 +01:00
Pascal Vizeli
da513e7347 Merge pull request #1495 from home-assistant/dev
Release 200
2020-02-09 22:38:08 +01:00
Pascal Vizeli
4279d7fd16 Check if HA is running (#1494) 2020-02-09 22:15:34 +01:00
Pascal Vizeli
934eab2e8c Fix operating-system url for OTA updates (#1493) 2020-02-09 21:42:22 +01:00
Pascal Vizeli
2a31edc768 Guard addon self lookup (#1492) 2020-02-08 23:56:24 +01:00
Pascal Vizeli
fcdd66dc6e Fix Hardware list (#1490) 2020-02-07 18:30:39 +01:00
dependabot-preview[bot]
a65d3222b9 Bump docker from 4.1.0 to 4.2.0 (#1485)
Bumps [docker](https://github.com/docker/docker-py) from 4.1.0 to 4.2.0.
- [Release notes](https://github.com/docker/docker-py/releases)
- [Commits](https://github.com/docker/docker-py/compare/4.1.0...4.2.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-07 16:07:00 +01:00
Pascal Vizeli
36179596a0 Fix HA instance 2020-02-06 10:25:37 +00:00
Pascal Vizeli
c083c850c1 Bump version to 200 2020-02-06 11:20:45 +01:00
Pascal Vizeli
ff903d7b5a Merge pull request #1484 from home-assistant/dev
Release 199
2020-02-06 11:20:17 +01:00
Pascal Vizeli
dd603e1ec2 Support basic video mapping (#1483)
* Support basic video mapping

* Fix regex
2020-02-06 10:48:27 +01:00
Pascal Vizeli
a2f06b1553 VSCode: cleanup homeassistant on shutdown (#1481) 2020-02-06 09:41:22 +01:00
Pascal Vizeli
8115d2b3d3 Show landingpage soon as possible (#1480) 2020-02-06 09:31:52 +01:00
Pascal Vizeli
4f97bb9e0b Fix overwrite authorization / ingress (#1479) 2020-02-06 08:30:27 +01:00
Pascal Vizeli
84d24a2c4d [skip ci] fix dev builds 2020-02-05 11:01:35 +01:00
Pascal Vizeli
b709061656 Bump version to 199 2020-02-05 10:59:38 +01:00
Pascal Vizeli
cd9034b3f1 Merge pull request #1478 from home-assistant/dev
Release 198
2020-02-05 10:58:59 +01:00
Pascal Vizeli
25d324c73a First clean renaming for smooth migration (#1476)
* First clean renaming for smooth migration

* Update security

* fix lint

* Update hassio/const.py

Co-Authored-By: Franck Nijhof <git@frenck.dev>

Co-authored-by: Franck Nijhof <frenck@frenck.nl>
2020-02-05 10:57:57 +01:00
Bram Kragten
3a834d1a73 Update frontend (#1477) 2020-02-05 10:08:39 +01:00
Bram Kragten
e9fecb817d Update frontend (#1475) 2020-02-05 09:55:24 +01:00
Bram Kragten
56e70d7ec4 Update frontend (#1473) 2020-02-04 11:18:59 -08:00
Pascal Vizeli
2e73a85aa9 Bump version to 198 2020-02-04 17:55:52 +01:00
Pascal Vizeli
1e119e9c03 Merge pull request #1472 from home-assistant/dev
Release 197
2020-02-04 17:55:12 +01:00
Pascal Vizeli
6f6e5c97df [skip ci] fix pipelines 2020-02-04 16:50:06 +00:00
Pascal Vizeli
6ef99974cf Cleanup service (#1471)
* Cleanup services on uninstall

* Fix active list
2020-02-04 17:40:46 +01:00
Bram Kragten
8984b9aef6 Update frontend (#1469) 2020-02-04 16:53:38 +01:00
Pascal Vizeli
63e08b15bc Revert "Change loglevel INFO to use black textcolor (#1459)" (#1467)
This reverts commit 0b44df366c.
2020-02-03 17:07:00 +01:00
dependabot-preview[bot]
319b2b5d4c Bump pytest from 5.3.4 to 5.3.5 (#1463)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.3.4 to 5.3.5.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.3.4...5.3.5)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-30 17:03:31 +01:00
dependabot-preview[bot]
bae7bb8ce4 Bump pyudev from 0.21.0 to 0.22.0 (#1461)
Bumps [pyudev](https://github.com/pyudev/pyudev) from 0.21.0 to 0.22.0.
- [Release notes](https://github.com/pyudev/pyudev/releases)
- [Changelog](https://github.com/pyudev/pyudev/blob/develop-0.22/CHANGES.rst)
- [Commits](https://github.com/pyudev/pyudev/compare/v0.21.0...v0.22)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-29 12:17:33 +01:00
Franck Nijhof
0b44df366c Change loglevel INFO to use black textcolor (#1459) 2020-01-29 08:45:30 +01:00
Pascal Vizeli
f253c797af Add stage flag (#1460)
* Add stage flag

* Add filter

* Remove filter

* Fix lint
2020-01-28 17:58:29 +01:00
Pascal Vizeli
0a8b1c2797 Bump version 197 2020-01-27 21:46:37 +01:00
Pascal Vizeli
3b45fb417b Merge pull request #1457 from home-assistant/dev
Release 196
2020-01-27 21:44:07 +01:00
Pascal Vizeli
2a2d92e3c5 Change rating for ingress add-on (#1451)
* Change rating for ingress add-on

* Fix style
2020-01-27 21:42:58 +01:00
Pascal Vizeli
a320e42ed5 Fix services validation & add tests (#1456) 2020-01-27 21:38:26 +01:00
Pascal Vizeli
fdef712e01 New Panel (#1455)
* Update Hassio Panel

* Fix issues
2020-01-27 21:20:47 +01:00
Pascal Vizeli
5717ac19d7 Update test_env.sh 2020-01-27 20:41:21 +01:00
Franck Nijhof
33d7d76fee Add UniFi discovery service (#1454) 2020-01-27 17:59:06 +01:00
dependabot-preview[bot]
73bdaa623c Bump packaging from 20.0 to 20.1 (#1450)
Bumps [packaging](https://github.com/pypa/packaging) from 20.0 to 20.1.
- [Release notes](https://github.com/pypa/packaging/releases)
- [Changelog](https://github.com/pypa/packaging/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pypa/packaging/compare/20.0...20.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-24 13:06:46 +01:00
Pascal Vizeli
8ca8f59a0b Add mysql service to API (#1449) 2020-01-24 12:11:58 +01:00
Franck Nijhof
745af3c039 Add MySQL service support (#1448) 2020-01-23 19:05:13 +01:00
dependabot-preview[bot]
5d17e1011a Bump pytest from 5.3.3 to 5.3.4 (#1447)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.3.3 to 5.3.4.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.3.3...5.3.4)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-21 15:47:44 +01:00
Pascal Vizeli
826464c41b Fix builder version / latest 2020-01-20 21:50:29 +00:00
Pascal Vizeli
a643df8cac Update builder 2020-01-20 21:03:09 +00:00
Bram Kragten
24ded99286 Typo (#1444)
* Typo

* Typo test
2020-01-20 17:11:41 +01:00
dependabot-preview[bot]
6646eee504 Bump pytest from 5.3.2 to 5.3.3 (#1440)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.3.2 to 5.3.3.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.3.2...5.3.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-20 15:14:18 +01:00
Pascal Vizeli
f55c10914e UI Schema / Addon options (#1441)
* UI schema for add-on options

* Fix lint

* Add tests

* Address comments
2020-01-20 15:13:17 +01:00
Pascal Vizeli
b1e768f69e Add advanced property for HA simple-mode (#1439) 2020-01-20 10:17:14 +01:00
Pascal Vizeli
4702f8bd5e Add docs support to addon (#1438)
* Add docs support to addon

* Fix stale code
2020-01-20 10:01:22 +01:00
Pascal Vizeli
69959b2c97 Password reset (#1433)
* API to reset password

* Fix error handling

* fix lint

* fix typing

* fix await
2020-01-15 18:16:19 +01:00
Franck Nijhof
9d6f4f5392 Remove orangepi-prime, add odroid-n2 to add-on machine validator (#1432) 2020-01-13 16:25:18 +01:00
Pascal Vizeli
36b9a609bf Add dns reset to API (#1428) 2020-01-09 22:27:39 +01:00
Pascal Vizeli
36ae0c82b6 Revert "Update aiohttp to version 3.6.2 again (#1427)" (#1429)
This reverts commit e11011ee51.
2020-01-09 22:22:02 +01:00
Pascal Vizeli
e11011ee51 Update aiohttp to version 3.6.2 again (#1427) 2020-01-09 16:28:15 +01:00
Pascal Vizeli
9125211a57 Bump version 196 2020-01-09 16:25:07 +01:00
Pascal Vizeli
3a4ef6ceb3 Merge pull request #1426 from home-assistant/dev
Release 195
2020-01-09 16:24:00 +01:00
Pascal Vizeli
ca82993278 Update to Alpine3.11 2020-01-09 14:10:36 +00:00
Pascal Vizeli
0925af91e3 Fix snapshot HA / remove API password (#1425)
* Fix snapshot HA / remove API password

* fix lint

* Fix log

* cleanup API

* stale password handling

* fix lint
2020-01-09 14:35:37 +01:00
Franck Nijhof
80bc32243c Add configuration for Lock Threads on closed pull requests (#1424) 2020-01-09 11:40:23 +01:00
Pascal Vizeli
f0d232880d Allow big files on Ingress (#1423)
* Allow big files on Ingress

* Fix style

* Fix

* Cleanup

* Set to 16mb
2020-01-09 10:16:20 +01:00
Pascal Vizeli
7c790dbbd9 Fix issue generic (#1422)
* Fix issue on generic

* Fix style
2020-01-08 23:52:26 +01:00
Pascal Vizeli
899b17e992 Bump version 195 2020-01-07 21:10:40 +01:00
Pascal Vizeli
d1b4521290 Merge pull request #1421 from home-assistant/dev
Release 194
2020-01-07 21:09:50 +01:00
dependabot-preview[bot]
9bb4feef29 Bump pytest-timeout from 1.3.3 to 1.3.4 (#1418)
Bumps [pytest-timeout](https://github.com/pytest-dev/pytest-timeout) from 1.3.3 to 1.3.4.
- [Release notes](https://github.com/pytest-dev/pytest-timeout/releases)
- [Commits](https://github.com/pytest-dev/pytest-timeout/commits)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-07 16:07:07 +01:00
dependabot-preview[bot]
4bcdc98a31 Bump packaging from 19.2 to 20.0 (#1419)
Bumps [packaging](https://github.com/pypa/packaging) from 19.2 to 20.0.
- [Release notes](https://github.com/pypa/packaging/releases)
- [Changelog](https://github.com/pypa/packaging/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pypa/packaging/compare/19.2...20.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-07 16:06:56 +01:00
Pascal Vizeli
26f8c1df92 Don't reset dns container (#1417)
* Don't reset dns container

* Ignore more exception

* Fix handling
2020-01-06 15:06:57 +01:00
Matt White
a481ad73f3 Prefer admin defined DNS (#1399)
* Prefer admin defined DNS servers

* Remove space

* Update debug log

* Test for customisation of manual DNS servers.

* Warn that manual DNS will be removed on reset in v200

* Remove TODO

* Format with black

* Implement DNS fix for versions <194

* Insert missing docstring

* Add missing docstring

* Remove self.servers == DNS_SERVERS test
2020-01-06 14:22:46 +01:00
Pascal Vizeli
e4ac17fea6 Support Odroid N2 (#1416) 2020-01-06 14:08:30 +01:00
dependabot-preview[bot]
bcd940e95b Bump colorlog from 4.0.2 to 4.1.0 (#1412)
Bumps [colorlog](https://github.com/borntyping/python-colorlog) from 4.0.2 to 4.1.0.
- [Release notes](https://github.com/borntyping/python-colorlog/releases)
- [Commits](https://github.com/borntyping/python-colorlog/commits/v4.1.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-06 10:52:43 +01:00
Pascal Vizeli
5365aa4466 Offload rauc logic with partition handling (#1404)
* Offload rauc logic with partition handling

* Fix name

* Fix detection

* Add to API
2019-12-19 16:41:16 +01:00
Pascal Vizeli
a0d106529c Report error correct for rauc (#1403)
* Report error correct

* Use new style
2019-12-18 23:46:42 +01:00
dependabot-preview[bot]
bf1a9ec42d Bump pytest from 5.3.1 to 5.3.2 (#1400)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.3.1 to 5.3.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.3.1...5.3.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-12-16 13:34:19 +01:00
Pascal Vizeli
fc5d97562f Dns update (#1393)
* Improvements to DNS validator to include IPv6 (#1312)

* improvements to DNS validator to include IPv6

* fixed the DNS validators

* updated per suggestions

* Update const.py

* Update dns.py

* Update validate.py

* Update validate.py

* Update dns.py

* Update test_validate.py

* Update validate.py

* Cleanup

* Don't set default DNS server as default

* Remove update local resolver

* Fix lint
2019-12-05 21:52:55 +01:00
Issac
f5c171e44f Fix grammatical issue in log message (#1392) 2019-12-05 14:38:28 +01:00
dependabot-preview[bot]
a3c3f15806 Bump pytest from 5.3.0 to 5.3.1 (#1384)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.3.0 to 5.3.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.3.0...5.3.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-11-27 12:37:17 +01:00
dependabot-preview[bot]
ef58a219ec Bump pytest from 5.2.4 to 5.3.0 (#1379)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.2.4 to 5.3.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.4...5.3.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-11-20 12:23:23 +01:00
dependabot-preview[bot]
6708fe36e3 Bump uvloop from 0.13.0 to 0.14.0 (#1363)
Bumps [uvloop](https://github.com/MagicStack/uvloop) from 0.13.0 to 0.14.0.
- [Release notes](https://github.com/MagicStack/uvloop/releases)
- [Commits](https://github.com/MagicStack/uvloop/compare/v0.13.0...v0.14.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-11-18 12:20:21 +01:00
dependabot-preview[bot]
e02fa2824c Bump cchardet from 2.1.4 to 2.1.5 (#1376)
Bumps [cchardet](https://github.com/PyYoshi/cChardet) from 2.1.4 to 2.1.5.
- [Release notes](https://github.com/PyYoshi/cChardet/releases)
- [Changelog](https://github.com/PyYoshi/cChardet/blob/master/CHANGES.rst)
- [Commits](https://github.com/PyYoshi/cChardet/compare/2.1.4...2.1.5)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-11-18 12:19:31 +01:00
dependabot-preview[bot]
a20f927082 Bump pytest from 5.2.2 to 5.2.4 (#1375)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.2.2 to 5.2.4.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.2...5.2.4)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-11-18 10:46:23 +01:00
dependabot-preview[bot]
6d71e3fe81 Bump pylint from 2.4.3 to 2.4.4 (#1372)
Bumps [pylint](https://github.com/PyCQA/pylint) from 2.4.3 to 2.4.4.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.4.3...pylint-2.4.4)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-11-18 10:22:32 +01:00
dependabot-preview[bot]
4056fcd75d Bump gitpython from 3.0.4 to 3.0.5 (#1371)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.4 to 3.0.5.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.4...3.0.5)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-11-18 10:22:14 +01:00
Pascal Vizeli
1e723cf0e3 Bump version 194 2019-11-07 16:55:32 +01:00
Pascal Vizeli
ce3f670597 Merge pull request #1364 from home-assistant/dev
Hass.io 193
2019-11-07 16:54:21 +01:00
Pascal Vizeli
ce3d3d58ec Fix name 2019-11-07 15:12:42 +00:00
Pascal Vizeli
a92cab48e0 Support upload streaming (#1362)
* Support upload streaming

* Fix header

* Fix typing
2019-11-06 23:57:51 +01:00
dependabot-preview[bot]
ee76317392 Bump flake8 from 3.7.8 to 3.7.9 (#1352)
Bumps [flake8](https://gitlab.com/pycqa/flake8) from 3.7.8 to 3.7.9.
- [Release notes](https://gitlab.com/pycqa/flake8/tags)
- [Commits](https://gitlab.com/pycqa/flake8/compare/3.7.8...3.7.9)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-30 23:19:25 +01:00
Pascal Vizeli
380ca13be1 Pin Black (#1355)
* Pin black version

* fix devcontainer
2019-10-30 23:18:04 +01:00
Pascal Vizeli
93f4c5e207 Real optimize websocket proxy (#1351) 2019-10-28 17:57:03 +01:00
Pascal Vizeli
e438858da0 Better proxy handling (#1350) 2019-10-28 14:34:26 +01:00
Pascal Vizeli
428a4dd849 Bump version 193 2019-10-25 16:55:44 +02:00
Pascal Vizeli
39cc8aaa13 Merge pull request #1349 from home-assistant/dev
Release 192
2019-10-25 16:55:21 +02:00
dependabot-preview[bot]
39a62864de Bump pytest from 5.2.1 to 5.2.2 (#1348)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.2.1 to 5.2.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.1...5.2.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-25 16:46:45 +02:00
Franck Nijhof
71a162a871 Gracefully allow loading duplicate keys in secrets (#1347) 2019-10-25 11:44:03 +02:00
Franck Nijhof
05d7eff09a Fix secrets containing unsupported types (#1345)
* Fix secrets containing unsupported types

* Black

* Cleanup
2019-10-24 18:26:47 +02:00
Pascal Vizeli
7b8ad0782d Update Dockerfile 2019-10-23 16:06:48 +02:00
Pascal Vizeli
df3e9e3a5e Bump version 192 2019-10-23 16:02:42 +02:00
Pascal Vizeli
8cdc769ec8 Merge pull request #1343 from home-assistant/dev
Release 191
2019-10-23 15:59:42 +02:00
Pascal Vizeli
76e1304241 Downgrade aiohttp to 3.6.1 to fix lost connections (#1342) 2019-10-23 15:58:54 +02:00
Pascal Vizeli
eb9b1ff03d Bump version 191 2019-10-22 15:04:04 +02:00
Pascal Vizeli
b3b12d35fd Merge pull request #1341 from home-assistant/dev
Release 190
2019-10-22 14:57:25 +02:00
Pascal Vizeli
74485262e7 Prune network/interface on repair (#1340)
* Prune network/interface on repair

* Force disconnect
2019-10-22 14:30:14 +02:00
Pascal Vizeli
615e68b29b Add discovery support for Almond (#1339)
* Add discovery support for Almond

* Fix docstring
2019-10-22 13:39:46 +02:00
dependabot-preview[bot]
927b4695c9 Bump gitpython from 3.0.3 to 3.0.4 (#1338)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.3 to 3.0.4.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.3...3.0.4)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-22 13:31:14 +02:00
Pascal Vizeli
11811701d0 Add snapshot_exclude option (#1337)
* Add snapshot tar filter

* Add filter to add-on

* Fix bug

* Fix
2019-10-21 14:48:24 +02:00
Pascal Vizeli
05c8022db3 Check path on extractall (#1336)
* Check path on extractall

* code cleanup

* Add logger

* Fix issue

* Add tests
2019-10-21 12:23:00 +02:00
dependabot-preview[bot]
a9ebb147c5 Bump cryptography from 2.7 to 2.8 (#1332)
Bumps [cryptography](https://github.com/pyca/cryptography) from 2.7 to 2.8.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/2.7...2.8)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-18 14:54:27 +02:00
dependabot-preview[bot]
ba8ca4d9ee Bump pylint from 2.4.2 to 2.4.3 (#1334)
Bumps [pylint](https://github.com/PyCQA/pylint) from 2.4.2 to 2.4.3.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.4.2...pylint-2.4.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-18 14:53:35 +02:00
dependabot-preview[bot]
3574df1385 Bump attrs from 19.1.0 to 19.3.0 (#1329)
* Bump attrs from 19.1.0 to 19.3.0

Bumps [attrs](https://github.com/python-attrs/attrs) from 19.1.0 to 19.3.0.
- [Release notes](https://github.com/python-attrs/attrs/releases)
- [Changelog](https://github.com/python-attrs/attrs/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/python-attrs/attrs/compare/19.1.0...19.3.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

* Fix attr Deprecations
2019-10-17 16:48:06 +02:00
dependabot-preview[bot]
b4497d231b Bump pytz from 2019.2 to 2019.3 (#1323)
Bumps [pytz](https://github.com/stub42/pytz) from 2019.2 to 2019.3.
- [Release notes](https://github.com/stub42/pytz/releases)
- [Commits](https://github.com/stub42/pytz/compare/release_2019.2...release_2019.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-14 11:40:50 +02:00
dependabot-preview[bot]
5aa9b0245a Bump pytest from 5.2.0 to 5.2.1 (#1324)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.2.0 to 5.2.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.0...5.2.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-14 11:33:59 +02:00
dependabot-preview[bot]
4c72c3aafc Bump aiohttp from 3.6.1 to 3.6.2 (#1325)
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.6.1 to 3.6.2.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.6.1...v3.6.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-14 11:32:45 +02:00
dependabot-preview[bot]
bf4f40f991 Bump docker from 4.0.2 to 4.1.0 (#1321)
Bumps [docker](https://github.com/docker/docker-py) from 4.0.2 to 4.1.0.
- [Release notes](https://github.com/docker/docker-py/releases)
- [Commits](https://github.com/docker/docker-py/compare/4.0.2...4.1.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-14 11:31:03 +02:00
Timmo
603334f4f3 Add support for Home Panel discovery (#1327) 2019-10-14 11:30:18 +02:00
dependabot-preview[bot]
46548af165 Bump gitpython from 3.0.2 to 3.0.3 (#1319)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.2 to 3.0.3.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.2...3.0.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-10-03 13:06:20 +02:00
dependabot-preview[bot]
8ef32b40c8 Bump pylint from 2.4.1 to 2.4.2 (#1314)
Bumps [pylint](https://github.com/PyCQA/pylint) from 2.4.1 to 2.4.2.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.4.1...pylint-2.4.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-30 22:21:00 +02:00
dependabot-preview[bot]
fb25377087 Bump pytest from 5.1.3 to 5.2.0 (#1315)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.1.3 to 5.2.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.1.3...5.2.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-30 22:15:12 +02:00
Pascal Vizeli
a75fd2d07e Update devcontainer.json 2019-09-30 11:01:59 +02:00
Pascal Vizeli
e30f39e97e Update devcontainer.json 2019-09-30 11:01:35 +02:00
dependabot-preview[bot]
4818ad7465 Bump pylint from 2.4.0 to 2.4.1 (#1308)
Bumps [pylint](https://github.com/PyCQA/pylint) from 2.4.0 to 2.4.1.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.4.0...pylint-2.4.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-25 18:08:33 +02:00
dependabot-preview[bot]
5e4e9740c7 Bump pylint from 2.3.1 to 2.4.0 (#1307)
* Bump pylint from 2.3.1 to 2.4.0

Bumps [pylint](https://github.com/PyCQA/pylint) from 2.3.1 to 2.4.0.
- [Release notes](https://github.com/PyCQA/pylint/releases)
- [Changelog](https://github.com/PyCQA/pylint/blob/master/ChangeLog)
- [Commits](https://github.com/PyCQA/pylint/compare/pylint-2.3.1...pylint-2.4.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

* Update __main__.py

* Update bootstrap.py

* Update homeassistant.py

* Update __init__.py
2019-09-25 09:41:16 +02:00
Pascal Vizeli
d4e41dbf80 Bump version 190 2019-09-24 15:25:28 +02:00
Pascal Vizeli
cea1a1a15f Merge pull request #1306 from home-assistant/dev
Release 189
2019-09-24 15:24:27 +02:00
dependabot-preview[bot]
c2700b14dc Bump packaging from 19.1 to 19.2 (#1305)
Bumps [packaging](https://github.com/pypa/packaging) from 19.1 to 19.2.
- [Release notes](https://github.com/pypa/packaging/releases)
- [Changelog](https://github.com/pypa/packaging/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pypa/packaging/compare/19.1...19.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-24 10:31:56 +02:00
dependabot-preview[bot]
07d27170db Bump pytest from 5.1.2 to 5.1.3 (#1303)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.1.2 to 5.1.3.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.1.2...5.1.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-24 10:31:13 +02:00
Pascal Vizeli
8eb8c07df6 Update uvloop 0.13.0 (#1302) 2019-09-23 23:00:57 +02:00
Pascal Vizeli
7bee6f884c Update aiohttp 3.6.1 (#1301) 2019-09-23 22:45:02 +02:00
Franck Nijhof
78dd20e314 Fixes accidental string concatenation in classifiers list (#1300) 2019-09-23 12:23:57 +02:00
Pascal Vizeli
2a011b6448 Fix typo to validate list (#1298)
* Fix typo to validate list

* Fix lint

* Add Typo
2019-09-20 17:28:23 +02:00
Pascal Vizeli
5c90370ec8 Bump version 189 2019-09-15 15:08:12 +02:00
Pascal Vizeli
120465b88d Merge pull request #1294 from home-assistant/dev
Release 188
2019-09-15 15:07:39 +02:00
Pascal Vizeli
c77292439a Fix invalid secrets (#1293)
* Fix invalid secrets format

* Fix style
2019-09-15 15:06:22 +02:00
Pascal Vizeli
0a0209f81a Bump version 188 2019-09-12 23:32:20 +02:00
Pascal Vizeli
69a7ed8a5c Merge pull request #1291 from home-assistant/dev
Release 187
2019-09-12 23:30:53 +02:00
Pascal Vizeli
8df35ab488 Fix detection of HA container / image (#1290) 2019-09-12 23:28:55 +02:00
Pascal Vizeli
a12567d0a8 Update secrets handling (#1289)
* Update secrets handling

* Remove start pre_check

* fix lint

* remove tasker
2019-09-12 23:16:56 +02:00
Pascal Vizeli
64fe190119 Bump version 187 2019-09-11 18:29:24 +02:00
Pascal Vizeli
e3ede66943 Merge pull request #1287 from home-assistant/dev
Release 186
2019-09-11 18:26:22 +02:00
Pascal Vizeli
2672b800d4 DNS fallback to docker internal one (#1286)
* DNS fallback to docker internal one

* Fix log

* Fix style

* Fix startup handling
2019-09-11 17:54:16 +02:00
Pascal Vizeli
c60d4bda92 Check supervisor docker permission (#1285)
* Check supervisor docker permission

* Update log message
2019-09-11 17:47:49 +02:00
Pascal Vizeli
db9d0f2639 Fix lint (#1284) 2019-09-11 16:37:49 +02:00
Pascal Vizeli
02d4045ec3 Add secrets support for options (#1283)
* Add secrets API

* Don't expose secrets
2019-09-11 16:29:34 +02:00
Pascal Vizeli
a308ea6927 Update Dockerfile 2019-09-05 14:20:35 +02:00
Pascal Vizeli
edc5e5e812 Update Dockerfile 2019-09-05 12:41:42 +02:00
dependabot-preview[bot]
23b65cb479 Bump pytest from 5.1.1 to 5.1.2 (#1278)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.1.1 to 5.1.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.1.1...5.1.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-09-02 23:07:11 +02:00
Pascal Vizeli
e5eabd2143 Fix typing warning / hardware (#1276) 2019-09-02 15:54:37 +02:00
Pascal Vizeli
b0dd043975 Fix typing warning / hardware (#1277) 2019-09-02 15:45:31 +02:00
Pascal Vizeli
435a1096ed Cleanup debug gdbus output (#1275) 2019-09-02 15:08:26 +02:00
Pascal Vizeli
21a9084ca0 Bump version 186 2019-09-02 14:39:56 +02:00
Pascal Vizeli
10d9135d86 Merge pull request #1274 from home-assistant/dev
Release 185
2019-09-02 14:39:17 +02:00
Pascal Vizeli
272d8b29f3 Fix version handling with nightly (#1273)
* Fix version handling with nightly

* fix lint
2019-09-02 14:37:59 +02:00
Pascal Vizeli
3d665b9eec Support for udev device trigger (#1272) 2019-09-02 14:07:09 +02:00
Pascal Vizeli
c563f484c9 Add support for udev trigger 2019-09-02 11:28:49 +00:00
Pascal Vizeli
38268ea4ea Bump version to 185 2019-08-26 10:04:36 +02:00
Pascal Vizeli
c1ad64cddf Merge pull request #1264 from home-assistant/dev
Release 184
2019-08-26 10:03:53 +02:00
Pascal Vizeli
b898cd2a3a Preserve ordering of locals (#1263)
* Preserve ordering of locals

* fix lint
2019-08-26 09:45:10 +02:00
Pascal Vizeli
937b31d845 Bump version 184 2019-08-23 14:22:47 +02:00
Pascal Vizeli
e4e655493b Merge pull request #1259 from home-assistant/dev
Release 183
2019-08-23 14:21:58 +02:00
Pascal Vizeli
387d2dcc2e Add support for gvariant annotations (#1258) 2019-08-23 13:41:17 +02:00
Pascal Vizeli
8abe33d48a Bump version 183 2019-08-22 18:58:02 +02:00
Pascal Vizeli
860442d5c4 Merge pull request #1256 from home-assistant/dev
Release 182
2019-08-22 18:57:38 +02:00
Pascal Vizeli
ce5183ce16 Add support to read Host DNS (#1255)
* Add support to read Host DNS

* Include properties

* Improve host info handling

* Add API

* Better abstraction

* Change prio list

* Address lint

* fix get properties

* Fix nameserver list

* Small cleanups

* Bit more stability

* cleanup
2019-08-22 18:01:49 +02:00
dependabot-preview[bot]
3e69b04b86 Bump gitpython from 3.0.1 to 3.0.2 (#1254)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.1 to 3.0.2.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.1...3.0.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-22 16:20:47 +02:00
Pascal Vizeli
8b9cd4f122 Improve handling with nested objects (#1253) 2019-08-22 14:24:50 +02:00
Pascal Vizeli
c0e3ccdb83 Improve gdbus error handling (#1252)
* Improve gdbus error handling

* Fix logging type

* Detect no dbus

* Fix issue with complex

* Update hassio/dbus/__init__.py

Co-Authored-By: Franck Nijhof <frenck@frenck.nl>

* Update hassio/dbus/hostname.py

Co-Authored-By: Franck Nijhof <frenck@frenck.nl>

* Update hassio/dbus/rauc.py

Co-Authored-By: Franck Nijhof <frenck@frenck.nl>

* Update hassio/dbus/systemd.py

Co-Authored-By: Franck Nijhof <frenck@frenck.nl>

* Fix black
2019-08-22 12:48:02 +02:00
dependabot-preview[bot]
e8cc85c487 Bump pytest from 5.1.0 to 5.1.1 (#1250)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.1.0 to 5.1.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.1.0...5.1.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-21 23:41:16 +02:00
Pascal Vizeli
b3eff41692 Update devcontainer.json 2019-08-20 17:51:27 +02:00
Pascal Vizeli
1ea63f185c Bump version 182 2019-08-18 21:08:01 +02:00
Pascal Vizeli
a513d5c09a Merge pull request #1247 from home-assistant/dev
Release 181
2019-08-18 21:07:21 +02:00
Pascal Vizeli
fb8216c102 Fix AAAA resolv for nginx (#1246) 2019-08-18 21:05:42 +02:00
Pascal Vizeli
4f381d01df Log coredns errors (#1245) 2019-08-18 17:21:46 +02:00
Pascal Vizeli
de3382226e Update setting on startup (#1244)
* Update setting on startup

* Fix

* fix exception

* Cleanup handling
2019-08-18 17:11:42 +02:00
Pascal Vizeli
77be830b72 Bump version 181 2019-08-18 12:00:31 +02:00
Pascal Vizeli
09c0e1320f Merge pull request #1243 from home-assistant/dev
Release 180
2019-08-18 12:00:01 +02:00
Franck Nijhof
cc4ee59542 Replace Google DNS by Quad9, prefer CloudFlare (#1235) 2019-08-18 11:48:29 +02:00
Pascal Vizeli
1f448744f3 Add restart function / options change (#1242) 2019-08-18 11:46:23 +02:00
Franck Nijhof
ee2c257057 Adjust coredns do not forward local.hass.io (#1237) 2019-08-18 11:08:34 +02:00
Franck Nijhof
be8439d4ac Add localhost to hosts file (#1240) 2019-08-18 11:07:23 +02:00
Franck Nijhof
981f2b193c Adjust coredns to use upstream fowarding server in order (#1238) 2019-08-18 11:06:17 +02:00
Pascal Vizeli
39087e09ce Bump version 180 2019-08-16 20:33:56 +02:00
Pascal Vizeli
59960efb9c Merge pull request #1229 from home-assistant/dev
Release 179
2019-08-16 20:32:34 +02:00
Pascal Vizeli
5a53bb5981 Use hosts as list (#1228)
* Use hosts as list

* Fix

* Clean style

* Fix list remove

* hide warning
2019-08-16 20:29:10 +02:00
dependabot-preview[bot]
a67fe69cbb Bump pytest from 5.0.1 to 5.1.0 (#1227)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.0.1 to 5.1.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.0.1...5.1.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-16 17:59:15 +02:00
Pascal Vizeli
9ce2b0765f Bump version 179 2019-08-16 13:27:51 +02:00
Pascal Vizeli
2e53a48504 Merge pull request #1224 from home-assistant/dev
Release 178
2019-08-16 13:26:45 +02:00
Pascal Vizeli
8e4db0c3ec Stripe resolv (#1226) 2019-08-16 13:22:07 +02:00
Pascal Vizeli
4072b06faf Fix issue on isntalled add-ons (#1225) 2019-08-16 13:12:39 +02:00
Pascal Vizeli
a2cf7ece70 Change handling with host files (#1223) 2019-08-16 12:47:32 +02:00
Pascal Vizeli
734fe3afde Bump version 178 2019-08-16 00:15:05 +02:00
Pascal Vizeli
7f3bc91c1d Merge pull request #1222 from home-assistant/dev
Release 177
2019-08-16 00:13:51 +02:00
Pascal Vizeli
9c2c95757d Validate dns better (#1221) 2019-08-15 23:48:14 +02:00
Franck Nijhof
b5ed6c586a Cleanup ingress panel on add-on uninstall (#1220)
* Cleanup ingress panel on add-on uninstall

* Update __init__.py
2019-08-15 23:05:03 +02:00
Franck Nijhof
35033d1f76 Allow manager role to access DNS API (#1219) 2019-08-15 22:38:34 +02:00
Pascal Vizeli
9e41d0c5b0 Bump version 177 2019-08-15 14:51:28 +02:00
Pascal Vizeli
62e92fada9 Merge pull request #1218 from home-assistant/dev
Release 176
2019-08-15 14:50:55 +02:00
dependabot-preview[bot]
ae0a1a657f Bump gitpython from 3.0.0 to 3.0.1 (#1216)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.0 to 3.0.1.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.0.0...3.0.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-15 14:46:16 +02:00
Pascal Vizeli
81e511ba8e Fix spell 2019-08-15 12:42:34 +00:00
Pascal Vizeli
d89cb91c8c Revert "Call update of resolv later (#1215)" (#1217)
This reverts commit dc31b6e6fe.
2019-08-15 14:42:05 +02:00
Pascal Vizeli
dc31b6e6fe Call update of resolv later (#1215) 2019-08-15 13:57:44 +02:00
Pascal Vizeli
930a32de1a Fix latest issue (#1214)
* Fix latest issue

* Use also update now

* Fix style
2019-08-15 12:42:21 +02:00
Pascal Vizeli
e40f2ed8e3 Bump version 176 2019-08-15 11:36:47 +02:00
Pascal Vizeli
abbd3d1078 Merge pull request #1213 from home-assistant/dev
Release 175
2019-08-15 11:36:06 +02:00
Pascal Vizeli
63c9948456 Add CoreDNS to update process (#1212) 2019-08-15 11:05:08 +02:00
Pascal Vizeli
b6c81d779a Use own coredns for supervisor 2019-08-15 08:51:42 +00:00
Pascal Vizeli
2480c83169 Fix socat command (#1211) 2019-08-15 10:17:41 +02:00
Pascal Vizeli
334cc66cf6 Bump Version 175 2019-08-14 15:39:44 +02:00
Pascal Vizeli
3cf189ad94 Merge pull request #1209 from home-assistant/dev
Release 174
2019-08-14 15:38:57 +02:00
dependabot-preview[bot]
6ffb94a0f5 Bump ptvsd from 4.3.1 to 4.3.2 (#1207)
Bumps [ptvsd](https://github.com/Microsoft/ptvsd) from 4.3.1 to 4.3.2.
- [Release notes](https://github.com/Microsoft/ptvsd/releases)
- [Commits](https://github.com/Microsoft/ptvsd/compare/v4.3.1...v4.3.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-14 14:35:43 +02:00
Pascal Vizeli
3593826441 Fix issue with windows dev env 2019-08-14 10:37:39 +00:00
Pascal Vizeli
0a0a62f238 Addon provide his own udev support (#1206)
* Addon provide his own udev support

* upgrade logger
2019-08-14 12:29:00 +02:00
Pascal Vizeli
41ce9913d2 Stats percent (#1205)
* Fix stats and add Memory percent

* Fix tasks

* round percent
2019-08-14 10:47:11 +02:00
Pascal Vizeli
b77c42384d Add DNS to add-on (#1204) 2019-08-14 09:53:03 +02:00
Pascal Vizeli
138bb12f98 Add debug output to gdbus (#1203) 2019-08-13 21:25:04 +02:00
Pascal Vizeli
4fe2859f4e Rename scripts folder (#1202)
* Rename script folder

* Rename scripts
2019-08-13 14:39:32 +02:00
dependabot-preview[bot]
0768b2b4bc Bump ptvsd from 4.3.0 to 4.3.1 (#1200)
Bumps [ptvsd](https://github.com/Microsoft/ptvsd) from 4.3.0 to 4.3.1.
- [Release notes](https://github.com/Microsoft/ptvsd/releases)
- [Commits](https://github.com/Microsoft/ptvsd/compare/v4.3.0...v4.3.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-13 14:35:41 +02:00
dependabot-preview[bot]
e6f1772a93 Bump gitpython from 2.1.13 to 3.0.0 (#1199)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 2.1.13 to 3.0.0.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/2.1.13...3.0.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-13 14:35:15 +02:00
dependabot-preview[bot]
5374b2b3b9 Bump voluptuous from 0.11.5 to 0.11.7 (#1201)
Bumps [voluptuous](https://github.com/alecthomas/voluptuous) from 0.11.5 to 0.11.7.
- [Release notes](https://github.com/alecthomas/voluptuous/releases)
- [Changelog](https://github.com/alecthomas/voluptuous/blob/master/CHANGELOG.md)
- [Commits](https://github.com/alecthomas/voluptuous/commits)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-13 14:29:33 +02:00
Pascal Vizeli
1196788856 Add CoreDNS as DNS backend (#1195)
* Add CoreDNS / DNS configuration

* Support get version

* add version

* add coresys

* Add more logic

* move forwareder into dns

* Setup docker inside

* add docker to env

* Add more function

* more interface

* Update hosts template

* Add DNS folder

* Fix issues

* Add more logic

* Add handling for hosts

* Fix setting

* fix lint

* Fix some issues

* Fix issue

* Run with no cache

* Fix issue on validate

* Fix bug

* Allow to jump into dev mode

* Fix permission

* Fix issue

* Add dns search

* Add watchdog

* Fix set issues

* add API description

* Add API endpoint

* Add CLI support

* Fix logs + add hostname

* Add/remove DNS entry

* Fix attribute

* Fix style

* Better shutdown

* Remove ha from network mapping

* Add more options

* Fix env shutdown

* Add support for new repair function

* Start coreDNS faster after restart

* remove options

* Fix ha fix
2019-08-13 14:20:42 +02:00
Pascal Vizeli
9f3f47eb80 Bump version 174 2019-08-11 09:59:48 +02:00
Pascal Vizeli
1a90a478f2 Merge pull request #1197 from home-assistant/dev
Release 173
2019-08-11 09:39:17 +02:00
Pascal Vizeli
ee773f3b63 Fix hanging landingpage (#1196) 2019-08-11 09:05:47 +02:00
Pascal Vizeli
5ffc27f60c Bump version 173 2019-08-08 23:24:11 +02:00
Pascal Vizeli
4c13dfb43c Merge pull request #1194 from home-assistant/dev
Release 172
2019-08-08 23:21:26 +02:00
Pascal Vizeli
bc099f0d81 Fix Version detection with exists container (#1193) 2019-08-08 23:20:26 +02:00
Pascal Vizeli
b26dd0af19 Add better log output for repair (#1191) 2019-08-08 10:14:13 +02:00
Pascal Vizeli
0dee5bd763 Fix black formating args 2019-08-08 10:13:44 +02:00
Pascal Vizeli
0765387ad8 Bump version 172 2019-08-07 18:18:09 +02:00
Pascal Vizeli
a07517bd3c Merge pull request #1190 from home-assistant/dev
Release 171
2019-08-07 18:17:30 +02:00
Pascal Vizeli
e5f0d80d96 Start API server before he beform a self update (#1189) 2019-08-07 18:03:56 +02:00
Pascal Vizeli
2fc5e3b7d9 Repair / fixup docker overlayfs issues (#1170)
* Add a repair modus

* Add repair to add-ons

* repair to cli

* Add API call

* fix sync call

* Clean all images

* Fix repair

* Fix supervisor

* Add new function to core

* fix tagging

* better style

* use retag

* new retag function

* Fix lint

* Fix import export
2019-08-07 17:26:32 +02:00
Pascal Vizeli
778bc46848 Don't relay on latest with HA/Addons (#1175)
* Don't relay on latest with HA/Addons

* Fix latest on install

* Revert some options

* Fix attach

* migrate to new version handling

* Fix thread

* Fix is running

* Allow wait

* debug code

* Fix debug value

* Fix list

* Fix regex

* Some better log output

* Fix logic

* Improve cleanup handling

* Fix bug

* Cleanup old code

* Improve version handling

* Fix the way to attach
2019-08-07 09:51:27 +02:00
Pascal Vizeli
882586b246 Fix time adjustments on latest boot (#1187)
* Fix time adjustments on latest boot

* Fix spell
2019-08-06 09:24:22 +02:00
dependabot-preview[bot]
b7c07a2555 Bump pytz from 2019.1 to 2019.2 (#1184)
Bumps [pytz](https://github.com/stub42/pytz) from 2019.1 to 2019.2.
- [Release notes](https://github.com/stub42/pytz/releases)
- [Commits](https://github.com/stub42/pytz/compare/release_2019.1...release_2019.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-02 10:32:04 +02:00
dependabot-preview[bot]
814b504fa9 Bump ptvsd from 4.2.10 to 4.3.0 (#1179)
Bumps [ptvsd](https://github.com/Microsoft/ptvsd) from 4.2.10 to 4.3.0.
- [Release notes](https://github.com/Microsoft/ptvsd/releases)
- [Commits](https://github.com/Microsoft/ptvsd/compare/v4.2.10...v4.3.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-07-29 17:01:28 +02:00
dependabot-preview[bot]
7ae430e7a8 Bump gitpython from 2.1.12 to 2.1.13 (#1178)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 2.1.12 to 2.1.13.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/2.1.12...2.1.13)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-07-29 14:53:54 +02:00
dependabot-preview[bot]
0e7e95ba20 Bump gitpython from 2.1.11 to 2.1.12 (#1171)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 2.1.11 to 2.1.12.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/2.1.11...2.1.12)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-07-22 14:18:11 +02:00
Pascal Vizeli
e577d8acb2 Bump version 171 2019-07-19 11:49:00 +02:00
Pascal Vizeli
0a76ab5054 Merge pull request #1168 from home-assistant/dev
Release 170
2019-07-19 11:48:28 +02:00
Pascal Vizeli
03c5596e04 Fix machine version (#1167) 2019-07-19 11:47:55 +02:00
Pascal Vizeli
3af4e14e83 Bump version 170 2019-07-16 12:36:05 +02:00
Pascal Vizeli
7c8cf57820 Merge pull request #1164 from home-assistant/dev
Release 169
2019-07-16 12:35:10 +02:00
Pascal Vizeli
8d84a8a62e Update panel & support panel on devcontainer (#1163)
* Update panel & support panel on devcontainer

* small cleanups

* small size
2019-07-16 12:23:03 +02:00
Pascal Vizeli
08c45060bd Add support for RPi4 (#1162) 2019-07-16 10:33:56 +02:00
Pascal Vizeli
7ca8d2811b Update URL for version file (#1161) 2019-07-16 10:26:59 +02:00
Pascal Vizeli
bb6898b032 Update azure-pipelines.yml for Azure Pipelines 2019-07-12 09:57:55 +02:00
Pascal Vizeli
cd86c6814e Update azure-pipelines.yml for Azure Pipelines 2019-07-12 09:42:55 +02:00
Pascal Vizeli
b67e116650 Update azure-pipelines.yml 2019-07-12 09:41:40 +02:00
Pascal Vizeli
57ce411fb6 Update azure-pipelines.yml 2019-07-11 22:11:37 +02:00
Pascal Vizeli
85ed4d9e8d Update Dockerfile 2019-07-11 19:25:07 +02:00
dependabot-preview[bot]
ccb39da569 Bump flake8 from 3.7.7 to 3.7.8 (#1154)
Bumps [flake8](https://gitlab.com/pycqa/flake8) from 3.7.7 to 3.7.8.
- [Release notes](https://gitlab.com/pycqa/flake8/tags)
- [Commits](https://gitlab.com/pycqa/flake8/compare/3.7.7...3.7.8)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-07-09 14:05:43 +02:00
Pascal Vizeli
dd7ba64d32 Bump version 169 2019-07-08 16:03:59 +02:00
Pascal Vizeli
de3edb1654 Merge pull request #1153 from home-assistant/dev
Release 168
2019-07-08 16:02:39 +02:00
dependabot-preview[bot]
d262151727 Bump pytest from 4.6.3 to 5.0.1 (#1148)
* Bump pytest from 4.6.3 to 5.0.1

Bumps [pytest](https://github.com/pytest-dev/pytest) from 4.6.3 to 5.0.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/4.6.3...5.0.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

* Update tox.ini
2019-07-06 18:07:15 +02:00
Pascal Vizeli
a37c90af96 Forward Params (#1150) 2019-07-06 18:06:39 +02:00
Pascal Vizeli
0a3a752b4c Add timezone to info call (#1146) 2019-07-04 18:20:46 +02:00
Pascal Vizeli
0a34f427f8 Fix error on save special permission (#1145) 2019-07-04 17:30:42 +02:00
Pascal Vizeli
157740e374 Update devcontainer.json 2019-07-04 17:28:50 +02:00
Pascal Vizeli
b0e994f3f5 Bump version 168 2019-06-25 17:38:57 +02:00
Pascal Vizeli
f374852801 Merge pull request #1139 from home-assistant/dev
Release 167
2019-06-25 17:34:42 +02:00
Pascal Vizeli
709f034f2e New TimeZone handling (#1138)
* Remove function to get TZ from config file

* Readd old timezone handling

* Fix tests
2019-06-25 17:09:14 +02:00
Pascal Vizeli
6d6deb8c66 Cleanup udev handling (#1137)
* Cleanup udev handling

* Update hardware.py
2019-06-25 16:15:02 +02:00
Pascal Vizeli
5771b417bc sort import 2019-06-25 12:54:45 +00:00
Pascal Vizeli
51efcefdab Compile only hassio 2019-06-24 23:21:15 +00:00
Pascal Vizeli
d31ab5139d compile all 2019-06-24 23:09:08 +00:00
Pascal Vizeli
ce18183daa Allow update discovery messages (#1136)
* Allow update discovery messages

* Update __init__.py

* Update __init__.py

* Update __init__.py

* fix lint

* Fix style
2019-06-24 23:29:42 +02:00
Pascal Vizeli
b8b73cf880 remove diff wheels build 2019-06-24 19:16:26 +02:00
Pascal Vizeli
5291e6c1f3 Use multistage 2019-06-24 19:04:52 +02:00
Pascal Vizeli
626a9f06c4 Update to alpine 3.10 (#1135) 2019-06-24 18:49:43 +02:00
Pascal Vizeli
72338eb5b8 Add devcontainer support (#1134) 2019-06-24 14:48:10 +02:00
Jakub
7bd77c6e99 Append devlinks to serial dev_list (#1131)
* append devlinks to dev_list

* replace eudev-libs with eudev

* include only devlinks starting with /dev/serial/by-id

* add missing package, move udev init to entry.sh

* fix mode on entry.sh

* Update homeassistant.py

* Update homeassistant.py
2019-06-24 09:53:54 +02:00
dependabot-preview[bot]
69151b962a Bump pytest from 4.6.2 to 4.6.3 (#1125)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 4.6.2 to 4.6.3.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/4.6.2...4.6.3)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-06-21 09:50:42 +02:00
dependabot-preview[bot]
86305d4fe4 Bump docker from 4.0.1 to 4.0.2 (#1133)
Bumps [docker](https://github.com/docker/docker-py) from 4.0.1 to 4.0.2.
- [Release notes](https://github.com/docker/docker-py/releases)
- [Commits](https://github.com/docker/docker-py/compare/4.0.1...4.0.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-06-21 09:50:28 +02:00
Pascal Vizeli
d5c3850a3f Don't break on supervisor update (#1118)
* Don't break on supervisor update

* Update core.py

* Fix lint
2019-06-06 10:57:36 +02:00
Pascal Vizeli
3e645b6175 Fix timeout on check port (#1116) 2019-06-05 17:49:05 +02:00
Pascal Vizeli
89dc78bc05 Bump version 167 2019-06-05 17:10:44 +02:00
Pascal Vizeli
164c403d05 Merge pull request #1115 from home-assistant/dev
Release 166
2019-06-05 17:10:14 +02:00
Pascal Vizeli
5e8007453f detect native arch (#1114) 2019-06-05 16:59:43 +02:00
Pascal Vizeli
0a0d97b084 Support pip progress for HA 0.94 (#1113)
* Support pip progress for HA 0.94

* fix black

* add tests

* add test for adguard

* Fix lint
2019-06-05 14:46:03 +02:00
dependabot-preview[bot]
eb604ed92d Bump pytest from 4.6.1 to 4.6.2 (#1112)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 4.6.1 to 4.6.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/4.6.1...4.6.2)
2019-06-04 23:47:35 +02:00
dependabot-preview[bot]
c47828dbaa Bump pytest from 4.5.0 to 4.6.1 (#1110)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 4.5.0 to 4.6.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/4.5.0...4.6.1)
2019-06-04 14:43:20 +02:00
Pascal Vizeli
ea437dc745 Update azure-pipelines.yml for Azure Pipelines 2019-06-02 14:12:45 +02:00
Franck Nijhof
c16a208b39 Support for AdGuard Home discovery (#1107)
* Support for AdGuard Home discovery

* 👕 Darkened the sky
2019-06-02 07:47:12 +02:00
dependabot-preview[bot]
55d803b2a0 Bump cryptography from 2.6.1 to 2.7 (#1108)
Bumps [cryptography](https://github.com/pyca/cryptography) from 2.6.1 to 2.7.
- [Release notes](https://github.com/pyca/cryptography/releases)
- [Changelog](https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/2.6.1...2.7)
2019-05-31 17:55:16 +02:00
Pascal Vizeli
611f6f2829 Don't follow requests itself (#1106)
* Don't follow requests itself

* Fix black lint
2019-05-31 13:53:46 +02:00
Pascal Vizeli
b94df76731 Update azure-pipelines.yml for Azure Pipelines 2019-05-31 12:42:37 +02:00
Pascal Vizeli
218619e7f0 Bump version 166 2019-05-31 12:36:06 +02:00
617 changed files with 113789 additions and 3273 deletions

51
.devcontainer/Dockerfile Normal file
View File

@@ -0,0 +1,51 @@
FROM python:3.7
WORKDIR /workspaces
# Install Node/Yarn for Frontent
RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
git \
apt-utils \
apt-transport-https \
&& curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \
&& echo "deb https://dl.yarnpkg.com/debian/ stable main" | tee /etc/apt/sources.list.d/yarn.list \
&& apt-get update && apt-get install -y --no-install-recommends \
nodejs \
yarn \
&& curl -o - https://raw.githubusercontent.com/creationix/nvm/v0.34.0/install.sh | bash \
&& rm -rf /var/lib/apt/lists/*
ENV NVM_DIR /root/.nvm
# Install docker
# https://docs.docker.com/engine/installation/linux/docker-ce/ubuntu/
RUN apt-get update && apt-get install -y --no-install-recommends \
apt-transport-https \
ca-certificates \
curl \
software-properties-common \
gpg-agent \
&& curl -fsSL https://download.docker.com/linux/debian/gpg | apt-key add - \
&& add-apt-repository "deb https://download.docker.com/linux/debian $(lsb_release -cs) stable" \
&& apt-get update && apt-get install -y --no-install-recommends \
docker-ce \
docker-ce-cli \
containerd.io \
&& rm -rf /var/lib/apt/lists/*
# Install tools
RUN apt-get update && apt-get install -y --no-install-recommends \
jq \
dbus \
network-manager \
libpulse0 \
&& rm -rf /var/lib/apt/lists/*
# Install Python dependencies from requirements.txt if it exists
COPY requirements.txt requirements_tests.txt ./
RUN pip3 install -r requirements.txt -r requirements_tests.txt \
&& pip3 install tox \
&& rm -f requirements.txt requirements_tests.txt
# Set the default shell to bash instead of sh
ENV SHELL /bin/bash

View File

@@ -0,0 +1,24 @@
// See https://aka.ms/vscode-remote/devcontainer.json for format details.
{
"name": "Supervisor dev",
"context": "..",
"dockerFile": "Dockerfile",
"appPort": "9123:8123",
"runArgs": ["-e", "GIT_EDITOR=code --wait", "--privileged"],
"extensions": [
"ms-python.python",
"visualstudioexptteam.vscodeintellicode",
"esbenp.prettier-vscode"
],
"settings": {
"python.pythonPath": "/usr/local/bin/python",
"python.linting.pylintEnabled": true,
"python.linting.enabled": true,
"python.formatting.provider": "black",
"python.formatting.blackArgs": ["--target-version", "py37"],
"editor.formatOnPaste": false,
"editor.formatOnSave": true,
"editor.formatOnType": true,
"files.trimTrailingWhitespace": true
}
}

View File

@@ -1,13 +1,23 @@
# General files
.git
.github
.devcontainer
.vscode
# Test related files
.tox
# Temporary files
**/__pycache__
.pytest_cache
# virtualenv
venv/
ENV/
# Data
home-assistant-polymer/
script/
tests/
# Test ENV
data/

27
.github/lock.yml vendored Normal file
View File

@@ -0,0 +1,27 @@
# Configuration for Lock Threads - https://github.com/dessant/lock-threads
# Number of days of inactivity before a closed issue or pull request is locked
daysUntilLock: 1
# Skip issues and pull requests created before a given timestamp. Timestamp must
# follow ISO 8601 (`YYYY-MM-DD`). Set to `false` to disable
skipCreatedBefore: 2020-01-01
# Issues and pull requests with these labels will be ignored. Set to `[]` to disable
exemptLabels: []
# Label to add before locking, such as `outdated`. Set to `false` to disable
lockLabel: false
# Comment to post before locking. Set to `false` to disable
lockComment: false
# Assign `resolved` as the reason for locking. Set to `false` to disable
setLockReason: false
# Limit to only `issues` or `pulls`
only: pulls
# Optionally, specify configuration settings just for `issues` or `pulls`
issues:
daysUntilLock: 30

5
.gitignore vendored
View File

@@ -92,4 +92,7 @@ ENV/
.pylint.d/
# VS Code
.vscode/
.vscode/*
!.vscode/cSpell.json
!.vscode/tasks.json
!.vscode/launch.json

18
.vscode/launch.json vendored Normal file
View File

@@ -0,0 +1,18 @@
{
"version": "0.2.0",
"configurations": [
{
"name": "Supervisor remote debug",
"type": "python",
"request": "attach",
"port": 33333,
"host": "172.30.32.2",
"pathMappings": [
{
"localRoot": "${workspaceFolder}",
"remoteRoot": "/usr/src/supervisor"
}
]
}
]
}

90
.vscode/tasks.json vendored Normal file
View File

@@ -0,0 +1,90 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "Run Testenv",
"type": "shell",
"command": "./scripts/test_env.sh",
"group": {
"kind": "test",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Run Testenv CLI",
"type": "shell",
"command": "docker exec -ti hassio_cli /usr/bin/cli.sh",
"group": {
"kind": "test",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Update UI",
"type": "shell",
"command": "./scripts/update-frontend.sh",
"group": {
"kind": "build",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Pytest",
"type": "shell",
"command": "pytest --timeout=10 tests",
"group": {
"kind": "test",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Flake8",
"type": "shell",
"command": "flake8 hassio tests",
"group": {
"kind": "test",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Pylint",
"type": "shell",
"command": "pylint hassio",
"dependsOn": ["Install all Requirements"],
"group": {
"kind": "test",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
}
]
}

1094
API.md

File diff suppressed because it is too large Load Diff

View File

@@ -1,28 +1,40 @@
ARG BUILD_FROM
FROM $BUILD_FROM
ARG BUILD_ARCH
ENV \
S6_SERVICES_GRACETIME=10000
# Install base
RUN apk add --no-cache \
openssl \
libffi \
musl \
git \
socat \
glib \
libstdc++ \
eudev-libs
RUN \
apk add --no-cache \
eudev \
eudev-libs \
git \
glib \
libffi \
libpulse \
musl \
openssl \
socat
ARG BUILD_ARCH
WORKDIR /usr/src
# Install requirements
COPY requirements.txt /usr/src/
RUN export MAKEFLAGS="-j$(nproc)" \
&& pip3 install --no-cache-dir --find-links https://wheels.hass.io/alpine-3.9/${BUILD_ARCH}/ \
-r /usr/src/requirements.txt \
&& rm -f /usr/src/requirements.txt
COPY requirements.txt .
RUN \
export MAKEFLAGS="-j$(nproc)" \
&& pip3 install --no-cache-dir --no-index --only-binary=:all: --find-links \
"https://wheels.home-assistant.io/alpine-$(cut -d '.' -f 1-2 < /etc/alpine-release)/${BUILD_ARCH}/" \
-r ./requirements.txt \
&& rm -f requirements.txt
# Install HassIO
COPY . /usr/src/hassio
RUN pip3 install --no-cache-dir -e /usr/src/hassio
# Install Home Assistant Supervisor
COPY . supervisor
RUN \
pip3 install --no-cache-dir -e ./supervisor \
&& python3 -m compileall ./supervisor/supervisor
CMD [ "python3", "-m", "hassio" ]
WORKDIR /
COPY rootfs /

811
LICENSE
View File

@@ -1,201 +1,674 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
1. Definitions.
Preamble
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
The precise terms and conditions for copying, distribution and
modification follow.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
TERMS AND CONDITIONS
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
0. Definitions.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
"This License" refers to version 3 of the GNU General Public License.
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
A "covered work" means either the unmodified Program or a work based
on the Program.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
1. Source Code.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
END OF TERMS AND CONDITIONS
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
APPENDIX: How to apply the Apache License to your work.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
Copyright 2017 Pascal Vizeli
The Corresponding Source for a work in source code form is that
same work.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
2. Basic Permissions.
http://www.apache.org/licenses/LICENSE-2.0
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>.

View File

@@ -1,3 +1,3 @@
include LICENSE.md
graft hassio
graft supervisor
recursive-exclude * *.py[co]

View File

@@ -1,30 +1,26 @@
[![Build Status](https://dev.azure.com/home-assistant/Hass.io/_apis/build/status/hassio?branchName=dev)](https://dev.azure.com/home-assistant/Hass.io/_build/latest?definitionId=2&branchName=dev)
# Hass.io
# Home Assistant Supervisor
## First private cloud solution for home automation
Hass.io is a Docker-based system for managing your Home Assistant installation
and related applications. The system is controlled via Home Assistant which
communicates with the Supervisor. The Supervisor provides an API to manage the
installation. This includes changing network settings or installing
and updating software.
![](misc/hassio.png?raw=true)
Home Assistant (former Hass.io) is a container-based system for managing your
Home Assistant Core installation and related applications. The system is
controlled via Home Assistant which communicates with the Supervisor. The
Supervisor provides an API to manage the installation. This includes changing
network settings or installing and updating software.
## Installation
Installation instructions can be found at <https://home-assistant.io/hassio>.
Installation instructions can be found at https://home-assistant.io/hassio.
## Development
The development of the supervisor is a bit tricky. Not difficult but tricky.
The development of the Supervisor is not difficult but tricky.
- You can use the builder to build your supervisor: https://github.com/home-assistant/hassio-builder
- Go into a HassOS device or VM and pull your supervisor.
- Set the developer modus with cli `hassio supervisor options --channel=dev`
- You can use the builder to create your Supervisor: https://github.com/home-assistant/hassio-builder
- Access a HassOS device or VM and pull your Supervisor.
- Set the developer modus with the CLI tool: `ha supervisor options --channel=dev`
- Tag it as `homeassistant/xy-hassio-supervisor:latest`
- Restart the service like `systemctl restart hassos-supervisor | journalctl -fu hassos-supervisor`
- Restart the service with `systemctl restart hassos-supervisor | journalctl -fu hassos-supervisor`
- Test your changes
Small Bugfix or improvements, make a PR. Significant change makes first an RFC.
For small bugfixes or improvements, make a PR. For significant changes open a RFC first, please. Thanks.

52
azure-pipelines-ci.yml Normal file
View File

@@ -0,0 +1,52 @@
# https://dev.azure.com/home-assistant
trigger:
batch: true
branches:
include:
- master
- dev
pr:
- dev
variables:
- name: versionHadolint
value: "v1.16.3"
jobs:
- job: "Tox"
pool:
vmImage: "ubuntu-latest"
steps:
- script: |
sudo apt-get update
sudo apt-get install -y libpulse0 libudev1
displayName: "Install Host library"
- task: UsePythonVersion@0
displayName: "Use Python 3.7"
inputs:
versionSpec: "3.7"
- script: pip install tox
displayName: "Install Tox"
- script: tox
displayName: "Run Tox"
- job: "JQ"
pool:
vmImage: "ubuntu-latest"
steps:
- script: sudo apt-get install -y jq
displayName: "Install JQ"
- bash: |
shopt -s globstar
cat **/*.json | jq '.'
displayName: "Run JQ"
- job: "Hadolint"
pool:
vmImage: "ubuntu-latest"
steps:
- script: sudo docker pull hadolint/hadolint:$(versionHadolint)
displayName: "Install Hadolint"
- script: |
sudo docker run --rm -i \
-v $(pwd)/.hadolint.yaml:/.hadolint.yaml:ro \
hadolint/hadolint:$(versionHadolint) < Dockerfile
displayName: "Run Hadolint"

View File

@@ -0,0 +1,53 @@
# https://dev.azure.com/home-assistant
trigger:
batch: true
branches:
include:
- dev
tags:
include:
- "*"
pr: none
variables:
- name: versionBuilder
value: "7.0"
- group: docker
jobs:
- job: "VersionValidate"
pool:
vmImage: "ubuntu-latest"
steps:
- task: UsePythonVersion@0
displayName: "Use Python 3.7"
inputs:
versionSpec: "3.7"
- script: |
setup_version="$(python setup.py -V)"
branch_version="$(Build.SourceBranchName)"
if [ "${branch_version}" == "dev" ]; then
exit 0
elif [ "${setup_version}" != "${branch_version}" ]; then
echo "Version of tag ${branch_version} don't match with ${setup_version}!"
exit 1
fi
displayName: "Check version of branch/tag"
- job: "Release"
dependsOn:
- "VersionValidate"
pool:
vmImage: "ubuntu-latest"
steps:
- script: sudo docker login -u $(dockerUser) -p $(dockerPassword)
displayName: "Docker hub login"
- script: sudo docker pull homeassistant/amd64-builder:$(versionBuilder)
displayName: "Install Builder"
- script: |
sudo docker run --rm --privileged \
-v ~/.docker:/root/.docker \
-v /run/docker.sock:/run/docker.sock:rw -v $(pwd):/data:ro \
homeassistant/amd64-builder:$(versionBuilder) \
--generic $(Build.SourceBranchName) --all -t /data
displayName: "Build Release"

View File

@@ -0,0 +1,26 @@
# https://dev.azure.com/home-assistant
trigger:
batch: true
branches:
include:
- dev
pr: none
variables:
- name: versionWheels
value: '1.6.1-3.7-alpine3.11'
resources:
repositories:
- repository: azure
type: github
name: 'home-assistant/ci-azure'
endpoint: 'home-assistant'
jobs:
- template: templates/azp-job-wheels.yaml@azure
parameters:
builderVersion: '$(versionWheels)'
builderApk: 'build-base;libffi-dev;openssl-dev'
builderPip: 'Cython'
wheelsRequirement: 'requirements.txt'

View File

@@ -1,192 +0,0 @@
# https://dev.azure.com/home-assistant
trigger:
branches:
include:
- master
- dev
tags:
include:
- '*'
exclude:
- untagged*
pr:
- dev
variables:
- name: versionHadolint
value: 'v1.16.3'
- name: versionBuilder
value: '3.2'
- name: versionWheels
value: '0.3'
- group: docker
- group: wheels
jobs:
- job: 'Tox'
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python 3.7'
inputs:
versionSpec: '3.7'
- script: pip install tox
displayName: 'Install Tox'
- script: tox
displayName: 'Run Tox'
- job: 'Black'
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python $(python.version)'
inputs:
versionSpec: '3.7'
- script: pip install black
displayName: 'Install black'
- script: black --check hassio
displayName: 'Run Black'
- job: 'JQ'
pool:
vmImage: 'ubuntu-latest'
steps:
- script: sudo apt-get install -y jq
displayName: 'Install JQ'
- bash: |
shopt -s globstar
cat **/*.json | jq '.'
displayName: 'Run JQ'
- job: 'Hadolint'
pool:
vmImage: 'ubuntu-latest'
steps:
- script: sudo docker pull hadolint/hadolint:$(versionHadolint)
displayName: 'Install Hadolint'
- script: |
sudo docker run --rm -i \
-v $(pwd)/.hadolint.yaml:/.hadolint.yaml:ro \
hadolint/hadolint:$(versionHadolint) < Dockerfile
displayName: 'Run Hadolint'
- job: 'Wheels'
condition: eq(variables['Build.SourceBranchName'], 'dev')
timeoutInMinutes: 360
pool:
vmImage: 'ubuntu-latest'
strategy:
maxParallel: 3
matrix:
amd64:
buildArch: 'amd64'
i386:
buildArch: 'i386'
armhf:
buildArch: 'armhf'
armv7:
buildArch: 'armv7'
aarch64:
buildArch: 'aarch64'
steps:
- script: |
sudo apt-get update
sudo apt-get install -y --no-install-recommends \
qemu-user-static \
binfmt-support
sudo mount binfmt_misc -t binfmt_misc /proc/sys/fs/binfmt_misc
sudo update-binfmts --enable qemu-arm
sudo update-binfmts --enable qemu-aarch64
displayName: 'Initial cross build'
- script: |
mkdir -p .ssh
echo -e "-----BEGIN RSA PRIVATE KEY-----\n$(wheelsSSH)\n-----END RSA PRIVATE KEY-----" >> .ssh/id_rsa
ssh-keyscan -H $(wheelsHost) >> .ssh/known_hosts
chmod 600 .ssh/*
displayName: 'Install ssh key'
- script: sudo docker pull homeassistant/$(buildArch)-wheels:$(versionWheels)
displayName: 'Install wheels builder'
- script: |
sudo docker run --rm -v $(pwd):/data:ro -v $(pwd)/.ssh:/root/.ssh:rw \
homeassistant/$(buildArch)-wheels:$(versionWheels) \
--apk "build-base;libffi-dev;openssl-dev" \
--index https://wheels.hass.io \
--requirement requirements.txt \
--upload rsync \
--remote wheels@$(wheelsHost):/opt/wheels
displayName: 'Run wheels build'
- job: 'ReleaseDEV'
condition: and(eq(variables['Build.SourceBranchName'], 'dev'), succeeded('JQ'), succeeded('Tox'), succeeded('Hadolint'), succeeded('Wheels'))
dependsOn:
- 'JQ'
- 'Tox'
- 'Hadolint'
- 'Wheels'
pool:
vmImage: 'ubuntu-latest'
steps:
- script: sudo docker login -u $(dockerUser) -p $(dockerPassword)
displayName: 'Docker hub login'
- script: sudo docker pull homeassistant/amd64-builder:$(versionBuilder)
displayName: 'Install Builder'
- script: |
sudo docker run --rm --privileged \
-v ~/.docker:/root/.docker \
-v /run/docker.sock:/run/docker.sock:rw -v $(pwd):/data:ro \
homeassistant/amd64-builder:$(versionBuilder) \
--supervisor --all -t /data --version dev --docker-hub homeassistant
displayName: 'Build DEV'
- job: 'VersionValidate'
condition: startsWith(variables['Build.SourceBranch'], 'refs/tags')
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python 3.7'
inputs:
versionSpec: '3.7'
- script: |
setup_version="$(python setup.py -V)"
branch_version="$(Build.SourceBranchName)"
if [ "${setup_version}" != "${branch_version}" ]; then
echo "Version of tag ${branch_version} don't match with ${setup_version}!"
exit 1
fi
displayName: 'Check version of branch/tag'
- job: 'Release'
condition: and(startsWith(variables['Build.SourceBranch'], 'refs/tags'), succeeded('JQ'), succeeded('Tox'), succeeded('Hadolint'), succeeded('VersionValidate'))
dependsOn:
- 'JQ'
- 'Tox'
- 'Hadolint'
- 'VersionValidate'
pool:
vmImage: 'ubuntu-latest'
steps:
- script: sudo docker login -u $(dockerUser) -p $(dockerPassword)
displayName: 'Docker hub login'
- script: sudo docker pull homeassistant/amd64-builder:$(versionBuilder)
displayName: 'Install Builder'
- script: |
sudo docker run --rm --privileged \
-v ~/.docker:/root/.docker \
-v /run/docker.sock:/run/docker.sock:rw -v $(pwd):/data:ro \
homeassistant/amd64-builder:$(versionBuilder) \
--supervisor --all -t /data --docker-hub homeassistant
displayName: 'Build Release'

13
build.json Normal file
View File

@@ -0,0 +1,13 @@
{
"image": "homeassistant/{arch}-hassio-supervisor",
"build_from": {
"aarch64": "homeassistant/aarch64-base-python:3.7-alpine3.11",
"armhf": "homeassistant/armhf-base-python:3.7-alpine3.11",
"armv7": "homeassistant/armv7-base-python:3.7-alpine3.11",
"amd64": "homeassistant/amd64-base-python:3.7-alpine3.11",
"i386": "homeassistant/i386-base-python:3.7-alpine3.11"
},
"labels": {
"io.hass.type": "supervisor"
}
}

View File

@@ -1 +0,0 @@
"""Init file for Hass.io."""

View File

@@ -1,40 +0,0 @@
"""Init file for Hass.io hardware RESTful API."""
import logging
from .utils import api_process
from ..const import (
ATTR_SERIAL,
ATTR_DISK,
ATTR_GPIO,
ATTR_AUDIO,
ATTR_INPUT,
ATTR_OUTPUT,
)
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
class APIHardware(CoreSysAttributes):
"""Handle RESTful API for hardware functions."""
@api_process
async def info(self, request):
"""Show hardware info."""
return {
ATTR_SERIAL: list(self.sys_hardware.serial_devices),
ATTR_INPUT: list(self.sys_hardware.input_devices),
ATTR_DISK: list(self.sys_hardware.disk_devices),
ATTR_GPIO: list(self.sys_hardware.gpio_devices),
ATTR_AUDIO: self.sys_hardware.audio_devices,
}
@api_process
async def audio(self, request):
"""Show ALSA audio devices."""
return {
ATTR_AUDIO: {
ATTR_INPUT: self.sys_host.alsa.input_devices,
ATTR_OUTPUT: self.sys_host.alsa.output_devices,
}
}

View File

@@ -1,57 +0,0 @@
"""Init file for Hass.io HassOS RESTful API."""
import asyncio
import logging
from typing import Any, Awaitable, Dict
import voluptuous as vol
from aiohttp import web
from ..const import (
ATTR_BOARD,
ATTR_VERSION,
ATTR_VERSION_CLI,
ATTR_VERSION_CLI_LATEST,
ATTR_VERSION_LATEST,
)
from ..coresys import CoreSysAttributes
from .utils import api_process, api_validate
_LOGGER = logging.getLogger(__name__)
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
class APIHassOS(CoreSysAttributes):
"""Handle RESTful API for HassOS functions."""
@api_process
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return HassOS information."""
return {
ATTR_VERSION: self.sys_hassos.version,
ATTR_VERSION_CLI: self.sys_hassos.version_cli,
ATTR_VERSION_LATEST: self.sys_hassos.version_latest,
ATTR_VERSION_CLI_LATEST: self.sys_hassos.version_cli_latest,
ATTR_BOARD: self.sys_hassos.board,
}
@api_process
async def update(self, request: web.Request) -> None:
"""Update HassOS."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_hassos.version_latest)
await asyncio.shield(self.sys_hassos.update(version))
@api_process
async def update_cli(self, request: web.Request) -> None:
"""Update HassOS CLI."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_hassos.version_cli_latest)
await asyncio.shield(self.sys_hassos.update_cli(version))
@api_process
def config_sync(self, request: web.Request) -> Awaitable[None]:
"""Trigger config reload on HassOS."""
return asyncio.shield(self.sys_hassos.config_sync())

View File

@@ -1 +0,0 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[7],{102:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(124),i=t.n(e),o=t(126),u=t.n(o),a=i.a,c=u.a}}]);

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.69dd3afa8a315523db98.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.807ac4267d577a6403cd.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.b060a768bba881c3480a.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.cff54ae25a1e80506a71.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.d1638e8b6cd63427acdd.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.fa1d90049d43bcf36e42.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
!function(e){function n(n){for(var t,o,a=n[0],i=n[1],f=0,c=[];f<a.length;f++)o=a[f],r[o]&&c.push(r[o][0]),r[o]=0;for(t in i)Object.prototype.hasOwnProperty.call(i,t)&&(e[t]=i[t]);for(u&&u(n);c.length;)c.shift()()}var t={},r={4:0};function o(n){if(t[n])return t[n].exports;var r=t[n]={i:n,l:!1,exports:{}};return e[n].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var n=[],t=r[e];if(0!==t)if(t)n.push(t[2]);else{var a=new Promise(function(n,o){t=r[e]=[n,o]});n.push(t[2]=a);var i,f=document.createElement("script");f.charset="utf-8",f.timeout=120,o.nc&&f.setAttribute("nonce",o.nc),f.src=function(e){return o.p+"chunk."+{0:"cff54ae25a1e80506a71",1:"fa1d90049d43bcf36e42",2:"d1638e8b6cd63427acdd",3:"807ac4267d577a6403cd",5:"a1122a4f9ebd72cb73ff",6:"bef066214c930e19d0e8",7:"0f38aa06224ef471d7a4",8:"8c6a9e6fd2862fc2fd9f",9:"69dd3afa8a315523db98",10:"320d5a8af6741fdbfaaf",11:"b060a768bba881c3480a",12:"ff337d8fd6f580702170",13:"439f972b31b285e49c06"}[e]+".js"}(e),i=function(n){f.onerror=f.onload=null,clearTimeout(u);var t=r[e];if(0!==t){if(t){var o=n&&("load"===n.type?"missing":n.type),a=n&&n.target&&n.target.src,i=new Error("Loading chunk "+e+" failed.\n("+o+": "+a+")");i.type=o,i.request=a,t[1](i)}r[e]=void 0}};var u=setTimeout(function(){i({type:"timeout",target:f})},12e4);f.onerror=f.onload=i,document.head.appendChild(f)}return Promise.all(n)},o.m=e,o.c=t,o.d=function(e,n,t){o.o(e,n)||Object.defineProperty(e,n,{enumerable:!0,get:t})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,n){if(1&n&&(e=o(e)),8&n)return e;if(4&n&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(o.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&n&&"string"!=typeof e)for(var r in e)o.d(t,r,function(n){return e[n]}.bind(null,r));return t},o.n=function(e){var n=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(n,"a",n),n},o.o=function(e,n){return Object.prototype.hasOwnProperty.call(e,n)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var a=window.webpackJsonp=window.webpackJsonp||[],i=a.push.bind(a);a.push=n,a=a.slice();for(var f=0;f<a.length;f++)n(a[f]);var u=i;o(o.s=0)}([function(e,n,t){window.loadES5Adapter().then(function(){Promise.all([t.e(1),t.e(5)]).then(t.bind(null,2)),Promise.all([t.e(1),t.e(9),t.e(6)]).then(t.bind(null,1))});var r=document.createElement("style");r.innerHTML="\nbody {\n font-family: Roboto, sans-serif;\n -moz-osx-font-smoothing: grayscale;\n -webkit-font-smoothing: antialiased;\n font-weight: 400;\n margin: 0;\n padding: 0;\n height: 100vh;\n}\n",document.head.appendChild(r)}]);

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -1,49 +0,0 @@
{
"raspberrypi": [
"armhf"
],
"raspberrypi2": [
"armv7",
"armhf"
],
"raspberrypi3": [
"armv7",
"armhf"
],
"raspberrypi3-64": [
"aarch64",
"armv7",
"armhf"
],
"tinker": [
"armv7",
"armhf"
],
"odroid-c2": [
"aarch64"
],
"odroid-xu": [
"armv7",
"armhf"
],
"orangepi-prime": [
"aarch64"
],
"qemux86": [
"i386"
],
"qemux86-64": [
"amd64",
"i386"
],
"qemuarm": [
"armhf"
],
"qemuarm-64": [
"aarch64"
],
"intel-nuc": [
"amd64",
"i386"
]
}

View File

@@ -1,158 +0,0 @@
"""Main file for Hass.io."""
from contextlib import suppress
import asyncio
import logging
import async_timeout
from .coresys import CoreSysAttributes
from .const import (
STARTUP_SYSTEM,
STARTUP_SERVICES,
STARTUP_APPLICATION,
STARTUP_INITIALIZE,
)
from .exceptions import HassioError, HomeAssistantError
_LOGGER = logging.getLogger(__name__)
class HassIO(CoreSysAttributes):
"""Main object of Hass.io."""
def __init__(self, coresys):
"""Initialize Hass.io object."""
self.coresys = coresys
async def setup(self):
"""Setup HassIO orchestration."""
# Load Supervisor
await self.sys_supervisor.load()
# Load DBus
await self.sys_dbus.load()
# Load Host
await self.sys_host.load()
# Load Home Assistant
await self.sys_homeassistant.load()
# Load CPU/Arch
await self.sys_arch.load()
# Load HassOS
await self.sys_hassos.load()
# Load Stores
await self.sys_store.load()
# Load Add-ons
await self.sys_addons.load()
# rest api views
await self.sys_api.load()
# load last available data
await self.sys_updater.load()
# load last available data
await self.sys_snapshots.load()
# load services
await self.sys_services.load()
# Load discovery
await self.sys_discovery.load()
# Load ingress
await self.sys_ingress.load()
# start dns forwarding
self.sys_create_task(self.sys_dns.start())
async def start(self):
"""Start Hass.io orchestration."""
# on release channel, try update itself
if self.sys_supervisor.need_update:
if self.sys_dev:
_LOGGER.warning("Ignore Hass.io updates on dev!")
elif await self.sys_supervisor.update():
return
# start api
await self.sys_api.start()
# start addon mark as initialize
await self.sys_addons.boot(STARTUP_INITIALIZE)
try:
# HomeAssistant is already running / supervisor have only reboot
if self.sys_hardware.last_boot == self.sys_config.last_boot:
_LOGGER.info("Hass.io reboot detected")
return
# reset register services / discovery
self.sys_services.reset()
# start addon mark as system
await self.sys_addons.boot(STARTUP_SYSTEM)
# start addon mark as services
await self.sys_addons.boot(STARTUP_SERVICES)
# run HomeAssistant
if self.sys_homeassistant.boot:
with suppress(HomeAssistantError):
await self.sys_homeassistant.start()
# start addon mark as application
await self.sys_addons.boot(STARTUP_APPLICATION)
# store new last boot
self.sys_config.last_boot = self.sys_hardware.last_boot
self.sys_config.save_data()
finally:
# Add core tasks into scheduler
await self.sys_tasks.load()
# If landingpage / run upgrade in background
if self.sys_homeassistant.version == "landingpage":
self.sys_create_task(self.sys_homeassistant.install())
_LOGGER.info("Hass.io is up and running")
async def stop(self):
"""Stop a running orchestration."""
# don't process scheduler anymore
self.sys_scheduler.suspend = True
# process async stop tasks
try:
with async_timeout.timeout(10):
await asyncio.wait(
[
self.sys_api.stop(),
self.sys_dns.stop(),
self.sys_websession.close(),
self.sys_websession_ssl.close(),
self.sys_ingress.unload(),
]
)
except asyncio.TimeoutError:
_LOGGER.warning("Force Shutdown!")
_LOGGER.info("Hass.io is down")
async def shutdown(self):
"""Shutdown all running containers in correct order."""
await self.sys_addons.shutdown(STARTUP_APPLICATION)
# Close Home Assistant
with suppress(HassioError):
await self.sys_homeassistant.stop()
await self.sys_addons.shutdown(STARTUP_SERVICES)
await self.sys_addons.shutdown(STARTUP_SYSTEM)
await self.sys_addons.shutdown(STARTUP_INITIALIZE)

View File

@@ -1,39 +0,0 @@
"""D-Bus interface objects."""
from .systemd import Systemd
from .hostname import Hostname
from .rauc import Rauc
from ..coresys import CoreSysAttributes
class DBusManager(CoreSysAttributes):
"""A DBus Interface handler."""
def __init__(self, coresys):
"""Initialize D-Bus interface."""
self.coresys = coresys
self._systemd = Systemd()
self._hostname = Hostname()
self._rauc = Rauc()
@property
def systemd(self):
"""Return the systemd interface."""
return self._systemd
@property
def hostname(self):
"""Return the hostname interface."""
return self._hostname
@property
def rauc(self):
"""Return the rauc interface."""
return self._rauc
async def load(self):
"""Connect interfaces to D-Bus."""
await self.systemd.connect()
await self.hostname.connect()
await self.rauc.connect()

View File

@@ -1,39 +0,0 @@
"""D-Bus interface for hostname."""
import logging
from .interface import DBusInterface
from .utils import dbus_connected
from ..exceptions import DBusError
from ..utils.gdbus import DBus
_LOGGER = logging.getLogger(__name__)
DBUS_NAME = "org.freedesktop.hostname1"
DBUS_OBJECT = "/org/freedesktop/hostname1"
class Hostname(DBusInterface):
"""Handle D-Bus interface for hostname/system."""
async def connect(self):
"""Connect to system's D-Bus."""
try:
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:
_LOGGER.warning("Can't connect to hostname")
@dbus_connected
def set_static_hostname(self, hostname):
"""Change local hostname.
Return a coroutine.
"""
return self.dbus.SetStaticHostname(hostname, False)
@dbus_connected
def get_properties(self):
"""Return local host informations.
Return a coroutine.
"""
return self.dbus.get_properties(DBUS_NAME)

View File

@@ -1,55 +0,0 @@
"""D-Bus interface for rauc."""
import logging
from .interface import DBusInterface
from .utils import dbus_connected
from ..exceptions import DBusError
from ..utils.gdbus import DBus
_LOGGER = logging.getLogger(__name__)
DBUS_NAME = "de.pengutronix.rauc"
DBUS_OBJECT = "/"
class Rauc(DBusInterface):
"""Handle D-Bus interface for rauc."""
async def connect(self):
"""Connect to D-Bus."""
try:
self.dbus = await DBus.connect(DBUS_NAME, DBUS_OBJECT)
except DBusError:
_LOGGER.warning("Can't connect to rauc")
@dbus_connected
def install(self, raucb_file):
"""Install rauc bundle file.
Return a coroutine.
"""
return self.dbus.Installer.Install(raucb_file)
@dbus_connected
def get_slot_status(self):
"""Get slot status.
Return a coroutine.
"""
return self.dbus.Installer.GetSlotStatus()
@dbus_connected
def get_properties(self):
"""Return rauc informations.
Return a coroutine.
"""
return self.dbus.get_properties(f"{DBUS_NAME}.Installer")
@dbus_connected
def signal_completed(self):
"""Return a signal wrapper for completed signal.
Return a coroutine.
"""
return self.dbus.wait_signal(f"{DBUS_NAME}.Installer.Completed")

View File

@@ -1,38 +0,0 @@
"""HassOS Cli docker object."""
import logging
import docker
from ..coresys import CoreSysAttributes
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__)
class DockerHassOSCli(DockerInterface, CoreSysAttributes):
"""Docker Hass.io wrapper for HassOS Cli."""
@property
def image(self):
"""Return name of HassOS CLI image."""
return f"homeassistant/{self.sys_arch.supervisor}-hassio-cli"
def _stop(self, remove_container=True):
"""Don't need stop."""
return True
def _attach(self):
"""Attach to running Docker container.
Need run inside executor.
"""
try:
image = self.sys_docker.images.get(self.image)
except docker.errors.DockerException:
_LOGGER.warning("Can't find a HassOS CLI %s", self.image)
else:
self._meta = image.attrs
_LOGGER.info(
"Found HassOS CLI %s with version %s", self.image, self.version
)

View File

@@ -1,51 +0,0 @@
"""Init file for Hass.io Docker object."""
from ipaddress import IPv4Address
import logging
import os
import docker
from ..coresys import CoreSysAttributes
from ..exceptions import DockerAPIError
from .interface import DockerInterface
_LOGGER = logging.getLogger(__name__)
class DockerSupervisor(DockerInterface, CoreSysAttributes):
"""Docker Hass.io wrapper for Supervisor."""
@property
def name(self) -> str:
"""Return name of Docker container."""
return os.environ["SUPERVISOR_NAME"]
@property
def ip_address(self) -> IPv4Address:
"""Return IP address of this container."""
return self.sys_docker.network.supervisor
def _attach(self) -> None:
"""Attach to running docker container.
Need run inside executor.
"""
try:
docker_container = self.sys_docker.containers.get(self.name)
except docker.errors.DockerException:
raise DockerAPIError() from None
self._meta = docker_container.attrs
_LOGGER.info(
"Attach to Supervisor %s with version %s", self.image, self.version
)
# If already attach
if docker_container in self.sys_docker.network.containers:
return
# Attach to network
_LOGGER.info("Connect Supervisor to Hass.io Network")
self.sys_docker.network.attach_container(
docker_container, alias=["hassio"], ipv4=self.sys_docker.network.supervisor
)

View File

@@ -1,136 +0,0 @@
"""Host Audio support."""
import logging
import json
from pathlib import Path
from string import Template
import attr
from ..const import ATTR_INPUT, ATTR_OUTPUT, ATTR_DEVICES, ATTR_NAME, CHAN_ID, CHAN_TYPE
from ..coresys import CoreSysAttributes
_LOGGER = logging.getLogger(__name__)
# pylint: disable=invalid-name
DefaultConfig = attr.make_class("DefaultConfig", ["input", "output"])
class AlsaAudio(CoreSysAttributes):
"""Handle Audio ALSA host data."""
def __init__(self, coresys):
"""Initialize ALSA audio system."""
self.coresys = coresys
self._data = {ATTR_INPUT: {}, ATTR_OUTPUT: {}}
self._cache = 0
self._default = None
@property
def input_devices(self):
"""Return list of ALSA input devices."""
self._update_device()
return self._data[ATTR_INPUT]
@property
def output_devices(self):
"""Return list of ALSA output devices."""
self._update_device()
return self._data[ATTR_OUTPUT]
def _update_device(self):
"""Update Internal device DB."""
current_id = hash(frozenset(self.sys_hardware.audio_devices))
# Need rebuild?
if current_id == self._cache:
return
# Clean old stuff
self._data[ATTR_INPUT].clear()
self._data[ATTR_OUTPUT].clear()
# Init database
_LOGGER.info("Update ALSA device list")
database = self._audio_database()
# Process devices
for dev_id, dev_data in self.sys_hardware.audio_devices.items():
for chan_info in dev_data[ATTR_DEVICES]:
chan_id = chan_info[CHAN_ID]
chan_type = chan_info[CHAN_TYPE]
alsa_id = f"{dev_id},{chan_id}"
dev_name = dev_data[ATTR_NAME]
# Lookup type
if chan_type.endswith("playback"):
key = ATTR_OUTPUT
elif chan_type.endswith("capture"):
key = ATTR_INPUT
else:
_LOGGER.warning("Unknown channel type: %s", chan_type)
continue
# Use name from DB or a generic name
self._data[key][alsa_id] = (
database.get(self.sys_machine, {})
.get(dev_name, {})
.get(alsa_id, f"{dev_name}: {chan_id}")
)
self._cache = current_id
@staticmethod
def _audio_database():
"""Read local json audio data into dict."""
json_file = Path(__file__).parent.joinpath("data/audiodb.json")
try:
# pylint: disable=no-member
with json_file.open("r") as database:
return json.loads(database.read())
except (ValueError, OSError) as err:
_LOGGER.warning("Can't read audio DB: %s", err)
return {}
@property
def default(self):
"""Generate ALSA default setting."""
# Init defaults
if self._default is None:
database = self._audio_database()
alsa_input = database.get(self.sys_machine, {}).get(ATTR_INPUT)
alsa_output = database.get(self.sys_machine, {}).get(ATTR_OUTPUT)
self._default = DefaultConfig(alsa_input, alsa_output)
# Search exists/new output
if self._default.output is None and self.output_devices:
self._default.output = next(iter(self.output_devices))
_LOGGER.info("Detect output device %s", self._default.output)
# Search exists/new input
if self._default.input is None and self.input_devices:
self._default.input = next(iter(self.input_devices))
_LOGGER.info("Detect input device %s", self._default.input)
return self._default
def asound(self, alsa_input=None, alsa_output=None):
"""Generate an asound data."""
alsa_input = alsa_input or self.default.input
alsa_output = alsa_output or self.default.output
# Read Template
asound_file = Path(__file__).parent.joinpath("data/asound.tmpl")
try:
# pylint: disable=no-member
with asound_file.open("r") as asound:
asound_data = asound.read()
except OSError as err:
_LOGGER.error("Can't read asound.tmpl: %s", err)
return ""
# Process Template
asound_template = Template(asound_data)
return asound_template.safe_substitute(input=alsa_input, output=alsa_output)

View File

@@ -1,17 +0,0 @@
pcm.!default {
type asym
capture.pcm "mic"
playback.pcm "speaker"
}
pcm.mic {
type plug
slave {
pcm "hw:$input"
}
}
pcm.speaker {
type plug
slave {
pcm "hw:$output"
}
}

View File

@@ -1,18 +0,0 @@
{
"raspberrypi3": {
"bcm2835 - bcm2835 ALSA": {
"0,0": "Raspberry Jack",
"0,1": "Raspberry HDMI"
},
"output": "0,0",
"input": null
},
"raspberrypi2": {
"output": "0,0",
"input": null
},
"raspberrypi": {
"output": "0,0",
"input": null
}
}

View File

@@ -1,58 +0,0 @@
"""Info control for host."""
import logging
from ..coresys import CoreSysAttributes
from ..exceptions import HassioError, HostNotSupportedError
_LOGGER = logging.getLogger(__name__)
class InfoCenter(CoreSysAttributes):
"""Handle local system information controls."""
def __init__(self, coresys):
"""Initialize system center handling."""
self.coresys = coresys
self._data = {}
@property
def hostname(self):
"""Return local hostname."""
return self._data.get("StaticHostname") or None
@property
def chassis(self):
"""Return local chassis type."""
return self._data.get("Chassis") or None
@property
def deployment(self):
"""Return local deployment type."""
return self._data.get("Deployment") or None
@property
def kernel(self):
"""Return local kernel version."""
return self._data.get("KernelRelease") or None
@property
def operating_system(self):
"""Return local operating system."""
return self._data.get("OperatingSystemPrettyName") or None
@property
def cpe(self):
"""Return local CPE."""
return self._data.get("OperatingSystemCPEName") or None
async def update(self):
"""Update properties over dbus."""
if not self.sys_dbus.hostname.is_connected:
_LOGGER.error("No hostname D-Bus connection available")
raise HostNotSupportedError()
_LOGGER.info("Update local host information")
try:
self._data = await self.sys_dbus.hostname.get_properties()
except HassioError:
_LOGGER.warning("Can't update host system information!")

View File

@@ -1 +0,0 @@
"""Special object and tools for Hass.io."""

View File

@@ -1,135 +0,0 @@
"""Read hardware info from system."""
from datetime import datetime
import logging
from pathlib import Path
import re
import pyudev
from ..const import ATTR_NAME, ATTR_TYPE, ATTR_DEVICES, CHAN_ID, CHAN_TYPE
_LOGGER = logging.getLogger(__name__)
ASOUND_CARDS = Path("/proc/asound/cards")
RE_CARDS = re.compile(r"(\d+) \[(\w*) *\]: (.*\w)")
ASOUND_DEVICES = Path("/proc/asound/devices")
RE_DEVICES = re.compile(r"\[.*(\d+)- (\d+).*\]: ([\w ]*)")
PROC_STAT = Path("/proc/stat")
RE_BOOT_TIME = re.compile(r"btime (\d+)")
GPIO_DEVICES = Path("/sys/class/gpio")
SOC_DEVICES = Path("/sys/devices/platform/soc")
RE_TTY = re.compile(r"tty[A-Z]+")
class Hardware:
"""Representation of an interface to procfs, sysfs and udev."""
def __init__(self):
"""Init hardware object."""
self.context = pyudev.Context()
@property
def serial_devices(self):
"""Return all serial and connected devices."""
dev_list = set()
for device in self.context.list_devices(subsystem="tty"):
if "ID_VENDOR" in device or RE_TTY.search(device.device_node):
dev_list.add(device.device_node)
return dev_list
@property
def input_devices(self):
"""Return all input devices."""
dev_list = set()
for device in self.context.list_devices(subsystem="input"):
if "NAME" in device:
dev_list.add(device["NAME"].replace('"', ""))
return dev_list
@property
def disk_devices(self):
"""Return all disk devices."""
dev_list = set()
for device in self.context.list_devices(subsystem="block"):
if device.device_node.startswith("/dev/sd"):
dev_list.add(device.device_node)
return dev_list
@property
def support_audio(self):
"""Return True if the system have audio support."""
return bool(self.audio_devices)
@property
def audio_devices(self):
"""Return all available audio interfaces."""
if not ASOUND_CARDS.exists():
_LOGGER.debug("No audio devices found")
return {}
try:
cards = ASOUND_CARDS.read_text()
devices = ASOUND_DEVICES.read_text()
except OSError as err:
_LOGGER.error("Can't read asound data: %s", err)
return {}
audio_list = {}
# parse cards
for match in RE_CARDS.finditer(cards):
audio_list[match.group(1)] = {
ATTR_NAME: match.group(3),
ATTR_TYPE: match.group(2),
ATTR_DEVICES: [],
}
# parse devices
for match in RE_DEVICES.finditer(devices):
try:
audio_list[match.group(1)][ATTR_DEVICES].append(
{CHAN_ID: match.group(2), CHAN_TYPE: match.group(3)}
)
except KeyError:
_LOGGER.warning("Wrong audio device found %s", match.group(0))
continue
return audio_list
@property
def support_gpio(self):
"""Return True if device support GPIOs."""
return SOC_DEVICES.exists() and GPIO_DEVICES.exists()
@property
def gpio_devices(self):
"""Return list of GPIO interface on device."""
dev_list = set()
for interface in GPIO_DEVICES.glob("gpio*"):
dev_list.add(interface.name)
return dev_list
@property
def last_boot(self):
"""Return last boot time."""
try:
with PROC_STAT.open("r") as stat_file:
stats = stat_file.read()
except OSError as err:
_LOGGER.error("Can't read stat data: %s", err)
return None
# parse stat file
found = RE_BOOT_TIME.search(stats)
if not found:
_LOGGER.error("Can't found last boot time!")
return None
return datetime.utcfromtimestamp(int(found.group(1)))

View File

@@ -1,12 +0,0 @@
"""Validate services schema."""
import voluptuous as vol
from ..utils.validate import schema_or
from .const import SERVICE_MQTT
from .modules.mqtt import SCHEMA_CONFIG_MQTT
SCHEMA_SERVICES_CONFIG = vol.Schema(
{vol.Optional(SERVICE_MQTT, default=dict): schema_or(SCHEMA_CONFIG_MQTT)},
extra=vol.REMOVE_EXTRA,
)

View File

@@ -1,121 +0,0 @@
"""Fetch last versions from webserver."""
import asyncio
from contextlib import suppress
from datetime import timedelta
import json
import logging
import aiohttp
from .const import (
URL_HASSIO_VERSION,
FILE_HASSIO_UPDATER,
ATTR_HOMEASSISTANT,
ATTR_HASSIO,
ATTR_CHANNEL,
ATTR_HASSOS,
ATTR_HASSOS_CLI,
)
from .coresys import CoreSysAttributes
from .utils import AsyncThrottle
from .utils.json import JsonConfig
from .validate import SCHEMA_UPDATER_CONFIG
from .exceptions import HassioUpdaterError
_LOGGER = logging.getLogger(__name__)
class Updater(JsonConfig, CoreSysAttributes):
"""Fetch last versions from version.json."""
def __init__(self, coresys):
"""Initialize updater."""
super().__init__(FILE_HASSIO_UPDATER, SCHEMA_UPDATER_CONFIG)
self.coresys = coresys
async def load(self):
"""Update internal data."""
with suppress(HassioUpdaterError):
await self.fetch_data()
async def reload(self):
"""Update internal data."""
with suppress(HassioUpdaterError):
await self.fetch_data()
@property
def version_homeassistant(self):
"""Return last version of Home Assistant."""
return self._data.get(ATTR_HOMEASSISTANT)
@property
def version_hassio(self):
"""Return last version of Hass.io."""
return self._data.get(ATTR_HASSIO)
@property
def version_hassos(self):
"""Return last version of HassOS."""
return self._data.get(ATTR_HASSOS)
@property
def version_hassos_cli(self):
"""Return last version of HassOS cli."""
return self._data.get(ATTR_HASSOS_CLI)
@property
def channel(self):
"""Return upstream channel of Hass.io instance."""
return self._data[ATTR_CHANNEL]
@channel.setter
def channel(self, value):
"""Set upstream mode."""
self._data[ATTR_CHANNEL] = value
@AsyncThrottle(timedelta(seconds=60))
async def fetch_data(self):
"""Fetch current versions from Github.
Is a coroutine.
"""
url = URL_HASSIO_VERSION.format(channel=self.channel)
machine = self.sys_machine or "default"
board = self.sys_hassos.board
try:
_LOGGER.info("Fetch update data from %s", url)
async with self.sys_websession.get(url, timeout=10) as request:
data = await request.json(content_type=None)
except (aiohttp.ClientError, asyncio.TimeoutError) as err:
_LOGGER.warning("Can't fetch versions from %s: %s", url, err)
raise HassioUpdaterError() from None
except json.JSONDecodeError as err:
_LOGGER.warning("Can't parse versions from %s: %s", url, err)
raise HassioUpdaterError() from None
# data valid?
if not data or data.get(ATTR_CHANNEL) != self.channel:
_LOGGER.warning("Invalid data from %s", url)
raise HassioUpdaterError() from None
try:
# update supervisor version
self._data[ATTR_HASSIO] = data["supervisor"]
# update Home Assistant version
self._data[ATTR_HOMEASSISTANT] = data["homeassistant"][machine]
# update hassos version
if self.sys_hassos.available and board:
self._data[ATTR_HASSOS] = data["hassos"][board]
self._data[ATTR_HASSOS_CLI] = data["hassos-cli"]
except KeyError as err:
_LOGGER.warning("Can't process version data: %s", err)
raise HassioUpdaterError() from None
else:
self.save_data()

View File

@@ -1,55 +0,0 @@
"""Tools file for Hass.io."""
import logging
import re
from datetime import datetime
_LOGGER = logging.getLogger(__name__)
RE_STRING = re.compile(r"\x1b(\[.*?[@-~]|\].*?(\x07|\x1b\\))")
def convert_to_ascii(raw) -> str:
"""Convert binary to ascii and remove colors."""
return RE_STRING.sub("", raw.decode())
def process_lock(method):
"""Wrap function with only run once."""
async def wrap_api(api, *args, **kwargs):
"""Return api wrapper."""
if api.lock.locked():
_LOGGER.error(
"Can't execute %s while a task is in progress", method.__name__
)
return False
async with api.lock:
return await method(api, *args, **kwargs)
return wrap_api
class AsyncThrottle:
"""
Decorator that prevents a function from being called more than once every
time period.
"""
def __init__(self, delta):
"""Initialize async throttle."""
self.throttle_period = delta
self.time_of_last_call = datetime.min
def __call__(self, method):
"""Throttle function"""
async def wrapper(*args, **kwargs):
"""Throttle function wrapper"""
now = datetime.now()
time_since_last_call = now - self.time_of_last_call
if time_since_last_call > self.throttle_period:
self.time_of_last_call = now
return await method(*args, **kwargs)
return wrapper

View File

@@ -1,145 +0,0 @@
"""Validate functions."""
import re
import uuid
import voluptuous as vol
from .const import (
ATTR_ACCESS_TOKEN,
ATTR_ADDONS_CUSTOM_LIST,
ATTR_BOOT,
ATTR_CHANNEL,
ATTR_DEBUG,
ATTR_DEBUG_BLOCK,
ATTR_HASSIO,
ATTR_HASSOS,
ATTR_HASSOS_CLI,
ATTR_HOMEASSISTANT,
ATTR_IMAGE,
ATTR_LAST_BOOT,
ATTR_LAST_VERSION,
ATTR_LOGGING,
ATTR_PASSWORD,
ATTR_PORT,
ATTR_PORTS,
ATTR_REFRESH_TOKEN,
ATTR_SESSION,
ATTR_SSL,
ATTR_TIMEZONE,
ATTR_UUID,
ATTR_WAIT_BOOT,
ATTR_WATCHDOG,
CHANNEL_BETA,
CHANNEL_DEV,
CHANNEL_STABLE,
)
from .utils.validate import validate_timezone
RE_REPOSITORY = re.compile(r"^(?P<url>[^#]+)(?:#(?P<branch>[\w\-]+))?$")
NETWORK_PORT = vol.All(vol.Coerce(int), vol.Range(min=1, max=65535))
WAIT_BOOT = vol.All(vol.Coerce(int), vol.Range(min=1, max=60))
DOCKER_IMAGE = vol.Match(r"^[\w{}]+/[\-\w{}]+$")
ALSA_DEVICE = vol.Maybe(vol.Match(r"\d+,\d+"))
CHANNELS = vol.In([CHANNEL_STABLE, CHANNEL_BETA, CHANNEL_DEV])
UUID_MATCH = vol.Match(r"^[0-9a-f]{32}$")
SHA256 = vol.Match(r"^[0-9a-f]{64}$")
TOKEN = vol.Match(r"^[0-9a-f]{32,256}$")
LOG_LEVEL = vol.In(["debug", "info", "warning", "error", "critical"])
def validate_repository(repository):
"""Validate a valid repository."""
data = RE_REPOSITORY.match(repository)
if not data:
raise vol.Invalid("No valid repository format!")
# Validate URL
# pylint: disable=no-value-for-parameter
vol.Url()(data.group("url"))
return repository
# pylint: disable=no-value-for-parameter
REPOSITORIES = vol.All([validate_repository], vol.Unique())
DOCKER_PORTS = vol.Schema(
{
vol.All(vol.Coerce(str), vol.Match(r"^\d+(?:/tcp|/udp)?$")): vol.Maybe(
NETWORK_PORT
)
}
)
DOCKER_PORTS_DESCRIPTION = vol.Schema(
{vol.All(vol.Coerce(str), vol.Match(r"^\d+(?:/tcp|/udp)?$")): vol.Coerce(str)}
)
# pylint: disable=no-value-for-parameter
SCHEMA_HASS_CONFIG = vol.Schema(
{
vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex): UUID_MATCH,
vol.Optional(ATTR_ACCESS_TOKEN): TOKEN,
vol.Optional(ATTR_BOOT, default=True): vol.Boolean(),
vol.Inclusive(ATTR_IMAGE, "custom_hass"): DOCKER_IMAGE,
vol.Inclusive(ATTR_LAST_VERSION, "custom_hass"): vol.Coerce(str),
vol.Optional(ATTR_PORT, default=8123): NETWORK_PORT,
vol.Optional(ATTR_PASSWORD): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_REFRESH_TOKEN): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_SSL, default=False): vol.Boolean(),
vol.Optional(ATTR_WATCHDOG, default=True): vol.Boolean(),
vol.Optional(ATTR_WAIT_BOOT, default=600): vol.All(
vol.Coerce(int), vol.Range(min=60)
),
},
extra=vol.REMOVE_EXTRA,
)
SCHEMA_UPDATER_CONFIG = vol.Schema(
{
vol.Optional(ATTR_CHANNEL, default=CHANNEL_STABLE): CHANNELS,
vol.Optional(ATTR_HOMEASSISTANT): vol.Coerce(str),
vol.Optional(ATTR_HASSIO): vol.Coerce(str),
vol.Optional(ATTR_HASSOS): vol.Coerce(str),
vol.Optional(ATTR_HASSOS_CLI): vol.Coerce(str),
},
extra=vol.REMOVE_EXTRA,
)
# pylint: disable=no-value-for-parameter
SCHEMA_HASSIO_CONFIG = vol.Schema(
{
vol.Optional(ATTR_TIMEZONE, default="UTC"): validate_timezone,
vol.Optional(ATTR_LAST_BOOT): vol.Coerce(str),
vol.Optional(
ATTR_ADDONS_CUSTOM_LIST,
default=["https://github.com/hassio-addons/repository"],
): REPOSITORIES,
vol.Optional(ATTR_WAIT_BOOT, default=5): WAIT_BOOT,
vol.Optional(ATTR_LOGGING, default="info"): LOG_LEVEL,
vol.Optional(ATTR_DEBUG, default=False): vol.Boolean(),
vol.Optional(ATTR_DEBUG_BLOCK, default=False): vol.Boolean(),
},
extra=vol.REMOVE_EXTRA,
)
SCHEMA_AUTH_CONFIG = vol.Schema({SHA256: SHA256})
SCHEMA_INGRESS_CONFIG = vol.Schema(
{
vol.Required(ATTR_SESSION, default=dict): vol.Schema(
{TOKEN: vol.Coerce(float)}
),
vol.Required(ATTR_PORTS, default=dict): vol.Schema(
{vol.Coerce(str): NETWORK_PORT}
),
},
extra=vol.REMOVE_EXTRA,
)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 37 KiB

View File

@@ -1 +0,0 @@
<mxfile userAgent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/63.0.3239.132 Safari/537.36" version="7.9.5" editor="www.draw.io" type="device"><diagram name="Page-1" id="535f6c39-9b73-04c2-941c-82630de90f1a">5VrLcqM4FP0aLzsFiOcycefRVTPVqfFippdYKLYmMvII4cd8/QiQDEKQ4Bicnmp7Yevqybk691zJnoH55vDI4u36d5ogMnOs5DADX2eOY1vAER+F5VhZ3DCoDCuGE9moNizwv0j1lNYcJyjTGnJKCcdb3QhpmiLINVvMGN3rzV4o0WfdxitkGBYwJqb1T5zwtbTaflRXPCG8WsupQ8evKpYxfF0xmqdyvpkDXspXVb2J1VjyQbN1nNB9wwTuZ2DOKOXVt81hjkiBrYKt6vfQU3taN0MpH9JB+mkXkxypFftEdL17oWIEsUB+lKD4/+RUVXzJSpfdigZitoP4EBUl0KJuL4EpalPKNjGpO4tvq+Lz+0LNI9ZWTVVVSFhOszr7NeZosY1hUd6L7SYarfmGiJJdrAYTMqeEsrK1ABv5EAp7xhl9RY0aq3zJ9S/k+B14SdMOMY4ODZPE7xHRDeLsKJqo2ghUXeRe9yLp2329c1wF9LqxaXzZLpabdXUaunaY+CJ91u0/YPjvW4oLvy2OGUebC9GECVqGyy40gQ8ikJz8NS6AwUAAnREAdA0Av1L4itilyEHkCdJfGznXRM7pQg6MgJxvIPc0N1ArQyEqehTUO5PLIUTdXF6GnutZ42Do+p6OoW9i6FkmhN4IEEYGXigROiSLlPE1XdE0Jve19U5HtIEeOmD+V2G+CTxZ/KGqUrGwqs5TxR9yhL8R50epwHHOqTDVE/9G6VaO0Qt1RnMG5fKlyvOYrRDXtknxYG+6gyESc7zTBfgScFUuMTa6zhvoRiLxaeFbFp4Rw+IBELsS6O5ngR705hPLWuHPSzBsv0gw2gnEIt8itsOZCAlqAqbqnuIs+/a9N8E4mZe9SUe9Dez3w5YRnuZz369SDT2gJR4KE3ecsAU8PWyBjqzDDjvilj2GatrOFNyyG8RSUezELY1XZRgbSqJMMIPfFqcCYYBEbA4MlfkBE7WKQVyz1WmkQbbgs8gGpolwmhd0J7Tkoy62A9xAzIe6EKWJOZgwNobqTPjn80sc64Sfpl0qHjSSKzHKl1vx6ALDIppdJ2LFKHyBYyWresRyOtL8U3DS0nx3jIjlX5kr9o2l5wI3dhhemg8MpFWDLilNkcaVN9NmjRHAZITal9dnhDuJ4kifNZK5kRAe7tC+awqYs92Jzx922Kdpk2veTHzAgRoIvd4832d9InK52zrx/rjrrqE1pqduk4SmmeGvbB1vi69bRiHKsvd1RhelwarzIF6lcleHAMFSy/EDEDnA90InDC0XTJRFd2mSY3umJkUjSJK6vJsypNWltuRcmtTJsNck2Sgn2/FClez6THF50JQuV2ei9rlJjVDRUnZyGjfnZ45TUdkYp9wUp6cZtk9Ck6CQU/OKUvEz35CqAbgrqIChQD5eIvJMM8wxTUWTJeWcbkQDUlTcnX610K7Sy98t6jFuCV4VfTk9j+b1zXv7rl5OMAKRW5d4oOMSD3SklqNcwZs0HkBSK9BY6r7HUtvk6BA6XkXzztTxQYqofkH8KZIZtZgGA/f7vRm9CcHbrHSDZCIkNE8u1smrECjS45lrdZzOgqnuk8DbN+Fyc3/gOHYmRybK5RtaW58Bq0U6vWo7jCauSRO1WydXUre1ZdrRdDwJBP0/01lP+bJXCWHMLqefX7466OcV73HoF4FWOtFFv67r3FEULJiIfc19H4yZZU5P2WHs867BvsFu9AySPGK+npoefeqE7MRDwTT0cNWh9Sr0CH8VcYp8naPBZdrk/xraZP4R4g+0LY5alGHUf4vy/yWfusifgHyiWP/5rXJG/Q9DcP8f</diagram></mxfile>

View File

@@ -1,14 +1,18 @@
aiohttp==3.5.4
aiohttp==3.6.1
async_timeout==3.0.1
attrs==19.1.0
cchardet==2.1.4
colorlog==4.0.2
attrs==19.3.0
cchardet==2.1.6
colorlog==4.1.0
cpe==1.2.1
cryptography==2.6.1
docker==4.0.1
gitpython==2.1.11
pytz==2019.1
pyudev==0.21.0
uvloop==0.12.2
voluptuous==0.11.5
ptvsd==4.2.10
cryptography==2.9.2
docker==4.2.0
gitpython==3.1.2
jinja2==2.11.2
packaging==20.3
ptvsd==4.3.2
pulsectl==20.4.3
pytz==2020.1
pyudev==0.22.0
ruamel.yaml==0.15.100
uvloop==0.14.0
voluptuous==0.11.7

View File

@@ -1,5 +1,6 @@
flake8==3.7.7
pylint==2.3.1
pytest==4.5.0
pytest-timeout==1.3.3
flake8==3.7.9
pylint==2.5.2
pytest==5.4.1
pytest-timeout==1.3.4
pytest-aiohttp==0.3.0
black==19.10b0

View File

@@ -0,0 +1,9 @@
#!/usr/bin/with-contenv bashio
# ==============================================================================
# Start udev service
# ==============================================================================
udevd --daemon
bashio::log.info "Update udev informations"
udevadm trigger
udevadm settle

View File

@@ -0,0 +1,35 @@
# This file is part of PulseAudio.
#
# PulseAudio is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# PulseAudio is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with PulseAudio; if not, see <http://www.gnu.org/licenses/>.
## Configuration file for PulseAudio clients. See pulse-client.conf(5) for
## more information. Default values are commented out. Use either ; or # for
## commenting.
; default-sink =
; default-source =
default-server = unix://data/audio/external/pulse.sock
; default-dbus-server =
autospawn = no
; daemon-binary = /usr/bin/pulseaudio
; extra-arguments = --log-target=syslog
; cookie-file =
; enable-shm = yes
; shm-size-bytes = 0 # setting this 0 will use the system-default, usually 64 MiB
; auto-connect-localhost = no
; auto-connect-display = no

View File

@@ -0,0 +1,5 @@
#!/usr/bin/execlineb -S0
# ==============================================================================
# Take down the S6 supervision tree when Supervisor fails
# ==============================================================================
s6-svscanctl -t /var/run/s6/services

View File

@@ -0,0 +1,7 @@
#!/usr/bin/with-contenv bashio
# ==============================================================================
# Start Service service
# ==============================================================================
export LD_PRELOAD="/usr/local/lib/libjemalloc.so.2"
exec python3 -m supervisor

133
scripts/test_env.sh Executable file
View File

@@ -0,0 +1,133 @@
#!/bin/bash
set -eE
DOCKER_TIMEOUT=30
DOCKER_PID=0
function start_docker() {
local starttime
local endtime
echo "Starting docker."
dockerd 2> /dev/null &
DOCKER_PID=$!
echo "Waiting for docker to initialize..."
starttime="$(date +%s)"
endtime="$(date +%s)"
until docker info >/dev/null 2>&1; do
if [ $((endtime - starttime)) -le $DOCKER_TIMEOUT ]; then
sleep 1
endtime=$(date +%s)
else
echo "Timeout while waiting for docker to come up"
exit 1
fi
done
echo "Docker was initialized"
}
function stop_docker() {
local starttime
local endtime
echo "Stopping in container docker..."
if [ "$DOCKER_PID" -gt 0 ] && kill -0 "$DOCKER_PID" 2> /dev/null; then
starttime="$(date +%s)"
endtime="$(date +%s)"
# Now wait for it to die
kill "$DOCKER_PID"
while kill -0 "$DOCKER_PID" 2> /dev/null; do
if [ $((endtime - starttime)) -le $DOCKER_TIMEOUT ]; then
sleep 1
endtime=$(date +%s)
else
echo "Timeout while waiting for container docker to die"
exit 1
fi
done
else
echo "Your host might have been left with unreleased resources"
fi
}
function build_supervisor() {
docker pull homeassistant/amd64-builder:dev
docker run --rm --privileged \
-v /run/docker.sock:/run/docker.sock -v "$(pwd):/data" \
homeassistant/amd64-builder:dev \
--generic dev -t /data --test --amd64 --no-cache
}
function cleanup_lastboot() {
if [[ -f /workspaces/test_supervisor/config.json ]]; then
echo "Cleaning up last boot"
cp /workspaces/test_supervisor/config.json /tmp/config.json
jq -rM 'del(.last_boot)' /tmp/config.json > /workspaces/test_supervisor/config.json
rm /tmp/config.json
fi
}
function cleanup_docker() {
echo "Cleaning up stopped containers..."
docker rm $(docker ps -a -q) || true
}
function setup_test_env() {
mkdir -p /workspaces/test_supervisor
echo "Start Supervisor"
docker run --rm --privileged \
--name hassio_supervisor \
--security-opt seccomp=unconfined \
--security-opt apparmor:unconfined \
-v /run/docker.sock:/run/docker.sock \
-v /run/dbus:/run/dbus \
-v "/workspaces/test_supervisor":/data \
-v /etc/machine-id:/etc/machine-id:ro \
-e SUPERVISOR_SHARE="/workspaces/test_supervisor" \
-e SUPERVISOR_NAME=hassio_supervisor \
-e SUPERVISOR_DEV=1 \
-e SUPERVISOR_MACHINE="qemux86-64" \
homeassistant/amd64-hassio-supervisor:latest
}
function init_dbus() {
if pgrep dbus-daemon; then
echo "Dbus is running"
return 0
fi
echo "Startup dbus"
mkdir -p /var/lib/dbus
cp -f /etc/machine-id /var/lib/dbus/machine-id
# cleanups
mkdir -p /run/dbus
rm -f /run/dbus/pid
# run
dbus-daemon --system --print-address
}
echo "Start Test-Env"
start_docker
trap "stop_docker" ERR
build_supervisor
cleanup_lastboot
cleanup_docker
init_dbus
setup_test_env
stop_docker

18
scripts/update-frontend.sh Executable file
View File

@@ -0,0 +1,18 @@
#!/bin/bash
set -e
# Update frontend
git submodule update --init --recursive --remote
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
cd home-assistant-polymer
nvm install
script/bootstrap
# build frontend
cd hassio
./script/build_hassio
# Copy frontend
rm -f ../../supervisor/hassio/api/panel/chunk.*
cp -rf build/* ../../supervisor/api/panel/

View File

@@ -1,35 +1,44 @@
"""Home Assistant Supervisor setup."""
from setuptools import setup
from hassio.const import HASSIO_VERSION
from supervisor.const import SUPERVISOR_VERSION
setup(
name='HassIO',
version=HASSIO_VERSION,
license='BSD License',
author='The Home Assistant Authors',
author_email='hello@home-assistant.io',
url='https://home-assistant.io/',
description=('Open-source private cloud os for Home-Assistant'
' based on HassOS'),
long_description=('A maintainless private cloud operator system that'
'setup a Home-Assistant instance. Based on HassOS'),
name="Supervisor",
version=SUPERVISOR_VERSION,
license="BSD License",
author="The Home Assistant Authors",
author_email="hello@home-assistant.io",
url="https://home-assistant.io/",
description=("Open-source private cloud os for Home-Assistant" " based on HassOS"),
long_description=(
"A maintainless private cloud operator system that"
"setup a Home-Assistant instance. Based on HassOS"
),
classifiers=[
'Intended Audience :: End Users/Desktop',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Operating System :: OS Independent',
'Topic :: Home Automation'
'Topic :: Software Development :: Libraries :: Python Modules',
'Topic :: Scientific/Engineering :: Atmospheric Science',
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
'Programming Language :: Python :: 3.6',
"Intended Audience :: End Users/Desktop",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Topic :: Home Automation",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Scientific/Engineering :: Atmospheric Science",
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"Programming Language :: Python :: 3.7",
],
keywords=['docker', 'home-assistant', 'api'],
keywords=["docker", "home-assistant", "api"],
zip_safe=False,
platforms='any',
platforms="any",
packages=[
'hassio', 'hassio.docker', 'hassio.addons', 'hassio.api', 'hassio.misc',
'hassio.utils', 'hassio.snapshots'
"supervisor",
"supervisor.docker",
"supervisor.addons",
"supervisor.api",
"supervisor.misc",
"supervisor.utils",
"supervisor.plugins",
"supervisor.snapshots",
],
include_package_data=True)
include_package_data=True,
)

1
supervisor/__init__.py Normal file
View File

@@ -0,0 +1 @@
"""Init file for Supervisor."""

View File

@@ -1,17 +1,18 @@
"""Main file for Hass.io."""
"""Main file for Supervisor."""
import asyncio
from concurrent.futures import ThreadPoolExecutor
import logging
import sys
from hassio import bootstrap
from supervisor import bootstrap
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
def initialize_event_loop():
"""Attempt to use uvloop."""
try:
# pylint: disable=import-outside-toplevel
import uvloop
uvloop.install()
@@ -28,34 +29,34 @@ if __name__ == "__main__":
# Init async event loop
loop = initialize_event_loop()
# Check if all information are available to setup Hass.io
if not bootstrap.check_environment():
sys.exit(1)
# Check if all information are available to setup Supervisor
bootstrap.check_environment()
# init executor pool
executor = ThreadPoolExecutor(thread_name_prefix="SyncWorker")
loop.set_default_executor(executor)
_LOGGER.info("Initialize Hass.io setup")
_LOGGER.info("Initialize Supervisor setup")
coresys = loop.run_until_complete(bootstrap.initialize_coresys())
loop.run_until_complete(coresys.core.connect())
bootstrap.migrate_system_env(coresys)
bootstrap.supervisor_debugger(coresys)
bootstrap.migrate_system_env(coresys)
_LOGGER.info("Setup HassIO")
_LOGGER.info("Setup Supervisor")
loop.run_until_complete(coresys.core.setup())
loop.call_soon_threadsafe(loop.create_task, coresys.core.start())
loop.call_soon_threadsafe(bootstrap.reg_signal, loop)
try:
_LOGGER.info("Run Hass.io")
_LOGGER.info("Run Supervisor")
loop.run_forever()
finally:
_LOGGER.info("Stopping Hass.io")
_LOGGER.info("Stopping Supervisor")
loop.run_until_complete(coresys.core.stop())
executor.shutdown(wait=False)
loop.close()
_LOGGER.info("Close Hass.io")
_LOGGER.info("Close Supervisor")
sys.exit(0)

View File

@@ -1,4 +1,4 @@
"""Init file for Hass.io add-ons."""
"""Init file for Supervisor add-ons."""
import asyncio
from contextlib import suppress
import logging
@@ -10,20 +10,22 @@ from ..coresys import CoreSys, CoreSysAttributes
from ..exceptions import (
AddonsError,
AddonsNotSupportedError,
CoreDNSError,
DockerAPIError,
HomeAssistantAPIError,
HostAppArmorError,
)
from ..store.addon import AddonStore
from .addon import Addon
from .data import AddonsData
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
AnyAddon = Union[Addon, AddonStore]
class AddonManager(CoreSysAttributes):
"""Manage add-ons inside Hass.io."""
"""Manage add-ons inside Supervisor."""
def __init__(self, coresys: CoreSys):
"""Initialize Docker base wrapper."""
@@ -55,9 +57,9 @@ class AddonManager(CoreSysAttributes):
return self.store.get(addon_slug)
def from_token(self, token: str) -> Optional[Addon]:
"""Return an add-on from Hass.io token."""
"""Return an add-on from Supervisor token."""
for addon in self.installed:
if token == addon.hassio_token:
if token == addon.supervisor_token:
return addon
return None
@@ -73,6 +75,9 @@ class AddonManager(CoreSysAttributes):
if tasks:
await asyncio.wait(tasks)
# Sync DNS
await self.sync_dns()
async def boot(self, stage: str) -> None:
"""Boot add-ons with mode auto."""
tasks = []
@@ -130,6 +135,7 @@ class AddonManager(CoreSysAttributes):
raise AddonsError() from None
else:
self.local[slug] = addon
_LOGGER.info("Add-on '%s' successfully installed", slug)
async def uninstall(self, slug: str) -> None:
"""Remove an add-on."""
@@ -146,19 +152,40 @@ class AddonManager(CoreSysAttributes):
await addon.remove_data()
# Cleanup audio settings
if addon.path_asound.exists():
if addon.path_pulse.exists():
with suppress(OSError):
addon.path_asound.unlink()
addon.path_pulse.unlink()
# Cleanup AppArmor profile
with suppress(HostAppArmorError):
await addon.uninstall_apparmor()
# Cleanup internal data
addon.remove_discovery()
# Cleanup Ingress panel from sidebar
if addon.ingress_panel:
addon.ingress_panel = False
with suppress(HomeAssistantAPIError):
await self.sys_ingress.update_hass_panel(addon)
# Cleanup Ingress dynamic port assignment
self.sys_ingress.del_dynamic_port(slug)
# Cleanup discovery data
for message in self.sys_discovery.list_messages:
if message.addon != addon.slug:
continue
self.sys_discovery.remove(message)
# Cleanup services data
for service in self.sys_services.list_services:
if addon.slug not in service.active:
continue
service.del_service_data(addon)
self.data.uninstall(addon)
self.local.pop(slug)
_LOGGER.info("Add-on '%s' successfully removed", slug)
async def update(self, slug: str) -> None:
"""Update add-on."""
if slug not in self.local:
@@ -184,9 +211,15 @@ class AddonManager(CoreSysAttributes):
last_state = await addon.state()
try:
await addon.instance.update(store.version, store.image)
# Cleanup
with suppress(DockerAPIError):
await addon.instance.cleanup()
except DockerAPIError:
raise AddonsError() from None
self.data.update(store)
else:
self.data.update(store)
_LOGGER.info("Add-on '%s' successfully updated", slug)
# Setup/Fix AppArmor profile
await addon.install_apparmor()
@@ -224,6 +257,7 @@ class AddonManager(CoreSysAttributes):
raise AddonsError() from None
else:
self.data.update(store)
_LOGGER.info("Add-on '%s' successfully rebuilt", slug)
# restore state
if last_state == STATE_STARTED:
@@ -232,17 +266,72 @@ class AddonManager(CoreSysAttributes):
async def restore(self, slug: str, tar_file: tarfile.TarFile) -> None:
"""Restore state of an add-on."""
if slug not in self.local:
_LOGGER.debug("Add-on %s is not local available for restore")
_LOGGER.debug("Add-on %s is not local available for restore", slug)
addon = Addon(self.coresys, slug)
else:
_LOGGER.debug("Add-on %s is local available for restore")
_LOGGER.debug("Add-on %s is local available for restore", slug)
addon = self.local[slug]
await addon.restore(tar_file)
# Check if new
if slug in self.local:
if slug not in self.local:
_LOGGER.info("Detect new Add-on after restore %s", slug)
self.local[slug] = addon
# Update ingress
if addon.with_ingress:
with suppress(HomeAssistantAPIError):
await self.sys_ingress.update_hass_panel(addon)
async def repair(self) -> None:
"""Repair local add-ons."""
needs_repair: List[Addon] = []
# Evaluate Add-ons to repair
for addon in self.installed:
if await addon.instance.exists():
continue
needs_repair.append(addon)
_LOGGER.info("Found %d add-ons to repair", len(needs_repair))
if not needs_repair:
return
_LOGGER.info("Detect new Add-on after restore %s", slug)
self.local[slug] = addon
for addon in needs_repair:
_LOGGER.info("Start repair for add-on: %s", addon.slug)
await self.sys_run_in_executor(
self.sys_docker.network.stale_cleanup, addon.instance.name
)
with suppress(DockerAPIError, KeyError):
# Need pull a image again
if not addon.need_build:
await addon.instance.install(addon.version, addon.image)
continue
# Need local lookup
if addon.need_build and not addon.is_detached:
store = self.store[addon.slug]
# If this add-on is available for rebuild
if addon.version == store.version:
await addon.instance.install(addon.version, addon.image)
continue
_LOGGER.error("Can't repair %s", addon.slug)
with suppress(AddonsError):
await self.uninstall(addon.slug)
async def sync_dns(self) -> None:
"""Sync add-ons DNS names."""
# Update hosts
for addon in self.installed:
if not await addon.instance.is_running():
continue
self.sys_plugins.dns.add_host(
ipv4=addon.ip_address, names=[addon.hostname], write=False
)
# Write hosts files
with suppress(CoreDNSError):
self.sys_plugins.dns.write_hosts()

View File

@@ -1,7 +1,7 @@
"""Init file for Hass.io add-ons."""
"""Init file for Supervisor add-ons."""
from contextlib import suppress
from copy import deepcopy
from ipaddress import IPv4Address, ip_address
from ipaddress import IPv4Address
import logging
from pathlib import Path, PurePath
import re
@@ -9,7 +9,7 @@ import secrets
import shutil
import tarfile
from tempfile import TemporaryDirectory
from typing import Any, Awaitable, Dict, Optional
from typing import Any, Awaitable, Dict, List, Optional
import voluptuous as vol
from voluptuous.humanize import humanize_error
@@ -35,7 +35,7 @@ from ..const import (
ATTR_USER,
ATTR_UUID,
ATTR_VERSION,
STATE_NONE,
DNS_SUFFIX,
STATE_STARTED,
STATE_STOPPED,
)
@@ -51,20 +51,23 @@ from ..exceptions import (
)
from ..utils.apparmor import adjust_profile
from ..utils.json import read_json_file, write_json_file
from ..utils.tar import exclude_filter, secure_path
from .model import AddonModel, Data
from .utils import remove_data
from .validate import SCHEMA_ADDON_SNAPSHOT, validate_options
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
RE_WEBUI = re.compile(
r"^(?:(?P<s_prefix>https?)|\[PROTO:(?P<t_proto>\w+)\])"
r":\/\/\[HOST\]:\[PORT:(?P<t_port>\d+)\](?P<s_suffix>.*)$"
)
RE_OLD_AUDIO = re.compile(r"\d+,\d+")
class Addon(AddonModel):
"""Hold data for add-on inside Hass.io."""
"""Hold data for add-on inside Supervisor."""
def __init__(self, coresys: CoreSys, slug: str):
"""Initialize data holder."""
@@ -75,13 +78,11 @@ class Addon(AddonModel):
async def load(self) -> None:
"""Async initialize of object."""
with suppress(DockerAPIError):
await self.instance.attach()
await self.instance.attach(tag=self.version)
@property
def ip_address(self) -> IPv4Address:
"""Return IP of Add-on instance."""
if not self.is_installed:
return ip_address("0.0.0.0")
return self.instance.ip_address
@property
@@ -119,6 +120,11 @@ class Addon(AddonModel):
"""Return installed version."""
return self.persist[ATTR_VERSION]
@property
def dns(self) -> List[str]:
"""Return list of DNS name for that add-on."""
return [f"{self.hostname}.{DNS_SUFFIX}"]
@property
def options(self) -> Dict[str, Any]:
"""Return options with local changes."""
@@ -158,13 +164,13 @@ class Addon(AddonModel):
return self.persist[ATTR_UUID]
@property
def hassio_token(self) -> Optional[str]:
"""Return access token for Hass.io API."""
def supervisor_token(self) -> Optional[str]:
"""Return access token for Supervisor API."""
return self.persist.get(ATTR_ACCESS_TOKEN)
@property
def ingress_token(self) -> Optional[str]:
"""Return access token for Hass.io API."""
"""Return access token for Supervisor API."""
return self.persist.get(ATTR_INGRESS_TOKEN)
@property
@@ -246,7 +252,7 @@ class Addon(AddonModel):
# lookup the correct protocol from config
if t_proto:
proto = "https" if self.options[t_proto] else "http"
proto = "https" if self.options.get(t_proto) else "http"
else:
proto = s_prefix
@@ -275,33 +281,39 @@ class Addon(AddonModel):
@property
def audio_output(self) -> Optional[str]:
"""Return ALSA config for output or None."""
"""Return a pulse profile for output or None."""
if not self.with_audio:
return None
return self.persist.get(ATTR_AUDIO_OUTPUT, self.sys_host.alsa.default.output)
# Fallback with old audio settings
# Remove after 210
output_data = self.persist.get(ATTR_AUDIO_OUTPUT)
if output_data and RE_OLD_AUDIO.fullmatch(output_data):
return None
return output_data
@audio_output.setter
def audio_output(self, value: Optional[str]):
"""Set/reset audio output settings."""
if value is None:
self.persist.pop(ATTR_AUDIO_OUTPUT, None)
else:
self.persist[ATTR_AUDIO_OUTPUT] = value
"""Set audio output profile settings."""
self.persist[ATTR_AUDIO_OUTPUT] = value
@property
def audio_input(self) -> Optional[str]:
"""Return ALSA config for input or None."""
"""Return pulse profile for input or None."""
if not self.with_audio:
return None
return self.persist.get(ATTR_AUDIO_INPUT, self.sys_host.alsa.default.input)
# Fallback with old audio settings
# Remove after 210
input_data = self.persist.get(ATTR_AUDIO_INPUT)
if input_data and RE_OLD_AUDIO.fullmatch(input_data):
return None
return input_data
@audio_input.setter
def audio_input(self, value: Optional[str]):
"""Set/reset audio input settings."""
if value is None:
self.persist.pop(ATTR_AUDIO_INPUT, None)
else:
self.persist[ATTR_AUDIO_INPUT] = value
"""Set audio input settings."""
self.persist[ATTR_AUDIO_INPUT] = value
@property
def image(self):
@@ -329,26 +341,29 @@ class Addon(AddonModel):
return Path(self.path_data, "options.json")
@property
def path_asound(self):
def path_pulse(self):
"""Return path to asound config."""
return Path(self.sys_config.path_tmp, f"{self.slug}_asound")
return Path(self.sys_config.path_tmp, f"{self.slug}_pulse")
@property
def path_extern_asound(self):
def path_extern_pulse(self):
"""Return path to asound config for Docker."""
return Path(self.sys_config.path_extern_tmp, f"{self.slug}_asound")
return Path(self.sys_config.path_extern_tmp, f"{self.slug}_pulse")
def save_persist(self):
"""Save data of add-on."""
self.sys_addons.data.save_data()
def write_options(self):
async def write_options(self):
"""Return True if add-on options is written to data."""
schema = self.schema
options = self.options
# Update secrets for validation
await self.sys_secrets.reload()
try:
schema(options)
options = schema(options)
write_json_file(self.path_options, options)
except vol.Invalid as ex:
_LOGGER.error(
@@ -364,13 +379,6 @@ class Addon(AddonModel):
raise AddonsError()
def remove_discovery(self):
"""Remove all discovery message from add-on."""
for message in self.sys_discovery.list_messages:
if message.addon != self.slug:
continue
self.sys_discovery.remove(message)
async def remove_data(self):
"""Remove add-on data."""
if not self.path_data.is_dir():
@@ -379,20 +387,28 @@ class Addon(AddonModel):
_LOGGER.info("Remove add-on data folder %s", self.path_data)
await remove_data(self.path_data)
def write_asound(self):
def write_pulse(self):
"""Write asound config to file and return True on success."""
asound_config = self.sys_host.alsa.asound(
alsa_input=self.audio_input, alsa_output=self.audio_output
pulse_config = self.sys_plugins.audio.pulse_client(
input_profile=self.audio_input, output_profile=self.audio_output
)
try:
with self.path_asound.open("w") as config_file:
config_file.write(asound_config)
except OSError as err:
_LOGGER.error("Add-on %s can't write asound: %s", self.slug, err)
raise AddonsError()
# Cleanup wrong maps
if self.path_pulse.is_dir():
shutil.rmtree(self.path_pulse, ignore_errors=True)
_LOGGER.debug("Add-on %s write asound: %s", self.slug, self.path_asound)
# Write pulse config
try:
with self.path_pulse.open("w") as config_file:
config_file.write(pulse_config)
except OSError as err:
_LOGGER.error(
"Add-on %s can't write pulse/client.config: %s", self.slug, err
)
else:
_LOGGER.debug(
"Add-on %s write pulse/client.config: %s", self.slug, self.path_pulse
)
async def install_apparmor(self) -> None:
"""Install or Update AppArmor profile for Add-on."""
@@ -435,7 +451,9 @@ class Addon(AddonModel):
options = {**self.persist[ATTR_OPTIONS], **default_options}
# create voluptuous
new_schema = vol.Schema(vol.All(dict, validate_options(new_raw_schema)))
new_schema = vol.Schema(
vol.All(dict, validate_options(self.coresys, new_raw_schema))
)
# validate
try:
@@ -447,9 +465,6 @@ class Addon(AddonModel):
async def state(self) -> str:
"""Return running state of add-on."""
if not self.is_installed:
return STATE_NONE
if await self.instance.is_running():
return STATE_STARTED
return STATE_STOPPED
@@ -465,12 +480,13 @@ class Addon(AddonModel):
self.save_persist()
# Options
self.write_options()
await self.write_options()
# Sound
if self.with_audio:
self.write_asound()
self.write_pulse()
# Start Add-on
try:
await self.instance.run()
except DockerAPIError:
@@ -519,7 +535,7 @@ class Addon(AddonModel):
async def snapshot(self, tar_file: tarfile.TarFile) -> None:
"""Snapshot state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
with TemporaryDirectory(dir=self.sys_config.path_tmp) as temp:
# store local image
if self.need_build:
try:
@@ -554,8 +570,15 @@ class Addon(AddonModel):
def _write_tarfile():
"""Write tar inside loop."""
with tar_file as snapshot:
# Snapshot system
snapshot.add(temp, arcname=".")
snapshot.add(self.path_data, arcname="data")
# Snapshot data
snapshot.add(
self.path_data,
arcname="data",
filter=exclude_filter(self.snapshot_exclude),
)
try:
_LOGGER.info("Build snapshot for add-on %s", self.slug)
@@ -568,12 +591,12 @@ class Addon(AddonModel):
async def restore(self, tar_file: tarfile.TarFile) -> None:
"""Restore state of an add-on."""
with TemporaryDirectory(dir=str(self.sys_config.path_tmp)) as temp:
with TemporaryDirectory(dir=self.sys_config.path_tmp) as temp:
# extract snapshot
def _extract_tarfile():
"""Extract tar snapshot."""
with tar_file as snapshot:
snapshot.extractall(path=Path(temp))
snapshot.extractall(path=Path(temp), members=secure_path(snapshot))
try:
await self.sys_run_in_executor(_extract_tarfile)
@@ -618,7 +641,7 @@ class Addon(AddonModel):
image_file = Path(temp, "image.tar")
if image_file.is_file():
with suppress(DockerAPIError):
await self.instance.import_image(image_file, version)
await self.instance.import_image(image_file)
else:
with suppress(DockerAPIError):
await self.instance.install(version, restore_image)
@@ -634,7 +657,7 @@ class Addon(AddonModel):
# Restore data
def _restore_data():
"""Restore data."""
shutil.copytree(str(Path(temp, "data")), str(self.path_data))
shutil.copytree(Path(temp, "data"), self.path_data)
_LOGGER.info("Restore data for addon %s", self.slug)
if self.path_data.is_dir():

View File

@@ -1,4 +1,4 @@
"""Hass.io add-on build environment."""
"""Supervisor add-on build environment."""
from __future__ import annotations
from pathlib import Path
from typing import TYPE_CHECKING, Dict
@@ -16,7 +16,7 @@ class AddonBuild(JsonConfig, CoreSysAttributes):
"""Handle build options for add-ons."""
def __init__(self, coresys: CoreSys, addon: AnyAddon) -> None:
"""Initialize Hass.io add-on builder."""
"""Initialize Supervisor add-on builder."""
self.coresys: CoreSys = coresys
self.addon = addon

View File

@@ -1,4 +1,4 @@
"""Init file for Hass.io add-on data."""
"""Init file for Supervisor add-on data."""
from copy import deepcopy
import logging
from typing import Any, Dict
@@ -17,13 +17,13 @@ from ..store.addon import AddonStore
from .addon import Addon
from .validate import SCHEMA_ADDONS_FILE
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
Config = Dict[str, Any]
class AddonsData(JsonConfig, CoreSysAttributes):
"""Hold data for installed Add-ons inside Hass.io."""
"""Hold data for installed Add-ons inside Supervisor."""
def __init__(self, coresys: CoreSys):
"""Initialize data holder."""

View File

@@ -1,11 +1,12 @@
"""Init file for Hass.io add-ons."""
from distutils.version import StrictVersion
"""Init file for Supervisor add-ons."""
from pathlib import Path
from typing import Any, Awaitable, Dict, List, Optional
from packaging import version as pkg_version
import voluptuous as vol
from ..const import (
ATTR_ADVANCED,
ATTR_APPARMOR,
ATTR_ARCH,
ATTR_AUDIO,
@@ -30,6 +31,7 @@ from ..const import (
ATTR_HOST_PID,
ATTR_IMAGE,
ATTR_INGRESS,
ATTR_INIT,
ATTR_KERNEL_MODULES,
ATTR_LEGACY,
ATTR_LOCATON,
@@ -47,19 +49,24 @@ from ..const import (
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SNAPSHOT_EXCLUDE,
ATTR_STAGE,
ATTR_STARTUP,
ATTR_STDIN,
ATTR_TIMEOUT,
ATTR_TMPFS,
ATTR_UDEV,
ATTR_URL,
ATTR_VERSION,
ATTR_VIDEO,
ATTR_WEBUI,
SECURITY_DEFAULT,
SECURITY_DISABLE,
SECURITY_PROFILE,
AddonStages,
)
from ..coresys import CoreSysAttributes
from .validate import MACHINE_ALL, RE_SERVICE, RE_VOLUME, validate_options
from .validate import RE_SERVICE, RE_VOLUME, schema_ui_options, validate_options
Data = Dict[str, Any]
@@ -109,6 +116,16 @@ class AddonModel(CoreSysAttributes):
"""Return name of add-on."""
return self.data[ATTR_NAME]
@property
def hostname(self) -> str:
"""Return slug/id of add-on."""
return self.slug.replace("_", "-")
@property
def dns(self) -> List[str]:
"""Return list of DNS name for that add-on."""
return []
@property
def timeout(self) -> int:
"""Return timeout of addon for docker stop."""
@@ -120,13 +137,13 @@ class AddonModel(CoreSysAttributes):
return None
@property
def hassio_token(self) -> Optional[str]:
"""Return access token for Hass.io API."""
def supervisor_token(self) -> Optional[str]:
"""Return access token for Supervisor API."""
return None
@property
def ingress_token(self) -> Optional[str]:
"""Return access token for Hass.io API."""
"""Return access token for Supervisor API."""
return None
@property
@@ -177,6 +194,16 @@ class AddonModel(CoreSysAttributes):
"""Return startup type of add-on."""
return self.data.get(ATTR_STARTUP)
@property
def advanced(self) -> bool:
"""Return advanced mode of add-on."""
return self.data[ATTR_ADVANCED]
@property
def stage(self) -> AddonStages:
"""Return stage mode of add-on."""
return self.data[ATTR_STAGE]
@property
def services_role(self) -> Dict[str, str]:
"""Return dict of services with rights."""
@@ -300,7 +327,7 @@ class AddonModel(CoreSysAttributes):
@property
def access_hassio_api(self) -> bool:
"""Return True if the add-on access to Hass.io REASTful API."""
"""Return True if the add-on access to Supervisor REASTful API."""
return self.data[ATTR_HASSIO_API]
@property
@@ -310,9 +337,19 @@ class AddonModel(CoreSysAttributes):
@property
def hassio_role(self) -> str:
"""Return Hass.io role for API."""
"""Return Supervisor role for API."""
return self.data[ATTR_HASSIO_ROLE]
@property
def snapshot_exclude(self) -> List[str]:
"""Return Exclude list for snapshot."""
return self.data.get(ATTR_SNAPSHOT_EXCLUDE, [])
@property
def default_init(self) -> bool:
"""Return True if the add-on have no own init."""
return self.data[ATTR_INIT]
@property
def with_stdin(self) -> bool:
"""Return True if the add-on access use stdin input."""
@@ -333,6 +370,11 @@ class AddonModel(CoreSysAttributes):
"""Return True if the add-on access to GPIO interface."""
return self.data[ATTR_GPIO]
@property
def with_udev(self) -> bool:
"""Return True if the add-on have his own udev."""
return self.data[ATTR_UDEV]
@property
def with_kernel_modules(self) -> bool:
"""Return True if the add-on access to kernel modules."""
@@ -358,6 +400,11 @@ class AddonModel(CoreSysAttributes):
"""Return True if the add-on access to audio."""
return self.data[ATTR_AUDIO]
@property
def with_video(self) -> bool:
"""Return True if the add-on access to video."""
return self.data[ATTR_VIDEO]
@property
def homeassistant_version(self) -> Optional[str]:
"""Return min Home Assistant version they needed by Add-on."""
@@ -383,6 +430,11 @@ class AddonModel(CoreSysAttributes):
"""Return True if a changelog exists."""
return self.path_changelog.exists()
@property
def with_documentation(self) -> bool:
"""Return True if a documentation exists."""
return self.path_documentation.exists()
@property
def supported_arch(self) -> List[str]:
"""Return list of supported arch."""
@@ -391,7 +443,7 @@ class AddonModel(CoreSysAttributes):
@property
def supported_machine(self) -> List[str]:
"""Return list of supported machine."""
return self.data.get(ATTR_MACHINE, MACHINE_ALL)
return self.data.get(ATTR_MACHINE, [])
@property
def image(self) -> str:
@@ -433,6 +485,11 @@ class AddonModel(CoreSysAttributes):
"""Return path to add-on changelog."""
return Path(self.path_location, "CHANGELOG.md")
@property
def path_documentation(self) -> Path:
"""Return path to add-on changelog."""
return Path(self.path_location, "DOCS.md")
@property
def path_apparmor(self) -> Path:
"""Return path to custom AppArmor profile."""
@@ -445,7 +502,16 @@ class AddonModel(CoreSysAttributes):
if isinstance(raw_schema, bool):
return vol.Schema(dict)
return vol.Schema(vol.All(dict, validate_options(raw_schema)))
return vol.Schema(vol.All(dict, validate_options(self.coresys, raw_schema)))
@property
def schema_ui(self) -> Optional[List[Dict[str, Any]]]:
"""Create a UI schema for add-on options."""
raw_schema = self.data[ATTR_SCHEMA]
if isinstance(raw_schema, bool):
return None
return schema_ui_options(raw_schema)
def __eq__(self, other):
"""Compaired add-on objects."""
@@ -460,13 +526,15 @@ class AddonModel(CoreSysAttributes):
return False
# Machine / Hardware
machine = config.get(ATTR_MACHINE) or MACHINE_ALL
if self.sys_machine not in machine:
machine = config.get(ATTR_MACHINE)
if machine and self.sys_machine not in machine:
return False
# Home Assistant
version = config.get(ATTR_HOMEASSISTANT) or self.sys_homeassistant.version
if StrictVersion(self.sys_homeassistant.version) < StrictVersion(version):
if pkg_version.parse(self.sys_homeassistant.version) < pkg_version.parse(
version
):
return False
return True

View File

@@ -22,7 +22,7 @@ from ..const import (
if TYPE_CHECKING:
from .model import AddonModel
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
def rating_security(addon: AddonModel) -> int:
@@ -39,8 +39,10 @@ def rating_security(addon: AddonModel) -> int:
elif addon.apparmor == SECURITY_PROFILE:
rating += 1
# Home Assistant Login
if addon.access_auth_api:
# Home Assistant Login & Ingress
if addon.with_ingress:
rating += 2
elif addon.access_auth_api:
rating += 1
# Privileged options
@@ -57,7 +59,7 @@ def rating_security(addon: AddonModel) -> int:
):
rating += -1
# API Hass.io role
# API Supervisor role
if addon.hassio_role == ROLE_MANAGER:
rating += -1
elif addon.hassio_role == ROLE_ADMIN:

View File

@@ -2,6 +2,7 @@
import logging
import re
import secrets
from typing import Any, Dict, List
import uuid
import voluptuous as vol
@@ -9,6 +10,7 @@ import voluptuous as vol
from ..const import (
ARCH_ALL,
ATTR_ACCESS_TOKEN,
ATTR_ADVANCED,
ATTR_APPARMOR,
ATTR_ARCH,
ATTR_ARGS,
@@ -39,12 +41,10 @@ from ..const import (
ATTR_IMAGE,
ATTR_INGRESS,
ATTR_INGRESS_ENTRY,
ATTR_INGRESS_PANEL,
ATTR_INGRESS_PORT,
ATTR_INGRESS_TOKEN,
ATTR_INGRESS_PANEL,
ATTR_PANEL_ADMIN,
ATTR_PANEL_ICON,
ATTR_PANEL_TITLE,
ATTR_INIT,
ATTR_KERNEL_MODULES,
ATTR_LEGACY,
ATTR_LOCATON,
@@ -53,6 +53,9 @@ from ..const import (
ATTR_NAME,
ATTR_NETWORK,
ATTR_OPTIONS,
ATTR_PANEL_ADMIN,
ATTR_PANEL_ICON,
ATTR_PANEL_TITLE,
ATTR_PORTS,
ATTR_PORTS_DESCRIPTION,
ATTR_PRIVILEGED,
@@ -61,17 +64,21 @@ from ..const import (
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SNAPSHOT_EXCLUDE,
ATTR_SQUASH,
ATTR_STAGE,
ATTR_STARTUP,
ATTR_STATE,
ATTR_STDIN,
ATTR_SYSTEM,
ATTR_TIMEOUT,
ATTR_TMPFS,
ATTR_UDEV,
ATTR_URL,
ATTR_USER,
ATTR_UUID,
ATTR_VERSION,
ATTR_VIDEO,
ATTR_WEBUI,
BOOT_AUTO,
BOOT_MANUAL,
@@ -83,41 +90,58 @@ from ..const import (
STARTUP_SERVICES,
STATE_STARTED,
STATE_STOPPED,
AddonStages,
)
from ..coresys import CoreSys
from ..discovery.validate import valid_discovery_service
from ..validate import (
ALSA_DEVICE,
DOCKER_PORTS,
DOCKER_PORTS_DESCRIPTION,
NETWORK_PORT,
TOKEN,
UUID_MATCH,
network_port,
token,
uuid_match,
)
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
RE_VOLUME = re.compile(r"^(config|ssl|addons|backup|share)(?::(rw|ro))?$")
RE_SERVICE = re.compile(r"^(?P<service>mqtt):(?P<rights>provide|want|need)$")
RE_SERVICE = re.compile(r"^(?P<service>mqtt|mysql):(?P<rights>provide|want|need)$")
V_STR = "str"
V_INT = "int"
V_FLOAT = "float"
V_BOOL = "bool"
V_PASSWORD = "password"
V_EMAIL = "email"
V_URL = "url"
V_PORT = "port"
V_MATCH = "match"
V_LIST = "list"
RE_SCHEMA_ELEMENT = re.compile(
r"^(?:"
r"|str|bool|email|url|port"
r"|bool|email|url|port"
r"|str(?:\((?P<s_min>\d+)?,(?P<s_max>\d+)?\))?"
r"|password(?:\((?P<p_min>\d+)?,(?P<p_max>\d+)?\))?"
r"|int(?:\((?P<i_min>\d+)?,(?P<i_max>\d+)?\))?"
r"|float(?:\((?P<f_min>[\d\.]+)?,(?P<f_max>[\d\.]+)?\))?"
r"|match\((?P<match>.*)\)"
r"|list\((?P<list>.+)\)"
r")\??$"
)
_SCHEMA_LENGTH_PARTS = (
"i_min",
"i_max",
"f_min",
"f_max",
"s_min",
"s_max",
"p_min",
"p_max",
)
RE_DOCKER_IMAGE = re.compile(r"^([a-zA-Z\-\.:\d{}]+/)*?([\-\w{}]+)/([\-\w{}]+)$")
RE_DOCKER_IMAGE_BUILD = re.compile(
r"^([a-zA-Z\-\.:\d{}]+/)*?([\-\w{}]+)/([\-\w{}]+)(:[\.\-\w{}]+)?$"
@@ -129,16 +153,18 @@ SCHEMA_ELEMENT = vol.Match(RE_SCHEMA_ELEMENT)
MACHINE_ALL = [
"intel-nuc",
"odroid-c2",
"odroid-n2",
"odroid-xu",
"orangepi-prime",
"qemux86",
"qemux86-64",
"qemuarm",
"qemuarm-64",
"qemuarm",
"qemux86-64",
"qemux86",
"raspberrypi",
"raspberrypi2",
"raspberrypi3",
"raspberrypi3-64",
"raspberrypi3",
"raspberrypi4-64",
"raspberrypi4",
"tinker",
]
@@ -164,6 +190,9 @@ SCHEMA_ADDON_CONFIG = vol.Schema(
vol.Optional(ATTR_URL): vol.Url(),
vol.Required(ATTR_STARTUP): vol.All(_simple_startup, vol.In(STARTUP_ALL)),
vol.Required(ATTR_BOOT): vol.In([BOOT_AUTO, BOOT_MANUAL]),
vol.Optional(ATTR_INIT, default=True): vol.Boolean(),
vol.Optional(ATTR_ADVANCED, default=False): vol.Boolean(),
vol.Optional(ATTR_STAGE, default=AddonStages.STABLE): vol.Coerce(AddonStages),
vol.Optional(ATTR_PORTS): DOCKER_PORTS,
vol.Optional(ATTR_PORTS_DESCRIPTION): DOCKER_PORTS_DESCRIPTION,
vol.Optional(ATTR_WEBUI): vol.Match(
@@ -171,7 +200,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema(
),
vol.Optional(ATTR_INGRESS, default=False): vol.Boolean(),
vol.Optional(ATTR_INGRESS_PORT, default=8099): vol.Any(
NETWORK_PORT, vol.Equal(0)
network_port, vol.Equal(0)
),
vol.Optional(ATTR_INGRESS_ENTRY): vol.Coerce(str),
vol.Optional(ATTR_PANEL_ICON, default="mdi:puzzle"): vol.Coerce(str),
@@ -184,6 +213,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema(
vol.Optional(ATTR_HOST_DBUS, default=False): vol.Boolean(),
vol.Optional(ATTR_DEVICES): [vol.Match(r"^(.*):(.*):([rwm]{1,3})$")],
vol.Optional(ATTR_AUTO_UART, default=False): vol.Boolean(),
vol.Optional(ATTR_UDEV, default=False): vol.Boolean(),
vol.Optional(ATTR_TMPFS): vol.Match(r"^size=(\d)*[kmg](,uid=\d{1,4})?(,rw)?$"),
vol.Optional(ATTR_MAP, default=list): [vol.Match(RE_VOLUME)],
vol.Optional(ATTR_ENVIRONMENT): {vol.Match(r"\w*"): vol.Coerce(str)},
@@ -191,6 +221,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema(
vol.Optional(ATTR_APPARMOR, default=True): vol.Boolean(),
vol.Optional(ATTR_FULL_ACCESS, default=False): vol.Boolean(),
vol.Optional(ATTR_AUDIO, default=False): vol.Boolean(),
vol.Optional(ATTR_VIDEO, default=False): vol.Boolean(),
vol.Optional(ATTR_GPIO, default=False): vol.Boolean(),
vol.Optional(ATTR_DEVICETREE, default=False): vol.Boolean(),
vol.Optional(ATTR_KERNEL_MODULES, default=False): vol.Boolean(),
@@ -203,6 +234,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema(
vol.Optional(ATTR_AUTH_API, default=False): vol.Boolean(),
vol.Optional(ATTR_SERVICES): [vol.Match(RE_SERVICE)],
vol.Optional(ATTR_DISCOVERY): [valid_discovery_service],
vol.Optional(ATTR_SNAPSHOT_EXCLUDE): [vol.Coerce(str)],
vol.Required(ATTR_OPTIONS): dict,
vol.Required(ATTR_SCHEMA): vol.Any(
vol.Schema(
@@ -229,7 +261,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema(
),
vol.Optional(ATTR_IMAGE): vol.Match(RE_DOCKER_IMAGE),
vol.Optional(ATTR_TIMEOUT, default=10): vol.All(
vol.Coerce(int), vol.Range(min=10, max=120)
vol.Coerce(int), vol.Range(min=10, max=300)
),
},
extra=vol.REMOVE_EXTRA,
@@ -256,8 +288,8 @@ SCHEMA_ADDON_USER = vol.Schema(
{
vol.Required(ATTR_VERSION): vol.Coerce(str),
vol.Optional(ATTR_IMAGE): vol.Coerce(str),
vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex): UUID_MATCH,
vol.Optional(ATTR_ACCESS_TOKEN): TOKEN,
vol.Optional(ATTR_UUID, default=lambda: uuid.uuid4().hex): uuid_match,
vol.Optional(ATTR_ACCESS_TOKEN): token,
vol.Optional(ATTR_INGRESS_TOKEN, default=secrets.token_urlsafe): vol.Coerce(
str
),
@@ -265,8 +297,8 @@ SCHEMA_ADDON_USER = vol.Schema(
vol.Optional(ATTR_AUTO_UPDATE, default=False): vol.Boolean(),
vol.Optional(ATTR_BOOT): vol.In([BOOT_AUTO, BOOT_MANUAL]),
vol.Optional(ATTR_NETWORK): DOCKER_PORTS,
vol.Optional(ATTR_AUDIO_OUTPUT): ALSA_DEVICE,
vol.Optional(ATTR_AUDIO_INPUT): ALSA_DEVICE,
vol.Optional(ATTR_AUDIO_OUTPUT): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_AUDIO_INPUT): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_PROTECTED, default=True): vol.Boolean(),
vol.Optional(ATTR_INGRESS_PANEL, default=False): vol.Boolean(),
},
@@ -301,7 +333,7 @@ SCHEMA_ADDON_SNAPSHOT = vol.Schema(
)
def validate_options(raw_schema):
def validate_options(coresys: CoreSys, raw_schema: Dict[str, Any]):
"""Validate schema."""
def validate(struct):
@@ -319,13 +351,13 @@ def validate_options(raw_schema):
try:
if isinstance(typ, list):
# nested value list
options[key] = _nested_validate_list(typ[0], value, key)
options[key] = _nested_validate_list(coresys, typ[0], value, key)
elif isinstance(typ, dict):
# nested value dict
options[key] = _nested_validate_dict(typ, value, key)
options[key] = _nested_validate_dict(coresys, typ, value, key)
else:
# normal value
options[key] = _single_validate(typ, value, key)
options[key] = _single_validate(coresys, typ, value, key)
except (IndexError, KeyError):
raise vol.Invalid(f"Type error for {key}") from None
@@ -337,24 +369,31 @@ def validate_options(raw_schema):
# pylint: disable=no-value-for-parameter
# pylint: disable=inconsistent-return-statements
def _single_validate(typ, value, key):
def _single_validate(coresys: CoreSys, typ: str, value: Any, key: str):
"""Validate a single element."""
# if required argument
if value is None:
raise vol.Invalid(f"Missing required option '{key}'")
# Lookup secret
if str(value).startswith("!secret "):
secret: str = value.partition(" ")[2]
value = coresys.secrets.get(secret)
if value is None:
raise vol.Invalid(f"Unknown secret {secret}")
# parse extend data from type
match = RE_SCHEMA_ELEMENT.match(typ)
# prepare range
range_args = {}
for group_name in ("i_min", "i_max", "f_min", "f_max"):
for group_name in _SCHEMA_LENGTH_PARTS:
group_value = match.group(group_name)
if group_value:
range_args[group_name[2:]] = float(group_value)
if typ.startswith(V_STR):
return str(value)
if typ.startswith(V_STR) or typ.startswith(V_PASSWORD):
return vol.All(str(value), vol.Range(**range_args))(value)
elif typ.startswith(V_INT):
return vol.All(vol.Coerce(int), vol.Range(**range_args))(value)
elif typ.startswith(V_FLOAT):
@@ -366,29 +405,31 @@ def _single_validate(typ, value, key):
elif typ.startswith(V_URL):
return vol.Url()(value)
elif typ.startswith(V_PORT):
return NETWORK_PORT(value)
return network_port(value)
elif typ.startswith(V_MATCH):
return vol.Match(match.group("match"))(str(value))
elif typ.startswith(V_LIST):
return vol.In(match.group("list").split("|"))(str(value))
raise vol.Invalid(f"Fatal error for {key} type {typ}")
def _nested_validate_list(typ, data_list, key):
def _nested_validate_list(coresys, typ, data_list, key):
"""Validate nested items."""
options = []
for element in data_list:
# Nested?
if isinstance(typ, dict):
c_options = _nested_validate_dict(typ, element, key)
c_options = _nested_validate_dict(coresys, typ, element, key)
options.append(c_options)
else:
options.append(_single_validate(typ, element, key))
options.append(_single_validate(coresys, typ, element, key))
return options
def _nested_validate_dict(typ, data_dict, key):
def _nested_validate_dict(coresys, typ, data_dict, key):
"""Validate nested items."""
options = {}
@@ -400,9 +441,11 @@ def _nested_validate_dict(typ, data_dict, key):
# Nested?
if isinstance(typ[c_key], list):
options[c_key] = _nested_validate_list(typ[c_key][0], c_value, c_key)
options[c_key] = _nested_validate_list(
coresys, typ[c_key][0], c_value, c_key
)
else:
options[c_key] = _single_validate(typ[c_key], c_value, c_key)
options[c_key] = _single_validate(coresys, typ[c_key], c_value, c_key)
_check_missing_options(typ, options, key)
return options
@@ -415,3 +458,117 @@ def _check_missing_options(origin, exists, root):
if isinstance(origin[miss_opt], str) and origin[miss_opt].endswith("?"):
continue
raise vol.Invalid(f"Missing option {miss_opt} in {root}")
def schema_ui_options(raw_schema: Dict[str, Any]) -> List[Dict[str, Any]]:
"""Generate UI schema."""
ui_schema = []
# read options
for key, value in raw_schema.items():
if isinstance(value, list):
# nested value list
_nested_ui_list(ui_schema, value, key)
elif isinstance(value, dict):
# nested value dict
_nested_ui_dict(ui_schema, value, key)
else:
# normal value
_single_ui_option(ui_schema, value, key)
return ui_schema
def _single_ui_option(
ui_schema: List[Dict[str, Any]], value: str, key: str, multiple: bool = False
) -> None:
"""Validate a single element."""
ui_node = {"name": key}
# If multiple
if multiple:
ui_node["multiple"] = True
# Parse extend data from type
match = RE_SCHEMA_ELEMENT.match(value)
# Prepare range
for group_name in _SCHEMA_LENGTH_PARTS:
group_value = match.group(group_name)
if not group_value:
continue
if group_name[2:] == "min":
ui_node["lengthMin"] = float(group_value)
elif group_name[2:] == "max":
ui_node["lengthMax"] = float(group_value)
# If required
if value.endswith("?"):
ui_node["optional"] = True
else:
ui_node["required"] = True
# Data types
if value.startswith(V_STR):
ui_node["type"] = "string"
elif value.startswith(V_PASSWORD):
ui_node["type"] = "string"
ui_node["format"] = "password"
elif value.startswith(V_INT):
ui_node["type"] = "integer"
elif value.startswith(V_FLOAT):
ui_node["type"] = "float"
elif value.startswith(V_BOOL):
ui_node["type"] = "boolean"
elif value.startswith(V_EMAIL):
ui_node["type"] = "string"
ui_node["format"] = "email"
elif value.startswith(V_URL):
ui_node["type"] = "string"
ui_node["format"] = "url"
elif value.startswith(V_PORT):
ui_node["type"] = "integer"
elif value.startswith(V_MATCH):
ui_node["type"] = "string"
elif value.startswith(V_LIST):
ui_node["type"] = "select"
ui_node["options"] = match.group("list").split("|")
ui_schema.append(ui_node)
def _nested_ui_list(
ui_schema: List[Dict[str, Any]], option_list: List[Any], key: str
) -> None:
"""UI nested list items."""
try:
element = option_list[0]
except IndexError:
_LOGGER.error("Invalid schema %s", key)
return
if isinstance(element, dict):
_nested_ui_dict(ui_schema, element, key, multiple=True)
else:
_single_ui_option(ui_schema, element, key, multiple=True)
def _nested_ui_dict(
ui_schema: List[Dict[str, Any]],
option_dict: Dict[str, Any],
key: str,
multiple: bool = False,
) -> None:
"""UI nested dict items."""
ui_node = {"name": key, "type": "schema", "optional": True, "multiple": multiple}
nested_schema = []
for c_key, c_value in option_dict.items():
# Nested?
if isinstance(c_value, list):
_nested_ui_list(nested_schema, c_value, c_key)
else:
_single_ui_option(nested_schema, c_value, c_key)
ui_node["schema"] = nested_schema
ui_schema.append(ui_node)

View File

@@ -1,4 +1,4 @@
"""Init file for Hass.io RESTful API."""
"""Init file for Supervisor RESTful API."""
import logging
from pathlib import Path
from typing import Optional
@@ -7,10 +7,13 @@ from aiohttp import web
from ..coresys import CoreSys, CoreSysAttributes
from .addons import APIAddons
from .audio import APIAudio
from .auth import APIAuth
from .cli import APICli
from .discovery import APIDiscovery
from .dns import APICoreDNS
from .hardware import APIHardware
from .hassos import APIHassOS
from .os import APIOS
from .homeassistant import APIHomeAssistant
from .host import APIHost
from .info import APIInfo
@@ -20,19 +23,24 @@ from .security import SecurityMiddleware
from .services import APIServices
from .snapshots import APISnapshots
from .supervisor import APISupervisor
from .multicast import APIMulticast
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
MAX_CLIENT_SIZE: int = 1024 ** 2 * 16
class RestAPI(CoreSysAttributes):
"""Handle RESTful API for Hass.io."""
"""Handle RESTful API for Supervisor."""
def __init__(self, coresys: CoreSys):
"""Initialize Docker base wrapper."""
self.coresys: CoreSys = coresys
self.security: SecurityMiddleware = SecurityMiddleware(coresys)
self.webapp: web.Application = web.Application(
middlewares=[self.security.token_validation]
client_max_size=MAX_CLIENT_SIZE,
middlewares=[self.security.token_validation],
)
# service stuff
@@ -43,7 +51,9 @@ class RestAPI(CoreSysAttributes):
"""Register REST API Calls."""
self._register_supervisor()
self._register_host()
self._register_hassos()
self._register_os()
self._register_cli()
self._register_multicast()
self._register_hardware()
self._register_homeassistant()
self._register_proxy()
@@ -55,6 +65,8 @@ class RestAPI(CoreSysAttributes):
self._register_services()
self._register_info()
self._register_auth()
self._register_dns()
self._register_audio()
def _register_host(self) -> None:
"""Register hostcontrol functions."""
@@ -64,6 +76,7 @@ class RestAPI(CoreSysAttributes):
self.webapp.add_routes(
[
web.get("/host/info", api_host.info),
web.get("/host/logs", api_host.logs),
web.post("/host/reboot", api_host.reboot),
web.post("/host/shutdown", api_host.shutdown),
web.post("/host/reload", api_host.reload),
@@ -76,17 +89,44 @@ class RestAPI(CoreSysAttributes):
]
)
def _register_hassos(self) -> None:
"""Register HassOS functions."""
api_hassos = APIHassOS()
api_hassos.coresys = self.coresys
def _register_os(self) -> None:
"""Register OS functions."""
api_os = APIOS()
api_os.coresys = self.coresys
self.webapp.add_routes(
[
web.get("/hassos/info", api_hassos.info),
web.post("/hassos/update", api_hassos.update),
web.post("/hassos/update/cli", api_hassos.update_cli),
web.post("/hassos/config/sync", api_hassos.config_sync),
web.get("/os/info", api_os.info),
web.post("/os/update", api_os.update),
web.post("/os/config/sync", api_os.config_sync),
]
)
def _register_cli(self) -> None:
"""Register HA cli functions."""
api_cli = APICli()
api_cli.coresys = self.coresys
self.webapp.add_routes(
[
web.get("/cli/info", api_cli.info),
web.get("/cli/stats", api_cli.stats),
web.post("/cli/update", api_cli.update),
]
)
def _register_multicast(self) -> None:
"""Register Multicast functions."""
api_multicast = APIMulticast()
api_multicast.coresys = self.coresys
self.webapp.add_routes(
[
web.get("/multicast/info", api_multicast.info),
web.get("/multicast/stats", api_multicast.stats),
web.get("/multicast/logs", api_multicast.logs),
web.post("/multicast/update", api_multicast.update),
web.post("/multicast/restart", api_multicast.restart),
]
)
@@ -99,6 +139,7 @@ class RestAPI(CoreSysAttributes):
[
web.get("/hardware/info", api_hardware.info),
web.get("/hardware/audio", api_hardware.audio),
web.post("/hardware/trigger", api_hardware.trigger),
]
)
@@ -114,7 +155,9 @@ class RestAPI(CoreSysAttributes):
api_auth = APIAuth()
api_auth.coresys = self.coresys
self.webapp.add_routes([web.post("/auth", api_auth.auth)])
self.webapp.add_routes(
[web.post("/auth", api_auth.auth), web.post("/auth/reset", api_auth.reset)]
)
def _register_supervisor(self) -> None:
"""Register Supervisor functions."""
@@ -130,6 +173,7 @@ class RestAPI(CoreSysAttributes):
web.post("/supervisor/update", api_supervisor.update),
web.post("/supervisor/reload", api_supervisor.reload),
web.post("/supervisor/options", api_supervisor.options),
web.post("/supervisor/repair", api_supervisor.repair),
]
)
@@ -140,6 +184,17 @@ class RestAPI(CoreSysAttributes):
self.webapp.add_routes(
[
web.get("/core/info", api_hass.info),
web.get("/core/logs", api_hass.logs),
web.get("/core/stats", api_hass.stats),
web.post("/core/options", api_hass.options),
web.post("/core/update", api_hass.update),
web.post("/core/restart", api_hass.restart),
web.post("/core/stop", api_hass.stop),
web.post("/core/start", api_hass.start),
web.post("/core/check", api_hass.check),
web.post("/core/rebuild", api_hass.rebuild),
# Remove with old Supervisor fallback
web.get("/homeassistant/info", api_hass.info),
web.get("/homeassistant/logs", api_hass.logs),
web.get("/homeassistant/stats", api_hass.stats),
@@ -160,6 +215,13 @@ class RestAPI(CoreSysAttributes):
self.webapp.add_routes(
[
web.get("/core/api/websocket", api_proxy.websocket),
web.get("/core/websocket", api_proxy.websocket),
web.get("/core/api/stream", api_proxy.stream),
web.post("/core/api/{path:.+}", api_proxy.api),
web.get("/core/api/{path:.+}", api_proxy.api),
web.get("/core/api/", api_proxy.api),
# Remove with old Supervisor fallback
web.get("/homeassistant/api/websocket", api_proxy.websocket),
web.get("/homeassistant/websocket", api_proxy.websocket),
web.get("/homeassistant/api/stream", api_proxy.stream),
@@ -191,6 +253,7 @@ class RestAPI(CoreSysAttributes):
web.get("/addons/{addon}/icon", api_addons.icon),
web.get("/addons/{addon}/logo", api_addons.logo),
web.get("/addons/{addon}/changelog", api_addons.changelog),
web.get("/addons/{addon}/documentation", api_addons.documentation),
web.post("/addons/{addon}/stdin", api_addons.stdin),
web.post("/addons/{addon}/security", api_addons.security),
web.get("/addons/{addon}/stats", api_addons.stats),
@@ -263,6 +326,45 @@ class RestAPI(CoreSysAttributes):
]
)
def _register_dns(self) -> None:
"""Register DNS functions."""
api_dns = APICoreDNS()
api_dns.coresys = self.coresys
self.webapp.add_routes(
[
web.get("/dns/info", api_dns.info),
web.get("/dns/stats", api_dns.stats),
web.get("/dns/logs", api_dns.logs),
web.post("/dns/update", api_dns.update),
web.post("/dns/options", api_dns.options),
web.post("/dns/restart", api_dns.restart),
web.post("/dns/reset", api_dns.reset),
]
)
def _register_audio(self) -> None:
"""Register Audio functions."""
api_audio = APIAudio()
api_audio.coresys = self.coresys
self.webapp.add_routes(
[
web.get("/audio/info", api_audio.info),
web.get("/audio/stats", api_audio.stats),
web.get("/audio/logs", api_audio.logs),
web.post("/audio/update", api_audio.update),
web.post("/audio/restart", api_audio.restart),
web.post("/audio/reload", api_audio.reload),
web.post("/audio/profile", api_audio.set_profile),
web.post("/audio/volume/{source}/application", api_audio.set_volume),
web.post("/audio/volume/{source}", api_audio.set_volume),
web.post("/audio/mute/{source}/application", api_audio.set_mute),
web.post("/audio/mute/{source}", api_audio.set_mute),
web.post("/audio/default/{source}", api_audio.set_default),
]
)
def _register_panel(self) -> None:
"""Register panel for Home Assistant."""
panel_dir = Path(__file__).parent.joinpath("panel")

View File

@@ -1,16 +1,17 @@
"""Init file for Hass.io Home Assistant RESTful API."""
"""Init file for Supervisor Home Assistant RESTful API."""
import asyncio
import logging
from typing import Any, Awaitable, Dict, List
from aiohttp import web
import voluptuous as vol
from voluptuous.humanize import humanize_error
from ..addons import AnyAddon
from ..addons.addon import Addon
from ..addons.utils import rating_security
from ..const import (
ATTR_ADDONS,
ATTR_ADVANCED,
ATTR_APPARMOR,
ATTR_ARCH,
ATTR_AUDIO,
@@ -30,7 +31,9 @@ from ..const import (
ATTR_DEVICES,
ATTR_DEVICETREE,
ATTR_DISCOVERY,
ATTR_DNS,
ATTR_DOCKER_API,
ATTR_DOCUMENTATION,
ATTR_FULL_ACCESS,
ATTR_GPIO,
ATTR_HASSIO_API,
@@ -41,6 +44,7 @@ from ..const import (
ATTR_HOST_IPC,
ATTR_HOST_NETWORK,
ATTR_HOST_PID,
ATTR_HOSTNAME,
ATTR_ICON,
ATTR_INGRESS,
ATTR_INGRESS_ENTRY,
@@ -50,12 +54,13 @@ from ..const import (
ATTR_INSTALLED,
ATTR_IP_ADDRESS,
ATTR_KERNEL_MODULES,
ATTR_LAST_VERSION,
ATTR_VERSION_LATEST,
ATTR_LOGO,
ATTR_LONG_DESCRIPTION,
ATTR_MACHINE,
ATTR_MAINTAINER,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_PERCENT,
ATTR_MEMORY_USAGE,
ATTR_NAME,
ATTR_NETWORK,
@@ -68,13 +73,17 @@ from ..const import (
ATTR_RATING,
ATTR_REPOSITORIES,
ATTR_REPOSITORY,
ATTR_SCHEMA,
ATTR_SERVICES,
ATTR_SLUG,
ATTR_SOURCE,
ATTR_STAGE,
ATTR_STATE,
ATTR_STDIN,
ATTR_UDEV,
ATTR_URL,
ATTR_VERSION,
ATTR_VIDEO,
ATTR_WEBUI,
BOOT_AUTO,
BOOT_MANUAL,
@@ -85,11 +94,12 @@ from ..const import (
STATE_NONE,
)
from ..coresys import CoreSysAttributes
from ..docker.stats import DockerStats
from ..exceptions import APIError
from ..validate import ALSA_DEVICE, DOCKER_PORTS
from ..validate import DOCKER_PORTS
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
@@ -97,10 +107,10 @@ SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
SCHEMA_OPTIONS = vol.Schema(
{
vol.Optional(ATTR_BOOT): vol.In([BOOT_AUTO, BOOT_MANUAL]),
vol.Optional(ATTR_NETWORK): vol.Any(None, DOCKER_PORTS),
vol.Optional(ATTR_NETWORK): vol.Maybe(DOCKER_PORTS),
vol.Optional(ATTR_AUTO_UPDATE): vol.Boolean(),
vol.Optional(ATTR_AUDIO_OUTPUT): ALSA_DEVICE,
vol.Optional(ATTR_AUDIO_INPUT): ALSA_DEVICE,
vol.Optional(ATTR_AUDIO_OUTPUT): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_AUDIO_INPUT): vol.Maybe(vol.Coerce(str)),
vol.Optional(ATTR_INGRESS_PANEL): vol.Boolean(),
}
)
@@ -116,11 +126,14 @@ class APIAddons(CoreSysAttributes):
self, request: web.Request, check_installed: bool = True
) -> AnyAddon:
"""Return addon, throw an exception it it doesn't exist."""
addon_slug = request.match_info.get("addon")
addon_slug: str = request.match_info.get("addon")
# Lookup itself
if addon_slug == "self":
return request.get(REQUEST_FROM)
addon = request.get(REQUEST_FROM)
if not isinstance(addon, Addon):
raise APIError("Self is not an Addon")
return addon
addon = self.sys_addons.get(addon_slug)
if not addon:
@@ -141,6 +154,8 @@ class APIAddons(CoreSysAttributes):
ATTR_NAME: addon.name,
ATTR_SLUG: addon.slug,
ATTR_DESCRIPTON: addon.description,
ATTR_ADVANCED: addon.advanced,
ATTR_STAGE: addon.stage,
ATTR_VERSION: addon.latest_version,
ATTR_INSTALLED: addon.version if addon.is_installed else None,
ATTR_AVAILABLE: addon.available,
@@ -175,21 +190,26 @@ class APIAddons(CoreSysAttributes):
@api_process
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return add-on information."""
addon = self._extract_addon(request, check_installed=False)
addon: AnyAddon = self._extract_addon(request, check_installed=False)
data = {
ATTR_NAME: addon.name,
ATTR_SLUG: addon.slug,
ATTR_HOSTNAME: addon.hostname,
ATTR_DNS: addon.dns,
ATTR_DESCRIPTON: addon.description,
ATTR_LONG_DESCRIPTION: addon.long_description,
ATTR_ADVANCED: addon.advanced,
ATTR_STAGE: addon.stage,
ATTR_AUTO_UPDATE: None,
ATTR_REPOSITORY: addon.repository,
ATTR_VERSION: None,
ATTR_LAST_VERSION: addon.latest_version,
ATTR_VERSION_LATEST: addon.latest_version,
ATTR_PROTECTED: addon.protected,
ATTR_RATING: rating_security(addon),
ATTR_BOOT: addon.boot,
ATTR_OPTIONS: addon.options,
ATTR_SCHEMA: addon.schema_ui,
ATTR_ARCH: addon.supported_arch,
ATTR_MACHINE: addon.supported_machine,
ATTR_HOMEASSISTANT: addon.homeassistant_version,
@@ -211,6 +231,7 @@ class APIAddons(CoreSysAttributes):
ATTR_ICON: addon.with_icon,
ATTR_LOGO: addon.with_logo,
ATTR_CHANGELOG: addon.with_changelog,
ATTR_DOCUMENTATION: addon.with_documentation,
ATTR_STDIN: addon.with_stdin,
ATTR_WEBUI: None,
ATTR_HASSIO_API: addon.access_hassio_api,
@@ -220,7 +241,9 @@ class APIAddons(CoreSysAttributes):
ATTR_GPIO: addon.with_gpio,
ATTR_KERNEL_MODULES: addon.with_kernel_modules,
ATTR_DEVICETREE: addon.with_devicetree,
ATTR_UDEV: addon.with_udev,
ATTR_DOCKER_API: addon.access_docker_api,
ATTR_VIDEO: addon.with_video,
ATTR_AUDIO: addon.with_audio,
ATTR_AUDIO_INPUT: None,
ATTR_AUDIO_OUTPUT: None,
@@ -256,13 +279,18 @@ class APIAddons(CoreSysAttributes):
@api_process
async def options(self, request: web.Request) -> None:
"""Store user options for add-on."""
addon = self._extract_addon(request)
addon: AnyAddon = self._extract_addon(request)
# Update secrets for validation
await self.sys_secrets.reload()
# Extend schema with add-on specific validation
addon_schema = SCHEMA_OPTIONS.extend(
{vol.Optional(ATTR_OPTIONS): vol.Any(None, addon.schema)}
)
body = await api_validate(addon_schema, request)
# Validate/Process Body
body = await api_validate(addon_schema, request, origin=[ATTR_OPTIONS])
if ATTR_OPTIONS in body:
addon.options = body[ATTR_OPTIONS]
if ATTR_BOOT in body:
@@ -284,25 +312,26 @@ class APIAddons(CoreSysAttributes):
@api_process
async def security(self, request: web.Request) -> None:
"""Store security options for add-on."""
addon = self._extract_addon(request)
body = await api_validate(SCHEMA_SECURITY, request)
addon: AnyAddon = self._extract_addon(request)
body: Dict[str, Any] = await api_validate(SCHEMA_SECURITY, request)
if ATTR_PROTECTED in body:
_LOGGER.warning("Protected flag changing for %s!", addon.slug)
addon.protected = body[ATTR_PROTECTED]
addon.save_data()
addon.save_persist()
@api_process
async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information."""
addon = self._extract_addon(request)
stats = await addon.stats()
addon: AnyAddon = self._extract_addon(request)
stats: DockerStats = await addon.stats()
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage,
ATTR_MEMORY_LIMIT: stats.memory_limit,
ATTR_MEMORY_PERCENT: stats.memory_percent,
ATTR_NETWORK_RX: stats.network_rx,
ATTR_NETWORK_TX: stats.network_tx,
ATTR_BLK_READ: stats.blk_read,
@@ -312,39 +341,31 @@ class APIAddons(CoreSysAttributes):
@api_process
def install(self, request: web.Request) -> Awaitable[None]:
"""Install add-on."""
addon = self._extract_addon(request, check_installed=False)
addon: AnyAddon = self._extract_addon(request, check_installed=False)
return asyncio.shield(addon.install())
@api_process
def uninstall(self, request: web.Request) -> Awaitable[None]:
"""Uninstall add-on."""
addon = self._extract_addon(request)
addon: AnyAddon = self._extract_addon(request)
return asyncio.shield(addon.uninstall())
@api_process
def start(self, request: web.Request) -> Awaitable[None]:
"""Start add-on."""
addon = self._extract_addon(request)
# check options
options = addon.options
try:
addon.schema(options)
except vol.Invalid as ex:
raise APIError(humanize_error(options, ex)) from None
addon: AnyAddon = self._extract_addon(request)
return asyncio.shield(addon.start())
@api_process
def stop(self, request: web.Request) -> Awaitable[None]:
"""Stop add-on."""
addon = self._extract_addon(request)
addon: AnyAddon = self._extract_addon(request)
return asyncio.shield(addon.stop())
@api_process
def update(self, request: web.Request) -> Awaitable[None]:
"""Update add-on."""
addon = self._extract_addon(request)
addon: AnyAddon = self._extract_addon(request)
if addon.latest_version == addon.version:
raise APIError("No update available!")
@@ -354,13 +375,13 @@ class APIAddons(CoreSysAttributes):
@api_process
def restart(self, request: web.Request) -> Awaitable[None]:
"""Restart add-on."""
addon = self._extract_addon(request)
addon: AnyAddon = self._extract_addon(request)
return asyncio.shield(addon.restart())
@api_process
def rebuild(self, request: web.Request) -> Awaitable[None]:
"""Rebuild local build add-on."""
addon = self._extract_addon(request)
addon: AnyAddon = self._extract_addon(request)
if not addon.need_build:
raise APIError("Only local build addons are supported")
@@ -369,13 +390,13 @@ class APIAddons(CoreSysAttributes):
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return logs from add-on."""
addon = self._extract_addon(request)
addon: AnyAddon = self._extract_addon(request)
return addon.logs()
@api_process_raw(CONTENT_TYPE_PNG)
async def icon(self, request: web.Request) -> bytes:
"""Return icon from add-on."""
addon = self._extract_addon(request, check_installed=False)
addon: AnyAddon = self._extract_addon(request, check_installed=False)
if not addon.with_icon:
raise APIError("No icon found!")
@@ -385,7 +406,7 @@ class APIAddons(CoreSysAttributes):
@api_process_raw(CONTENT_TYPE_PNG)
async def logo(self, request: web.Request) -> bytes:
"""Return logo from add-on."""
addon = self._extract_addon(request, check_installed=False)
addon: AnyAddon = self._extract_addon(request, check_installed=False)
if not addon.with_logo:
raise APIError("No logo found!")
@@ -395,17 +416,27 @@ class APIAddons(CoreSysAttributes):
@api_process_raw(CONTENT_TYPE_TEXT)
async def changelog(self, request: web.Request) -> str:
"""Return changelog from add-on."""
addon = self._extract_addon(request, check_installed=False)
addon: AnyAddon = self._extract_addon(request, check_installed=False)
if not addon.with_changelog:
raise APIError("No changelog found!")
with addon.path_changelog.open("r") as changelog:
return changelog.read()
@api_process_raw(CONTENT_TYPE_TEXT)
async def documentation(self, request: web.Request) -> str:
"""Return documentation from add-on."""
addon: AnyAddon = self._extract_addon(request, check_installed=False)
if not addon.with_documentation:
raise APIError("No documentation found!")
with addon.path_documentation.open("r") as documentation:
return documentation.read()
@api_process
async def stdin(self, request: web.Request) -> None:
"""Write to stdin of add-on."""
addon = self._extract_addon(request)
addon: AnyAddon = self._extract_addon(request)
if not addon.with_stdin:
raise APIError("STDIN not supported by add-on")

170
supervisor/api/audio.py Normal file
View File

@@ -0,0 +1,170 @@
"""Init file for Supervisor Audio RESTful API."""
import asyncio
import logging
from typing import Any, Awaitable, Dict
from aiohttp import web
import attr
import voluptuous as vol
from ..const import (
ATTR_ACTIVE,
ATTR_APPLICATION,
ATTR_AUDIO,
ATTR_BLK_READ,
ATTR_BLK_WRITE,
ATTR_CARD,
ATTR_CPU_PERCENT,
ATTR_HOST,
ATTR_INDEX,
ATTR_INPUT,
ATTR_VERSION_LATEST,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_PERCENT,
ATTR_MEMORY_USAGE,
ATTR_NAME,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_OUTPUT,
ATTR_VERSION,
ATTR_VOLUME,
CONTENT_TYPE_BINARY,
)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError
from ..host.sound import StreamType
from .utils import api_process, api_process_raw, api_validate
_LOGGER: logging.Logger = logging.getLogger(__name__)
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
SCHEMA_VOLUME = vol.Schema(
{
vol.Required(ATTR_INDEX): vol.Coerce(int),
vol.Required(ATTR_VOLUME): vol.Coerce(float),
}
)
# pylint: disable=no-value-for-parameter
SCHEMA_MUTE = vol.Schema(
{
vol.Required(ATTR_INDEX): vol.Coerce(int),
vol.Required(ATTR_ACTIVE): vol.Boolean(),
}
)
SCHEMA_DEFAULT = vol.Schema({vol.Required(ATTR_NAME): vol.Coerce(str)})
SCHEMA_PROFILE = vol.Schema(
{vol.Required(ATTR_CARD): vol.Coerce(str), vol.Required(ATTR_NAME): vol.Coerce(str)}
)
class APIAudio(CoreSysAttributes):
"""Handle RESTful API for Audio functions."""
@api_process
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return Audio information."""
return {
ATTR_VERSION: self.sys_plugins.audio.version,
ATTR_VERSION_LATEST: self.sys_plugins.audio.latest_version,
ATTR_HOST: str(self.sys_docker.network.audio),
ATTR_AUDIO: {
ATTR_CARD: [attr.asdict(card) for card in self.sys_host.sound.cards],
ATTR_INPUT: [
attr.asdict(stream) for stream in self.sys_host.sound.inputs
],
ATTR_OUTPUT: [
attr.asdict(stream) for stream in self.sys_host.sound.outputs
],
ATTR_APPLICATION: [
attr.asdict(stream) for stream in self.sys_host.sound.applications
],
},
}
@api_process
async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information."""
stats = await self.sys_plugins.audio.stats()
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage,
ATTR_MEMORY_LIMIT: stats.memory_limit,
ATTR_MEMORY_PERCENT: stats.memory_percent,
ATTR_NETWORK_RX: stats.network_rx,
ATTR_NETWORK_TX: stats.network_tx,
ATTR_BLK_READ: stats.blk_read,
ATTR_BLK_WRITE: stats.blk_write,
}
@api_process
async def update(self, request: web.Request) -> None:
"""Update Audio plugin."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_plugins.audio.latest_version)
if version == self.sys_plugins.audio.version:
raise APIError("Version {} is already in use".format(version))
await asyncio.shield(self.sys_plugins.audio.update(version))
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return Audio Docker logs."""
return self.sys_plugins.audio.logs()
@api_process
def restart(self, request: web.Request) -> Awaitable[None]:
"""Restart Audio plugin."""
return asyncio.shield(self.sys_plugins.audio.restart())
@api_process
def reload(self, request: web.Request) -> Awaitable[None]:
"""Reload Audio information."""
return asyncio.shield(self.sys_host.sound.update())
@api_process
async def set_volume(self, request: web.Request) -> None:
"""Set audio volume on stream."""
source: StreamType = StreamType(request.match_info.get("source"))
application: bool = request.path.endswith("application")
body = await api_validate(SCHEMA_VOLUME, request)
await asyncio.shield(
self.sys_host.sound.set_volume(
source, body[ATTR_INDEX], body[ATTR_VOLUME], application
)
)
@api_process
async def set_mute(self, request: web.Request) -> None:
"""Mute audio volume on stream."""
source: StreamType = StreamType(request.match_info.get("source"))
application: bool = request.path.endswith("application")
body = await api_validate(SCHEMA_MUTE, request)
await asyncio.shield(
self.sys_host.sound.set_mute(
source, body[ATTR_INDEX], body[ATTR_ACTIVE], application
)
)
@api_process
async def set_default(self, request: web.Request) -> None:
"""Set audio default stream."""
source: StreamType = StreamType(request.match_info.get("source"))
body = await api_validate(SCHEMA_DEFAULT, request)
await asyncio.shield(self.sys_host.sound.set_default(source, body[ATTR_NAME]))
@api_process
async def set_profile(self, request: web.Request) -> None:
"""Set audio default sources."""
body = await api_validate(SCHEMA_PROFILE, request)
await asyncio.shield(
self.sys_host.sound.ativate_profile(body[ATTR_CARD], body[ATTR_NAME])
)

View File

@@ -1,22 +1,39 @@
"""Init file for Hass.io auth/SSO RESTful API."""
"""Init file for Supervisor auth/SSO RESTful API."""
import asyncio
import logging
from typing import Dict
from aiohttp import BasicAuth
from aiohttp import BasicAuth, web
from aiohttp.hdrs import AUTHORIZATION, CONTENT_TYPE, WWW_AUTHENTICATE
from aiohttp.web_exceptions import HTTPUnauthorized
from aiohttp.hdrs import CONTENT_TYPE, AUTHORIZATION, WWW_AUTHENTICATE
import voluptuous as vol
from .utils import api_process
from ..const import REQUEST_FROM, CONTENT_TYPE_JSON, CONTENT_TYPE_URL
from ..addons.addon import Addon
from ..const import (
ATTR_PASSWORD,
ATTR_USERNAME,
CONTENT_TYPE_JSON,
CONTENT_TYPE_URL,
REQUEST_FROM,
)
from ..coresys import CoreSysAttributes
from ..exceptions import APIForbidden
from .utils import api_process, api_validate
_LOGGER = logging.getLogger(__name__)
_LOGGER: logging.Logger = logging.getLogger(__name__)
SCHEMA_PASSWORD_RESET = vol.Schema(
{
vol.Required(ATTR_USERNAME): vol.Coerce(str),
vol.Required(ATTR_PASSWORD): vol.Coerce(str),
}
)
class APIAuth(CoreSysAttributes):
"""Handle RESTful API for auth functions."""
def _process_basic(self, request, addon):
def _process_basic(self, request: web.Request, addon: Addon) -> bool:
"""Process login request with basic auth.
Return a coroutine.
@@ -24,7 +41,9 @@ class APIAuth(CoreSysAttributes):
auth = BasicAuth.decode(request.headers[AUTHORIZATION])
return self.sys_auth.check_login(addon, auth.login, auth.password)
def _process_dict(self, request, addon, data):
def _process_dict(
self, request: web.Request, addon: Addon, data: Dict[str, str]
) -> bool:
"""Process login with dict data.
Return a coroutine.
@@ -35,7 +54,7 @@ class APIAuth(CoreSysAttributes):
return self.sys_auth.check_login(addon, username, password)
@api_process
async def auth(self, request):
async def auth(self, request: web.Request) -> bool:
"""Process login request."""
addon = request[REQUEST_FROM]
@@ -57,5 +76,13 @@ class APIAuth(CoreSysAttributes):
return await self._process_dict(request, addon, data)
raise HTTPUnauthorized(
headers={WWW_AUTHENTICATE: 'Basic realm="Hass.io Authentication"'}
headers={WWW_AUTHENTICATE: 'Basic realm="Home Assistant Authentication"'}
)
@api_process
async def reset(self, request: web.Request) -> None:
"""Process reset password request."""
body: Dict[str, str] = await api_validate(SCHEMA_PASSWORD_RESET, request)
await asyncio.shield(
self.sys_auth.change_password(body[ATTR_USERNAME], body[ATTR_PASSWORD])
)

62
supervisor/api/cli.py Normal file
View File

@@ -0,0 +1,62 @@
"""Init file for Supervisor HA cli RESTful API."""
import asyncio
import logging
from typing import Any, Dict
from aiohttp import web
import voluptuous as vol
from ..const import (
ATTR_VERSION,
ATTR_VERSION_LATEST,
ATTR_BLK_READ,
ATTR_BLK_WRITE,
ATTR_CPU_PERCENT,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_PERCENT,
ATTR_MEMORY_USAGE,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
)
from ..coresys import CoreSysAttributes
from .utils import api_process, api_validate
_LOGGER: logging.Logger = logging.getLogger(__name__)
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
class APICli(CoreSysAttributes):
"""Handle RESTful API for HA Cli functions."""
@api_process
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return HA cli information."""
return {
ATTR_VERSION: self.sys_plugins.cli.version,
ATTR_VERSION_LATEST: self.sys_plugins.cli.latest_version,
}
@api_process
async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information."""
stats = await self.sys_plugins.cli.stats()
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage,
ATTR_MEMORY_LIMIT: stats.memory_limit,
ATTR_MEMORY_PERCENT: stats.memory_percent,
ATTR_NETWORK_RX: stats.network_rx,
ATTR_NETWORK_TX: stats.network_tx,
ATTR_BLK_READ: stats.blk_read,
ATTR_BLK_WRITE: stats.blk_write,
}
@api_process
async def update(self, request: web.Request) -> None:
"""Update HA CLI."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_plugins.cli.latest_version)
await asyncio.shield(self.sys_plugins.cli.update(version))

View File

@@ -1,4 +1,4 @@
"""Init file for Hass.io network RESTful API."""
"""Init file for Supervisor network RESTful API."""
import voluptuous as vol
from .utils import api_process, api_validate
@@ -64,7 +64,7 @@ class APIDiscovery(CoreSysAttributes):
# Access?
if body[ATTR_SERVICE] not in addon.discovery:
raise APIForbidden(f"Can't use discovery!")
raise APIForbidden("Can't use discovery!")
# Process discovery message
message = self.sys_discovery.send(addon, **body)
@@ -94,7 +94,7 @@ class APIDiscovery(CoreSysAttributes):
# Permission
if message.addon != addon.slug:
raise APIForbidden(f"Can't remove discovery message")
raise APIForbidden("Can't remove discovery message")
self.sys_discovery.remove(message)
return True

102
supervisor/api/dns.py Normal file
View File

@@ -0,0 +1,102 @@
"""Init file for Supervisor DNS RESTful API."""
import asyncio
import logging
from typing import Any, Awaitable, Dict
from aiohttp import web
import voluptuous as vol
from ..const import (
ATTR_BLK_READ,
ATTR_BLK_WRITE,
ATTR_CPU_PERCENT,
ATTR_HOST,
ATTR_VERSION_LATEST,
ATTR_LOCALS,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_PERCENT,
ATTR_MEMORY_USAGE,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_SERVERS,
ATTR_VERSION,
CONTENT_TYPE_BINARY,
)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError
from ..validate import dns_server_list
from .utils import api_process, api_process_raw, api_validate
_LOGGER: logging.Logger = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_OPTIONS = vol.Schema({vol.Optional(ATTR_SERVERS): dns_server_list})
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
class APICoreDNS(CoreSysAttributes):
"""Handle RESTful API for DNS functions."""
@api_process
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return DNS information."""
return {
ATTR_VERSION: self.sys_plugins.dns.version,
ATTR_VERSION_LATEST: self.sys_plugins.dns.latest_version,
ATTR_HOST: str(self.sys_docker.network.dns),
ATTR_SERVERS: self.sys_plugins.dns.servers,
ATTR_LOCALS: self.sys_host.network.dns_servers,
}
@api_process
async def options(self, request: web.Request) -> None:
"""Set DNS options."""
body = await api_validate(SCHEMA_OPTIONS, request)
if ATTR_SERVERS in body:
self.sys_plugins.dns.servers = body[ATTR_SERVERS]
self.sys_create_task(self.sys_plugins.dns.restart())
self.sys_plugins.dns.save_data()
@api_process
async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information."""
stats = await self.sys_plugins.dns.stats()
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage,
ATTR_MEMORY_LIMIT: stats.memory_limit,
ATTR_MEMORY_PERCENT: stats.memory_percent,
ATTR_NETWORK_RX: stats.network_rx,
ATTR_NETWORK_TX: stats.network_tx,
ATTR_BLK_READ: stats.blk_read,
ATTR_BLK_WRITE: stats.blk_write,
}
@api_process
async def update(self, request: web.Request) -> None:
"""Update DNS plugin."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_plugins.dns.latest_version)
if version == self.sys_plugins.dns.version:
raise APIError("Version {} is already in use".format(version))
await asyncio.shield(self.sys_plugins.dns.update(version))
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return DNS Docker logs."""
return self.sys_plugins.dns.logs()
@api_process
def restart(self, request: web.Request) -> Awaitable[None]:
"""Restart CoreDNS plugin."""
return asyncio.shield(self.sys_plugins.dns.restart())
@api_process
def reset(self, request: web.Request) -> Awaitable[None]:
"""Reset CoreDNS plugin."""
return asyncio.shield(self.sys_plugins.dns.reset())

Some files were not shown because too many files have changed in this diff Show More