Compare commits

...

45 Commits
168 ... 174

Author SHA1 Message Date
Pascal Vizeli
3cf189ad94 Merge pull request #1209 from home-assistant/dev
Release 174
2019-08-14 15:38:57 +02:00
dependabot-preview[bot]
6ffb94a0f5 Bump ptvsd from 4.3.1 to 4.3.2 (#1207)
Bumps [ptvsd](https://github.com/Microsoft/ptvsd) from 4.3.1 to 4.3.2.
- [Release notes](https://github.com/Microsoft/ptvsd/releases)
- [Commits](https://github.com/Microsoft/ptvsd/compare/v4.3.1...v4.3.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-14 14:35:43 +02:00
Pascal Vizeli
3593826441 Fix issue with windows dev env 2019-08-14 10:37:39 +00:00
Pascal Vizeli
0a0a62f238 Addon provide his own udev support (#1206)
* Addon provide his own udev support

* upgrade logger
2019-08-14 12:29:00 +02:00
Pascal Vizeli
41ce9913d2 Stats percent (#1205)
* Fix stats and add Memory percent

* Fix tasks

* round percent
2019-08-14 10:47:11 +02:00
Pascal Vizeli
b77c42384d Add DNS to add-on (#1204) 2019-08-14 09:53:03 +02:00
Pascal Vizeli
138bb12f98 Add debug output to gdbus (#1203) 2019-08-13 21:25:04 +02:00
Pascal Vizeli
4fe2859f4e Rename scripts folder (#1202)
* Rename script folder

* Rename scripts
2019-08-13 14:39:32 +02:00
dependabot-preview[bot]
0768b2b4bc Bump ptvsd from 4.3.0 to 4.3.1 (#1200)
Bumps [ptvsd](https://github.com/Microsoft/ptvsd) from 4.3.0 to 4.3.1.
- [Release notes](https://github.com/Microsoft/ptvsd/releases)
- [Commits](https://github.com/Microsoft/ptvsd/compare/v4.3.0...v4.3.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-13 14:35:41 +02:00
dependabot-preview[bot]
e6f1772a93 Bump gitpython from 2.1.13 to 3.0.0 (#1199)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 2.1.13 to 3.0.0.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/2.1.13...3.0.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-13 14:35:15 +02:00
dependabot-preview[bot]
5374b2b3b9 Bump voluptuous from 0.11.5 to 0.11.7 (#1201)
Bumps [voluptuous](https://github.com/alecthomas/voluptuous) from 0.11.5 to 0.11.7.
- [Release notes](https://github.com/alecthomas/voluptuous/releases)
- [Changelog](https://github.com/alecthomas/voluptuous/blob/master/CHANGELOG.md)
- [Commits](https://github.com/alecthomas/voluptuous/commits)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-13 14:29:33 +02:00
Pascal Vizeli
1196788856 Add CoreDNS as DNS backend (#1195)
* Add CoreDNS / DNS configuration

* Support get version

* add version

* add coresys

* Add more logic

* move forwareder into dns

* Setup docker inside

* add docker to env

* Add more function

* more interface

* Update hosts template

* Add DNS folder

* Fix issues

* Add more logic

* Add handling for hosts

* Fix setting

* fix lint

* Fix some issues

* Fix issue

* Run with no cache

* Fix issue on validate

* Fix bug

* Allow to jump into dev mode

* Fix permission

* Fix issue

* Add dns search

* Add watchdog

* Fix set issues

* add API description

* Add API endpoint

* Add CLI support

* Fix logs + add hostname

* Add/remove DNS entry

* Fix attribute

* Fix style

* Better shutdown

* Remove ha from network mapping

* Add more options

* Fix env shutdown

* Add support for new repair function

* Start coreDNS faster after restart

* remove options

* Fix ha fix
2019-08-13 14:20:42 +02:00
Pascal Vizeli
9f3f47eb80 Bump version 174 2019-08-11 09:59:48 +02:00
Pascal Vizeli
1a90a478f2 Merge pull request #1197 from home-assistant/dev
Release 173
2019-08-11 09:39:17 +02:00
Pascal Vizeli
ee773f3b63 Fix hanging landingpage (#1196) 2019-08-11 09:05:47 +02:00
Pascal Vizeli
5ffc27f60c Bump version 173 2019-08-08 23:24:11 +02:00
Pascal Vizeli
4c13dfb43c Merge pull request #1194 from home-assistant/dev
Release 172
2019-08-08 23:21:26 +02:00
Pascal Vizeli
bc099f0d81 Fix Version detection with exists container (#1193) 2019-08-08 23:20:26 +02:00
Pascal Vizeli
b26dd0af19 Add better log output for repair (#1191) 2019-08-08 10:14:13 +02:00
Pascal Vizeli
0dee5bd763 Fix black formating args 2019-08-08 10:13:44 +02:00
Pascal Vizeli
0765387ad8 Bump version 172 2019-08-07 18:18:09 +02:00
Pascal Vizeli
a07517bd3c Merge pull request #1190 from home-assistant/dev
Release 171
2019-08-07 18:17:30 +02:00
Pascal Vizeli
e5f0d80d96 Start API server before he beform a self update (#1189) 2019-08-07 18:03:56 +02:00
Pascal Vizeli
2fc5e3b7d9 Repair / fixup docker overlayfs issues (#1170)
* Add a repair modus

* Add repair to add-ons

* repair to cli

* Add API call

* fix sync call

* Clean all images

* Fix repair

* Fix supervisor

* Add new function to core

* fix tagging

* better style

* use retag

* new retag function

* Fix lint

* Fix import export
2019-08-07 17:26:32 +02:00
Pascal Vizeli
778bc46848 Don't relay on latest with HA/Addons (#1175)
* Don't relay on latest with HA/Addons

* Fix latest on install

* Revert some options

* Fix attach

* migrate to new version handling

* Fix thread

* Fix is running

* Allow wait

* debug code

* Fix debug value

* Fix list

* Fix regex

* Some better log output

* Fix logic

* Improve cleanup handling

* Fix bug

* Cleanup old code

* Improve version handling

* Fix the way to attach
2019-08-07 09:51:27 +02:00
Pascal Vizeli
882586b246 Fix time adjustments on latest boot (#1187)
* Fix time adjustments on latest boot

* Fix spell
2019-08-06 09:24:22 +02:00
dependabot-preview[bot]
b7c07a2555 Bump pytz from 2019.1 to 2019.2 (#1184)
Bumps [pytz](https://github.com/stub42/pytz) from 2019.1 to 2019.2.
- [Release notes](https://github.com/stub42/pytz/releases)
- [Commits](https://github.com/stub42/pytz/compare/release_2019.1...release_2019.2)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-08-02 10:32:04 +02:00
dependabot-preview[bot]
814b504fa9 Bump ptvsd from 4.2.10 to 4.3.0 (#1179)
Bumps [ptvsd](https://github.com/Microsoft/ptvsd) from 4.2.10 to 4.3.0.
- [Release notes](https://github.com/Microsoft/ptvsd/releases)
- [Commits](https://github.com/Microsoft/ptvsd/compare/v4.2.10...v4.3.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-07-29 17:01:28 +02:00
dependabot-preview[bot]
7ae430e7a8 Bump gitpython from 2.1.12 to 2.1.13 (#1178)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 2.1.12 to 2.1.13.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/2.1.12...2.1.13)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-07-29 14:53:54 +02:00
dependabot-preview[bot]
0e7e95ba20 Bump gitpython from 2.1.11 to 2.1.12 (#1171)
Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 2.1.11 to 2.1.12.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/master/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/2.1.11...2.1.12)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-07-22 14:18:11 +02:00
Pascal Vizeli
e577d8acb2 Bump version 171 2019-07-19 11:49:00 +02:00
Pascal Vizeli
0a76ab5054 Merge pull request #1168 from home-assistant/dev
Release 170
2019-07-19 11:48:28 +02:00
Pascal Vizeli
03c5596e04 Fix machine version (#1167) 2019-07-19 11:47:55 +02:00
Pascal Vizeli
3af4e14e83 Bump version 170 2019-07-16 12:36:05 +02:00
Pascal Vizeli
7c8cf57820 Merge pull request #1164 from home-assistant/dev
Release 169
2019-07-16 12:35:10 +02:00
Pascal Vizeli
8d84a8a62e Update panel & support panel on devcontainer (#1163)
* Update panel & support panel on devcontainer

* small cleanups

* small size
2019-07-16 12:23:03 +02:00
Pascal Vizeli
08c45060bd Add support for RPi4 (#1162) 2019-07-16 10:33:56 +02:00
Pascal Vizeli
7ca8d2811b Update URL for version file (#1161) 2019-07-16 10:26:59 +02:00
Pascal Vizeli
bb6898b032 Update azure-pipelines.yml for Azure Pipelines 2019-07-12 09:57:55 +02:00
Pascal Vizeli
cd86c6814e Update azure-pipelines.yml for Azure Pipelines 2019-07-12 09:42:55 +02:00
Pascal Vizeli
b67e116650 Update azure-pipelines.yml 2019-07-12 09:41:40 +02:00
Pascal Vizeli
57ce411fb6 Update azure-pipelines.yml 2019-07-11 22:11:37 +02:00
Pascal Vizeli
85ed4d9e8d Update Dockerfile 2019-07-11 19:25:07 +02:00
dependabot-preview[bot]
ccb39da569 Bump flake8 from 3.7.7 to 3.7.8 (#1154)
Bumps [flake8](https://gitlab.com/pycqa/flake8) from 3.7.7 to 3.7.8.
- [Release notes](https://gitlab.com/pycqa/flake8/tags)
- [Commits](https://gitlab.com/pycqa/flake8/compare/3.7.7...3.7.8)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2019-07-09 14:05:43 +02:00
Pascal Vizeli
dd7ba64d32 Bump version 169 2019-07-08 16:03:59 +02:00
132 changed files with 1577 additions and 335 deletions

View File

@@ -1,9 +1,40 @@
FROM python:3.7 FROM python:3.7
WORKDIR /workspace WORKDIR /workspaces
# Install Node/Yarn for Frontent
RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
git \
apt-utils \
apt-transport-https \
&& curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \
&& echo "deb https://dl.yarnpkg.com/debian/ stable main" | tee /etc/apt/sources.list.d/yarn.list \
&& apt-get update && apt-get install -y --no-install-recommends \
nodejs \
yarn \
&& curl -o - https://raw.githubusercontent.com/creationix/nvm/v0.34.0/install.sh | bash \
&& rm -rf /var/lib/apt/lists/*
ENV NVM_DIR /root/.nvm
# Install docker
# https://docs.docker.com/engine/installation/linux/docker-ce/ubuntu/
RUN apt-get update && apt-get install -y --no-install-recommends \
apt-transport-https \
ca-certificates \
curl \
software-properties-common \
gpg-agent \
&& curl -fsSL https://download.docker.com/linux/debian/gpg | apt-key add - \
&& add-apt-repository "deb https://download.docker.com/linux/debian $(lsb_release -cs) stable" \
&& apt-get update && apt-get install -y --no-install-recommends \
docker-ce \
docker-ce-cli \
containerd.io \
&& rm -rf /var/lib/apt/lists/*
# Install Python dependencies from requirements.txt if it exists # Install Python dependencies from requirements.txt if it exists
COPY requirements.txt requirements_tests.txt /workspace/ COPY requirements.txt requirements_tests.txt /workspaces/
RUN pip install -r requirements.txt \ RUN pip install -r requirements.txt \
&& pip3 install -r requirements_tests.txt \ && pip3 install -r requirements_tests.txt \
&& pip install black tox && pip install black tox

View File

@@ -3,8 +3,11 @@
"name": "Hass.io dev", "name": "Hass.io dev",
"context": "..", "context": "..",
"dockerFile": "Dockerfile", "dockerFile": "Dockerfile",
"appPort": "9123:8123",
"runArgs": [ "runArgs": [
"-e", "GIT_EDTIOR='code --wait'" "-e",
"GIT_EDITOR='code --wait'",
"--privileged"
], ],
"extensions": [ "extensions": [
"ms-python.python" "ms-python.python"
@@ -14,9 +17,13 @@
"python.linting.pylintEnabled": true, "python.linting.pylintEnabled": true,
"python.linting.enabled": true, "python.linting.enabled": true,
"python.formatting.provider": "black", "python.formatting.provider": "black",
"python.formatting.blackArgs": [
"--target-version",
"py37"
],
"editor.formatOnPaste": false, "editor.formatOnPaste": false,
"editor.formatOnSave": true, "editor.formatOnSave": true,
"editor.formatOnType": true, "editor.formatOnType": true,
"files.trimTrailingWhitespace": true "files.trimTrailingWhitespace": true
} }
} }

View File

@@ -1,13 +1,23 @@
# General files # General files
.git .git
.github .github
.devcontainer
.vscode
# Test related files # Test related files
.tox .tox
# Temporary files # Temporary files
**/__pycache__ **/__pycache__
.pytest_cache
# virtualenv # virtualenv
venv/ venv/
ENV/
# HA
home-assistant-polymer/*
misc/*
script/*
# Test ENV
data/

4
.gitignore vendored
View File

@@ -92,4 +92,6 @@ ENV/
.pylint.d/ .pylint.d/
# VS Code # VS Code
.vscode/ .vscode/*
!.vscode/cSpell.json
!.vscode/tasks.json

92
.vscode/tasks.json vendored Normal file
View File

@@ -0,0 +1,92 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "Run Testenv",
"type": "shell",
"command": "./scripts/test_env.sh",
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Run Testenv CLI",
"type": "shell",
"command": "docker run --rm -ti -v /etc/machine-id:/etc/machine-id --network=hassio --add-host hassio:172.30.32.2 homeassistant/amd64-hassio-cli:dev",
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Update UI",
"type": "shell",
"command": "./scripts/update-frontend.sh",
"group": {
"kind": "build",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Pytest",
"type": "shell",
"command": "pytest --timeout=10 tests",
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Flake8",
"type": "shell",
"command": "flake8 hassio tests",
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Pylint",
"type": "shell",
"command": "pylint hassio",
"dependsOn": [
"Install all Requirements"
],
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
}
]
}

52
API.md
View File

@@ -105,6 +105,7 @@ Output is the raw docker log.
"cpu_percent": 0.0, "cpu_percent": 0.0,
"memory_usage": 283123, "memory_usage": 283123,
"memory_limit": 329392, "memory_limit": 329392,
"memory_percent": 1.4,
"network_tx": 0, "network_tx": 0,
"network_rx": 0, "network_rx": 0,
"blk_read": 0, "blk_read": 0,
@@ -112,6 +113,10 @@ Output is the raw docker log.
} }
``` ```
- GET `/supervisor/repair`
Repair overlayfs issue and restore lost images
### Snapshot ### Snapshot
- GET `/snapshots` - GET `/snapshots`
@@ -417,6 +422,7 @@ Proxy to real websocket instance.
"cpu_percent": 0.0, "cpu_percent": 0.0,
"memory_usage": 283123, "memory_usage": 283123,
"memory_limit": 329392, "memory_limit": 329392,
"memory_percent": 1.4,
"network_tx": 0, "network_tx": 0,
"network_rx": 0, "network_rx": 0,
"blk_read": 0, "blk_read": 0,
@@ -469,6 +475,8 @@ Get all available addons.
{ {
"name": "xy bla", "name": "xy bla",
"slug": "xdssd_xybla", "slug": "xdssd_xybla",
"hostname": "xdssd-xybla",
"dns": [],
"description": "description", "description": "description",
"long_description": "null|markdown", "long_description": "null|markdown",
"auto_update": "bool", "auto_update": "bool",
@@ -494,6 +502,7 @@ Get all available addons.
"privileged": ["NET_ADMIN", "SYS_ADMIN"], "privileged": ["NET_ADMIN", "SYS_ADMIN"],
"apparmor": "disable|default|profile", "apparmor": "disable|default|profile",
"devices": ["/dev/xy"], "devices": ["/dev/xy"],
"udev": "bool",
"auto_uart": "bool", "auto_uart": "bool",
"icon": "bool", "icon": "bool",
"logo": "bool", "logo": "bool",
@@ -589,6 +598,7 @@ Write data to add-on stdin
"cpu_percent": 0.0, "cpu_percent": 0.0,
"memory_usage": 283123, "memory_usage": 283123,
"memory_limit": 329392, "memory_limit": 329392,
"memory_percent": 1.4,
"network_tx": 0, "network_tx": 0,
"network_rx": 0, "network_rx": 0,
"blk_read": 0, "blk_read": 0,
@@ -735,6 +745,48 @@ return:
} }
``` ```
### DNS
- GET `/dns/info`
```json
{
"host": "ip-address",
"version": "1",
"latest_version": "2",
"servers": ["dns://8.8.8.8"]
}
```
- POST `/dns/options`
```json
{
"servers": ["dns://8.8.8.8"]
}
```
- POST `/dns/update`
```json
{
"version": "VERSION"
}
```
- GET `/dns/logs`
- GET `/dns/stats`
```json
{
"cpu_percent": 0.0,
"memory_usage": 283123,
"memory_limit": 329392,
"memory_percent": 1.4,
"network_tx": 0,
"network_rx": 0,
"blk_read": 0,
"blk_write": 0
}
```
### Auth / SSO API ### Auth / SSO API
You can use the user system on homeassistant. We handle this auth system on You can use the user system on homeassistant. We handle this auth system on

View File

@@ -1,8 +1,6 @@
ARG BUILD_FROM ARG BUILD_FROM
FROM $BUILD_FROM FROM $BUILD_FROM
ARG BUILD_ARCH
# Install base # Install base
RUN apk add --no-cache \ RUN apk add --no-cache \
openssl \ openssl \
@@ -15,20 +13,26 @@ RUN apk add --no-cache \
eudev \ eudev \
eudev-libs eudev-libs
ARG BUILD_ARCH
WORKDIR /usr/src
# Install requirements # Install requirements
COPY requirements.txt /usr/src/ COPY requirements.txt .
RUN export MAKEFLAGS="-j$(nproc)" \ RUN export MAKEFLAGS="-j$(nproc)" \
&& pip3 install --no-cache-dir --find-links "https://wheels.home-assistant.io/alpine-$(cut -d '.' -f 1-2 < /etc/alpine-release)/${BUILD_ARCH}/" \ && pip3 install --no-cache-dir --no-index --only-binary=:all: --find-links \
-r /usr/src/requirements.txt \ "https://wheels.home-assistant.io/alpine-$(cut -d '.' -f 1-2 < /etc/alpine-release)/${BUILD_ARCH}/" \
&& rm -f /usr/src/requirements.txt -r ./requirements.txt \
&& rm -f requirements.txt
# Install HassIO # Install HassIO
COPY . /usr/src/hassio COPY . hassio
RUN pip3 install --no-cache-dir -e /usr/src/hassio \ RUN pip3 install --no-cache-dir -e ./hassio \
&& python3 -m compileall /usr/src/hassio/hassio && python3 -m compileall ./hassio/hassio
# Initialize udev daemon, handle CMD # Initialize udev daemon, handle CMD
COPY entry.sh /bin/ COPY entry.sh /bin/
ENTRYPOINT ["/bin/entry.sh"] ENTRYPOINT ["/bin/entry.sh"]
WORKDIR /
CMD [ "python3", "-m", "hassio" ] CMD [ "python3", "-m", "hassio" ]

View File

@@ -21,7 +21,7 @@ variables:
- name: versionBuilder - name: versionBuilder
value: '4.4' value: '4.4'
- name: versionWheels - name: versionWheels
value: '0.11' value: '1.0-3.7-alpine3.10'
- group: docker - group: docker
- group: wheels - group: wheels
@@ -52,7 +52,7 @@ stages:
versionSpec: '3.7' versionSpec: '3.7'
- script: pip install black - script: pip install black
displayName: 'Install black' displayName: 'Install black'
- script: black --check hassio tests - script: black --target-version py37 --check hassio tests
displayName: 'Run Black' displayName: 'Run Black'
- job: 'JQ' - job: 'JQ'
pool: pool:
@@ -84,7 +84,7 @@ stages:
pool: pool:
vmImage: 'ubuntu-latest' vmImage: 'ubuntu-latest'
strategy: strategy:
maxParallel: 3 maxParallel: 5
matrix: matrix:
amd64: amd64:
buildArch: 'amd64' buildArch: 'amd64'
@@ -114,11 +114,11 @@ stages:
ssh-keyscan -H $(wheelsHost) >> .ssh/known_hosts ssh-keyscan -H $(wheelsHost) >> .ssh/known_hosts
chmod 600 .ssh/* chmod 600 .ssh/*
displayName: 'Install ssh key' displayName: 'Install ssh key'
- script: sudo docker pull homeassistant/$(buildArch)-wheels:$(versionWheels)-$(basePythonTag) - script: sudo docker pull homeassistant/$(buildArch)-wheels:$(versionWheels)
displayName: 'Install wheels builder' displayName: 'Install wheels builder'
- script: | - script: |
sudo docker run --rm -v $(pwd):/data:ro -v $(pwd)/.ssh:/root/.ssh:rw \ sudo docker run --rm -v $(pwd):/data:ro -v $(pwd)/.ssh:/root/.ssh:rw \
homeassistant/$(buildArch)-wheels:$(versionWheels)-$(basePythonTag) \ homeassistant/$(buildArch)-wheels:$(versionWheels) \
--apk "build-base;libffi-dev;openssl-dev" \ --apk "build-base;libffi-dev;openssl-dev" \
--index $(wheelsIndex) \ --index $(wheelsIndex) \
--requirement requirements.txt \ --requirement requirements.txt \

View File

@@ -38,9 +38,10 @@ if __name__ == "__main__":
_LOGGER.info("Initialize Hass.io setup") _LOGGER.info("Initialize Hass.io setup")
coresys = loop.run_until_complete(bootstrap.initialize_coresys()) coresys = loop.run_until_complete(bootstrap.initialize_coresys())
loop.run_until_complete(coresys.core.connect())
bootstrap.migrate_system_env(coresys)
bootstrap.supervisor_debugger(coresys) bootstrap.supervisor_debugger(coresys)
bootstrap.migrate_system_env(coresys)
_LOGGER.info("Setup HassIO") _LOGGER.info("Setup HassIO")
loop.run_until_complete(coresys.core.setup()) loop.run_until_complete(coresys.core.setup())

View File

@@ -130,6 +130,7 @@ class AddonManager(CoreSysAttributes):
raise AddonsError() from None raise AddonsError() from None
else: else:
self.local[slug] = addon self.local[slug] = addon
_LOGGER.info("Add-on '%s' successfully installed", slug)
async def uninstall(self, slug: str) -> None: async def uninstall(self, slug: str) -> None:
"""Remove an add-on.""" """Remove an add-on."""
@@ -159,6 +160,8 @@ class AddonManager(CoreSysAttributes):
self.data.uninstall(addon) self.data.uninstall(addon)
self.local.pop(slug) self.local.pop(slug)
_LOGGER.info("Add-on '%s' successfully removed", slug)
async def update(self, slug: str) -> None: async def update(self, slug: str) -> None:
"""Update add-on.""" """Update add-on."""
if slug not in self.local: if slug not in self.local:
@@ -184,9 +187,15 @@ class AddonManager(CoreSysAttributes):
last_state = await addon.state() last_state = await addon.state()
try: try:
await addon.instance.update(store.version, store.image) await addon.instance.update(store.version, store.image)
# Cleanup
with suppress(DockerAPIError):
await addon.instance.cleanup()
except DockerAPIError: except DockerAPIError:
raise AddonsError() from None raise AddonsError() from None
self.data.update(store) else:
self.data.update(store)
_LOGGER.info("Add-on '%s' successfully updated", slug)
# Setup/Fix AppArmor profile # Setup/Fix AppArmor profile
await addon.install_apparmor() await addon.install_apparmor()
@@ -224,6 +233,7 @@ class AddonManager(CoreSysAttributes):
raise AddonsError() from None raise AddonsError() from None
else: else:
self.data.update(store) self.data.update(store)
_LOGGER.info("Add-on '%s' successfully rebuilded", slug)
# restore state # restore state
if last_state == STATE_STARTED: if last_state == STATE_STARTED:
@@ -246,3 +256,38 @@ class AddonManager(CoreSysAttributes):
_LOGGER.info("Detect new Add-on after restore %s", slug) _LOGGER.info("Detect new Add-on after restore %s", slug)
self.local[slug] = addon self.local[slug] = addon
async def repair(self) -> None:
"""Repair local add-ons."""
needs_repair: List[Addon] = []
# Evaluate Add-ons to repair
for addon in self.installed:
if await addon.instance.exists():
continue
needs_repair.append(addon)
_LOGGER.info("Found %d add-ons to repair", len(needs_repair))
if not needs_repair:
return
for addon in needs_repair:
_LOGGER.info("Start repair for add-on: %s", addon.slug)
with suppress(DockerAPIError, KeyError):
# Need pull a image again
if not addon.need_build:
await addon.instance.install(addon.version, addon.image)
continue
# Need local lookup
elif addon.need_build and not addon.is_detached:
store = self.store[addon.slug]
# If this add-on is available for rebuild
if addon.version == store.version:
await addon.instance.install(addon.version, addon.image)
continue
_LOGGER.error("Can't repair %s", addon.slug)
with suppress(AddonsError):
await self.uninstall(addon.slug)

View File

@@ -9,7 +9,7 @@ import secrets
import shutil import shutil
import tarfile import tarfile
from tempfile import TemporaryDirectory from tempfile import TemporaryDirectory
from typing import Any, Awaitable, Dict, Optional from typing import Any, Awaitable, Dict, List, Optional
import voluptuous as vol import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
@@ -35,6 +35,7 @@ from ..const import (
ATTR_USER, ATTR_USER,
ATTR_UUID, ATTR_UUID,
ATTR_VERSION, ATTR_VERSION,
DNS_SUFFIX,
STATE_NONE, STATE_NONE,
STATE_STARTED, STATE_STARTED,
STATE_STOPPED, STATE_STOPPED,
@@ -75,7 +76,7 @@ class Addon(AddonModel):
async def load(self) -> None: async def load(self) -> None:
"""Async initialize of object.""" """Async initialize of object."""
with suppress(DockerAPIError): with suppress(DockerAPIError):
await self.instance.attach() await self.instance.attach(tag=self.version)
@property @property
def ip_address(self) -> IPv4Address: def ip_address(self) -> IPv4Address:
@@ -119,6 +120,11 @@ class Addon(AddonModel):
"""Return installed version.""" """Return installed version."""
return self.persist[ATTR_VERSION] return self.persist[ATTR_VERSION]
@property
def dns(self) -> List[str]:
"""Return list of DNS name for that add-on."""
return [f"{self.hostname}.{DNS_SUFFIX}"]
@property @property
def options(self) -> Dict[str, Any]: def options(self) -> Dict[str, Any]:
"""Return options with local changes.""" """Return options with local changes."""
@@ -618,7 +624,7 @@ class Addon(AddonModel):
image_file = Path(temp, "image.tar") image_file = Path(temp, "image.tar")
if image_file.is_file(): if image_file.is_file():
with suppress(DockerAPIError): with suppress(DockerAPIError):
await self.instance.import_image(image_file, version) await self.instance.import_image(image_file)
else: else:
with suppress(DockerAPIError): with suppress(DockerAPIError):
await self.instance.install(version, restore_image) await self.instance.install(version, restore_image)

View File

@@ -51,6 +51,7 @@ from ..const import (
ATTR_STDIN, ATTR_STDIN,
ATTR_TIMEOUT, ATTR_TIMEOUT,
ATTR_TMPFS, ATTR_TMPFS,
ATTR_UDEV,
ATTR_URL, ATTR_URL,
ATTR_VERSION, ATTR_VERSION,
ATTR_WEBUI, ATTR_WEBUI,
@@ -59,7 +60,7 @@ from ..const import (
SECURITY_PROFILE, SECURITY_PROFILE,
) )
from ..coresys import CoreSysAttributes from ..coresys import CoreSysAttributes
from .validate import MACHINE_ALL, RE_SERVICE, RE_VOLUME, validate_options from .validate import RE_SERVICE, RE_VOLUME, validate_options
Data = Dict[str, Any] Data = Dict[str, Any]
@@ -109,6 +110,16 @@ class AddonModel(CoreSysAttributes):
"""Return name of add-on.""" """Return name of add-on."""
return self.data[ATTR_NAME] return self.data[ATTR_NAME]
@property
def hostname(self) -> str:
"""Return slug/id of add-on."""
return self.slug.replace("_", "-")
@property
def dns(self) -> List[str]:
"""Return list of DNS name for that add-on."""
return []
@property @property
def timeout(self) -> int: def timeout(self) -> int:
"""Return timeout of addon for docker stop.""" """Return timeout of addon for docker stop."""
@@ -333,6 +344,11 @@ class AddonModel(CoreSysAttributes):
"""Return True if the add-on access to GPIO interface.""" """Return True if the add-on access to GPIO interface."""
return self.data[ATTR_GPIO] return self.data[ATTR_GPIO]
@property
def with_udev(self) -> bool:
"""Return True if the add-on have his own udev."""
return self.data[ATTR_UDEV]
@property @property
def with_kernel_modules(self) -> bool: def with_kernel_modules(self) -> bool:
"""Return True if the add-on access to kernel modules.""" """Return True if the add-on access to kernel modules."""
@@ -391,7 +407,7 @@ class AddonModel(CoreSysAttributes):
@property @property
def supported_machine(self) -> List[str]: def supported_machine(self) -> List[str]:
"""Return list of supported machine.""" """Return list of supported machine."""
return self.data.get(ATTR_MACHINE, MACHINE_ALL) return self.data.get(ATTR_MACHINE, [])
@property @property
def image(self) -> str: def image(self) -> str:
@@ -460,8 +476,8 @@ class AddonModel(CoreSysAttributes):
return False return False
# Machine / Hardware # Machine / Hardware
machine = config.get(ATTR_MACHINE) or MACHINE_ALL machine = config.get(ATTR_MACHINE)
if self.sys_machine not in machine: if machine and self.sys_machine not in machine:
return False return False
# Home Assistant # Home Assistant

View File

@@ -68,6 +68,7 @@ from ..const import (
ATTR_SYSTEM, ATTR_SYSTEM,
ATTR_TIMEOUT, ATTR_TIMEOUT,
ATTR_TMPFS, ATTR_TMPFS,
ATTR_UDEV,
ATTR_URL, ATTR_URL,
ATTR_USER, ATTR_USER,
ATTR_UUID, ATTR_UUID,
@@ -139,6 +140,8 @@ MACHINE_ALL = [
"raspberrypi2", "raspberrypi2",
"raspberrypi3", "raspberrypi3",
"raspberrypi3-64", "raspberrypi3-64",
"raspberrypi4",
"raspberrypi4-64",
"tinker", "tinker",
] ]
@@ -184,6 +187,7 @@ SCHEMA_ADDON_CONFIG = vol.Schema(
vol.Optional(ATTR_HOST_DBUS, default=False): vol.Boolean(), vol.Optional(ATTR_HOST_DBUS, default=False): vol.Boolean(),
vol.Optional(ATTR_DEVICES): [vol.Match(r"^(.*):(.*):([rwm]{1,3})$")], vol.Optional(ATTR_DEVICES): [vol.Match(r"^(.*):(.*):([rwm]{1,3})$")],
vol.Optional(ATTR_AUTO_UART, default=False): vol.Boolean(), vol.Optional(ATTR_AUTO_UART, default=False): vol.Boolean(),
vol.Optional(ATTR_UDEV, default=False): vol.Boolean(),
vol.Optional(ATTR_TMPFS): vol.Match(r"^size=(\d)*[kmg](,uid=\d{1,4})?(,rw)?$"), vol.Optional(ATTR_TMPFS): vol.Match(r"^size=(\d)*[kmg](,uid=\d{1,4})?(,rw)?$"),
vol.Optional(ATTR_MAP, default=list): [vol.Match(RE_VOLUME)], vol.Optional(ATTR_MAP, default=list): [vol.Match(RE_VOLUME)],
vol.Optional(ATTR_ENVIRONMENT): {vol.Match(r"\w*"): vol.Coerce(str)}, vol.Optional(ATTR_ENVIRONMENT): {vol.Match(r"\w*"): vol.Coerce(str)},

View File

@@ -9,6 +9,7 @@ from ..coresys import CoreSys, CoreSysAttributes
from .addons import APIAddons from .addons import APIAddons
from .auth import APIAuth from .auth import APIAuth
from .discovery import APIDiscovery from .discovery import APIDiscovery
from .dns import APICoreDNS
from .hardware import APIHardware from .hardware import APIHardware
from .hassos import APIHassOS from .hassos import APIHassOS
from .homeassistant import APIHomeAssistant from .homeassistant import APIHomeAssistant
@@ -55,6 +56,7 @@ class RestAPI(CoreSysAttributes):
self._register_services() self._register_services()
self._register_info() self._register_info()
self._register_auth() self._register_auth()
self._register_dns()
def _register_host(self) -> None: def _register_host(self) -> None:
"""Register hostcontrol functions.""" """Register hostcontrol functions."""
@@ -130,6 +132,7 @@ class RestAPI(CoreSysAttributes):
web.post("/supervisor/update", api_supervisor.update), web.post("/supervisor/update", api_supervisor.update),
web.post("/supervisor/reload", api_supervisor.reload), web.post("/supervisor/reload", api_supervisor.reload),
web.post("/supervisor/options", api_supervisor.options), web.post("/supervisor/options", api_supervisor.options),
web.post("/supervisor/repair", api_supervisor.repair),
] ]
) )
@@ -263,6 +266,21 @@ class RestAPI(CoreSysAttributes):
] ]
) )
def _register_dns(self) -> None:
"""Register DNS functions."""
api_dns = APICoreDNS()
api_dns.coresys = self.coresys
self.webapp.add_routes(
[
web.get("/dns/info", api_dns.info),
web.get("/dns/stats", api_dns.stats),
web.get("/dns/logs", api_dns.logs),
web.post("/dns/update", api_dns.update),
web.post("/dns/options", api_dns.options),
]
)
def _register_panel(self) -> None: def _register_panel(self) -> None:
"""Register panel for Home Assistant.""" """Register panel for Home Assistant."""
panel_dir = Path(__file__).parent.joinpath("panel") panel_dir = Path(__file__).parent.joinpath("panel")

View File

@@ -8,6 +8,7 @@ import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
from ..addons import AnyAddon from ..addons import AnyAddon
from ..docker.stats import DockerStats
from ..addons.utils import rating_security from ..addons.utils import rating_security
from ..const import ( from ..const import (
ATTR_ADDONS, ATTR_ADDONS,
@@ -30,6 +31,7 @@ from ..const import (
ATTR_DEVICES, ATTR_DEVICES,
ATTR_DEVICETREE, ATTR_DEVICETREE,
ATTR_DISCOVERY, ATTR_DISCOVERY,
ATTR_DNS,
ATTR_DOCKER_API, ATTR_DOCKER_API,
ATTR_FULL_ACCESS, ATTR_FULL_ACCESS,
ATTR_GPIO, ATTR_GPIO,
@@ -41,6 +43,7 @@ from ..const import (
ATTR_HOST_IPC, ATTR_HOST_IPC,
ATTR_HOST_NETWORK, ATTR_HOST_NETWORK,
ATTR_HOST_PID, ATTR_HOST_PID,
ATTR_HOSTNAME,
ATTR_ICON, ATTR_ICON,
ATTR_INGRESS, ATTR_INGRESS,
ATTR_INGRESS_ENTRY, ATTR_INGRESS_ENTRY,
@@ -56,6 +59,7 @@ from ..const import (
ATTR_MACHINE, ATTR_MACHINE,
ATTR_MAINTAINER, ATTR_MAINTAINER,
ATTR_MEMORY_LIMIT, ATTR_MEMORY_LIMIT,
ATTR_MEMORY_PERCENT,
ATTR_MEMORY_USAGE, ATTR_MEMORY_USAGE,
ATTR_NAME, ATTR_NAME,
ATTR_NETWORK, ATTR_NETWORK,
@@ -73,6 +77,7 @@ from ..const import (
ATTR_SOURCE, ATTR_SOURCE,
ATTR_STATE, ATTR_STATE,
ATTR_STDIN, ATTR_STDIN,
ATTR_UDEV,
ATTR_URL, ATTR_URL,
ATTR_VERSION, ATTR_VERSION,
ATTR_WEBUI, ATTR_WEBUI,
@@ -116,7 +121,7 @@ class APIAddons(CoreSysAttributes):
self, request: web.Request, check_installed: bool = True self, request: web.Request, check_installed: bool = True
) -> AnyAddon: ) -> AnyAddon:
"""Return addon, throw an exception it it doesn't exist.""" """Return addon, throw an exception it it doesn't exist."""
addon_slug = request.match_info.get("addon") addon_slug: str = request.match_info.get("addon")
# Lookup itself # Lookup itself
if addon_slug == "self": if addon_slug == "self":
@@ -175,11 +180,13 @@ class APIAddons(CoreSysAttributes):
@api_process @api_process
async def info(self, request: web.Request) -> Dict[str, Any]: async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return add-on information.""" """Return add-on information."""
addon = self._extract_addon(request, check_installed=False) addon: AnyAddon = self._extract_addon(request, check_installed=False)
data = { data = {
ATTR_NAME: addon.name, ATTR_NAME: addon.name,
ATTR_SLUG: addon.slug, ATTR_SLUG: addon.slug,
ATTR_HOSTNAME: addon.hostname,
ATTR_DNS: addon.dns,
ATTR_DESCRIPTON: addon.description, ATTR_DESCRIPTON: addon.description,
ATTR_LONG_DESCRIPTION: addon.long_description, ATTR_LONG_DESCRIPTION: addon.long_description,
ATTR_AUTO_UPDATE: None, ATTR_AUTO_UPDATE: None,
@@ -220,6 +227,7 @@ class APIAddons(CoreSysAttributes):
ATTR_GPIO: addon.with_gpio, ATTR_GPIO: addon.with_gpio,
ATTR_KERNEL_MODULES: addon.with_kernel_modules, ATTR_KERNEL_MODULES: addon.with_kernel_modules,
ATTR_DEVICETREE: addon.with_devicetree, ATTR_DEVICETREE: addon.with_devicetree,
ATTR_UDEV: addon.with_udev,
ATTR_DOCKER_API: addon.access_docker_api, ATTR_DOCKER_API: addon.access_docker_api,
ATTR_AUDIO: addon.with_audio, ATTR_AUDIO: addon.with_audio,
ATTR_AUDIO_INPUT: None, ATTR_AUDIO_INPUT: None,
@@ -256,12 +264,12 @@ class APIAddons(CoreSysAttributes):
@api_process @api_process
async def options(self, request: web.Request) -> None: async def options(self, request: web.Request) -> None:
"""Store user options for add-on.""" """Store user options for add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
addon_schema = SCHEMA_OPTIONS.extend( addon_schema = SCHEMA_OPTIONS.extend(
{vol.Optional(ATTR_OPTIONS): vol.Any(None, addon.schema)} {vol.Optional(ATTR_OPTIONS): vol.Any(None, addon.schema)}
) )
body = await api_validate(addon_schema, request) body: Dict[str, Any] = await api_validate(addon_schema, request)
if ATTR_OPTIONS in body: if ATTR_OPTIONS in body:
addon.options = body[ATTR_OPTIONS] addon.options = body[ATTR_OPTIONS]
@@ -284,8 +292,8 @@ class APIAddons(CoreSysAttributes):
@api_process @api_process
async def security(self, request: web.Request) -> None: async def security(self, request: web.Request) -> None:
"""Store security options for add-on.""" """Store security options for add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
body = await api_validate(SCHEMA_SECURITY, request) body: Dict[str, Any] = await api_validate(SCHEMA_SECURITY, request)
if ATTR_PROTECTED in body: if ATTR_PROTECTED in body:
_LOGGER.warning("Protected flag changing for %s!", addon.slug) _LOGGER.warning("Protected flag changing for %s!", addon.slug)
@@ -296,13 +304,14 @@ class APIAddons(CoreSysAttributes):
@api_process @api_process
async def stats(self, request: web.Request) -> Dict[str, Any]: async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information.""" """Return resource information."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
stats = await addon.stats() stats: DockerStats = await addon.stats()
return { return {
ATTR_CPU_PERCENT: stats.cpu_percent, ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage, ATTR_MEMORY_USAGE: stats.memory_usage,
ATTR_MEMORY_LIMIT: stats.memory_limit, ATTR_MEMORY_LIMIT: stats.memory_limit,
ATTR_MEMORY_PERCENT: stats.memory_percent,
ATTR_NETWORK_RX: stats.network_rx, ATTR_NETWORK_RX: stats.network_rx,
ATTR_NETWORK_TX: stats.network_tx, ATTR_NETWORK_TX: stats.network_tx,
ATTR_BLK_READ: stats.blk_read, ATTR_BLK_READ: stats.blk_read,
@@ -312,19 +321,19 @@ class APIAddons(CoreSysAttributes):
@api_process @api_process
def install(self, request: web.Request) -> Awaitable[None]: def install(self, request: web.Request) -> Awaitable[None]:
"""Install add-on.""" """Install add-on."""
addon = self._extract_addon(request, check_installed=False) addon: AnyAddon = self._extract_addon(request, check_installed=False)
return asyncio.shield(addon.install()) return asyncio.shield(addon.install())
@api_process @api_process
def uninstall(self, request: web.Request) -> Awaitable[None]: def uninstall(self, request: web.Request) -> Awaitable[None]:
"""Uninstall add-on.""" """Uninstall add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
return asyncio.shield(addon.uninstall()) return asyncio.shield(addon.uninstall())
@api_process @api_process
def start(self, request: web.Request) -> Awaitable[None]: def start(self, request: web.Request) -> Awaitable[None]:
"""Start add-on.""" """Start add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
# check options # check options
options = addon.options options = addon.options
@@ -338,13 +347,13 @@ class APIAddons(CoreSysAttributes):
@api_process @api_process
def stop(self, request: web.Request) -> Awaitable[None]: def stop(self, request: web.Request) -> Awaitable[None]:
"""Stop add-on.""" """Stop add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
return asyncio.shield(addon.stop()) return asyncio.shield(addon.stop())
@api_process @api_process
def update(self, request: web.Request) -> Awaitable[None]: def update(self, request: web.Request) -> Awaitable[None]:
"""Update add-on.""" """Update add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
if addon.latest_version == addon.version: if addon.latest_version == addon.version:
raise APIError("No update available!") raise APIError("No update available!")
@@ -354,13 +363,13 @@ class APIAddons(CoreSysAttributes):
@api_process @api_process
def restart(self, request: web.Request) -> Awaitable[None]: def restart(self, request: web.Request) -> Awaitable[None]:
"""Restart add-on.""" """Restart add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
return asyncio.shield(addon.restart()) return asyncio.shield(addon.restart())
@api_process @api_process
def rebuild(self, request: web.Request) -> Awaitable[None]: def rebuild(self, request: web.Request) -> Awaitable[None]:
"""Rebuild local build add-on.""" """Rebuild local build add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
if not addon.need_build: if not addon.need_build:
raise APIError("Only local build addons are supported") raise APIError("Only local build addons are supported")
@@ -369,13 +378,13 @@ class APIAddons(CoreSysAttributes):
@api_process_raw(CONTENT_TYPE_BINARY) @api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request: web.Request) -> Awaitable[bytes]: def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return logs from add-on.""" """Return logs from add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
return addon.logs() return addon.logs()
@api_process_raw(CONTENT_TYPE_PNG) @api_process_raw(CONTENT_TYPE_PNG)
async def icon(self, request: web.Request) -> bytes: async def icon(self, request: web.Request) -> bytes:
"""Return icon from add-on.""" """Return icon from add-on."""
addon = self._extract_addon(request, check_installed=False) addon: AnyAddon = self._extract_addon(request, check_installed=False)
if not addon.with_icon: if not addon.with_icon:
raise APIError("No icon found!") raise APIError("No icon found!")
@@ -385,7 +394,7 @@ class APIAddons(CoreSysAttributes):
@api_process_raw(CONTENT_TYPE_PNG) @api_process_raw(CONTENT_TYPE_PNG)
async def logo(self, request: web.Request) -> bytes: async def logo(self, request: web.Request) -> bytes:
"""Return logo from add-on.""" """Return logo from add-on."""
addon = self._extract_addon(request, check_installed=False) addon: AnyAddon = self._extract_addon(request, check_installed=False)
if not addon.with_logo: if not addon.with_logo:
raise APIError("No logo found!") raise APIError("No logo found!")
@@ -395,7 +404,7 @@ class APIAddons(CoreSysAttributes):
@api_process_raw(CONTENT_TYPE_TEXT) @api_process_raw(CONTENT_TYPE_TEXT)
async def changelog(self, request: web.Request) -> str: async def changelog(self, request: web.Request) -> str:
"""Return changelog from add-on.""" """Return changelog from add-on."""
addon = self._extract_addon(request, check_installed=False) addon: AnyAddon = self._extract_addon(request, check_installed=False)
if not addon.with_changelog: if not addon.with_changelog:
raise APIError("No changelog found!") raise APIError("No changelog found!")
@@ -405,7 +414,7 @@ class APIAddons(CoreSysAttributes):
@api_process @api_process
async def stdin(self, request: web.Request) -> None: async def stdin(self, request: web.Request) -> None:
"""Write to stdin of add-on.""" """Write to stdin of add-on."""
addon = self._extract_addon(request) addon: AnyAddon = self._extract_addon(request)
if not addon.with_stdin: if not addon.with_stdin:
raise APIError("STDIN not supported by add-on") raise APIError("STDIN not supported by add-on")

89
hassio/api/dns.py Normal file
View File

@@ -0,0 +1,89 @@
"""Init file for Hass.io DNS RESTful API."""
import asyncio
import logging
from typing import Any, Awaitable, Dict
from aiohttp import web
import voluptuous as vol
from ..const import (
ATTR_BLK_READ,
ATTR_BLK_WRITE,
ATTR_CPU_PERCENT,
ATTR_HOST,
ATTR_LATEST_VERSION,
ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE,
ATTR_MEMORY_PERCENT,
ATTR_NETWORK_RX,
ATTR_NETWORK_TX,
ATTR_SERVERS,
ATTR_VERSION,
CONTENT_TYPE_BINARY,
)
from ..coresys import CoreSysAttributes
from ..exceptions import APIError
from ..validate import DNS_SERVER_LIST
from .utils import api_process, api_process_raw, api_validate
_LOGGER = logging.getLogger(__name__)
# pylint: disable=no-value-for-parameter
SCHEMA_OPTIONS = vol.Schema({vol.Optional(ATTR_SERVERS): DNS_SERVER_LIST})
SCHEMA_VERSION = vol.Schema({vol.Optional(ATTR_VERSION): vol.Coerce(str)})
class APICoreDNS(CoreSysAttributes):
"""Handle RESTful API for DNS functions."""
@api_process
async def info(self, request: web.Request) -> Dict[str, Any]:
"""Return DNS information."""
return {
ATTR_VERSION: self.sys_dns.version,
ATTR_LATEST_VERSION: self.sys_dns.latest_version,
ATTR_HOST: str(self.sys_docker.network.dns),
ATTR_SERVERS: self.sys_dns.servers,
}
@api_process
async def options(self, request: web.Request) -> None:
"""Set DNS options."""
body = await api_validate(SCHEMA_OPTIONS, request)
if ATTR_SERVERS in body:
self.sys_dns.servers = body[ATTR_SERVERS]
self.sys_dns.save_data()
@api_process
async def stats(self, request: web.Request) -> Dict[str, Any]:
"""Return resource information."""
stats = await self.sys_dns.stats()
return {
ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage,
ATTR_MEMORY_LIMIT: stats.memory_limit,
ATTR_MEMORY_PERCENT: stats.memory_percent,
ATTR_NETWORK_RX: stats.network_rx,
ATTR_NETWORK_TX: stats.network_tx,
ATTR_BLK_READ: stats.blk_read,
ATTR_BLK_WRITE: stats.blk_write,
}
@api_process
async def update(self, request: web.Request) -> None:
"""Update DNS plugin."""
body = await api_validate(SCHEMA_VERSION, request)
version = body.get(ATTR_VERSION, self.sys_dns.latest_version)
if version == self.sys_dns.version:
raise APIError("Version {} is already in use".format(version))
await asyncio.shield(self.sys_dns.update(version))
@api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return DNS Docker logs."""
return self.sys_dns.logs()

View File

@@ -22,7 +22,9 @@ class APIHardware(CoreSysAttributes):
async def info(self, request): async def info(self, request):
"""Show hardware info.""" """Show hardware info."""
return { return {
ATTR_SERIAL: list(self.sys_hardware.serial_devices), ATTR_SERIAL: list(
self.sys_hardware.serial_devices | self.sys_hardware.serial_by_id
),
ATTR_INPUT: list(self.sys_hardware.input_devices), ATTR_INPUT: list(self.sys_hardware.input_devices),
ATTR_DISK: list(self.sys_hardware.disk_devices), ATTR_DISK: list(self.sys_hardware.disk_devices),
ATTR_GPIO: list(self.sys_hardware.gpio_devices), ATTR_GPIO: list(self.sys_hardware.gpio_devices),

View File

@@ -18,6 +18,7 @@ from ..const import (
ATTR_MACHINE, ATTR_MACHINE,
ATTR_MEMORY_LIMIT, ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE, ATTR_MEMORY_USAGE,
ATTR_MEMORY_PERCENT,
ATTR_NETWORK_RX, ATTR_NETWORK_RX,
ATTR_NETWORK_TX, ATTR_NETWORK_TX,
ATTR_PASSWORD, ATTR_PASSWORD,
@@ -121,6 +122,7 @@ class APIHomeAssistant(CoreSysAttributes):
ATTR_CPU_PERCENT: stats.cpu_percent, ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage, ATTR_MEMORY_USAGE: stats.memory_usage,
ATTR_MEMORY_LIMIT: stats.memory_limit, ATTR_MEMORY_LIMIT: stats.memory_limit,
ATTR_MEMORY_PERCENT: stats.memory_percent,
ATTR_NETWORK_RX: stats.network_rx, ATTR_NETWORK_RX: stats.network_rx,
ATTR_NETWORK_TX: stats.network_tx, ATTR_NETWORK_TX: stats.network_tx,
ATTR_BLK_READ: stats.blk_read, ATTR_BLK_READ: stats.blk_read,

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[7],{102:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(124),i=t.n(e),o=t(126),u=t.n(o),a=i.a,c=u.a}}]);

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -8,18 +8,6 @@ Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/ */
/**
* @fileoverview
* @suppress {checkPrototypalTypes}
* @license Copyright (c) 2017 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt The complete set of authors may be found
* at http://polymer.github.io/AUTHORS.txt The complete set of contributors may
* be found at http://polymer.github.io/CONTRIBUTORS.txt Code distributed by
* Google as part of the polymer project is also subject to an additional IP
* rights grant found at http://polymer.github.io/PATENTS.txt
*/
/** /**
@license @license
Copyright (c) 2015 The Polymer Project Authors. All rights reserved. Copyright (c) 2015 The Polymer Project Authors. All rights reserved.

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

View File

@@ -0,0 +1 @@
{"version":3,"sources":["webpack:///./src/ingress-view/hassio-ingress-view.ts"],"names":["customElement","HassioIngressView","property","this","_addon","html","_templateObject2","name","ingress_url","_templateObject","changedProps","_get","_getPrototypeOf","prototype","call","has","addon","route","path","substr","oldRoute","get","oldAddon","undefined","_fetchData","_callee","addonSlug","_ref","_ref2","regeneratorRuntime","wrap","_context","prev","next","Promise","all","fetchHassioAddonInfo","hass","catch","Error","createHassioSession","sent","_slicedToArray","ingress","t0","console","error","alert","message","history","back","stop","css","_templateObject3","LitElement"],"mappings":"4gSAmBCA,YAAc,0CACTC,smBACHC,kEACAA,mEACAA,4EAED,WACE,OAAKC,KAAKC,OAMHC,YAAPC,IAC0BH,KAAKC,OAAOG,KACpBJ,KAAKC,OAAOI,aAPrBH,YAAPI,0CAYJ,SAAkBC,GAGhB,GAFAC,EAAAC,EApBEX,EAoBFY,WAAA,eAAAV,MAAAW,KAAAX,KAAmBO,GAEdA,EAAaK,IAAI,SAAtB,CAIA,IAAMC,EAAQb,KAAKc,MAAMC,KAAKC,OAAO,GAE/BC,EAAWV,EAAaW,IAAI,SAC5BC,EAAWF,EAAWA,EAASF,KAAKC,OAAO,QAAKI,EAElDP,GAASA,IAAUM,GACrBnB,KAAKqB,WAAWR,0FAIpB,SAAAS,EAAyBC,GAAzB,IAAAC,EAAAC,EAAAZ,EAAA,OAAAa,mBAAAC,KAAA,SAAAC,GAAA,cAAAA,EAAAC,KAAAD,EAAAE,MAAA,cAAAF,EAAAC,KAAA,EAAAD,EAAAE,KAAA,EAE0BC,QAAQC,IAAI,CAChCC,YAAqBjC,KAAKkC,KAAMX,GAAWY,MAAM,WAC/C,MAAM,IAAIC,MAAM,iCAElBC,YAAoBrC,KAAKkC,MAAMC,MAAM,WACnC,MAAM,IAAIC,MAAM,2CAPxB,UAAAZ,EAAAI,EAAAU,KAAAb,EAAAc,EAAAf,EAAA,IAEWX,EAFXY,EAAA,IAWee,QAXf,CAAAZ,EAAAE,KAAA,cAYY,IAAIM,MAAM,wCAZtB,OAeIpC,KAAKC,OAASY,EAflBe,EAAAE,KAAA,iBAAAF,EAAAC,KAAA,GAAAD,EAAAa,GAAAb,EAAA,SAkBIc,QAAQC,MAARf,EAAAa,IACAG,MAAMhB,EAAAa,GAAII,SAAW,mCACrBC,QAAQC,OApBZ,yBAAAnB,EAAAoB,SAAA1B,EAAAtB,KAAA,yRAwBA,WACE,OAAOiD,YAAPC,UA7D4BC","file":"chunk.5dd33a3a20657ed46a19.js","sourcesContent":["import {\n LitElement,\n customElement,\n property,\n TemplateResult,\n html,\n PropertyValues,\n CSSResult,\n css,\n} from \"lit-element\";\nimport { HomeAssistant, Route } from \"../../../src/types\";\nimport {\n createHassioSession,\n HassioAddonDetails,\n fetchHassioAddonInfo,\n} from \"../../../src/data/hassio\";\nimport \"../../../src/layouts/hass-loading-screen\";\nimport \"../../../src/layouts/hass-subpage\";\n\n@customElement(\"hassio-ingress-view\")\nclass HassioIngressView extends LitElement {\n @property() public hass!: HomeAssistant;\n @property() public route!: Route;\n @property() private _addon?: HassioAddonDetails;\n\n protected render(): TemplateResult | void {\n if (!this._addon) {\n return html`\n <hass-loading-screen></hass-loading-screen>\n `;\n }\n\n return html`\n <hass-subpage .header=${this._addon.name} hassio>\n <iframe src=${this._addon.ingress_url}></iframe>\n </hass-subpage>\n `;\n }\n\n protected updated(changedProps: PropertyValues) {\n super.firstUpdated(changedProps);\n\n if (!changedProps.has(\"route\")) {\n return;\n }\n\n const addon = this.route.path.substr(1);\n\n const oldRoute = changedProps.get(\"route\") as this[\"route\"] | undefined;\n const oldAddon = oldRoute ? oldRoute.path.substr(1) : undefined;\n\n if (addon && addon !== oldAddon) {\n this._fetchData(addon);\n }\n }\n\n private async _fetchData(addonSlug: string) {\n try {\n const [addon] = await Promise.all([\n fetchHassioAddonInfo(this.hass, addonSlug).catch(() => {\n throw new Error(\"Failed to fetch add-on info\");\n }),\n createHassioSession(this.hass).catch(() => {\n throw new Error(\"Failed to create an ingress session\");\n }),\n ]);\n\n if (!addon.ingress) {\n throw new Error(\"This add-on does not support ingress\");\n }\n\n this._addon = addon;\n } catch (err) {\n // tslint:disable-next-line\n console.error(err);\n alert(err.message || \"Unknown error starting ingress.\");\n history.back();\n }\n }\n\n static get styles(): CSSResult {\n return css`\n iframe {\n display: block;\n width: 100%;\n height: 100%;\n border: 0;\n }\n paper-icon-button {\n color: var(--text-primary-color);\n }\n `;\n }\n}\n\ndeclare global {\n interface HTMLElementTagNameMap {\n \"hassio-ingress-view\": HassioIngressView;\n }\n}\n"],"sourceRoot":""}

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.69dd3afa8a315523db98.js","sourceRoot":""}

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,2 @@
(window.webpackJsonp=window.webpackJsonp||[]).push([[9],{101:function(n,r,t){"use strict";t.r(r),t.d(r,"marked",function(){return a}),t.d(r,"filterXSS",function(){return c});var e=t(124),i=t.n(e),o=t(126),u=t.n(o),a=i.a,c=u.a}}]);
//# sourceMappingURL=chunk.7f8cce5798f837214ef8.js.map

Binary file not shown.

View File

@@ -0,0 +1 @@
{"version":3,"sources":["webpack:///../src/resources/load_markdown.js"],"names":["__webpack_require__","r","__webpack_exports__","d","marked","filterXSS","marked__WEBPACK_IMPORTED_MODULE_0__","marked__WEBPACK_IMPORTED_MODULE_0___default","n","xss__WEBPACK_IMPORTED_MODULE_1__","xss__WEBPACK_IMPORTED_MODULE_1___default","marked_","filterXSS_"],"mappings":"0FAAAA,EAAAC,EAAAC,GAAAF,EAAAG,EAAAD,EAAA,2BAAAE,IAAAJ,EAAAG,EAAAD,EAAA,8BAAAG,IAAA,IAAAC,EAAAN,EAAA,KAAAO,EAAAP,EAAAQ,EAAAF,GAAAG,EAAAT,EAAA,KAAAU,EAAAV,EAAAQ,EAAAC,GAGaL,EAASO,IACTN,EAAYO","file":"chunk.7f8cce5798f837214ef8.js","sourcesContent":["import marked_ from \"marked\";\nimport filterXSS_ from \"xss\";\n\nexport const marked = marked_;\nexport const filterXSS = filterXSS_;\n"],"sourceRoot":""}

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.807ac4267d577a6403cd.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.b060a768bba881c3480a.js","sourceRoot":""}

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.cff54ae25a1e80506a71.js","sourceRoot":""}

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.d1638e8b6cd63427acdd.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
{"version":3,"sources":[],"names":[],"mappings":"","file":"chunk.fa1d90049d43bcf36e42.js","sourceRoot":""}

File diff suppressed because one or more lines are too long

View File

@@ -1 +1,2 @@
!function(e){function n(n){for(var t,o,a=n[0],i=n[1],f=0,c=[];f<a.length;f++)o=a[f],r[o]&&c.push(r[o][0]),r[o]=0;for(t in i)Object.prototype.hasOwnProperty.call(i,t)&&(e[t]=i[t]);for(u&&u(n);c.length;)c.shift()()}var t={},r={4:0};function o(n){if(t[n])return t[n].exports;var r=t[n]={i:n,l:!1,exports:{}};return e[n].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var n=[],t=r[e];if(0!==t)if(t)n.push(t[2]);else{var a=new Promise(function(n,o){t=r[e]=[n,o]});n.push(t[2]=a);var i,f=document.createElement("script");f.charset="utf-8",f.timeout=120,o.nc&&f.setAttribute("nonce",o.nc),f.src=function(e){return o.p+"chunk."+{0:"cff54ae25a1e80506a71",1:"fa1d90049d43bcf36e42",2:"d1638e8b6cd63427acdd",3:"807ac4267d577a6403cd",5:"a1122a4f9ebd72cb73ff",6:"bef066214c930e19d0e8",7:"0f38aa06224ef471d7a4",8:"8c6a9e6fd2862fc2fd9f",9:"69dd3afa8a315523db98",10:"320d5a8af6741fdbfaaf",11:"b060a768bba881c3480a",12:"ff337d8fd6f580702170",13:"439f972b31b285e49c06"}[e]+".js"}(e),i=function(n){f.onerror=f.onload=null,clearTimeout(u);var t=r[e];if(0!==t){if(t){var o=n&&("load"===n.type?"missing":n.type),a=n&&n.target&&n.target.src,i=new Error("Loading chunk "+e+" failed.\n("+o+": "+a+")");i.type=o,i.request=a,t[1](i)}r[e]=void 0}};var u=setTimeout(function(){i({type:"timeout",target:f})},12e4);f.onerror=f.onload=i,document.head.appendChild(f)}return Promise.all(n)},o.m=e,o.c=t,o.d=function(e,n,t){o.o(e,n)||Object.defineProperty(e,n,{enumerable:!0,get:t})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,n){if(1&n&&(e=o(e)),8&n)return e;if(4&n&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(o.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&n&&"string"!=typeof e)for(var r in e)o.d(t,r,function(n){return e[n]}.bind(null,r));return t},o.n=function(e){var n=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(n,"a",n),n},o.o=function(e,n){return Object.prototype.hasOwnProperty.call(e,n)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var a=window.webpackJsonp=window.webpackJsonp||[],i=a.push.bind(a);a.push=n,a=a.slice();for(var f=0;f<a.length;f++)n(a[f]);var u=i;o(o.s=0)}([function(e,n,t){window.loadES5Adapter().then(function(){Promise.all([t.e(1),t.e(5)]).then(t.bind(null,2)),Promise.all([t.e(1),t.e(9),t.e(6)]).then(t.bind(null,1))});var r=document.createElement("style");r.innerHTML="\nbody {\n font-family: Roboto, sans-serif;\n -moz-osx-font-smoothing: grayscale;\n -webkit-font-smoothing: antialiased;\n font-weight: 400;\n margin: 0;\n padding: 0;\n height: 100vh;\n}\n",document.head.appendChild(r)}]); !function(e){function n(n){for(var t,o,a=n[0],i=n[1],f=0,c=[];f<a.length;f++)o=a[f],r[o]&&c.push(r[o][0]),r[o]=0;for(t in i)Object.prototype.hasOwnProperty.call(i,t)&&(e[t]=i[t]);for(u&&u(n);c.length;)c.shift()()}var t={},r={4:0};function o(n){if(t[n])return t[n].exports;var r=t[n]={i:n,l:!1,exports:{}};return e[n].call(r.exports,r,r.exports,o),r.l=!0,r.exports}o.e=function(e){var n=[],t=r[e];if(0!==t)if(t)n.push(t[2]);else{var a=new Promise(function(n,o){t=r[e]=[n,o]});n.push(t[2]=a);var i,f=document.createElement("script");f.charset="utf-8",f.timeout=120,o.nc&&f.setAttribute("nonce",o.nc),f.src=function(e){return o.p+"chunk."+{0:"7f411ffa9df152cb8f05",1:"598ae99dfd641ab3a30c",2:"af7784dbf07df8e24819",3:"b15efbd4fb2c8cac0ad4",5:"87d3a6d0178fb26762cf",6:"6f4702eafe52425373ed",7:"5dd33a3a20657ed46a19",8:"7c785f796f428abae18d",9:"7f8cce5798f837214ef8",10:"04bcaa18b59728e10be9",11:"9d7374dae6137783dda4",12:"6685a7f98b13655ab808",13:"f1156b978f6f3143a651"}[e]+".js"}(e),i=function(n){f.onerror=f.onload=null,clearTimeout(u);var t=r[e];if(0!==t){if(t){var o=n&&("load"===n.type?"missing":n.type),a=n&&n.target&&n.target.src,i=new Error("Loading chunk "+e+" failed.\n("+o+": "+a+")");i.type=o,i.request=a,t[1](i)}r[e]=void 0}};var u=setTimeout(function(){i({type:"timeout",target:f})},12e4);f.onerror=f.onload=i,document.head.appendChild(f)}return Promise.all(n)},o.m=e,o.c=t,o.d=function(e,n,t){o.o(e,n)||Object.defineProperty(e,n,{enumerable:!0,get:t})},o.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},o.t=function(e,n){if(1&n&&(e=o(e)),8&n)return e;if(4&n&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(o.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&n&&"string"!=typeof e)for(var r in e)o.d(t,r,function(n){return e[n]}.bind(null,r));return t},o.n=function(e){var n=e&&e.__esModule?function(){return e.default}:function(){return e};return o.d(n,"a",n),n},o.o=function(e,n){return Object.prototype.hasOwnProperty.call(e,n)},o.p="/api/hassio/app/",o.oe=function(e){throw console.error(e),e};var a=window.webpackJsonp=window.webpackJsonp||[],i=a.push.bind(a);a.push=n,a=a.slice();for(var f=0;f<a.length;f++)n(a[f]);var u=i;o(o.s=0)}([function(e,n,t){window.loadES5Adapter().then(function(){Promise.all([t.e(1),t.e(6)]).then(t.bind(null,2)),Promise.all([t.e(1),t.e(12),t.e(8)]).then(t.bind(null,1))});var r=document.createElement("style");r.innerHTML="\nbody {\n font-family: Roboto, sans-serif;\n -moz-osx-font-smoothing: grayscale;\n -webkit-font-smoothing: antialiased;\n font-weight: 400;\n margin: 0;\n padding: 0;\n height: 100vh;\n}\n",document.head.appendChild(r)}]);
//# sourceMappingURL=entrypoint.js.map

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@@ -25,6 +25,7 @@ from ..const import (
ATTR_LOGO, ATTR_LOGO,
ATTR_MEMORY_LIMIT, ATTR_MEMORY_LIMIT,
ATTR_MEMORY_USAGE, ATTR_MEMORY_USAGE,
ATTR_MEMORY_PERCENT,
ATTR_NAME, ATTR_NAME,
ATTR_NETWORK_RX, ATTR_NETWORK_RX,
ATTR_NETWORK_TX, ATTR_NETWORK_TX,
@@ -140,6 +141,7 @@ class APISupervisor(CoreSysAttributes):
ATTR_CPU_PERCENT: stats.cpu_percent, ATTR_CPU_PERCENT: stats.cpu_percent,
ATTR_MEMORY_USAGE: stats.memory_usage, ATTR_MEMORY_USAGE: stats.memory_usage,
ATTR_MEMORY_LIMIT: stats.memory_limit, ATTR_MEMORY_LIMIT: stats.memory_limit,
ATTR_MEMORY_PERCENT: stats.memory_percent,
ATTR_NETWORK_RX: stats.network_rx, ATTR_NETWORK_RX: stats.network_rx,
ATTR_NETWORK_TX: stats.network_tx, ATTR_NETWORK_TX: stats.network_tx,
ATTR_BLK_READ: stats.blk_read, ATTR_BLK_READ: stats.blk_read,
@@ -161,6 +163,11 @@ class APISupervisor(CoreSysAttributes):
"""Reload add-ons, configuration, etc.""" """Reload add-ons, configuration, etc."""
return asyncio.shield(self.sys_updater.reload()) return asyncio.shield(self.sys_updater.reload())
@api_process
def repair(self, request: web.Request) -> Awaitable[None]:
"""Try to repair the local setup / overlayfs."""
return asyncio.shield(self.sys_core.repair())
@api_process_raw(CONTENT_TYPE_BINARY) @api_process_raw(CONTENT_TYPE_BINARY)
def logs(self, request: web.Request) -> Awaitable[bytes]: def logs(self, request: web.Request) -> Awaitable[bytes]:
"""Return supervisor Docker logs.""" """Return supervisor Docker logs."""

View File

@@ -10,6 +10,8 @@ from .utils.json import read_json_file
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
ARCH_JSON: Path = Path(__file__).parent.joinpath("data/arch.json")
MAP_CPU = { MAP_CPU = {
"armv7": "armv7", "armv7": "armv7",
"armv6": "armhf", "armv6": "armhf",
@@ -47,7 +49,7 @@ class CpuArch(CoreSysAttributes):
async def load(self) -> None: async def load(self) -> None:
"""Load data and initialize default arch.""" """Load data and initialize default arch."""
try: try:
arch_data = read_json_file(Path(__file__).parent.joinpath("arch.json")) arch_data = read_json_file(ARCH_JSON)
except JsonFileError: except JsonFileError:
_LOGGER.warning("Can't read arch json") _LOGGER.warning("Can't read arch json")
return return

View File

@@ -11,11 +11,12 @@ from .addons import AddonManager
from .api import RestAPI from .api import RestAPI
from .arch import CpuArch from .arch import CpuArch
from .auth import Auth from .auth import Auth
from .const import SOCKET_DOCKER from .const import CHANNEL_DEV, SOCKET_DOCKER
from .core import HassIO from .core import HassIO
from .coresys import CoreSys from .coresys import CoreSys
from .dbus import DBusManager from .dbus import DBusManager
from .discovery import Discovery from .discovery import Discovery
from .dns import CoreDNS
from .hassos import HassOS from .hassos import HassOS
from .homeassistant import HomeAssistant from .homeassistant import HomeAssistant
from .host import HostManager from .host import HostManager
@@ -43,6 +44,7 @@ async def initialize_coresys():
# Initialize core objects # Initialize core objects
coresys.core = HassIO(coresys) coresys.core = HassIO(coresys)
coresys.dns = CoreDNS(coresys)
coresys.arch = CpuArch(coresys) coresys.arch = CpuArch(coresys)
coresys.auth = Auth(coresys) coresys.auth = Auth(coresys)
coresys.updater = Updater(coresys) coresys.updater = Updater(coresys)
@@ -127,9 +129,21 @@ def initialize_system_data(coresys: CoreSys):
_LOGGER.info("Create Hass.io Apparmor folder %s", config.path_apparmor) _LOGGER.info("Create Hass.io Apparmor folder %s", config.path_apparmor)
config.path_apparmor.mkdir() config.path_apparmor.mkdir()
# dns folder
if not config.path_dns.is_dir():
_LOGGER.info("Create Hass.io DNS folder %s", config.path_dns)
config.path_dns.mkdir()
# Update log level # Update log level
coresys.config.modify_log_level() coresys.config.modify_log_level()
# Check if ENV is in development mode
if bool(os.environ.get("SUPERVISOR_DEV", 0)):
_LOGGER.warning("SUPERVISOR_DEV is set")
coresys.updater.channel = CHANNEL_DEV
coresys.config.logging = "debug"
coresys.config.debug = True
def migrate_system_env(coresys: CoreSys): def migrate_system_env(coresys: CoreSys):
"""Cleanup some stuff after update.""" """Cleanup some stuff after update."""
@@ -218,7 +232,7 @@ def reg_signal(loop):
def supervisor_debugger(coresys: CoreSys) -> None: def supervisor_debugger(coresys: CoreSys) -> None:
"""Setup debugger if needed.""" """Setup debugger if needed."""
if not coresys.config.debug or not coresys.dev: if not coresys.config.debug:
return return
import ptvsd import ptvsd
@@ -226,4 +240,5 @@ def supervisor_debugger(coresys: CoreSys) -> None:
ptvsd.enable_attach(address=("0.0.0.0", 33333), redirect_output=True) ptvsd.enable_attach(address=("0.0.0.0", 33333), redirect_output=True)
if coresys.config.debug_block: if coresys.config.debug_block:
_LOGGER.info("Wait until debugger is attached")
ptvsd.wait_for_attach() ptvsd.wait_for_attach()

View File

@@ -34,6 +34,7 @@ BACKUP_DATA = PurePath("backup")
SHARE_DATA = PurePath("share") SHARE_DATA = PurePath("share")
TMP_DATA = PurePath("tmp") TMP_DATA = PurePath("tmp")
APPARMOR_DATA = PurePath("apparmor") APPARMOR_DATA = PurePath("apparmor")
DNS_DATA = PurePath("dns")
DEFAULT_BOOT_TIME = datetime.utcfromtimestamp(0).isoformat() DEFAULT_BOOT_TIME = datetime.utcfromtimestamp(0).isoformat()
@@ -99,7 +100,7 @@ class CoreConfig(JsonConfig):
def modify_log_level(self) -> None: def modify_log_level(self) -> None:
"""Change log level.""" """Change log level."""
lvl = getattr(logging, self.logging.upper()) lvl = getattr(logging, self.logging.upper())
logging.basicConfig(level=lvl) logging.getLogger("hassio").setLevel(lvl)
@property @property
def last_boot(self): def last_boot(self):
@@ -211,6 +212,16 @@ class CoreConfig(JsonConfig):
"""Return root share data folder external for Docker.""" """Return root share data folder external for Docker."""
return PurePath(self.path_extern_hassio, SHARE_DATA) return PurePath(self.path_extern_hassio, SHARE_DATA)
@property
def path_extern_dns(self):
"""Return dns path external for Docker."""
return str(PurePath(self.path_extern_hassio, DNS_DATA))
@property
def path_dns(self):
"""Return dns path inside supervisor."""
return Path(HASSIO_DATA, DNS_DATA)
@property @property
def addons_repositories(self): def addons_repositories(self):
"""Return list of custom Add-on repositories.""" """Return list of custom Add-on repositories."""

View File

@@ -3,11 +3,11 @@ from pathlib import Path
from ipaddress import ip_network from ipaddress import ip_network
HASSIO_VERSION = "168" HASSIO_VERSION = "174"
URL_HASSIO_ADDONS = "https://github.com/home-assistant/hassio-addons" URL_HASSIO_ADDONS = "https://github.com/home-assistant/hassio-addons"
URL_HASSIO_VERSION = "https://s3.amazonaws.com/hassio-version/{channel}.json" URL_HASSIO_VERSION = "https://version.home-assistant.io/{channel}.json"
URL_HASSIO_APPARMOR = "https://s3.amazonaws.com/hassio-version/apparmor.txt" URL_HASSIO_APPARMOR = "https://version.home-assistant.io/apparmor.txt"
URL_HASSOS_OTA = ( URL_HASSOS_OTA = (
"https://github.com/home-assistant/hassos/releases/download/" "https://github.com/home-assistant/hassos/releases/download/"
@@ -24,6 +24,7 @@ FILE_HASSIO_UPDATER = Path(HASSIO_DATA, "updater.json")
FILE_HASSIO_SERVICES = Path(HASSIO_DATA, "services.json") FILE_HASSIO_SERVICES = Path(HASSIO_DATA, "services.json")
FILE_HASSIO_DISCOVERY = Path(HASSIO_DATA, "discovery.json") FILE_HASSIO_DISCOVERY = Path(HASSIO_DATA, "discovery.json")
FILE_HASSIO_INGRESS = Path(HASSIO_DATA, "ingress.json") FILE_HASSIO_INGRESS = Path(HASSIO_DATA, "ingress.json")
FILE_HASSIO_DNS = Path(HASSIO_DATA, "dns.json")
SOCKET_DOCKER = Path("/var/run/docker.sock") SOCKET_DOCKER = Path("/var/run/docker.sock")
@@ -31,6 +32,9 @@ DOCKER_NETWORK = "hassio"
DOCKER_NETWORK_MASK = ip_network("172.30.32.0/23") DOCKER_NETWORK_MASK = ip_network("172.30.32.0/23")
DOCKER_NETWORK_RANGE = ip_network("172.30.33.0/24") DOCKER_NETWORK_RANGE = ip_network("172.30.33.0/24")
DNS_SERVERS = ["dns://8.8.8.8", "dns://1.1.1.1"]
DNS_SUFFIX = "local.hass.io"
LABEL_VERSION = "io.hass.version" LABEL_VERSION = "io.hass.version"
LABEL_ARCH = "io.hass.arch" LABEL_ARCH = "io.hass.arch"
LABEL_TYPE = "io.hass.type" LABEL_TYPE = "io.hass.type"
@@ -86,6 +90,7 @@ ATTR_VERSION_LATEST = "version_latest"
ATTR_AUTO_UART = "auto_uart" ATTR_AUTO_UART = "auto_uart"
ATTR_LAST_BOOT = "last_boot" ATTR_LAST_BOOT = "last_boot"
ATTR_LAST_VERSION = "last_version" ATTR_LAST_VERSION = "last_version"
ATTR_LATEST_VERSION = "latest_version"
ATTR_CHANNEL = "channel" ATTR_CHANNEL = "channel"
ATTR_NAME = "name" ATTR_NAME = "name"
ATTR_SLUG = "slug" ATTR_SLUG = "slug"
@@ -159,6 +164,7 @@ ATTR_NETWORK_RX = "network_rx"
ATTR_NETWORK_TX = "network_tx" ATTR_NETWORK_TX = "network_tx"
ATTR_MEMORY_LIMIT = "memory_limit" ATTR_MEMORY_LIMIT = "memory_limit"
ATTR_MEMORY_USAGE = "memory_usage" ATTR_MEMORY_USAGE = "memory_usage"
ATTR_MEMORY_PERCENT = "memory_percent"
ATTR_BLK_READ = "blk_read" ATTR_BLK_READ = "blk_read"
ATTR_BLK_WRITE = "blk_write" ATTR_BLK_WRITE = "blk_write"
ATTR_ADDON = "addon" ATTR_ADDON = "addon"
@@ -210,6 +216,9 @@ ATTR_ADMIN = "admin"
ATTR_PANELS = "panels" ATTR_PANELS = "panels"
ATTR_DEBUG = "debug" ATTR_DEBUG = "debug"
ATTR_DEBUG_BLOCK = "debug_block" ATTR_DEBUG_BLOCK = "debug_block"
ATTR_DNS = "dns"
ATTR_SERVERS = "servers"
ATTR_UDEV = "udev"
PROVIDE_SERVICE = "provide" PROVIDE_SERVICE = "provide"
NEED_SERVICE = "need" NEED_SERVICE = "need"

View File

@@ -24,10 +24,14 @@ class HassIO(CoreSysAttributes):
"""Initialize Hass.io object.""" """Initialize Hass.io object."""
self.coresys = coresys self.coresys = coresys
async def connect(self):
"""Connect Supervisor container."""
await self.sys_supervisor.load()
async def setup(self): async def setup(self):
"""Setup HassIO orchestration.""" """Setup HassIO orchestration."""
# Load Supervisor # Load CoreDNS
await self.sys_supervisor.load() await self.sys_dns.load()
# Load DBus # Load DBus
await self.sys_dbus.load() await self.sys_dbus.load()
@@ -68,11 +72,10 @@ class HassIO(CoreSysAttributes):
# Load ingress # Load ingress
await self.sys_ingress.load() await self.sys_ingress.load()
# start dns forwarding
self.sys_create_task(self.sys_dns.start())
async def start(self): async def start(self):
"""Start Hass.io orchestration.""" """Start Hass.io orchestration."""
await self.sys_api.start()
# on release channel, try update itself # on release channel, try update itself
if self.sys_supervisor.need_update: if self.sys_supervisor.need_update:
try: try:
@@ -86,9 +89,6 @@ class HassIO(CoreSysAttributes):
"future version of Home Assistant!" "future version of Home Assistant!"
) )
# start api
await self.sys_api.start()
# start addon mark as initialize # start addon mark as initialize
await self.sys_addons.boot(STARTUP_INITIALIZE) await self.sys_addons.boot(STARTUP_INITIALIZE)
@@ -116,8 +116,7 @@ class HassIO(CoreSysAttributes):
await self.sys_addons.boot(STARTUP_APPLICATION) await self.sys_addons.boot(STARTUP_APPLICATION)
# store new last boot # store new last boot
self.sys_config.last_boot = self.sys_hardware.last_boot self._update_last_boot()
self.sys_config.save_data()
finally: finally:
# Add core tasks into scheduler # Add core tasks into scheduler
@@ -134,16 +133,19 @@ class HassIO(CoreSysAttributes):
# don't process scheduler anymore # don't process scheduler anymore
self.sys_scheduler.suspend = True self.sys_scheduler.suspend = True
# store new last boot / prevent time adjustments
self._update_last_boot()
# process async stop tasks # process async stop tasks
try: try:
with async_timeout.timeout(10): with async_timeout.timeout(10):
await asyncio.wait( await asyncio.wait(
[ [
self.sys_api.stop(), self.sys_api.stop(),
self.sys_dns.stop(),
self.sys_websession.close(), self.sys_websession.close(),
self.sys_websession_ssl.close(), self.sys_websession_ssl.close(),
self.sys_ingress.unload(), self.sys_ingress.unload(),
self.sys_dns.unload(),
] ]
) )
except asyncio.TimeoutError: except asyncio.TimeoutError:
@@ -162,3 +164,26 @@ class HassIO(CoreSysAttributes):
await self.sys_addons.shutdown(STARTUP_SERVICES) await self.sys_addons.shutdown(STARTUP_SERVICES)
await self.sys_addons.shutdown(STARTUP_SYSTEM) await self.sys_addons.shutdown(STARTUP_SYSTEM)
await self.sys_addons.shutdown(STARTUP_INITIALIZE) await self.sys_addons.shutdown(STARTUP_INITIALIZE)
def _update_last_boot(self):
"""Update last boot time."""
self.sys_config.last_boot = self.sys_hardware.last_boot
self.sys_config.save_data()
async def repair(self):
"""Repair system integrity."""
_LOGGER.info("Start repairing of Hass.io Environment")
await self.sys_run_in_executor(self.sys_docker.repair)
# Restore core functionality
await self.sys_dns.repair()
await self.sys_addons.repair()
await self.sys_homeassistant.repair()
# Fix HassOS specific
if self.sys_hassos.available:
await self.sys_hassos.repair_cli()
# Tag version for latest
await self.sys_supervisor.repair()
_LOGGER.info("Finished repairing of Hass.io Environment")

View File

@@ -1,14 +1,13 @@
"""Handle core shared data.""" """Handle core shared data."""
from __future__ import annotations from __future__ import annotations
import asyncio import asyncio
from typing import TYPE_CHECKING from typing import TYPE_CHECKING, Optional
import aiohttp import aiohttp
from .config import CoreConfig from .config import CoreConfig
from .const import CHANNEL_DEV from .const import CHANNEL_DEV
from .docker import DockerAPI from .docker import DockerAPI
from .misc.dns import DNSForward
from .misc.hardware import Hardware from .misc.hardware import Hardware
from .misc.scheduler import Scheduler from .misc.scheduler import Scheduler
@@ -20,6 +19,7 @@ if TYPE_CHECKING:
from .core import HassIO from .core import HassIO
from .dbus import DBusManager from .dbus import DBusManager
from .discovery import Discovery from .discovery import Discovery
from .dns import CoreDNS
from .hassos import HassOS from .hassos import HassOS
from .homeassistant import HomeAssistant from .homeassistant import HomeAssistant
from .host import HostManager from .host import HostManager
@@ -52,26 +52,26 @@ class CoreSys:
self._hardware: Hardware = Hardware() self._hardware: Hardware = Hardware()
self._docker: DockerAPI = DockerAPI() self._docker: DockerAPI = DockerAPI()
self._scheduler: Scheduler = Scheduler() self._scheduler: Scheduler = Scheduler()
self._dns: DNSForward = DNSForward()
# Internal objects pointers # Internal objects pointers
self._core: HassIO = None self._core: Optional[HassIO] = None
self._arch: CpuArch = None self._arch: Optional[CpuArch] = None
self._auth: Auth = None self._auth: Optional[Auth] = None
self._homeassistant: HomeAssistant = None self._dns: Optional[CoreDNS] = None
self._supervisor: Supervisor = None self._homeassistant: Optional[HomeAssistant] = None
self._addons: AddonManager = None self._supervisor: Optional[Supervisor] = None
self._api: RestAPI = None self._addons: Optional[AddonManager] = None
self._updater: Updater = None self._api: Optional[RestAPI] = None
self._snapshots: SnapshotManager = None self._updater: Optional[Updater] = None
self._tasks: Tasks = None self._snapshots: Optional[SnapshotManager] = None
self._host: HostManager = None self._tasks: Optional[Tasks] = None
self._ingress: Ingress = None self._host: Optional[HostManager] = None
self._dbus: DBusManager = None self._ingress: Optional[Ingress] = None
self._hassos: HassOS = None self._dbus: Optional[DBusManager] = None
self._services: ServiceManager = None self._hassos: Optional[HassOS] = None
self._store: StoreManager = None self._services: Optional[ServiceManager] = None
self._discovery: Discovery = None self._store: Optional[StoreManager] = None
self._discovery: Optional[Discovery] = None
@property @property
def machine(self) -> str: def machine(self) -> str:
@@ -125,11 +125,6 @@ class CoreSys:
"""Return Scheduler object.""" """Return Scheduler object."""
return self._scheduler return self._scheduler
@property
def dns(self) -> DNSForward:
"""Return DNSForward object."""
return self._dns
@property @property
def core(self) -> HassIO: def core(self) -> HassIO:
"""Return HassIO object.""" """Return HassIO object."""
@@ -298,6 +293,18 @@ class CoreSys:
raise RuntimeError("DBusManager already set!") raise RuntimeError("DBusManager already set!")
self._dbus = value self._dbus = value
@property
def dns(self) -> CoreDNS:
"""Return CoreDNS object."""
return self._dns
@dns.setter
def dns(self, value: CoreDNS):
"""Set a CoreDNS object."""
if self._dns:
raise RuntimeError("CoreDNS already set!")
self._dns = value
@property @property
def host(self) -> HostManager: def host(self) -> HostManager:
"""Return HostManager object.""" """Return HostManager object."""
@@ -395,11 +402,6 @@ class CoreSysAttributes:
"""Return Scheduler object.""" """Return Scheduler object."""
return self.coresys.scheduler return self.coresys.scheduler
@property
def sys_dns(self) -> DNSForward:
"""Return DNSForward object."""
return self.coresys.dns
@property @property
def sys_core(self) -> HassIO: def sys_core(self) -> HassIO:
"""Return HassIO object.""" """Return HassIO object."""
@@ -470,6 +472,11 @@ class CoreSysAttributes:
"""Return DBusManager object.""" """Return DBusManager object."""
return self.coresys.dbus return self.coresys.dbus
@property
def sys_dns(self) -> CoreDNS:
"""Return CoreDNS object."""
return self.coresys.dns
@property @property
def sys_host(self) -> HostManager: def sys_host(self) -> HostManager:
"""Return HostManager object.""" """Return HostManager object."""

Some files were not shown because too many files have changed in this diff Show More