Compare commits

...

122 Commits

Author SHA1 Message Date
Otto Winter
b71b14cd06 Bump HassIO version to v1.10.1 2019-01-13 19:06:09 +01:00
Otto Winter
145e7b00ee Bump version to v1.10.1 2019-01-13 19:04:57 +01:00
Otto Winter
07c80dfcda Pin platformio platforms (#335) 2019-01-13 19:04:54 +01:00
Otto Winter
0e52c9a778 Introduce wifi fast connect mode (#333) 2019-01-13 19:04:54 +01:00
Otto Winter
94bd179256 Fix show logs with MQTT and dashboard (#332)
Fixes #327
2019-01-13 19:04:54 +01:00
Otto Winter
11c38ca4e8 Fix AsyncTCP compilation on ESP32 with Arduino breaking change (#334) 2019-01-13 19:04:54 +01:00
Otto Winter
dc71d11a21 Fix ESP32 not decoding stacktrace on broken PC (#330) 2019-01-13 19:04:53 +01:00
Otto Winter
e385c8435b Fix auto_uart 2019-01-10 10:56:25 +01:00
Otto Winter
13eca6012d Bump HassIO version to v1.10.0 2019-01-09 20:30:58 +01:00
Otto Winter
cb3e3e024d Bump version to v1.10.0 2019-01-09 20:29:28 +01:00
Otto Winter
79bdec32b8 Merge branch 'rc' 2019-01-09 20:29:15 +01:00
Otto Winter
c153dba5bc Fix Gitlab CI 2019-01-09 16:29:21 +01:00
Otto Winter
fa2c2917c1 Bump beta version to v1.10.0b2 2019-01-09 16:16:19 +01:00
Otto Winter
2a06f4dbf4 Bump version to v1.10.0b2 2019-01-09 16:14:47 +01:00
Otto Winter
49b618bb0c Fix interval trigger (#313) 2019-01-09 16:14:43 +01:00
escoand
20ec3900be use full space on small devices (#310)
* use full space on small devices

I often use my smartphone for quick actions and always was annoyed of this wasted space.

* hide card-image on small devices
2019-01-09 16:14:43 +01:00
Otto Winter
605be1a6ec OTA don't error when upgrading from no password to password mode (#309) 2019-01-09 16:14:43 +01:00
Otto Winter
33b67de32e Fix component.update action (#308)
Fixes https://github.com/OttoWinter/esphomeyaml/issues/286
2019-01-09 16:14:43 +01:00
Otto Winter
1ffedb291c Update beta config (#305)
* Update beta Hass.io add-on config

* Add README
2019-01-06 21:29:33 +01:00
Otto Winter
8f251848ef Bump beta version to v1.10.0b1 2019-01-06 19:39:15 +01:00
Otto Winter
4046a16d85 Merge branch 'dev' into rc 2019-01-06 19:38:23 +01:00
Otto Winter
5ed987adcd Bump HassIO version to v1.9.3 2018-12-01 13:38:05 +01:00
Otto Winter
a463c59733 Bump version to v1.9.3 2018-12-01 13:37:28 +01:00
Otto Winter
1726c4237b CSE7766 update interval (#250)
* CSE7766 update interval

* PollingComponent
2018-12-01 13:37:25 +01:00
Otto Winter
f1241af91d Bump HassIO version to v1.9.2 2018-11-25 19:18:17 +01:00
Otto Winter
7ae6777fd6 Bump version to v1.9.2 2018-11-25 19:17:32 +01:00
Otto Winter
a169d37557 Fix fastled lambda light effect
Fixes #232
2018-11-25 19:05:39 +01:00
Otto Winter
be21aa786d Bump HassIO version to v1.9.1 2018-11-19 23:05:18 +01:00
Otto Winter
9a881100e6 Bump version to v1.9.1 2018-11-19 23:04:56 +01:00
Otto Winter
c2f88776c7 Fix SNTP servers option (#237)
* Fix SNTP servers option

* Lint
2018-11-19 23:04:45 +01:00
Otto Winter
846fcb8ccd Bump HassIO version to v1.9.0 2018-11-15 12:07:23 +01:00
Otto Winter
44495c919c Bump version to v1.9.0 2018-11-15 12:05:25 +01:00
Otto Winter
d4ce7699d4 Merge branch 'rc' 2018-11-15 12:05:18 +01:00
Otto Winter
318bb8b254 Bump beta version to v1.9.0b6 2018-11-13 19:10:32 +01:00
Otto Winter
1ad65516cf Bump version to v1.9.0b6 2018-11-13 19:06:32 +01:00
Otto Winter
d4c7e6c634 Merge branch 'dev' into rc 2018-11-13 19:06:26 +01:00
Otto Winter
9b07bb6608 Fix my9231 IDs not being resolved 2018-11-13 19:00:36 +01:00
Otto Winter
f368255739 Update MY9231 2018-11-13 16:53:46 +01:00
puuu
083c2fce05 Add MY9231 support (#227) 2018-11-13 16:51:30 +01:00
Otto Winter
c99d4e2815 Deep Sleep Wake Up From Multiple Pins (#230) 2018-11-13 15:36:49 +01:00
Otto Winter
39457f7b8c Enable nodelay for phase 1 of OTAv2 2018-11-13 15:31:14 +01:00
Otto Winter
01aaf14078 Add frequency option to ESP8266 PWM 2018-11-13 12:02:19 +01:00
Otto Winter
9a939d2d27 Bump beta version to v1.9.0b5 2018-11-12 23:44:57 +01:00
Otto Winter
7f91141df2 Bump beta version to v1.9.0b5 2018-11-12 23:44:55 +01:00
Otto Winter
06eeed9ee9 Bump version to v1.9.0b5 2018-11-12 23:43:03 +01:00
Otto Winter
5655b5fe10 Merge branch 'dev' into rc 2018-11-12 23:43:00 +01:00
Otto Winter
15331edb78 Let esphomeyaml know about class inheritance (#229)
* Allow overriding setup priority

* Add inheritance tree

* Global variables

* Tests and better validation

* Fix

* Lint
2018-11-12 23:30:31 +01:00
Otto Winter
4f375757a5 Better validation error messages 2018-11-12 21:00:17 +01:00
Otto Winter
0dec7cfbf8 Lint 2018-11-10 15:39:25 +01:00
Otto Winter
f51d301d53 Bump beta version to v1.9.0b4 2018-11-10 15:33:03 +01:00
Otto Winter
c3b3ba4923 Bump beta version to v1.9.0b4 2018-11-10 15:33:01 +01:00
Otto Winter
30e7797577 Bump version to v1.9.0b4 2018-11-10 15:31:20 +01:00
Otto Winter
0e5cabadc1 Merge branch 'dev' into rc 2018-11-10 15:31:16 +01:00
Otto Winter
58f4fa53d0 Make Switch not inherit from Component 2018-11-10 14:07:11 +01:00
Otto Winter
04dc848620 Improve OTAv2 error messages 2018-11-10 11:45:38 +01:00
Otto Winter
8203b8fcd3 Add total daily energy sensor (#220)
* Add total daily energy sensor

* Add test
2018-11-09 20:27:19 +01:00
Otto Winter
0ad61f4a95 Add binary sensor multi click trigger (#226) 2018-11-09 20:18:04 +01:00
Otto Winter
9fd4076ab8 Revert Add power on value to GPIO switch (#223)
* Revert Add power on value to GPIO switch

* Fix tests

* Fix
2018-11-09 20:08:35 +01:00
Otto Winter
a9e799cb06 Clean up time API (#221) 2018-11-09 20:08:16 +01:00
Otto Winter
3ec931ffa4 Add restore state option to template switch (#222)
* Add restore state option to template switch

* Add test
2018-11-09 20:05:50 +01:00
Otto Winter
e3094d9689 OTA Send back acknowledgement 2018-11-09 14:27:12 +01:00
Otto Winter
3594779401 Better typing to components (#225)
* Add better typing

* Fix
2018-11-08 16:01:07 +01:00
Otto Winter
94978d0063 Fix HLW8012 Voltage Divider option not being added to source (#224) 2018-11-07 23:27:06 +01:00
Otto Winter
415e12309b Improve OTA v2 error and log messages 2018-11-07 22:41:54 +01:00
Otto Winter
aa5f887ff3 Fix Gitlab CI 2018-11-04 00:10:44 +01:00
Otto Winter
28561ea6a4 Merge branch 'dev' into rc 2018-11-03 17:52:22 +01:00
Otto Winter
3b3ff4fea9 Bump beta version to v1.9.0b3 2018-11-03 17:47:16 +01:00
Otto Winter
6b8125f5f2 Bump version to v1.9.0b3 2018-11-03 17:47:12 +01:00
Otto Winter
ab43390983 Merge branch 'dev' into rc 2018-11-03 17:47:06 +01:00
Otto Winter
611592170b Fix esphomeyaml-edge 2018-11-03 17:35:05 +01:00
Otto Winter
8a58ff91c3 Update Gitlab Build Script (#215) 2018-11-03 16:24:26 +01:00
Otto Winter
27b86d89b0 Add generate home assistant config command (#208)
* Add generate home assistant config command

* Lint

* Lint
2018-11-03 14:08:31 +01:00
Otto Winter
36da3b85d5 Auto-Decode stacktraces (#214)
* Auto-Decode stacktraces

* Fix Gitlab CI
2018-11-03 14:06:14 +01:00
Otto Winter
d7d3a4aa36 Bump beta version to v1.10.0-dev 2018-11-03 12:09:22 +01:00
Otto Winter
4e4ffc3a24 Bump beta version to 1.9.0b2 2018-10-31 17:47:34 +01:00
Otto Winter
4dce7fa103 Bump version to 1.9.0b2 2018-10-31 16:35:45 +01:00
Otto Winter
467ef9902f Merge branch 'dev' into rc 2018-10-31 16:34:36 +01:00
Otto Winter
486174073b Update .gitlab-ci.yml 2018-10-31 16:34:03 +01:00
Otto Winter
e9d9de448e Fix output actions 2018-10-31 16:27:30 +01:00
Otto Winter
08c16020c6 Fix output platforms 2018-10-29 22:46:00 +01:00
Otto Winter
0ade9baf65 Fix automation validation 2018-10-27 14:10:37 +02:00
Otto Winter
2fab7e73b9 Add send_first_at option to sliding window sensor filter (#207)
* Add send_first_at option to sliding window sensor filter

* Lint
2018-10-26 22:59:03 +02:00
Otto Winter
af4e2bf61d Add Stepper Motor Support (#206)
* Add stepper support

* Fix output set_level

* Lint
2018-10-26 22:57:03 +02:00
Otto Winter
6a2e9a8503 Add beta add-on 2018-10-20 19:15:49 +02:00
Otto Winter
74fefea5bb Fix gitlab 2018-10-20 19:14:27 +02:00
Otto Winter
21c22fe04a Bump version to 1.10.0-dev 2018-10-20 18:32:06 +02:00
Otto Winter
70206df8b5 Fix gitlab 2018-10-20 18:26:44 +02:00
Otto Winter
15732ca465 Bump version to 1.9.0b1 2018-10-20 18:24:02 +02:00
Otto Winter
8e0f4f93d4 Fix HassIO add-on archs 2018-10-20 18:20:42 +02:00
Otto Winter
361baea17f Add beta builds 2018-10-20 18:20:21 +02:00
Otto Winter
8bbfbc4cc1 Add logger.log action (#198)
* Add logger.log Action

* Simple schema

* Validate printf

* Improve error message

* Undo unfix tests :)
2018-10-20 15:19:59 +02:00
Otto Winter
8c5d12df51 Fix some typos (#202) 2018-10-20 15:18:12 +02:00
Otto Winter
25c66ed8ca Improve API naming convention consistency (#197)
* Improve API naming convention consistency

* Fix
2018-10-20 15:16:58 +02:00
Otto Winter
7a55521807 Fix rebase error 2018-10-20 15:08:55 +02:00
Otto Winter
d04de7baeb Fix value range trigger 😑 (#201) 2018-10-20 14:40:46 +02:00
Otto Winter
4aeb756388 Fix triggers being interpreted as a sequence of automations (#199) 2018-10-20 13:15:09 +02:00
Otto Winter
92b6ed4180 Add FastLED color correction option (#200)
* Add FastLED color correction option

* Add test
2018-10-20 13:14:02 +02:00
Otto Winter
9efd9f4fe8 Add PMSX003 Particulate Matter Sensor (#192) 2018-10-20 12:58:52 +02:00
Otto Winter
34fc2147a4 Add update component action and scripts (#196)
* Add update component action

* Add script component
2018-10-20 12:58:02 +02:00
Otto Winter
629f2b128e Add MQTT publish JSON action and subscribe JSON trigger (#193)
* Add MQTT publish JSON action and subscribe JSON trigger

* Lint
2018-10-20 12:41:00 +02:00
Otto Winter
81bc400340 Add PN532 On Tag Trigger (#189)
* Add PN532 On Tag Trigger

* Lint

* Fix 😶

* Fix
2018-10-17 21:29:44 +02:00
Otto Winter
75628b96a1 Add Text Sensors (#166)
* Text Sensors

* Add version text sensor

* Fix

* Fixes

* Add tests

* Add template text sensor

* Lint

* Fix test
2018-10-17 21:24:02 +02:00
Otto Winter
b1f7ed4fdc Add CSE776 for Sonoff Pow R2 (#190) 2018-10-17 21:14:31 +02:00
Otto Winter
b8d7185d99 Unify Xiaomi implementations (#188) 2018-10-17 20:56:55 +02:00
Otto Winter
2d20a1c0fb Decentralize Automation Generator Code (#182)
* Decentralize Automation Generator Code

* Lint
2018-10-16 23:16:06 +02:00
Otto Winter
820067ae5a GPIO Switch Power On Value v2 (#183) 2018-10-16 22:58:26 +02:00
Otto Winter
db8313e0d5 Fix config dump time output (#184) 2018-10-16 20:45:24 +02:00
Otto Winter
27a77c685d Rework OTA to be more stable (#177)
* Rework OTA to be more stable

* Lint
2018-10-14 19:14:28 +02:00
Otto Winter
e34365dc7c Add clean build files command and auto-clean on version change (#181)
* Add clean build files command and auto-clean on version change

* Update __main__.py
2018-10-14 18:52:21 +02:00
Otto Winter
8d395e5338 MQTT different log level (#167)
* Add option to have different log level over MQTT

* Add Test

* Lint
2018-10-14 18:46:17 +02:00
Otto Winter
6f54afec00 Add MQTT Subscribe sensor (#175) 2018-10-14 18:45:13 +02:00
Otto Winter
6a24145be6 Fix Wifi power_save_mode option (#178)
* Fix WIFI power_save_mode option

* Add Test
2018-10-13 21:17:19 +02:00
escoand
4a2cdbf31c Add Samsung IR protocol (#176)
* add Samsung ir protocol

* fix pylint

* add test

* add transmitter
2018-10-13 19:21:06 +02:00
Otto Winter
1d75ed1ff4 Add use_build_flags removal notice (#173) 2018-10-12 12:12:07 +02:00
Otto Winter
76b1c6f47b Fix readme broken link (#174) 2018-10-12 12:05:48 +02:00
Otto Winter
06371c9e2d Create issue templates (#171)
* Create issue templates

* Update bug_report.md
2018-10-12 11:27:14 +02:00
Otto Winter
a9c130dd50 Add Code of Conduct (Contributor Covenant) (#168) 2018-10-12 11:26:26 +02:00
Otto Winter
1f82c1a483 Create CONTRIBUTING.md (#169) 2018-10-12 11:26:05 +02:00
Otto Winter
cb28429231 Create Pull Request Template (#172)
* Create PULL_REQUEST_TEMPLATE.md

* Update PULL_REQUEST_TEMPLATE.md

* Rename PULL_REQUEST_TEMPLATE.md to .github/PULL_REQUEST_TEMPLATE.md
2018-10-12 11:25:43 +02:00
Otto Winter
f2cd2ec178 Fix raw remote receiver (#158) 2018-10-12 09:49:10 +02:00
Otto Winter
37360bb797 Log esphomelib version and compilation time on boot (#159) 2018-10-12 09:48:55 +02:00
JonnyaiR
6c1dc0f2b3 Add a link to Home Assistant in README (#152)
As "Home Assistant" could be interpreted as a generic term for automating a home, i've added links to the Home-Assistant website :-)
2018-10-08 11:00:02 +02:00
243 changed files with 15950 additions and 4747 deletions

49
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,49 @@
---
name: Bug report
about: Create a report to help esphomelib improve
---
<!-- Thanks for reporting a bug for this project. READ THIS FIRST:
- Please make sure to submit issues in the right GitHub repository, if unsure just post it here:
- esphomeyaml [here] - This is mostly for reporting bugs when compiling and when you get a long stack trace while compiling or if a configuration fails to validate.
- esphomelib [https://github.com/OttoWinter/esphomelib] - Report bugs there if the ESP is crashing or a feature is not working as expected.
- esphomedocs [https://github.com/OttoWinter/esphomedocs] - Report bugs there if the documentation is wrong/outdated.
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks (```).
DO NOT DELETE ANY TEXT from this template! Otherwise the issue may be closed without a comment.
-->
**Operating environment (Hass.io/Docker/pip/etc.):**
<!--
Please provide details about your environment.
-->
**ESP (ESP32/ESP8266/Board/Sonoff):**
<!--
Please provide details about which ESP you're using.
-->
**Affected component:**
<!--
Please add the link to the documentation at https://esphomelib.com/esphomeyaml/index.html of the component in question.
-->
**Description of problem:**
**Problem-relevant YAML-configuration entries:**
```yaml
PASTE YAML FILE HERE
```
**Traceback (if applicable):**
<!--
Please copy the traceback here if compilation is failing. If possible, also connect to the ESP and copy its logs into the backticks.
-->
```
```
**Additional information:**

View File

@@ -0,0 +1,21 @@
---
name: Feature request
about: Suggest an idea for this project
---
<!-- READ THIS FIRST:
- This is for feature requests only, if you want to have a certain new sensor/module supported, please use the "new integration" template.
- Please be as descriptive as possible, especially use-cases that can otherwise not be solved boost the problem's priority.
DO NOT DELETE ANY TEXT from this template! Otherwise the issue may be closed without a comment.
-->
**Is your feature request related to a problem/use-case? Please describe.**
<!-- A clear and concise description of what the problem is. -->
**Describe the solution you'd like:**
<!-- A description of what you want to happen. -->
**Additional context:**
<!-- Add any other context about the feature request here. -->

View File

@@ -0,0 +1,13 @@
---
name: New integration
about: Suggest a new integration for esphomelib
---
DO NOT POST NEW INTEGRATION REQUESTS HERE!
Please post all new integration requests in the esphomelib repository:
https://github.com/OttoWinter/esphomelib/issues
Thank you!

14
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,14 @@
## Description:
**Related issue (if applicable):** fixes <link to issue>
**Pull request in [esphomedocs](https://github.com/OttoWinter/esphomedocs) with documentation (if applicable):** OttoWinter/esphomedocs#<esphomedocs PR number goes here>
**Pull request in [esphomelib](https://github.com/OttoWinter/esphomelib) with C++ framework changes (if applicable):** OttoWinter/esphomelib#<esphomelib PR number goes here>
## Checklist:
- [ ] The code change is tested and works locally.
- [ ] Tests have been added to verify that the new code works (under `tests/` folder).
If user exposed functionality or configuration variables are added/changed:
- [ ] Documentation added/updated in [esphomedocs](https://github.com/OttoWinter/esphomedocs).

1
.gitignore vendored
View File

@@ -105,3 +105,4 @@ venv.bak/
config/
tests/build/
tests/.esphomeyaml/

View File

@@ -11,6 +11,8 @@ stages:
.lint: &lint
stage: lint
before_script:
- pip install -e .
tags:
- python2.7
- esphomeyaml-lint
@@ -24,9 +26,6 @@ stages:
- esphomeyaml-test
variables:
TZ: UTC
cache:
paths:
- tests/build
.docker-builder: &docker-builder
before_script:
@@ -61,127 +60,255 @@ test2:
<<: *docker-builder
stage: build
script:
- docker run --rm --privileged hassioaddons/qemu-user-static:latest
- BUILD_FROM=hassioaddons/ubuntu-base-${ADDON_ARCH}:2.2.0
- ADDON_VERSION="${CI_COMMIT_TAG#v}"
- ADDON_VERSION="${ADDON_VERSION:-${CI_COMMIT_SHA:0:7}}"
- echo "Build from ${BUILD_FROM}"
- echo "Add-on version ${ADDON_VERSION}"
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:dev"
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
- |
hassio-builder.sh \
-t . \
-i ottowinter/esphomeyaml-hassio-${ADDON_ARCH} \
-d "$CI_REGISTRY" \
--${ADDON_ARCH}
docker build \
--build-arg "BUILD_FROM=${BUILD_FROM}" \
--build-arg "BUILD_DATE=$(date +"%Y-%m-%dT%H:%M:%SZ")" \
--build-arg "BUILD_ARCH=${ADDON_ARCH}" \
--build-arg "BUILD_REF=${CI_COMMIT_SHA}" \
--build-arg "BUILD_VERSION=${ADDON_VERSION}" \
--tag "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:dev" \
--tag "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
--file "docker/Dockerfile.hassio" \
.
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:dev" \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:dev"
retry: 2
if [ "${DO_PUSH:-true}" = true ]; then
echo "Pushing to CI registry"
docker push ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}
docker push ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:dev
fi
# Generic deploy template
.deploy: &deploy
.deploy-release: &deploy-release
<<: *docker-builder
stage: deploy
script:
- version=${CI_COMMIT_TAG:1}
- echo "Publishing version ${version}"
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
- version="${CI_COMMIT_TAG#v}"
- echo "Publishing release version ${version}"
- docker pull "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
- |
docker tag \
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}" \
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- |
docker tag \
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}" \
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
only:
- /^v\d+\.\d+\.\d+$/
except:
- /^(?!master).+@/
.deploy-beta: &deploy-beta
<<: *docker-builder
stage: deploy
script:
- version="${CI_COMMIT_TAG#v}"
- echo "Publishing beta version ${version}"
- docker pull "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- |
docker tag \
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}" \
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
only:
- /^v\d+\.\d+\.\d+b\d+$/
except:
- /^(?!rc).+@/
# Build jobs
build:normal:
<<: *docker-builder
stage: build
script:
- docker build -t "${CI_REGISTRY}/ottowinter/esphomeyaml:dev" .
- |
docker tag \
"${CI_REGISTRY}/ottowinter/esphomeyaml:dev" \
"${CI_REGISTRY}/ottowinter/esphomeyaml:${CI_COMMIT_SHA}"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml:${CI_COMMIT_SHA}"
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml:dev"
build:armhf:
.build-hassio-edge: &build-hassio-edge
<<: *build-hassio
except:
- /^v\d+\.\d+\.\d+$/
- /^v\d+\.\d+\.\d+b\d+$/
.build-hassio-release: &build-hassio-release
<<: *build-hassio
only:
- /^v\d+\.\d+\.\d+$/
- /^v\d+\.\d+\.\d+b\d+$/
build:hassio-armhf-edge:
<<: *build-hassio-edge
variables:
ADDON_ARCH: armhf
DO_PUSH: "false"
build:hassio-armhf:
<<: *build-hassio-release
variables:
ADDON_ARCH: armhf
#build:aarch64:
# <<: *build
# variables:
# ADDON_ARCH: aarch64
#build:hassio-aarch64-edge:
# <<: *build-hassio-edge
# variables:
# ADDON_ARCH: aarch64
# DO_PUSH: "false"
build:i386:
<<: *build-hassio
#build:hassio-aarch64:
# <<: *build-hassio-release
# variables:
# ADDON_ARCH: aarch64
build:hassio-i386-edge:
<<: *build-hassio-edge
variables:
ADDON_ARCH: i386
DO_PUSH: "false"
build:hassio-i386:
<<: *build-hassio-release
variables:
ADDON_ARCH: i386
build:amd64:
<<: *build-hassio
build:hassio-amd64-edge:
<<: *build-hassio-edge
variables:
ADDON_ARCH: amd64
DO_PUSH: "false"
build:hassio-amd64:
<<: *build-hassio-release
variables:
ADDON_ARCH: amd64
# Deploy jobs
deploy:armhf:
<<: *deploy
deploy-release:armhf:
<<: *deploy-release
variables:
ADDON_ARCH: armhf
only:
- /^v\d+\.\d+\.\d+(?:(?:(?:\+|\.)?[a-zA-Z0-9]+)*)?$/
except:
- /^(?!master).+@/
#deploy:aarch64:
# <<: *deploy
deploy-beta:armhf:
<<: *deploy-beta
variables:
ADDON_ARCH: armhf
#deploy-release:aarch64:
# <<: *deploy-release
# variables:
# ADDON_ARCH: aarch64
# only:
# - /^v\d+\.\d+\.\d+(?:(?:(?:\+|\.)?[a-zA-Z0-9]+)*)?$/
# except:
# - /^(?!master).+@/
deploy:i386:
<<: *deploy
#deploy-beta:aarch64:
# <<: *deploy-beta
# variables:
# ADDON_ARCH: aarch64
deploy-release:i386:
<<: *deploy-release
variables:
ADDON_ARCH: i386
only:
- /^v\d+\.\d+\.\d+(?:(?:(?:\+|\.)?[a-zA-Z0-9]+)*)?$/
except:
- /^(?!master).+@/
deploy:amd64:
<<: *deploy
deploy-beta:i386:
<<: *deploy-beta
variables:
ADDON_ARCH: i386
deploy-release:amd64:
<<: *deploy-release
variables:
ADDON_ARCH: amd64
deploy-beta:amd64:
<<: *deploy-beta
variables:
ADDON_ARCH: amd64
.deploy-pypi: &deploy-pypi
stage: deploy
before_script:
- pip install -e .
- pip install twine
script:
- python setup.py sdist
- twine upload dist/*
tags:
- python2.7
- esphomeyaml-test
deploy-release:pypi:
<<: *deploy-pypi
only:
- /^v\d+\.\d+\.\d+(?:(?:(?:\+|\.)?[a-zA-Z0-9]+)*)?$/
- /^v\d+\.\d+\.\d+$/
except:
- /^(?!master).+@/
deploy:pypi:
stage: deploy
before_script:
- pip install -e .
- pip install twine
script:
- python setup.py sdist
- twine upload dist/*
tags:
- python2.7
- esphomeyaml-test
deploy-beta:pypi:
<<: *deploy-pypi
only:
- /^v\d+\.\d+\.\d+(?:(?:(?:\+|\.)?[a-zA-Z0-9]+)*)?$/
- /^v\d+\.\d+\.\d+b\d+$/
except:
- /^(?!master).+@/
- /^(?!rc).+@/

View File

@@ -1,20 +1,30 @@
sudo: false
language: python
python:
- "2.7"
jobs:
matrix:
fast_finish: true
include:
- name: "Lint"
install:
- pip install -r requirements.txt
- pip install flake8==3.5.0 pylint==1.9.3 tzlocal pillow
- python: "2.7"
env: TARGET=Lint2.7
install: pip install -e . && pip install flake8==3.6.0 pylint==1.9.4 pillow
script:
- flake8 esphomeyaml
- pylint esphomeyaml
- name: "Test"
install:
- pip install -e .
- pip install tzlocal pillow
- python: "3.5.3"
env: TARGET=Lint3.5
install: pip install -U https://github.com/platformio/platformio-core/archive/develop.zip && pip install -e . && pip install flake8==3.6.0 pylint==2.2.2 pillow
script:
- flake8 esphomeyaml
- pylint esphomeyaml
- python: "2.7"
env: TARGET=Test2.7
install: pip install -e . && pip install flake8==3.6.0 pylint==1.9.4 pillow
script:
- esphomeyaml tests/test1.yaml compile
- esphomeyaml tests/test2.yaml compile
- python: "3.5.3"
env: TARGET=Test3.5
install: pip install -U https://github.com/platformio/platformio-core/archive/develop.zip && pip install -e . && pip install flake8==3.6.0 pylint==2.2.2 pillow
script:
- esphomeyaml tests/test1.yaml compile
- esphomeyaml tests/test2.yaml compile

46
CODE_OF_CONDUCT.md Normal file
View File

@@ -0,0 +1,46 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at contact@otto-winter.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version]
[homepage]: http://contributor-covenant.org
[version]: http://contributor-covenant.org/version/1/4/

18
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,18 @@
# Contributing to esphomeyaml
esphomeyaml is a part of esphomelib and is responsible for reading in YAML configuration files,
converting them to C++ code. This code is then converted to a platformio project and compiled
with [esphomelib](https://github.com/OttoWinter/esphomelib), the C++ framework behind the project.
For a detailed guide, please see https://esphomelib.com/esphomeyaml/guides/contributing.html#contributing-to-esphomeyaml
Things to note when contributing:
- Please test your changes :)
- If a new feature is added or an existing user-facing feature is changed, you should also
update the [docs](https://github.com/OttoWinter/esphomedocs). See [contributing to esphomedocs](https://esphomelib.com/esphomeyaml/guides/contributing.html#contributing-to-esphomedocs)
for more information.
- Please also update the tests in the `tests/` folder. You can do so by just adding a line in one of the YAML files
which checks if your new feature compiles correctly.
- Sometimes I will let pull requests linger because I'm not 100% sure about them. Please feel free to ping
me after some time.

View File

@@ -1,25 +1,27 @@
FROM python:2.7
ARG BUILD_FROM=python:2.7
FROM ${BUILD_FROM}
MAINTAINER Otto Winter <contact@otto-winter.com>
RUN apt-get update && apt-get install -y \
python-pil \
&& rm -rf /var/lib/apt/lists/*
git \
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* && \
pip install --no-cache-dir --no-binary :all: platformio && \
platformio settings set enable_telemetry No && \
platformio settings set check_libraries_interval 1000000 && \
platformio settings set check_platformio_interval 1000000 && \
platformio settings set check_platforms_interval 1000000
ENV ESPHOMEYAML_OTA_HOST_PORT=6123
EXPOSE 6123
VOLUME /config
WORKDIR /usr/src/app
RUN pip install --no-cache-dir --no-binary :all: platformio && \
platformio settings set enable_telemetry No
COPY docker/platformio.ini /usr/src/app/
RUN platformio settings set enable_telemetry No && \
platformio run -e espressif32 -e espressif8266; exit 0
COPY docker/platformio.ini /pio/platformio.ini
RUN platformio run -d /pio; rm -rf /pio
COPY . .
RUN pip install --no-cache-dir -e . && \
pip install --no-cache-dir tzlocal pillow
RUN pip install --no-cache-dir --no-binary :all: -e .
WORKDIR /config
ENTRYPOINT ["esphomeyaml"]

View File

@@ -1,4 +1,17 @@
include README.md
include esphomeyaml/dashboard/templates/index.html
include esphomeyaml/dashboard/templates/login.html
include esphomeyaml/dashboard/static/ace.js
include esphomeyaml/dashboard/static/esphomeyaml.css
include esphomeyaml/dashboard/static/esphomeyaml.js
include esphomeyaml/dashboard/static/favicon.ico
include esphomeyaml/dashboard/static/jquery.min.js
include esphomeyaml/dashboard/static/jquery.validate.min.js
include esphomeyaml/dashboard/static/jquery-ui.min.js
include esphomeyaml/dashboard/static/materialize.min.css
include esphomeyaml/dashboard/static/materialize.min.js
include esphomeyaml/dashboard/static/materialize-stepper.min.css
include esphomeyaml/dashboard/static/materialize-stepper.min.js
include esphomeyaml/dashboard/static/mode-yaml.js
include esphomeyaml/dashboard/static/theme-dreamweaver.js
include esphomeyaml/dashboard/static/ext-searchbox.js

View File

@@ -1,10 +1,10 @@
# esphomeyaml for [esphomelib](https://github.com/OttoWinter/esphomelib)
### Getting Started Guide: https://esphomelib.com/esphomeyaml/getting-started.html
### Getting Started Guide: https://esphomelib.com/esphomeyaml/guides/getting_started_command_line.html
### Available Components: https://esphomelib.com/esphomeyaml/index.html
esphomeyaml is the solution for your ESP8266/ESP32 projects with Home Assistant. It allows you to create **custom firmwares** for your microcontrollers with no programming experience required. All you need to know is the YAML configuration format which is also used by Home Assistant.
esphomeyaml is the solution for your ESP8266/ESP32 projects with Home Assistant. It allows you to create **custom firmwares** for your microcontrollers with no programming experience required. All you need to know is the YAML configuration format which is also used by [Home Assistant](https://www.home-assistant.io).
esphomeyaml will:
@@ -26,7 +26,7 @@ esphomeyaml configuration.yaml run
files like you're used to with Home Assistant.
* **Flexible:** Use [esphomelib](https://github.com/OttoWinter/esphomelib)'s powerful core to create custom sensors/outputs.
* **Fast and efficient:** Written in C++ and keeps memory consumption to a minimum.
* **Made for Home Assistant:** Almost all Home Assistant features are supported out of the box. Including RGB lights and many more.
* **Made for [Home Assistant](https://www.home-assistant.io):** Almost all [Home Assistant](https://www.home-assistant.io) features are supported out of the box. Including RGB lights and many more.
* **Easy reproducible configuration:** No need to go through a long setup process for every single node. Just copy a configuration file and run a single command.
* **Smart Over The Air Updates:** esphomeyaml has OTA updates deeply integrated into the system. It even automatically enters a recovery mode if a boot loop is detected.
* **Powerful logging engine:** View colorful logs and debug issues remotely.

View File

@@ -1,21 +0,0 @@
# Dockerfile for aarch64 version of HassIO add-on
FROM arm64v8/ubuntu:bionic
RUN apt-get update && apt-get install -y --no-install-recommends \
python \
python-pip \
python-setuptools \
python-pil \
git \
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
pip install --no-cache-dir --no-binary :all: platformio && \
platformio settings set enable_telemetry No
COPY docker/platformio.ini /pio/platformio.ini
RUN platformio run -d /pio; rm -rf /pio
COPY . .
RUN pip install --no-cache-dir --no-binary :all: -e . && \
pip install --no-cache-dir --no-binary :all: tzlocal
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]

View File

@@ -1,21 +0,0 @@
# Dockerfile for amd64 version of HassIO add-on
FROM ubuntu:bionic
RUN apt-get update && apt-get install -y --no-install-recommends \
python \
python-pip \
python-setuptools \
python-pil \
git \
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
pip install --no-cache-dir --no-binary :all: platformio && \
platformio settings set enable_telemetry No
COPY docker/platformio.ini /pio/platformio.ini
RUN platformio run -d /pio; rm -rf /pio
COPY . .
RUN pip install --no-cache-dir --no-binary :all: -e . && \
pip install --no-cache-dir --no-binary :all: tzlocal
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]

View File

@@ -1,31 +0,0 @@
# Dockerfile for armhf version of HassIO add-on
FROM homeassistant/armhf-base:latest
RUN apk add --no-cache \
python2 \
python2-dev \
py2-pip \
git \
gcc \
openssh \
libc6-compat \
jpeg-dev \
zlib-dev \
freetype-dev \
lcms2-dev \
openjpeg-dev \
tiff-dev \
libc-dev \
linux-headers \
&& \
pip install --no-cache-dir --no-binary :all: platformio && \
platformio settings set enable_telemetry No
COPY docker/platformio-esp8266.ini /pio/platformio.ini
RUN platformio run -d /pio; rm -rf /pio
COPY . .
RUN pip install --no-cache-dir --no-binary :all: -e . && \
pip install --no-cache-dir pillow tzlocal
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]

View File

@@ -27,6 +27,4 @@ RUN apt-get update && apt-get install -y \
binfmt-support \
&& rm -rf /var/lib/apt/lists/*
COPY docker/hassio-builder.sh /usr/bin/
WORKDIR /data

75
docker/Dockerfile.hassio Normal file
View File

@@ -0,0 +1,75 @@
ARG BUILD_FROM=hassioaddons/ubuntu-base:2.2.0
# hadolint ignore=DL3006
FROM ${BUILD_FROM}
# Set shell
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
# Copy root filesystem
COPY esphomeyaml-edge/rootfs /
COPY setup.py setup.cfg MANIFEST.in /opt/esphomeyaml/
COPY esphomeyaml /opt/esphomeyaml/esphomeyaml
RUN \
# Temporarily move nginx.conf (otherwise dpkg fails)
mv /etc/nginx/nginx.conf /etc/nginx/nginx.conf.bkp \
# Install add-on dependencies
&& apt-get update \
&& apt-get install -y --no-install-recommends \
# Python for esphomeyaml
python \
python-pip \
python-setuptools \
# Python Pillow for display component
python-pil \
# Git for esphomelib downloads
git \
# Ping for dashboard online/offline status
iputils-ping \
# NGINX proxy
nginx \
\
&& mv /etc/nginx/nginx.conf.bkp /etc/nginx/nginx.conf \
\
&& pip2 install --no-cache-dir --no-binary :all: -e /opt/esphomeyaml \
\
# Change some platformio settings
&& platformio settings set enable_telemetry No \
&& platformio settings set check_libraries_interval 1000000 \
&& platformio settings set check_platformio_interval 1000000 \
&& platformio settings set check_platforms_interval 1000000 \
\
# Build an empty platformio project to force platformio to install all fw build dependencies
# The return-code will be non-zero since there's nothing to build.
&& (platformio run -d /opt/pio; echo "Done") \
\
# Cleanup
&& rm -fr \
/tmp/* \
/var/{cache,log}/* \
/var/lib/apt/lists/* \
/opt/pio/
# Build arugments
ARG BUILD_ARCH=amd64
ARG BUILD_DATE
ARG BUILD_REF
ARG BUILD_VERSION
# Labels
LABEL \
io.hass.name="esphomeyaml" \
io.hass.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.type="addon" \
io.hass.version=${BUILD_VERSION} \
maintainer="Otto Winter <contact@otto-winter.com>" \
org.label-schema.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
org.label-schema.build-date=${BUILD_DATE} \
org.label-schema.name="esphomeyaml" \
org.label-schema.schema-version="1.0" \
org.label-schema.url="https://esphomelib.com" \
org.label-schema.usage="https://github.com/OttoWinter/esphomeyaml/tree/dev/esphomeyaml/README.md" \
org.label-schema.vcs-ref=${BUILD_REF} \
org.label-schema.vcs-url="https://github.com/OttoWinter/esphomeyaml" \
org.label-schema.vendor="esphomelib"

View File

@@ -1,21 +0,0 @@
# Dockerfile for i386 version of HassIO add-on
FROM i386/ubuntu:bionic
RUN apt-get update && apt-get install -y --no-install-recommends \
python \
python-pip \
python-setuptools \
python-pil \
git \
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
pip install --no-cache-dir --no-binary :all: platformio && \
platformio settings set enable_telemetry No
COPY docker/platformio.ini /pio/platformio.ini
RUN platformio run -d /pio; rm -rf /pio
COPY . .
RUN pip install --no-cache-dir --no-binary :all: -e . && \
pip install --no-cache-dir --no-binary :all: tzlocal
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]

View File

@@ -3,4 +3,4 @@ FROM python:2.7
COPY requirements.txt /requirements.txt
RUN pip install -r /requirements.txt && \
pip install flake8==3.5.0 pylint==1.9.3 tzlocal pillow
pip install flake8==3.6.0 pylint==1.9.4 pillow

View File

@@ -8,12 +8,14 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
git \
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
pip install --no-cache-dir --no-binary :all: platformio && \
platformio settings set enable_telemetry No
platformio settings set enable_telemetry No && \
platformio settings set check_libraries_interval 1000000 && \
platformio settings set check_platformio_interval 1000000 && \
platformio settings set check_platforms_interval 1000000
COPY docker/platformio.ini /pio/platformio.ini
RUN platformio run -d /pio; rm -rf /pio
COPY requirements.txt /requirements.txt
RUN pip install --no-cache-dir -r /requirements.txt && \
pip install --no-cache-dir tzlocal pillow
RUN pip install --no-cache-dir -r /requirements.txt

View File

@@ -1,318 +0,0 @@
#!/usr/bin/env bash
# Based on Home Assistant's docker builder
######################
# Hass.io Build-env
######################
set -e
echo -- "$@"
#### Variable ####
DOCKER_TIMEOUT=20
DOCKER_PID=-1
DOCKER_HUB=""
DOCKER_CACHE="true"
DOCKER_LOCAL="false"
TARGET=""
IMAGE=""
BUILD_LIST=()
BUILD_TASKS=()
#### Misc functions ####
function print_help() {
cat << EOF
Hass.io build-env for ecosystem:
docker run --rm homeassistant/{arch}-builder:latest [options]
Options:
-h, --help
Display this help and exit.
Repository / Data
-t, --target <PATH_TO_BUILD>
Set local folder or path inside repository for build.
Version/Image handling
-i, --image <IMAGE_NAME>
Overwrite image name of build / support {arch}
Architecture
--armhf
Build for arm.
--amd64
Build for intel/amd 64bit.
--aarch64
Build for arm 64bit.
--i386
Build for intel/amd 32bit.
--all
Build all architecture.
Build handling
--no-cache
Disable cache for the build (from latest).
-d, --docker-hub <DOCKER_REPOSITORY>
Set or overwrite the docker repository.
Use the host docker socket if mapped into container:
/var/run/docker.sock
EOF
exit 1
}
#### Docker functions ####
function start_docker() {
local starttime
local endtime
if [ -S "/var/run/docker.sock" ]; then
echo "[INFO] Use host docker setup with '/var/run/docker.sock'"
DOCKER_LOCAL="true"
return 0
fi
echo "[INFO] Starting docker."
dockerd 2> /dev/null &
DOCKER_PID=$!
echo "[INFO] Waiting for docker to initialize..."
starttime="$(date +%s)"
endtime="$(date +%s)"
until docker info >/dev/null 2>&1; do
if [ $((endtime - starttime)) -le ${DOCKER_TIMEOUT} ]; then
sleep 1
endtime=$(date +%s)
else
echo "[ERROR] Timeout while waiting for docker to come up"
exit 1
fi
done
echo "[INFO] Docker was initialized"
}
function stop_docker() {
local starttime
local endtime
if [ "$DOCKER_LOCAL" == "true" ]; then
return 0
fi
echo "[INFO] Stopping in container docker..."
if [ "$DOCKER_PID" -gt 0 ] && kill -0 "$DOCKER_PID" 2> /dev/null; then
starttime="$(date +%s)"
endtime="$(date +%s)"
# Now wait for it to die
kill "$DOCKER_PID"
while kill -0 "$DOCKER_PID" 2> /dev/null; do
if [ $((endtime - starttime)) -le ${DOCKER_TIMEOUT} ]; then
sleep 1
endtime=$(date +%s)
else
echo "[ERROR] Timeout while waiting for container docker to die"
exit 1
fi
done
else
echo "[WARN] Your host might have been left with unreleased resources"
fi
}
function run_build() {
local build_dir=$1
local repository=$2
local image=$3
local version=$4
local build_arch=$5
local docker_cli=("${!6}")
local push_images=()
# Overwrites
if [ ! -z "$DOCKER_HUB" ]; then repository="$DOCKER_HUB"; fi
if [ ! -z "$IMAGE" ]; then image="$IMAGE"; fi
# Init Cache
if [ "$DOCKER_CACHE" == "true" ]; then
echo "[INFO] Init cache for $repository/$image:$version"
if docker pull "$repository/$image:latest" > /dev/null 2>&1; then
docker_cli+=("--cache-from" "$repository/$image:latest")
else
docker_cli+=("--no-cache")
echo "[WARN] No cache image found. Cache is disabled for build"
fi
else
docker_cli+=("--no-cache")
fi
# Build image
echo "[INFO] Run build for $repository/$image:$version"
docker build --pull -t "$repository/$image:$version" \
--label "io.hass.version=$version" \
--label "io.hass.arch=$build_arch" \
-f "$TARGET/docker/Dockerfile.$build_arch" \
"${docker_cli[@]}" \
"$build_dir"
echo "[INFO] Finish build for $repository/$image:$version"
docker tag "$repository/$image:$version" "$repository/$image:dev"
}
#### HassIO functions ####
function build_addon() {
local build_arch=$1
local docker_cli=()
local image=""
local repository=""
local raw_image=""
local name=""
local description=""
local url=""
local args=""
# Read addon config.json
name="$(jq --raw-output '.name // empty' "$TARGET/esphomeyaml/config.json" | sed "s/'//g")"
description="$(jq --raw-output '.description // empty' "$TARGET/esphomeyaml/config.json" | sed "s/'//g")"
url="$(jq --raw-output '.url // empty' "$TARGET/esphomeyaml/config.json")"
version="$(jq --raw-output '.version' "$TARGET/esphomeyaml/config.json")"
raw_image="$(jq --raw-output '.image // empty' "$TARGET/esphomeyaml/config.json")"
# Read data from image
if [ ! -z "$raw_image" ]; then
repository="$(echo "$raw_image" | cut -f 1 -d '/')"
image="$(echo "$raw_image" | cut -f 2 -d '/')"
fi
# Set additional labels
docker_cli+=("--label" "io.hass.name=$name")
docker_cli+=("--label" "io.hass.description=$description")
docker_cli+=("--label" "io.hass.type=addon")
if [ ! -z "$url" ]; then
docker_cli+=("--label" "io.hass.url=$url")
fi
# Start build
run_build "$TARGET" "$repository" "$image" "$version" \
"$build_arch" docker_cli[@]
}
#### initialized cross-build ####
function init_crosscompile() {
echo "[INFO] Setup crosscompiling feature"
(
mount binfmt_misc -t binfmt_misc /proc/sys/fs/binfmt_misc
update-binfmts --enable qemu-arm
update-binfmts --enable qemu-aarch64
) > /dev/null 2>&1 || echo "[WARN] Can't enable crosscompiling feature"
}
function clean_crosscompile() {
echo "[INFO] Clean crosscompiling feature"
if [ -f /proc/sys/fs/binfmt_misc ]; then
umount /proc/sys/fs/binfmt_misc || true
fi
(
update-binfmts --disable qemu-arm
update-binfmts --disable qemu-aarch64
) > /dev/null 2>&1 || echo "[WARN] No crosscompiling feature found for cleanup"
}
#### Error handling ####
function error_handling() {
stop_docker
clean_crosscompile
exit 1
}
trap 'error_handling' SIGINT SIGTERM
#### Parse arguments ####
while [[ $# -gt 0 ]]; do
key=$1
case ${key} in
-h|--help)
print_help
;;
-t|--target)
TARGET=$2
shift
;;
-i|--image)
IMAGE=$2
shift
;;
--no-cache)
DOCKER_CACHE="false"
;;
-d|--docker-hub)
DOCKER_HUB=$2
shift
;;
--armhf)
BUILD_LIST+=("armhf")
;;
--amd64)
BUILD_LIST+=("amd64")
;;
--i386)
BUILD_LIST+=("i386")
;;
--aarch64)
BUILD_LIST+=("aarch64")
;;
--all)
BUILD_LIST=("armhf" "amd64" "i386" "aarch64")
;;
*)
echo "[WARN] $0 : Argument '$1' unknown will be Ignoring"
;;
esac
shift
done
# Check if an architecture is available
if [ "${#BUILD_LIST[@]}" -eq 0 ]; then
echo "[ERROR] You need select an architecture for build!"
exit 1
fi
#### Main ####
mkdir -p /data
# Setup docker env
init_crosscompile
start_docker
# Select arch build
for arch in "${BUILD_LIST[@]}"; do
(build_addon "$arch") &
BUILD_TASKS+=($!)
done
# Wait until all build jobs are done
wait "${BUILD_TASKS[@]}"
# Cleanup docker env
clean_crosscompile
stop_docker
exit 0

View File

@@ -1,7 +0,0 @@
; This file allows the docker build file to install the required platformio
; platforms
[env:espressif8266]
platform = espressif8266
board = nodemcuv2
framework = arduino

View File

@@ -2,11 +2,11 @@
; platforms
[env:espressif8266]
platform = espressif8266
platform = espressif8266@1.8.0
board = nodemcuv2
framework = arduino
[env:espressif32]
platform = espressif32
platform = espressif32@1.5.0
board = nodemcu-32s
framework = arduino

109
esphomeyaml-beta/README.md Normal file
View File

@@ -0,0 +1,109 @@
# Esphomeyaml Hass.io Add-On
[![esphomeyaml logo](https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/logo.png)](https://esphomelib.com/esphomeyaml/index.html)
[![GitHub stars](https://img.shields.io/github/stars/OttoWinter/esphomelib.svg?style=social&label=Star&maxAge=2592000)](https://github.com/OttoWinter/esphomelib)
[![GitHub Release][releases-shield]][releases]
[![Discord][discord-shield]][discord]
## About
This add-on allows you to manage and program your ESP8266 and ESP32 based microcontrollers
directly through Hass.io **with no programming experience required**. All you need to do
is write YAML configuration files; the rest (over-the-air updates, compiling) is all
handled by esphomeyaml.
<p align="center">
<img title="esphomeyaml dashboard screenshot" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/screenshot.png" width="700px"></img>
</p>
[_View the esphomeyaml documentation here_](https://esphomelib.com/esphomeyaml/index.html)
## Example
With esphomeyaml, you can go from a few lines of YAML straight to a custom-made
firmware. For example, to include a [DHT22][dht22].
temperature and humidity sensor, you just need to include 8 lines of YAML
in your configuration file:
<img title="esphomeyaml DHT configuration example" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/dht-example.png" width="500px"></img>
Then just click UPLOAD and the sensor will magically appear in Home Assistant:
<img title="esphomelib Home Assistant MQTT discovery" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/temperature-humidity.png" width="600px"></img>
## Installation
To install this Hass.io add-on you need to add the esphomeyaml add-on repository
first:
1. Add the epshomeyaml add-ons repository to your Hass.io instance. You can do this by navigating to the "Add-on Store" tab in the Hass.io panel and then entering https://github.com/OttoWinter/esphomeyaml in the "Add new repository by URL" field.
2. Now scroll down and select the "esphomeyaml" add-on.
3. Press install to download the add-on and unpack it on your machine. This can take some time.
4. Optional: If you're using SSL certificates and want to encrypt your communication to this add-on, please enter `true` into the `ssl` field and set the `fullchain` and `certfile` options accordingly.
5. Start the add-on, check the logs of the add-on to see if everything went well.
6. Click "OPEN WEB UI" to open the esphomeyaml dashboard. You will be asked for your Home Assistant credentials - esphomeyaml uses Hass.io's authentication system to log you in.
**NOTE**: Installation on RPis running in 64-bit mode is currently not possible. Please use the 32-bit variant of HassOS instead.
You can view the esphomeyaml docs here: https://esphomelib.com/esphomeyaml/index.html
## Configuration
**Note**: _Remember to restart the add-on when the configuration is changed._
Example add-on configuration:
```json
{
"ssl": false,
"certfile": "fullchain.pem",
"keyfile": "privkey.pem",
"port": 6052
}
```
### Option: `port`
The port to start the dashboard server on. Default is 6052.
### Option: `ssl`
Enables/Disables encrypted SSL (HTTPS) connections to the web server of this add-on.
Set it to `true` to encrypt communications, `false` otherwise.
Please note that if you set this to `true` you must also generate the key and certificate
files for encryption. For example using [Let's Encrypt](https://www.home-assistant.io/addons/lets_encrypt/)
or [Self-signed certificates](https://www.home-assistant.io/docs/ecosystem/certificates/tls_self_signed_certificate/).
### Option: `certfile`
The certificate file to use for SSL. If this file doesn't exist, the add-on start will fail.
**Note**: The file MUST be stored in `/ssl/`, which is the default for Hass.io
### Option: `keyfile`
The private key file to use for SSL. If this file doesn't exist, the add-on start will fail.
**Note**: The file MUST be stored in `/ssl/`, which is the default for Hass.io
### Option: `leave_front_door_open`
Adding this option to the add-on configuration allows you to disable
authentication by setting it to `true`.
### Option: `esphomeyaml_version`
Manually override which esphomeyaml version to use in the addon.
For example to install the latest development version, use `"esphomeyaml_version": "dev"`,
or for version 1.10.0: `"esphomeyaml_version": "v1.10.0""`.
Please note that this does not always work and is only meant for testing, usually the
esphomeyaml add-on and dashboard version must match to guarantee a working system.
[discord-shield]: https://img.shields.io/discord/429907082951524364.svg
[dht22]: https://esphomelib.com/esphomeyaml/components/sensor/dht.html
[discord]: https://discord.me/KhAMKrd
[releases-shield]: https://img.shields.io/github/release/OttoWinter/esphomeyaml.svg
[releases]: https://esphomelib.com/esphomeyaml/changelog/index.html
[repository]: https://github.com/OttoWinter/esphomeyaml

View File

@@ -0,0 +1,39 @@
{
"name": "esphomeyaml-beta",
"version": "1.10.1",
"slug": "esphomeyaml-beta",
"description": "Beta version of esphomeyaml Hass.io add-on.",
"url": "https://beta.esphomelib.com/esphomeyaml/index.html",
"webui": "http://[HOST]:[PORT:6052]",
"startup": "application",
"arch": [
"amd64",
"armhf",
"i386"
],
"hassio_api": true,
"auth_api": true,
"hassio_role": "default",
"homeassistant_api": false,
"host_network": true,
"boot": "auto",
"auto_uart": true,
"map": [
"ssl",
"config:rw"
],
"options": {
"ssl": false,
"certfile": "fullchain.pem",
"keyfile": "privkey.pem",
"port": 6052
},
"schema": {
"ssl": "bool",
"certfile": "str",
"keyfile": "str",
"port": "int",
"leave_front_door_open": "bool?",
"esphomeyaml_version": "str?"
}
}

BIN
esphomeyaml-beta/icon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 KiB

BIN
esphomeyaml-beta/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.6 KiB

View File

@@ -1,59 +1,73 @@
# Dockerfile for HassIO add-on
ARG BUILD_FROM=ubuntu:bionic
ARG BUILD_FROM=hassioaddons/ubuntu-base:2.2.0
# hadolint ignore=DL3006
FROM ${BUILD_FROM}
# Re-declare BUILD_FROM to fix weird docker issue
ARG BUILD_FROM
# Set shell
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
# On amd64 and alike, using ubuntu as the base is better as building
# for the ESP32 only works with glibc (and ubuntu). However, on armhf
# the build toolchain frequently procudes segfaults under ubuntu.
# -> Use ubuntu for most architectures, except alpine for armhf
#
# * python and related required because this is a python project
# * git required for platformio library dependencies downloads
# * libc6-compat and openssh required on alpine for weird reasons
# * disable platformio telemetry on install
RUN /bin/bash -c "if [[ '$BUILD_FROM' = *\"ubuntu\"* ]]; then \
apt-get update && apt-get install -y --no-install-recommends \
python python-pip python-setuptools python-pil git && \
rm -rf /var/lib/apt/lists/* /tmp/*; \
else \
apk add --no-cache \
python2 \
python2-dev \
py2-pip \
git \
gcc \
openssh \
libc6-compat \
jpeg-dev \
zlib-dev \
freetype-dev \
lcms2-dev \
openjpeg-dev \
tiff-dev \
libc-dev \
linux-headers; \
fi" && \
pip install --no-cache-dir platformio && \
platformio settings set enable_telemetry No
# Copy root filesystem
COPY rootfs /
RUN \
# Temporarily move nginx.conf (otherwise dpkg fails)
mv /etc/nginx/nginx.conf /etc/nginx/nginx.conf.bkp \
# Install add-on dependencies
&& apt-get update \
&& apt-get install -y --no-install-recommends \
# Python for esphomeyaml
python \
python-pip \
python-setuptools \
# Python Pillow for display component
python-pil \
# Git for esphomelib downloads
git \
# Ping for dashboard online/offline status
iputils-ping \
# NGINX proxy
nginx \
\
&& mv /etc/nginx/nginx.conf.bkp /etc/nginx/nginx.conf \
\
&& pip2 install --no-cache-dir --no-binary :all: https://github.com/OttoWinter/esphomeyaml/archive/dev.zip \
\
# Change some platformio settings
&& platformio settings set enable_telemetry No \
&& platformio settings set check_libraries_interval 1000000 \
&& platformio settings set check_platformio_interval 1000000 \
&& platformio settings set check_platforms_interval 1000000 \
\
# Build an empty platformio project to force platformio to install all fw build dependencies
# The return-code will be non-zero since there's nothing to build.
&& (platformio run -d /opt/pio; echo "Done") \
\
# Cleanup
&& rm -fr \
/tmp/* \
/var/{cache,log}/* \
/var/lib/apt/lists/* \
/opt/pio/
# Create fake project to make platformio install all depdencies.
# * Ignore build errors from platformio - empty project
# * On alpine, only install ESP8266 toolchain
COPY platformio.ini /pio/platformio.ini
RUN /bin/bash -c "if [[ '$BUILD_FROM' = *\"ubuntu\"* ]]; then \
platformio run -e espressif32 -e espressif8266 -d /pio; exit 0; \
else \
echo \"\$(head -8 /pio/platformio.ini)\" >/pio/platformio.ini; \
platformio run -e espressif8266 -d /pio; exit 0; \
fi"
# Build arugments
ARG BUILD_ARCH=amd64
ARG BUILD_DATE
ARG BUILD_REF
ARG BUILD_VERSION
# Install latest esphomeyaml from git
RUN pip install --no-cache-dir \
git+git://github.com/OttoWinter/esphomeyaml.git && \
pip install --no-cache-dir pillow tzlocal
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
# Labels
LABEL \
io.hass.name="esphomeyaml-edge" \
io.hass.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.type="addon" \
io.hass.version=${BUILD_VERSION} \
maintainer="Otto Winter <contact@otto-winter.com>" \
org.label-schema.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
org.label-schema.build-date=${BUILD_DATE} \
org.label-schema.name="esphomeyaml-edge" \
org.label-schema.schema-version="1.0" \
org.label-schema.url="https://esphomelib.com" \
org.label-schema.usage="https://github.com/OttoWinter/esphomeyaml/tree/dev/esphomeyaml-edge/README.md" \
org.label-schema.vcs-ref=${BUILD_REF} \
org.label-schema.vcs-url="https://github.com/OttoWinter/esphomeyaml" \
org.label-schema.vendor="esphomelib"

109
esphomeyaml-edge/README.md Normal file
View File

@@ -0,0 +1,109 @@
# Esphomeyaml Hass.io Add-On
[![esphomeyaml logo](https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/logo.png)](https://esphomelib.com/esphomeyaml/index.html)
[![GitHub stars](https://img.shields.io/github/stars/OttoWinter/esphomelib.svg?style=social&label=Star&maxAge=2592000)](https://github.com/OttoWinter/esphomelib)
[![GitHub Release][releases-shield]][releases]
[![Discord][discord-shield]][discord]
## About
This add-on allows you to manage and program your ESP8266 and ESP32 based microcontrollers
directly through Hass.io **with no programming experience required**. All you need to do
is write YAML configuration files; the rest (over-the-air updates, compiling) is all
handled by esphomeyaml.
<p align="center">
<img title="esphomeyaml dashboard screenshot" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/screenshot.png" width="700px"></img>
</p>
[_View the esphomeyaml documentation here_](https://esphomelib.com/esphomeyaml/index.html)
## Example
With esphomeyaml, you can go from a few lines of YAML straight to a custom-made
firmware. For example, to include a [DHT22][dht22].
temperature and humidity sensor, you just need to include 8 lines of YAML
in your configuration file:
<img title="esphomeyaml DHT configuration example" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/dht-example.png" width="500px"></img>
Then just click UPLOAD and the sensor will magically appear in Home Assistant:
<img title="esphomelib Home Assistant MQTT discovery" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/temperature-humidity.png" width="600px"></img>
## Installation
To install this Hass.io add-on you need to add the esphomeyaml add-on repository
first:
1. Add the epshomeyaml add-ons repository to your Hass.io instance. You can do this by navigating to the "Add-on Store" tab in the Hass.io panel and then entering https://github.com/OttoWinter/esphomeyaml in the "Add new repository by URL" field.
2. Now scroll down and select the "esphomeyaml" add-on.
3. Press install to download the add-on and unpack it on your machine. This can take some time.
4. Optional: If you're using SSL certificates and want to encrypt your communication to this add-on, please enter `true` into the `ssl` field and set the `fullchain` and `certfile` options accordingly.
5. Start the add-on, check the logs of the add-on to see if everything went well.
6. Click "OPEN WEB UI" to open the esphomeyaml dashboard. You will be asked for your Home Assistant credentials - esphomeyaml uses Hass.io's authentication system to log you in.
**NOTE**: Installation on RPis running in 64-bit mode is currently not possible. Please use the 32-bit variant of HassOS instead.
You can view the esphomeyaml docs here: https://esphomelib.com/esphomeyaml/index.html
## Configuration
**Note**: _Remember to restart the add-on when the configuration is changed._
Example add-on configuration:
```json
{
"ssl": false,
"certfile": "fullchain.pem",
"keyfile": "privkey.pem",
"port": 6052
}
```
### Option: `port`
The port to start the dashboard server on. Default is 6052.
### Option: `ssl`
Enables/Disables encrypted SSL (HTTPS) connections to the web server of this add-on.
Set it to `true` to encrypt communications, `false` otherwise.
Please note that if you set this to `true` you must also generate the key and certificate
files for encryption. For example using [Let's Encrypt](https://www.home-assistant.io/addons/lets_encrypt/)
or [Self-signed certificates](https://www.home-assistant.io/docs/ecosystem/certificates/tls_self_signed_certificate/).
### Option: `certfile`
The certificate file to use for SSL. If this file doesn't exist, the add-on start will fail.
**Note**: The file MUST be stored in `/ssl/`, which is the default for Hass.io
### Option: `keyfile`
The private key file to use for SSL. If this file doesn't exist, the add-on start will fail.
**Note**: The file MUST be stored in `/ssl/`, which is the default for Hass.io
### Option: `leave_front_door_open`
Adding this option to the add-on configuration allows you to disable
authentication by setting it to `true`.
### Option: `esphomeyaml_version`
Manually override which esphomeyaml version to use in the addon.
For example to install the latest development version, use `"esphomeyaml_version": "dev"`,
or for version 1.10.0: `"esphomeyaml_version": "v1.10.0""`.
Please note that this does not always work and is only meant for testing, usually the
esphomeyaml add-on and dashboard version must match to guarantee a working system.
[discord-shield]: https://img.shields.io/discord/429907082951524364.svg
[dht22]: https://esphomelib.com/esphomeyaml/components/sensor/dht.html
[discord]: https://discord.me/KhAMKrd
[releases-shield]: https://img.shields.io/github/release/OttoWinter/esphomeyaml.svg
[releases]: https://esphomelib.com/esphomeyaml/changelog/index.html
[repository]: https://github.com/OttoWinter/esphomeyaml

View File

@@ -1,10 +1,10 @@
{
"squash": false,
"build_from": {
"aarch64": "arm64v8/ubuntu:bionic",
"amd64": "ubuntu:bionic",
"armhf": "homeassistant/armhf-base:latest",
"i386": "i386/ubuntu:bionic"
},
"args": {}
"squash": false,
"build_from": {
"aarch64": "hassioaddons/ubuntu-base-aarch64:2.2.0",
"amd64": "hassioaddons/ubuntu-base-amd64:2.2.0",
"armhf": "hassioaddons/ubuntu-base-armhf:2.2.0",
"i386": "hassioaddons/ubuntu-base-i386:2.2.0"
},
"args": {}
}

View File

@@ -2,32 +2,39 @@
"name": "esphomeyaml-edge",
"version": "dev",
"slug": "esphomeyaml-edge",
"description": "Development build of the esphomeyaml HassIO add-on.",
"url": "https://esphomelib.com/esphomeyaml/index.html",
"startup": "application",
"description": "Development Version! Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files",
"url": "https://github.com/OttoWinter/esphomeyaml/tree/dev/esphomeyaml-edge/README.md",
"webui": "http://[HOST]:[PORT:6052]",
"boot": "auto",
"ports": {
"6052/tcp": 6052,
"6053/tcp": 6053
},
"startup": "application",
"arch": [
"aarch64",
"amd64",
"armhf",
"i386"
],
"hassio_api": true,
"auth_api": true,
"hassio_role": "default",
"homeassistant_api": false,
"host_network": true,
"boot": "auto",
"auto_uart": true,
"map": [
"ssl",
"config:rw"
],
"options": {
"password": ""
"ssl": false,
"certfile": "fullchain.pem",
"keyfile": "privkey.pem",
"port": 6052
},
"schema": {
"password": "str?"
},
"environment": {
"ESPHOMEYAML_OTA_HOST_PORT": "6053"
"ssl": "bool",
"certfile": "str",
"keyfile": "str",
"port": "int",
"leave_front_door_open": "bool?",
"esphomeyaml_version": "str?"
}
}

BIN
esphomeyaml-edge/icon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 17 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 50 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.3 KiB

BIN
esphomeyaml-edge/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.6 KiB

View File

@@ -0,0 +1,35 @@
#!/usr/bin/with-contenv bash
# ==============================================================================
# Community Hass.io Add-ons: esphomeyaml
# This files check if all user configuration requirements are met
# ==============================================================================
# shellcheck disable=SC1091
source /usr/lib/hassio-addons/base.sh
# Check SSL requirements, if enabled
if hass.config.true 'ssl'; then
if ! hass.config.has_value 'certfile'; then
hass.die 'SSL is enabled, but no certfile was specified.'
fi
if ! hass.config.has_value 'keyfile'; then
hass.die 'SSL is enabled, but no keyfile was specified'
fi
if ! hass.file_exists "/ssl/$(hass.config.get 'certfile')"; then
if ! hass.file_exists "/ssl/$(hass.config.get 'keyfile')"; then
# Both files are missing, let's print a friendlier error message
text="You enabled encrypted connections using the \"ssl\": true option.
However, the SSL files \"$(hass.config.get 'certfile')\" and \"$(hass.config.get 'keyfile')\"
were not found. If you're using Hass.io on your local network and don't want
to encrypt connections to the esphomeyaml dashboard, you can manually disable
SSL by setting \"ssl\" to false."
hass.die "${text}"
fi
hass.die 'The configured certfile is not found'
fi
if ! hass.file_exists "/ssl/$(hass.config.get 'keyfile')"; then
hass.die 'The configured keyfile is not found'
fi
fi

View File

@@ -0,0 +1,28 @@
#!/usr/bin/with-contenv bash
# ==============================================================================
# Community Hass.io Add-ons: esphomeyaml
# Configures NGINX for use with esphomeyaml
# ==============================================================================
# shellcheck disable=SC1091
source /usr/lib/hassio-addons/base.sh
declare certfile
declare keyfile
declare port
mkdir -p /var/log/nginx
# Enable SSL
if hass.config.true 'ssl'; then
rm /etc/nginx/nginx.conf
mv /etc/nginx/nginx-ssl.conf /etc/nginx/nginx.conf
certfile=$(hass.config.get 'certfile')
keyfile=$(hass.config.get 'keyfile')
sed -i "s/%%certfile%%/${certfile}/g" /etc/nginx/nginx.conf
sed -i "s/%%keyfile%%/${keyfile}/g" /etc/nginx/nginx.conf
fi
port=$(hass.config.get 'port')
sed -i "s/%%port%%/${port}/g" /etc/nginx/nginx.conf

View File

@@ -0,0 +1,14 @@
#!/usr/bin/with-contenv bash
# ==============================================================================
# Community Hass.io Add-ons: esphomeyaml
# This files installs the user esphomeyaml version if specified
# ==============================================================================
# shellcheck disable=SC1091
source /usr/lib/hassio-addons/base.sh
declare esphomeyaml_version
if hass.config.has_value 'esphomeyaml_version'; then
esphomeyaml_version=$(hass.config.get 'esphomeyaml_version')
pip2 install --no-cache-dir --no-binary :all: "https://github.com/OttoWinter/esphomeyaml/archive/${esphomeyaml_version}.zip"
fi

View File

@@ -0,0 +1,62 @@
worker_processes 1;
pid /var/run/nginx.pid;
error_log stderr;
events {
worker_connections 1024;
}
http {
access_log stdout;
include mime.types;
default_type application/octet-stream;
sendfile on;
keepalive_timeout 65;
upstream esphomeyaml {
ip_hash;
server unix:/var/run/esphomeyaml.sock;
}
map $http_upgrade $connection_upgrade {
default upgrade;
'' close;
}
server {
server_name hassio.local;
listen %%port%% default_server ssl;
root /dev/null;
ssl_certificate /ssl/%%certfile%%;
ssl_certificate_key /ssl/%%keyfile%%;
ssl_protocols TLSv1.2;
ssl_prefer_server_ciphers on;
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:DHE-RSA-AES256-SHA;
ssl_ecdh_curve secp384r1;
ssl_session_timeout 10m;
ssl_session_cache shared:SSL:10m;
ssl_session_tickets off;
ssl_stapling on;
ssl_stapling_verify on;
# Redirect http requests to https on the same port.
# https://rageagainstshell.com/2016/11/redirect-http-to-https-on-the-same-port-in-nginx/
error_page 497 https://$http_host$request_uri;
location / {
proxy_redirect off;
proxy_pass http://esphomeyaml;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header Authorization "";
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
}
}
}

View File

@@ -0,0 +1,46 @@
worker_processes 1;
pid /var/run/nginx.pid;
error_log stderr;
events {
worker_connections 1024;
}
http {
access_log stdout;
include mime.types;
default_type application/octet-stream;
sendfile on;
keepalive_timeout 65;
upstream esphomeyaml {
ip_hash;
server unix:/var/run/esphomeyaml.sock;
}
map $http_upgrade $connection_upgrade {
default upgrade;
'' close;
}
server {
server_name hassio.local;
listen %%port%% default_server;
root /dev/null;
location / {
proxy_redirect off;
proxy_pass http://esphomeyaml;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header Authorization "";
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
}
}
}

View File

@@ -0,0 +1,9 @@
#!/usr/bin/execlineb -S0
# ==============================================================================
# Community Hass.io Add-ons: esphomeyaml
# Take down the S6 supervision tree when esphomeyaml fails
# ==============================================================================
if -n { s6-test $# -ne 0 }
if -n { s6-test ${1} -eq 256 }
s6-svscanctl -t /var/run/s6/services

View File

@@ -0,0 +1,14 @@
#!/usr/bin/with-contenv bash
# ==============================================================================
# Community Hass.io Add-ons: esphomeyaml
# Runs the esphomeyaml dashboard
# ==============================================================================
# shellcheck disable=SC1091
source /usr/lib/hassio-addons/base.sh
if hass.config.true 'leave_front_door_open'; then
export DISABLE_HA_AUTHENTICATION=true
fi
hass.log.info "Starting esphomeyaml dashboard..."
exec esphomeyaml /config/esphomeyaml dashboard --socket /var/run/esphomeyaml.sock --hassio

View File

@@ -0,0 +1,9 @@
#!/usr/bin/execlineb -S0
# ==============================================================================
# Community Hass.io Add-ons: esphomeyaml
# Take down the S6 supervision tree when NGINX fails
# ==============================================================================
if -n { s6-test $# -ne 0 }
if -n { s6-test ${1} -eq 256 }
s6-svscanctl -t /var/run/s6/services

View File

@@ -0,0 +1,10 @@
#!/usr/bin/with-contenv bash
# ==============================================================================
# Community Hass.io Add-ons: esphomeyaml
# Runs the NGINX proxy
# ==============================================================================
# shellcheck disable=SC1091
source /usr/lib/hassio-addons/base.sh
hass.log.info "Starting NGINX..."
exec nginx -g "daemon off;"

View File

@@ -2,11 +2,11 @@
; platforms
[env:espressif8266]
platform = espressif8266
platform = espressif8266@1.8.0
board = nodemcuv2
framework = arduino
[env:espressif32]
platform = espressif32
platform = espressif32@1.5.0
board = nodemcu-32s
framework = arduino

109
esphomeyaml/README.md Normal file
View File

@@ -0,0 +1,109 @@
# Esphomeyaml Hass.io Add-On
[![esphomeyaml logo](https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/logo.png)](https://esphomelib.com/esphomeyaml/index.html)
[![GitHub stars](https://img.shields.io/github/stars/OttoWinter/esphomelib.svg?style=social&label=Star&maxAge=2592000)](https://github.com/OttoWinter/esphomelib)
[![GitHub Release][releases-shield]][releases]
[![Discord][discord-shield]][discord]
## About
This add-on allows you to manage and program your ESP8266 and ESP32 based microcontrollers
directly through Hass.io **with no programming experience required**. All you need to do
is write YAML configuration files; the rest (over-the-air updates, compiling) is all
handled by esphomeyaml.
<p align="center">
<img title="esphomeyaml dashboard screenshot" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/screenshot.png" width="700px"></img>
</p>
[_View the esphomeyaml documentation here_](https://esphomelib.com/esphomeyaml/index.html)
## Example
With esphomeyaml, you can go from a few lines of YAML straight to a custom-made
firmware. For example, to include a [DHT22][dht22].
temperature and humidity sensor, you just need to include 8 lines of YAML
in your configuration file:
<img title="esphomeyaml DHT configuration example" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/dht-example.png" width="500px"></img>
Then just click UPLOAD and the sensor will magically appear in Home Assistant:
<img title="esphomelib Home Assistant MQTT discovery" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/temperature-humidity.png" width="600px"></img>
## Installation
To install this Hass.io add-on you need to add the esphomeyaml add-on repository
first:
1. Add the epshomeyaml add-ons repository to your Hass.io instance. You can do this by navigating to the "Add-on Store" tab in the Hass.io panel and then entering https://github.com/OttoWinter/esphomeyaml in the "Add new repository by URL" field.
2. Now scroll down and select the "esphomeyaml" add-on.
3. Press install to download the add-on and unpack it on your machine. This can take some time.
4. Optional: If you're using SSL certificates and want to encrypt your communication to this add-on, please enter `true` into the `ssl` field and set the `fullchain` and `certfile` options accordingly.
5. Start the add-on, check the logs of the add-on to see if everything went well.
6. Click "OPEN WEB UI" to open the esphomeyaml dashboard. You will be asked for your Home Assistant credentials - esphomeyaml uses Hass.io's authentication system to log you in.
**NOTE**: Installation on RPis running in 64-bit mode is currently not possible. Please use the 32-bit variant of HassOS instead.
You can view the esphomeyaml docs here: https://esphomelib.com/esphomeyaml/index.html
## Configuration
**Note**: _Remember to restart the add-on when the configuration is changed._
Example add-on configuration:
```json
{
"ssl": false,
"certfile": "fullchain.pem",
"keyfile": "privkey.pem",
"port": 6052
}
```
### Option: `port`
The port to start the dashboard server on. Default is 6052.
### Option: `ssl`
Enables/Disables encrypted SSL (HTTPS) connections to the web server of this add-on.
Set it to `true` to encrypt communications, `false` otherwise.
Please note that if you set this to `true` you must also generate the key and certificate
files for encryption. For example using [Let's Encrypt](https://www.home-assistant.io/addons/lets_encrypt/)
or [Self-signed certificates](https://www.home-assistant.io/docs/ecosystem/certificates/tls_self_signed_certificate/).
### Option: `certfile`
The certificate file to use for SSL. If this file doesn't exist, the add-on start will fail.
**Note**: The file MUST be stored in `/ssl/`, which is the default for Hass.io
### Option: `keyfile`
The private key file to use for SSL. If this file doesn't exist, the add-on start will fail.
**Note**: The file MUST be stored in `/ssl/`, which is the default for Hass.io
### Option: `leave_front_door_open`
Adding this option to the add-on configuration allows you to disable
authentication by setting it to `true`.
### Option: `esphomeyaml_version`
Manually override which esphomeyaml version to use in the addon.
For example to install the latest development version, use `"esphomeyaml_version": "dev"`,
or for version 1.10.0: `"esphomeyaml_version": "v1.10.0""`.
Please note that this does not always work and is only meant for testing, usually the
esphomeyaml add-on and dashboard version must match to guarantee a working system.
[discord-shield]: https://img.shields.io/discord/429907082951524364.svg
[dht22]: https://esphomelib.com/esphomeyaml/components/sensor/dht.html
[discord]: https://discord.me/KhAMKrd
[releases-shield]: https://img.shields.io/github/release/OttoWinter/esphomeyaml.svg
[releases]: https://esphomelib.com/esphomeyaml/changelog/index.html
[repository]: https://github.com/OttoWinter/esphomeyaml

View File

@@ -1,25 +1,30 @@
from __future__ import print_function
import argparse
from collections import OrderedDict
from datetime import datetime
import logging
import os
import random
import sys
from datetime import datetime
from esphomeyaml import const, core, core_config, mqtt, wizard, writer, yaml_util
from esphomeyaml.config import get_component, iter_components, read_config
from esphomeyaml.const import CONF_BAUD_RATE, CONF_BUILD_PATH, CONF_DOMAIN, CONF_ESPHOMEYAML, \
CONF_HOSTNAME, CONF_LOGGER, CONF_MANUAL_IP, CONF_NAME, CONF_STATIC_IP, CONF_USE_CUSTOM_CODE, \
CONF_WIFI, ESP_PLATFORM_ESP8266
from esphomeyaml.core import ESPHomeYAMLError
from esphomeyaml.helpers import AssignmentExpression, Expression, RawStatement, \
_EXPRESSIONS, add, \
add_job, color, flush_tasks, indent, quote, statement, relative_path
from esphomeyaml import const, core_config, mqtt, platformio_api, wizard, writer, yaml_util
from esphomeyaml.api.client import run_logs
from esphomeyaml.config import get_component, iter_components, read_config, strip_default_ids
from esphomeyaml.const import CONF_BAUD_RATE, CONF_ESPHOMEYAML, CONF_LOGGER, CONF_USE_CUSTOM_CODE, \
CONF_BROKER
from esphomeyaml.core import CORE, EsphomeyamlError
from esphomeyaml.cpp_generator import Expression, RawStatement, add, statement
from esphomeyaml.helpers import color, indent
from esphomeyaml.py_compat import safe_input, text_type, IS_PY2
from esphomeyaml.storage_json import StorageJSON, esphomeyaml_storage_path, \
start_update_check_thread, storage_path
from esphomeyaml.util import run_external_command, safe_print
_LOGGER = logging.getLogger(__name__)
PRE_INITIALIZE = ['esphomeyaml', 'logger', 'wifi', 'ota', 'mqtt', 'web_server', 'i2c']
PRE_INITIALIZE = ['esphomeyaml', 'logger', 'wifi', 'ethernet', 'ota', 'mqtt', 'web_server', 'api',
'i2c']
def get_serial_ports():
@@ -31,65 +36,64 @@ def get_serial_ports():
continue
if "VID:PID" in info:
result.append((port, desc))
result.sort(key=lambda x: x[0])
return result
def choose_serial_port(config):
result = get_serial_ports()
def choose_prompt(options):
if not options:
raise ValueError
if len(options) == 1:
return options[0][1]
safe_print(u"Found multiple options, please choose one:")
for i, (desc, _) in enumerate(options):
safe_print(u" [{}] {}".format(i + 1, desc))
if not result:
return 'OTA'
print(u"Found multiple serial port options, please choose one:")
for i, (res, desc) in enumerate(result):
print(u" [{}] {} ({})".format(i, res, desc))
print(u" [{}] Over The Air ({})".format(len(result), get_upload_host(config)))
print()
while True:
opt = raw_input('(number): ')
if opt in result:
opt = result.index(opt)
opt = safe_input('(number): ')
if opt in options:
opt = options.index(opt)
break
try:
opt = int(opt)
if opt < 0 or opt > len(result):
if opt < 1 or opt > len(options):
raise ValueError
break
except ValueError:
print(color('red', u"Invalid option: '{}'".format(opt)))
if opt == len(result):
return 'OTA'
return result[opt][0]
safe_print(color('red', u"Invalid option: '{}'".format(opt)))
return options[opt - 1][1]
def run_platformio(*cmd, **kwargs):
def mock_exit(return_code):
raise SystemExit(return_code)
orig_argv = sys.argv
orig_exit = sys.exit # mock sys.exit
full_cmd = u' '.join(quote(x) for x in cmd)
_LOGGER.info(u"Running: %s", full_cmd)
try:
func = kwargs.get('main')
if func is None:
import platformio.__main__
func = platformio.__main__.main
sys.argv = list(cmd)
sys.exit = mock_exit
return func() or 0
except KeyboardInterrupt:
return 1
except SystemExit as err:
return err.args[0]
except Exception as err: # pylint: disable=broad-except
_LOGGER.error(u"Running platformio failed: %s", err)
_LOGGER.error(u"Please try running %s locally.", full_cmd)
finally:
sys.argv = orig_argv
sys.exit = orig_exit
def choose_upload_log_host(default, check_default, show_ota, show_mqtt, show_api):
options = []
for res, desc in get_serial_ports():
options.append((u"{} ({})".format(res, desc), res))
if (show_ota and 'ota' in CORE.config) or (show_api and 'api' in CORE.config):
options.append((u"Over The Air ({})".format(CORE.address), CORE.address))
if default == 'OTA':
return CORE.address
if show_mqtt and 'mqtt' in CORE.config:
options.append((u"MQTT ({})".format(CORE.config['mqtt'][CONF_BROKER]), 'MQTT'))
if default == 'OTA':
return 'MQTT'
if default is not None:
return default
if check_default is not None and check_default in [opt[1] for opt in options]:
return check_default
return choose_prompt(options)
def run_miniterm(config, port, escape=False):
def get_port_type(port):
if port.startswith('/') or port.startswith('COM'):
return 'SERIAL'
if port == 'MQTT':
return 'MQTT'
return 'NETWORK'
def run_miniterm(config, port):
import serial
if CONF_LOGGER not in config:
_LOGGER.info("Logger is not enabled. Not starting UART logs.")
@@ -99,6 +103,7 @@ def run_miniterm(config, port, escape=False):
_LOGGER.info("UART logging is disabled (baud_rate=0). Not starting UART logs.")
_LOGGER.info("Starting log output from %s with baud rate %s", port, baud_rate)
backtrace_state = False
with serial.Serial(port, baudrate=baud_rate) as ser:
while True:
try:
@@ -106,133 +111,117 @@ def run_miniterm(config, port, escape=False):
except serial.SerialException:
_LOGGER.error("Serial port closed!")
return
line = raw.replace('\r', '').replace('\n', '')
if IS_PY2:
line = raw.replace('\r', '').replace('\n', '')
else:
line = raw.replace(b'\r', b'').replace(b'\n', b'').decode('utf8',
'backslashreplace')
time = datetime.now().time().strftime('[%H:%M:%S]')
message = time + line
if escape:
message = message.replace('\033', '\\033')
try:
print(message)
except UnicodeEncodeError:
print(message.encode('ascii', 'backslashreplace'))
safe_print(message)
backtrace_state = platformio_api.process_stacktrace(
config, line, backtrace_state=backtrace_state)
def write_cpp(config):
_LOGGER.info("Generating C++ source...")
add_job(core_config.to_code, config[CONF_ESPHOMEYAML], domain='esphomeyaml')
CORE.add_job(core_config.to_code, config[CONF_ESPHOMEYAML], domain='esphomeyaml')
for domain in PRE_INITIALIZE:
if domain == CONF_ESPHOMEYAML or domain not in config:
continue
add_job(get_component(domain).to_code, config[domain], domain=domain)
CORE.add_job(get_component(domain).to_code, config[domain], domain=domain)
for domain, component, conf in iter_components(config):
if domain in PRE_INITIALIZE or not hasattr(component, 'to_code'):
continue
add_job(component.to_code, conf, domain=domain)
CORE.add_job(component.to_code, conf, domain=domain)
flush_tasks()
CORE.flush_tasks()
add(RawStatement(''))
add(RawStatement(''))
all_code = []
for exp in _EXPRESSIONS:
for exp in CORE.expressions:
if not config[CONF_ESPHOMEYAML][CONF_USE_CUSTOM_CODE]:
if isinstance(exp, Expression) and not exp.required:
continue
if isinstance(exp, AssignmentExpression) and not exp.obj.required:
if not exp.has_side_effects():
continue
exp = exp.rhs
all_code.append(unicode(statement(exp)))
all_code.append(text_type(statement(exp)))
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
writer.write_platformio_project(config, build_path)
writer.write_platformio_project()
code_s = indent('\n'.join(line.rstrip() for line in all_code))
cpp_path = os.path.join(build_path, 'src', 'main.cpp')
writer.write_cpp(code_s, cpp_path)
writer.write_cpp(code_s)
return 0
def compile_program(args, config):
_LOGGER.info("Compiling app...")
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
command = ['platformio', 'run', '-d', build_path]
if args.verbose:
command.append('-v')
return run_platformio(*command)
def get_upload_host(config):
if CONF_MANUAL_IP in config[CONF_WIFI]:
host = str(config[CONF_WIFI][CONF_MANUAL_IP][CONF_STATIC_IP])
elif CONF_HOSTNAME in config[CONF_WIFI]:
host = config[CONF_WIFI][CONF_HOSTNAME] + config[CONF_WIFI][CONF_DOMAIN]
else:
host = config[CONF_ESPHOMEYAML][CONF_NAME] + config[CONF_WIFI][CONF_DOMAIN]
return host
update_check = not os.getenv('ESPHOMEYAML_NO_UPDATE_CHECK', '')
if update_check:
thread = start_update_check_thread(esphomeyaml_storage_path(CORE.config_dir))
rc = platformio_api.run_compile(config, args.verbose)
if update_check:
thread.join()
return rc
def upload_using_esptool(config, port):
import esptool
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
path = os.path.join(build_path, '.pioenvs', core.NAME, 'firmware.bin')
path = os.path.join(CORE.build_path, '.pioenvs', CORE.name, 'firmware.bin')
cmd = ['esptool.py', '--before', 'default_reset', '--after', 'hard_reset',
'--chip', 'esp8266', '--port', port, 'write_flash', '0x0', path]
# pylint: disable=protected-access
return run_platformio('esptool.py', '--before', 'default_reset', '--after', 'hard_reset',
'--chip', 'esp8266', '--port', port, 'write_flash', '0x0',
path, main=esptool._main)
return run_external_command(esptool._main, *cmd)
def upload_program(config, args, port):
_LOGGER.info("Uploading binary...")
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
def upload_program(config, args, host):
# if upload is to a serial port use platformio, otherwise assume ota
serial_port = port.startswith('/') or port.startswith('COM')
if port != 'OTA' and serial_port:
if core.ESP_PLATFORM == ESP_PLATFORM_ESP8266 and args.use_esptoolpy:
return upload_using_esptool(config, port)
command = ['platformio', 'run', '-d', build_path,
'-t', 'upload', '--upload-port', port]
if args.verbose:
command.append('-v')
return run_platformio(*command)
if 'ota' not in config:
_LOGGER.error("No serial port found and OTA not enabled. Can't upload!")
return -1
# If hostname/ip is explicitly provided as upload-port argument, use this instead of zeroconf
# hostname. This is to support use cases where zeroconf (hostname.local) does not work.
if port != 'OTA':
host = port
else:
host = get_upload_host(config)
if get_port_type(host) == 'SERIAL':
if CORE.is_esp8266:
return upload_using_esptool(config, host)
return platformio_api.run_upload(config, args.verbose, host)
from esphomeyaml.components import ota
from esphomeyaml import espota
from esphomeyaml import espota2
bin_file = os.path.join(build_path, '.pioenvs', core.NAME, 'firmware.bin')
if args.host_port is not None:
host_port = args.host_port
else:
host_port = int(os.getenv('ESPHOMEYAML_OTA_HOST_PORT', random.randint(10000, 60000)))
espota_args = ['espota.py', '--debug', '--progress', '-i', host,
'-p', str(ota.get_port(config)), '-f', bin_file,
'-a', ota.get_auth(config), '-P', str(host_port)]
if args.verbose:
espota_args.append('-d')
return espota.main(espota_args)
verbose = args.verbose
remote_port = ota.get_port(config)
password = ota.get_auth(config)
storage = StorageJSON.load(storage_path())
res = espota2.run_ota(host, remote_port, password, CORE.firmware_bin)
if res == 0:
if storage is not None and storage.use_legacy_ota:
storage.use_legacy_ota = False
storage.save(storage_path())
return res
if storage is not None and not storage.use_legacy_ota:
return res
_LOGGER.warning("OTA v2 method failed. Trying with legacy OTA...")
return espota2.run_legacy_ota(verbose, host_port, host, remote_port, password,
CORE.firmware_bin)
def show_logs(config, args, port, escape=False):
serial_port = port.startswith('/') or port.startswith('COM')
if port != 'OTA' and serial_port:
run_miniterm(config, port, escape=escape)
def show_logs(config, args, port):
if 'logger' not in config:
raise EsphomeyamlError("Logger is not configured!")
if get_port_type(port) == 'SERIAL':
run_miniterm(config, port)
return 0
return mqtt.show_logs(config, args.topic, args.username, args.password, args.client_id,
escape=escape)
if get_port_type(port) == 'NETWORK' and 'api' in config:
return run_logs(config, port)
if get_port_type(port) == 'MQTT' and 'mqtt' in config:
return mqtt.show_logs(config, args.topic, args.username, args.password, args.client_id)
raise ValueError
def clean_mqtt(config, args):
@@ -242,7 +231,7 @@ def clean_mqtt(config, args):
def setup_log(debug=False):
log_level = logging.DEBUG if debug else logging.INFO
logging.basicConfig(level=log_level)
fmt = "%(levelname)s [%(name)s] %(message)s"
fmt = "%(levelname)s %(message)s"
colorfmt = "%(log_color)s{}%(reset)s".format(fmt)
datefmt = '%H:%M:%S'
@@ -270,29 +259,11 @@ def command_wizard(args):
return wizard.wizard(args.configuration)
def strip_default_ids(config):
value = config
if isinstance(config, list):
value = type(config)()
for x in config:
if isinstance(x, core.ID) and not x.is_manual:
continue
value.append(strip_default_ids(x))
return value
elif isinstance(config, dict):
value = type(config)()
for k, v in config.iteritems():
if isinstance(v, core.ID) and not v.is_manual:
continue
value[k] = strip_default_ids(v)
return value
return value
def command_config(args, config):
_LOGGER.info("Configuration is valid!")
if not args.verbose:
config = strip_default_ids(config)
print(yaml_util.dump(config))
safe_print(yaml_util.dump(config))
return 0
@@ -311,7 +282,8 @@ def command_compile(args, config):
def command_upload(args, config):
port = args.upload_port or choose_serial_port(config)
port = choose_upload_log_host(default=args.upload_port, check_default=None,
show_ota=True, show_mqtt=False, show_api=False)
exit_code = upload_program(config, args, port)
if exit_code != 0:
return exit_code
@@ -320,8 +292,9 @@ def command_upload(args, config):
def command_logs(args, config):
port = args.serial_port or choose_serial_port(config)
return show_logs(config, args, port, escape=args.escape)
port = choose_upload_log_host(default=args.serial_port, check_default=None,
show_ota=False, show_mqtt=True, show_api=True)
return show_logs(config, args, port)
def command_run(args, config):
@@ -332,14 +305,17 @@ def command_run(args, config):
if exit_code != 0:
return exit_code
_LOGGER.info(u"Successfully compiled program.")
port = args.upload_port or choose_serial_port(config)
port = choose_upload_log_host(default=args.upload_port, check_default=None,
show_ota=True, show_mqtt=False, show_api=True)
exit_code = upload_program(config, args, port)
if exit_code != 0:
return exit_code
_LOGGER.info(u"Successfully uploaded program.")
if args.no_logs:
return 0
return show_logs(config, args, port, escape=args.escape)
port = choose_upload_log_host(default=args.upload_port, check_default=port,
show_ota=False, show_mqtt=True, show_api=True)
return show_logs(config, args, port)
def command_clean_mqtt(args, config):
@@ -351,7 +327,39 @@ def command_mqtt_fingerprint(args, config):
def command_version(args):
print(u"Version: {}".format(const.__version__))
safe_print(u"Version: {}".format(const.__version__))
return 0
def command_clean(args, config):
try:
writer.clean_build()
except OSError as err:
_LOGGER.error("Error deleting build files: %s", err)
return 1
_LOGGER.info("Done!")
return 0
def command_hass_config(args, config):
from esphomeyaml.components import mqtt as mqtt_component
_LOGGER.info("This is what you should put in your Home Assistant YAML configuration.")
_LOGGER.info("Please note this is only necessary if you're not using MQTT discovery.")
data = mqtt_component.GenerateHassConfigData(config)
hass_config = OrderedDict()
for domain, component, conf in iter_components(config):
if not hasattr(component, 'to_hass_config'):
continue
func = getattr(component, 'to_hass_config')
ret = func(data, conf)
if not isinstance(ret, (list, tuple)):
ret = [ret]
ret = [x for x in ret if x is not None]
domain_conf = hass_config.setdefault(domain.split('.')[0], [])
domain_conf += ret
safe_print(yaml_util.dump(hass_config))
return 0
@@ -375,6 +383,8 @@ POST_CONFIG_ACTIONS = {
'run': command_run,
'clean-mqtt': command_clean_mqtt,
'mqtt-fingerprint': command_mqtt_fingerprint,
'clean': command_clean,
'hass-config': command_hass_config,
}
@@ -382,6 +392,8 @@ def parse_args(argv):
parser = argparse.ArgumentParser(prog='esphomeyaml')
parser.add_argument('-v', '--verbose', help="Enable verbose esphomeyaml logs.",
action='store_true')
parser.add_argument('--dashboard', help="Internal flag to set if the command is run from the "
"dashboard.", action='store_true')
parser.add_argument('configuration', help='Your YAML configuration file.')
subparsers = parser.add_subparsers(help='Commands', dest='command')
@@ -399,9 +411,6 @@ def parse_args(argv):
parser_upload.add_argument('--upload-port', help="Manually specify the upload port to use. "
"For example /dev/cu.SLAB_USBtoUART.")
parser_upload.add_argument('--host-port', help="Specify the host port.", type=int)
parser_upload.add_argument('--use-esptoolpy',
help="Use esptool.py for the uploading (only for ESP8266)",
action='store_true')
parser_logs = subparsers.add_parser('logs', help='Validate the configuration '
'and show all MQTT logs.')
@@ -411,8 +420,6 @@ def parse_args(argv):
parser_logs.add_argument('--client-id', help='Manually set the client id.')
parser_logs.add_argument('--serial-port', help="Manually specify a serial port to use"
"For example /dev/cu.SLAB_USBtoUART.")
parser_logs.add_argument('--escape', help="Escape ANSI color codes for running in dashboard",
action='store_true')
parser_run = subparsers.add_parser('run', help='Validate the configuration, create a binary, '
'upload it, and start MQTT logs.')
@@ -425,11 +432,6 @@ def parse_args(argv):
parser_run.add_argument('--username', help='Manually set the MQTT username for logs.')
parser_run.add_argument('--password', help='Manually set the MQTT password for logs.')
parser_run.add_argument('--client-id', help='Manually set the client id for logs.')
parser_run.add_argument('--escape', help="Escape ANSI color codes for running in dashboard",
action='store_true')
parser_run.add_argument('--use-esptoolpy',
help="Use esptool.py for the uploading (only for ESP8266)",
action='store_true')
parser_clean = subparsers.add_parser('clean-mqtt', help="Helper to clear an MQTT topic from "
"retain messages.")
@@ -445,48 +447,63 @@ def parse_args(argv):
subparsers.add_parser('version', help="Print the esphomeyaml version and exit.")
subparsers.add_parser('clean', help="Delete all temporary build files.")
dashboard = subparsers.add_parser('dashboard',
help="Create a simple webserver for a dashboard.")
dashboard.add_argument("--port", help="The HTTP port to open connections on.", type=int,
default=6052)
help="Create a simple web server for a dashboard.")
dashboard.add_argument("--port", help="The HTTP port to open connections on. Defaults to 6052.",
type=int, default=6052)
dashboard.add_argument("--password", help="The optional password to require for all requests.",
type=str, default='')
dashboard.add_argument("--open-ui", help="Open the dashboard UI in a browser.",
action='store_true')
dashboard.add_argument("--hassio",
help="Internal flag used to tell esphomeyaml is started as a Hass.io "
"add-on.",
action="store_true")
dashboard.add_argument("--socket",
help="Make the dashboard serve under a unix socket", type=str)
subparsers.add_parser('hass-config',
help="Dump the configuration entries that should be added "
"to Home Assistant when not using MQTT discovery.")
return parser.parse_args(argv[1:])
def run_esphomeyaml(argv):
args = parse_args(argv)
CORE.dashboard = args.dashboard
setup_log(args.verbose)
if args.command in PRE_CONFIG_ACTIONS:
try:
return PRE_CONFIG_ACTIONS[args.command](args)
except ESPHomeYAMLError as e:
except EsphomeyamlError as e:
_LOGGER.error(e)
return 1
core.CONFIG_PATH = args.configuration
CORE.config_path = args.configuration
config = read_config(core.CONFIG_PATH)
config = read_config(args.verbose)
if config is None:
return 1
CORE.config = config
if args.command in POST_CONFIG_ACTIONS:
try:
return POST_CONFIG_ACTIONS[args.command](args, config)
except ESPHomeYAMLError as e:
except EsphomeyamlError as e:
_LOGGER.error(e)
return 1
print(u"Unknown command {}".format(args.command))
safe_print(u"Unknown command {}".format(args.command))
return 1
def main():
try:
return run_esphomeyaml(sys.argv)
except ESPHomeYAMLError as e:
except EsphomeyamlError as e:
_LOGGER.error(e)
return 1
except KeyboardInterrupt:

View File

330
esphomeyaml/api/api.proto Normal file
View File

@@ -0,0 +1,330 @@
syntax = "proto3";
// The Home Assistant protocol is structured as a simple
// TCP socket with short binary messages encoded in the protocol buffers format
// First, a message in this protocol has a specific format:
// * VarInt denoting the size of the message object. (type is not part of this)
// * VarInt denoting the type of message.
// * The message object encoded as a ProtoBuf message
// The connection is established in 4 steps:
// * First, the client connects to the server and sends a "Hello Request" identifying itself
// * The server responds with a "Hello Response" and selects the protocol version
// * After receiving this message, the client attempts to authenticate itself using
// the password and a "Connect Request"
// * The server responds with a "Connect Response" and notifies of invalid password.
// If anything in this initial process fails, the connection must immediately closed
// by both sides and _no_ disconnection message is to be sent.
// Message sent at the beginning of each connection
// Can only be sent by the client and only at the beginning of the connection
message HelloRequest {
// Description of client (like User Agent)
// For example "Home Assistant"
// Not strictly necessary to send but nice for debugging
// purposes.
string client_info = 1;
}
// Confirmation of successful connection request.
// Can only be sent by the server and only at the beginning of the connection
message HelloResponse {
// The version of the API to use. The _client_ (for example Home Assistant) needs to check
// for compatibility and if necessary adopt to an older API.
// Major is for breaking changes in the base protocol - a mismatch will lead to immediate disconnect_client_
// Minor is for breaking changes in individual messages - a mismatch will lead to a warning message
uint32 api_version_major = 1;
uint32 api_version_minor = 2;
// A string identifying the server (ESP); like client info this may be empty
// and only exists for debugging/logging purposes.
// For example "ESPHome v1.10.0 on ESP8266"
string server_info = 3;
}
// Message sent at the beginning of each connection to authenticate the client
// Can only be sent by the client and only at the beginning of the connection
message ConnectRequest {
// The password to log in with
string password = 1;
}
// Confirmation of successful connection. After this the connection is available for all traffic.
// Can only be sent by the server and only at the beginning of the connection
message ConnectResponse {
bool invalid_password = 1;
}
// Request to close the connection.
// Can be sent by both the client and server
message DisconnectRequest {
// Do not close the connection before the acknowledgement arrives
}
message DisconnectResponse {
// Empty - Both parties are required to close the connection after this
// message has been received.
}
message PingRequest {
// Empty
}
message PingResponse {
// Empty
}
message DeviceInfoRequest {
// Empty
}
message DeviceInfoResponse {
bool uses_password = 1;
// The name of the node, given by "App.set_name()"
string name = 2;
// The mac address of the device. For example "AC:BC:32:89:0E:A9"
string mac_address = 3;
// A string describing the ESPHome version. For example "1.10.0"
string esphome_core_version = 4;
// A string describing the date of compilation, this is generated by the compiler
// and therefore may not be in the same format all the time.
// If the user isn't using esphomeyaml, this will also not be set.
string compilation_time = 5;
// The model of the board. For example NodeMCU
string model = 6;
bool has_deep_sleep = 7;
}
message ListEntitiesRequest {
// Empty
}
message ListEntitiesBinarySensorResponse {
string object_id = 1;
fixed32 key = 2;
string name = 3;
string unique_id = 4;
string device_class = 5;
bool is_status_binary_sensor = 6;
}
message ListEntitiesCoverResponse {
string object_id = 1;
fixed32 key = 2;
string name = 3;
string unique_id = 4;
bool is_optimistic = 5;
}
message ListEntitiesFanResponse {
string object_id = 1;
fixed32 key = 2;
string name = 3;
string unique_id = 4;
bool supports_oscillation = 5;
bool supports_speed = 6;
}
message ListEntitiesLightResponse {
string object_id = 1;
fixed32 key = 2;
string name = 3;
string unique_id = 4;
bool supports_brightness = 5;
bool supports_rgb = 6;
bool supports_white_value = 7;
bool supports_color_temperature = 8;
float min_mireds = 9;
float max_mireds = 10;
repeated string effects = 11;
}
message ListEntitiesSensorResponse {
string object_id = 1;
fixed32 key = 2;
string name = 3;
string unique_id = 4;
string icon = 5;
string unit_of_measurement = 6;
int32 accuracy_decimals = 7;
}
message ListEntitiesSwitchResponse {
string object_id = 1;
fixed32 key = 2;
string name = 3;
string unique_id = 4;
string icon = 5;
bool optimistic = 6;
}
message ListEntitiesTextSensorResponse {
string object_id = 1;
fixed32 key = 2;
string name = 3;
string unique_id = 4;
string icon = 5;
}
message ListEntitiesDoneResponse {
// Empty
}
message SubscribeStatesRequest {
// Empty
}
message BinarySensorStateResponse {
fixed32 key = 1;
bool state = 2;
}
message CoverStateResponse {
fixed32 key = 1;
enum CoverState {
OPEN = 0;
CLOSED = 1;
}
CoverState state = 2;
}
enum FanSpeed {
LOW = 0;
MEDIUM = 1;
HIGH = 2;
}
message FanStateResponse {
fixed32 key = 1;
bool state = 2;
bool oscillating = 3;
FanSpeed speed = 4;
}
message LightStateResponse {
fixed32 key = 1;
bool state = 2;
float brightness = 3;
float red = 4;
float green = 5;
float blue = 6;
float white = 7;
float color_temperature = 8;
string effect = 9;
}
message SensorStateResponse {
fixed32 key = 1;
float state = 2;
}
message SwitchStateResponse {
fixed32 key = 1;
bool state = 2;
}
message TextSensorStateResponse {
fixed32 key = 1;
string state = 2;
}
message CoverCommandRequest {
fixed32 key = 1;
enum CoverCommand {
OPEN = 0;
CLOSE = 1;
STOP = 2;
}
bool has_state = 2;
CoverCommand command = 3;
}
message FanCommandRequest {
fixed32 key = 1;
bool has_state = 2;
bool state = 3;
bool has_speed = 4;
FanSpeed speed = 5;
bool has_oscillating = 6;
bool oscillating = 7;
}
message LightCommandRequest {
fixed32 key = 1;
bool has_state = 2;
bool state = 3;
bool has_brightness = 4;
float brightness = 5;
bool has_rgb = 6;
float red = 7;
float green = 8;
float blue = 9;
bool has_white = 10;
float white = 11;
bool has_color_temperature = 12;
float color_temperature = 13;
bool has_transition_length = 14;
uint32 transition_length = 15;
bool has_flash_length = 16;
uint32 flash_length = 17;
bool has_effect = 18;
string effect = 19;
}
message SwitchCommandRequest {
fixed32 key = 1;
bool state = 2;
}
enum LogLevel {
NONE = 0;
ERROR = 1;
WARN = 2;
INFO = 3;
DEBUG = 4;
VERBOSE = 5;
VERY_VERBOSE = 6;
}
message SubscribeLogsRequest {
LogLevel level = 1;
bool dump_config = 2;
}
message SubscribeLogsResponse {
LogLevel level = 1;
string tag = 2;
string message = 3;
bool send_failed = 4;
}
message SubscribeServiceCallsRequest {
}
message ServiceCallResponse {
string service = 1;
map<string, string> data = 2;
map<string, string> data_template = 3;
map<string, string> variables = 4;
}
// 1. Client sends SubscribeHomeAssistantStatesRequest
// 2. Server responds with zero or more SubscribeHomeAssistantStateResponse (async)
// 3. Client sends HomeAssistantStateResponse for state changes.
message SubscribeHomeAssistantStatesRequest {
}
message SubscribeHomeAssistantStateResponse {
string entity_id = 1;
}
message HomeAssistantStateResponse {
string entity_id = 1;
string state = 2;
}
message GetTimeRequest {
}
message GetTimeResponse {
fixed32 epoch_seconds = 1;
}

2484
esphomeyaml/api/api_pb2.py Normal file

File diff suppressed because one or more lines are too long

490
esphomeyaml/api/client.py Normal file
View File

@@ -0,0 +1,490 @@
from datetime import datetime
import functools
import logging
import socket
import threading
import time
# pylint: disable=unused-import
from typing import Optional # noqa
from google.protobuf import message # noqa
from esphomeyaml import const
import esphomeyaml.api.api_pb2 as pb
from esphomeyaml.const import CONF_PASSWORD, CONF_PORT
from esphomeyaml.core import EsphomeyamlError
from esphomeyaml.helpers import resolve_ip_address, indent, color
from esphomeyaml.py_compat import text_type, IS_PY2, byte_to_bytes, char_to_byte, format_bytes
from esphomeyaml.util import safe_print
_LOGGER = logging.getLogger(__name__)
class APIConnectionError(EsphomeyamlError):
pass
MESSAGE_TYPE_TO_PROTO = {
1: pb.HelloRequest,
2: pb.HelloResponse,
3: pb.ConnectRequest,
4: pb.ConnectResponse,
5: pb.DisconnectRequest,
6: pb.DisconnectResponse,
7: pb.PingRequest,
8: pb.PingResponse,
9: pb.DeviceInfoRequest,
10: pb.DeviceInfoResponse,
11: pb.ListEntitiesRequest,
12: pb.ListEntitiesBinarySensorResponse,
13: pb.ListEntitiesCoverResponse,
14: pb.ListEntitiesFanResponse,
15: pb.ListEntitiesLightResponse,
16: pb.ListEntitiesSensorResponse,
17: pb.ListEntitiesSwitchResponse,
18: pb.ListEntitiesTextSensorResponse,
19: pb.ListEntitiesDoneResponse,
20: pb.SubscribeStatesRequest,
21: pb.BinarySensorStateResponse,
22: pb.CoverStateResponse,
23: pb.FanStateResponse,
24: pb.LightStateResponse,
25: pb.SensorStateResponse,
26: pb.SwitchStateResponse,
27: pb.TextSensorStateResponse,
28: pb.SubscribeLogsRequest,
29: pb.SubscribeLogsResponse,
30: pb.CoverCommandRequest,
31: pb.FanCommandRequest,
32: pb.LightCommandRequest,
33: pb.SwitchCommandRequest,
34: pb.SubscribeServiceCallsRequest,
35: pb.ServiceCallResponse,
36: pb.GetTimeRequest,
37: pb.GetTimeResponse,
}
def _varuint_to_bytes(value):
if value <= 0x7F:
return byte_to_bytes(value)
ret = bytes()
while value:
temp = value & 0x7F
value >>= 7
if value:
ret += byte_to_bytes(temp | 0x80)
else:
ret += byte_to_bytes(temp)
return ret
def _bytes_to_varuint(value):
result = 0
bitpos = 0
for c in value:
val = char_to_byte(c)
result |= (val & 0x7F) << bitpos
bitpos += 7
if (val & 0x80) == 0:
return result
return None
# pylint: disable=too-many-instance-attributes,not-callable
class APIClient(threading.Thread):
def __init__(self, address, port, password):
threading.Thread.__init__(self)
self._address = address # type: str
self._port = port # type: int
self._password = password # type: Optional[str]
self._socket = None # type: Optional[socket.socket]
self._socket_open_event = threading.Event()
self._socket_write_lock = threading.Lock()
self._connected = False
self._authenticated = False
self._message_handlers = []
self._keepalive = 5
self._ping_timer = None
self._refresh_ping()
self.on_disconnect = None
self.on_connect = None
self.on_login = None
self.auto_reconnect = False
self._running_event = threading.Event()
self._stop_event = threading.Event()
@property
def stopped(self):
return self._stop_event.is_set()
def _refresh_ping(self):
if self._ping_timer is not None:
self._ping_timer.cancel()
self._ping_timer = None
def func():
self._ping_timer = None
if self._connected:
try:
self.ping()
except APIConnectionError:
self._fatal_error()
else:
self._refresh_ping()
self._ping_timer = threading.Timer(self._keepalive, func)
self._ping_timer.start()
def _cancel_ping(self):
if self._ping_timer is not None:
self._ping_timer.cancel()
self._ping_timer = None
def _close_socket(self):
self._cancel_ping()
if self._socket is not None:
self._socket.close()
self._socket = None
self._socket_open_event.clear()
self._connected = False
self._authenticated = False
self._message_handlers = []
def stop(self, force=False):
if self.stopped:
raise ValueError
if self._connected and not force:
try:
self.disconnect()
except APIConnectionError:
pass
self._close_socket()
self._stop_event.set()
if not force:
self.join()
def connect(self):
if not self._running_event.wait(0.1):
raise APIConnectionError("You need to call start() first!")
if self._connected:
raise APIConnectionError("Already connected!")
try:
ip = resolve_ip_address(self._address)
except EsphomeyamlError as err:
_LOGGER.warning("Error resolving IP address of %s. Is it connected to WiFi?",
self._address)
_LOGGER.warning("(If this error persists, please set a static IP address: "
"https://esphomelib.com/esphomeyaml/components/wifi.html#manual-ips)")
raise APIConnectionError(err)
_LOGGER.info("Connecting to %s:%s (%s)", self._address, self._port, ip)
self._socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self._socket.settimeout(10.0)
self._socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
try:
self._socket.connect((ip, self._port))
except socket.error as err:
self._fatal_error()
raise APIConnectionError("Error connecting to {}: {}".format(ip, err))
self._socket.settimeout(0.1)
self._socket_open_event.set()
hello = pb.HelloRequest()
hello.client_info = 'esphomeyaml v{}'.format(const.__version__)
try:
resp = self._send_message_await_response(hello, pb.HelloResponse)
except APIConnectionError as err:
self._fatal_error()
raise err
_LOGGER.debug("Successfully connected to %s ('%s' API=%s.%s)", self._address,
resp.server_info, resp.api_version_major, resp.api_version_minor)
self._connected = True
if self.on_connect is not None:
self.on_connect()
def _check_connected(self):
if not self._connected:
self._fatal_error()
raise APIConnectionError("Must be connected!")
def login(self):
self._check_connected()
if self._authenticated:
raise APIConnectionError("Already logged in!")
connect = pb.ConnectRequest()
if self._password is not None:
connect.password = self._password
resp = self._send_message_await_response(connect, pb.ConnectResponse)
if resp.invalid_password:
raise APIConnectionError("Invalid password!")
self._authenticated = True
if self.on_login is not None:
self.on_login()
def _fatal_error(self):
was_connected = self._connected
self._close_socket()
if was_connected and self.on_disconnect is not None:
self.on_disconnect()
def _write(self, data): # type: (bytes) -> None
if self._socket is None:
raise APIConnectionError("Socket closed")
_LOGGER.debug("Write: %s", format_bytes(data))
with self._socket_write_lock:
try:
self._socket.sendall(data)
except socket.error as err:
self._fatal_error()
raise APIConnectionError("Error while writing data: {}".format(err))
def _send_message(self, msg):
# type: (message.Message) -> None
for message_type, klass in MESSAGE_TYPE_TO_PROTO.items():
if isinstance(msg, klass):
break
else:
raise ValueError
encoded = msg.SerializeToString()
_LOGGER.debug("Sending %s:\n%s", type(msg), indent(text_type(msg)))
if IS_PY2:
req = chr(0x00)
else:
req = bytes([0])
req += _varuint_to_bytes(len(encoded))
req += _varuint_to_bytes(message_type)
req += encoded
self._write(req)
self._refresh_ping()
def _send_message_await_response_complex(self, send_msg, do_append, do_stop, timeout=1):
event = threading.Event()
responses = []
def on_message(resp):
if do_append(resp):
responses.append(resp)
if do_stop(resp):
event.set()
self._message_handlers.append(on_message)
self._send_message(send_msg)
ret = event.wait(timeout)
try:
self._message_handlers.remove(on_message)
except ValueError:
pass
if not ret:
raise APIConnectionError("Timeout while waiting for message response!")
return responses
def _send_message_await_response(self, send_msg, response_type, timeout=1):
def is_response(msg):
return isinstance(msg, response_type)
return self._send_message_await_response_complex(send_msg, is_response, is_response,
timeout)[0]
def device_info(self):
self._check_connected()
return self._send_message_await_response(pb.DeviceInfoRequest(), pb.DeviceInfoResponse)
def ping(self):
self._check_connected()
return self._send_message_await_response(pb.PingRequest(), pb.PingResponse)
def disconnect(self):
self._check_connected()
try:
self._send_message_await_response(pb.DisconnectRequest(), pb.DisconnectResponse)
except APIConnectionError:
pass
self._close_socket()
if self.on_disconnect is not None:
self.on_disconnect()
def _check_authenticated(self):
if not self._authenticated:
raise APIConnectionError("Must login first!")
def subscribe_logs(self, on_log, log_level=None, dump_config=False):
self._check_authenticated()
def on_msg(msg):
if isinstance(msg, pb.SubscribeLogsResponse):
on_log(msg)
self._message_handlers.append(on_msg)
req = pb.SubscribeLogsRequest(dump_config=dump_config)
if log_level is not None:
req.level = log_level
self._send_message(req)
def _recv(self, amount):
ret = bytes()
if amount == 0:
return ret
while len(ret) < amount:
if self.stopped:
raise APIConnectionError("Stopped!")
if not self._socket_open_event.is_set():
raise APIConnectionError("No socket!")
try:
val = self._socket.recv(amount - len(ret))
except AttributeError:
raise APIConnectionError("Socket was closed")
except socket.timeout:
continue
except socket.error as err:
raise APIConnectionError("Error while receiving data: {}".format(err))
ret += val
return ret
def _recv_varint(self):
raw = bytes()
while not raw or char_to_byte(raw[-1]) & 0x80:
raw += self._recv(1)
return _bytes_to_varuint(raw)
def _run_once(self):
if not self._socket_open_event.wait(0.1):
return
# Preamble
if char_to_byte(self._recv(1)[0]) != 0x00:
raise APIConnectionError("Invalid preamble")
length = self._recv_varint()
msg_type = self._recv_varint()
raw_msg = self._recv(length)
if msg_type not in MESSAGE_TYPE_TO_PROTO:
_LOGGER.debug("Skipping message type %s", msg_type)
return
msg = MESSAGE_TYPE_TO_PROTO[msg_type]()
msg.ParseFromString(raw_msg)
_LOGGER.debug("Got message: %s:\n%s", type(msg), indent(str(msg)))
for msg_handler in self._message_handlers[:]:
msg_handler(msg)
self._handle_internal_messages(msg)
self._refresh_ping()
def run(self):
self._running_event.set()
while not self.stopped:
try:
self._run_once()
except APIConnectionError as err:
if self.stopped:
break
if self._connected:
_LOGGER.error("Error while reading incoming messages: %s", err)
self._fatal_error()
self._running_event.clear()
def _handle_internal_messages(self, msg):
if isinstance(msg, pb.DisconnectRequest):
self._send_message(pb.DisconnectResponse())
if self._socket is not None:
self._socket.close()
self._socket = None
self._connected = False
if self.on_disconnect is not None:
self.on_disconnect()
elif isinstance(msg, pb.PingRequest):
self._send_message(pb.PingResponse())
elif isinstance(msg, pb.GetTimeRequest):
resp = pb.GetTimeResponse()
resp.epoch_seconds = int(time.time())
self._send_message(resp)
def run_logs(config, address):
conf = config['api']
port = conf[CONF_PORT]
password = conf[CONF_PASSWORD]
_LOGGER.info("Starting log output from %s using esphomelib API", address)
cli = APIClient(address, port, password)
stopping = False
retry_timer = []
def try_connect(tries=0, is_disconnect=True):
if stopping:
return
if is_disconnect:
_LOGGER.warning(u"Disconnected from API.")
while retry_timer:
retry_timer.pop(0).cancel()
error = None
try:
cli.connect()
cli.login()
except APIConnectionError as err: # noqa
error = err
if error is None:
_LOGGER.info("Successfully connected to %s", address)
return
wait_time = min(2**tries, 300)
_LOGGER.warning(u"Couldn't connect to API (%s). Trying to reconnect in %s seconds",
error, wait_time)
timer = threading.Timer(wait_time, functools.partial(try_connect, tries + 1, is_disconnect))
timer.start()
retry_timer.append(timer)
def on_log(msg):
time_ = datetime.now().time().strftime(u'[%H:%M:%S]')
text = msg.message
if msg.send_failed:
text = color('white', '(Message skipped because it was too big to fit in '
'TCP buffer - This is only cosmetic)')
safe_print(time_ + text)
has_connects = []
def on_login():
try:
cli.subscribe_logs(on_log, dump_config=not has_connects)
has_connects.append(True)
except APIConnectionError:
cli.disconnect()
cli.on_disconnect = try_connect
cli.on_login = on_login
cli.start()
try:
try_connect(is_disconnect=False)
while True:
time.sleep(1)
except KeyboardInterrupt:
stopping = True
cli.stop(True)
while retry_timer:
retry_timer.pop(0).cancel()
return 0

View File

@@ -1,37 +1,17 @@
import copy
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import core
from esphomeyaml.components import cover, deep_sleep, fan, output
from esphomeyaml.const import CONF_ABOVE, CONF_ACTION_ID, CONF_AND, CONF_AUTOMATION_ID, \
CONF_BELOW, CONF_BLUE, CONF_BRIGHTNESS, CONF_CONDITION, CONF_CONDITION_ID, CONF_DELAY, \
CONF_EFFECT, CONF_ELSE, CONF_FLASH_LENGTH, CONF_GREEN, CONF_ID, CONF_IF, CONF_LAMBDA, \
CONF_LEVEL, CONF_OR, CONF_OSCILLATING, CONF_PAYLOAD, CONF_QOS, CONF_RANGE, CONF_RED, \
CONF_RETAIN, CONF_SPEED, CONF_THEN, CONF_TOPIC, CONF_TRANSITION_LENGTH, CONF_TRIGGER_ID, \
CONF_WHITE, CONF_COLOR_TEMPERATURE
from esphomeyaml.core import ESPHomeYAMLError
from esphomeyaml.helpers import App, ArrayInitializer, Pvariable, TemplateArguments, add, add_job, \
bool_, esphomelib_ns, float_, get_variable, process_lambda, std_string, templatable, uint32, \
uint8
CONF_MQTT_PUBLISH = 'mqtt.publish'
CONF_LIGHT_TOGGLE = 'light.toggle'
CONF_LIGHT_TURN_OFF = 'light.turn_off'
CONF_LIGHT_TURN_ON = 'light.turn_on'
CONF_SWITCH_TOGGLE = 'switch.toggle'
CONF_SWITCH_TURN_OFF = 'switch.turn_off'
CONF_SWITCH_TURN_ON = 'switch.turn_on'
CONF_COVER_OPEN = 'cover.open'
CONF_COVER_CLOSE = 'cover.close'
CONF_COVER_STOP = 'cover.stop'
CONF_FAN_TOGGLE = 'fan.toggle'
CONF_FAN_TURN_OFF = 'fan.turn_off'
CONF_FAN_TURN_ON = 'fan.turn_on'
CONF_OUTPUT_TURN_ON = 'output.turn_on'
CONF_OUTPUT_TURN_OFF = 'output.turn_off'
CONF_OUTPUT_SET_LEVEL = 'output.set_level'
CONF_DEEP_SLEEP_ENTER = 'deep_sleep.enter'
CONF_DEEP_SLEEP_PREVENT = 'deep_sleep.prevent'
CONF_BELOW, CONF_CONDITION, CONF_CONDITION_ID, CONF_DELAY, CONF_ELSE, CONF_ID, CONF_IF, \
CONF_LAMBDA, CONF_OR, CONF_RANGE, CONF_THEN, CONF_TRIGGER_ID, CONF_WHILE
from esphomeyaml.core import CORE
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, TemplateArguments, add, \
get_variable, process_lambda, templatable
from esphomeyaml.cpp_types import Action, App, Component, PollingComponent, Trigger, \
esphomelib_ns, float_, uint32, void, bool_
from esphomeyaml.util import ServiceRegistry
def maybe_simple_id(*validators):
@@ -46,426 +26,293 @@ def maybe_simple_id(*validators):
def validate_recursive_condition(value):
return CONDITIONS_SCHEMA(value)
is_list = isinstance(value, list)
value = cv.ensure_list()(value)[:]
for i, item in enumerate(value):
path = [i] if is_list else []
item = copy.deepcopy(item)
if not isinstance(item, dict):
raise vol.Invalid(u"Condition must consist of key-value mapping! Got {}".format(item),
path)
key = next((x for x in item if x != CONF_CONDITION_ID), None)
if key is None:
raise vol.Invalid(u"Key missing from action! Got {}".format(item), path)
if key not in CONDITION_REGISTRY:
raise vol.Invalid(u"Unable to find condition with the name '{}', is the "
u"component loaded?".format(key), path + [key])
item.setdefault(CONF_CONDITION_ID, None)
key2 = next((x for x in item if x not in (CONF_CONDITION_ID, key)), None)
if key2 is not None:
raise vol.Invalid(u"Cannot have two conditions in one item. Key '{}' overrides '{}'! "
u"Did you forget to indent the block inside the condition?"
u"".format(key, key2), path)
validator = CONDITION_REGISTRY[key][0]
try:
condition = validator(item[key])
except vol.Invalid as err:
err.prepend(path)
raise err
value[i] = {
CONF_CONDITION_ID: cv.declare_variable_id(Condition)(item[CONF_CONDITION_ID]),
key: condition,
}
return value
def validate_recursive_action(value):
return ACTIONS_SCHEMA(value)
is_list = isinstance(value, list)
if not is_list:
value = [value]
for i, item in enumerate(value):
path = [i] if is_list else []
item = copy.deepcopy(item)
if not isinstance(item, dict):
raise vol.Invalid(u"Action must consist of key-value mapping! Got {}".format(item),
path)
key = next((x for x in item if x != CONF_ACTION_ID), None)
if key is None:
raise vol.Invalid(u"Key missing from action! Got {}".format(item), path)
if key not in ACTION_REGISTRY:
raise vol.Invalid(u"Unable to find action with the name '{}', is the component loaded?"
u"".format(key), path + [key])
item.setdefault(CONF_ACTION_ID, None)
key2 = next((x for x in item if x not in (CONF_ACTION_ID, key)), None)
if key2 is not None:
raise vol.Invalid(u"Cannot have two actions in one item. Key '{}' overrides '{}'! "
u"Did you forget to indent the block inside the action?"
u"".format(key, key2), path)
validator = ACTION_REGISTRY[key][0]
try:
action = validator(item[key])
except vol.Invalid as err:
err.prepend(path)
raise err
value[i] = {
CONF_ACTION_ID: cv.declare_variable_id(Action)(item[CONF_ACTION_ID]),
key: action,
}
return value
ACTION_KEYS = [CONF_DELAY, CONF_MQTT_PUBLISH, CONF_LIGHT_TOGGLE, CONF_LIGHT_TURN_OFF,
CONF_LIGHT_TURN_ON, CONF_SWITCH_TOGGLE, CONF_SWITCH_TURN_OFF, CONF_SWITCH_TURN_ON,
CONF_LAMBDA, CONF_COVER_OPEN, CONF_COVER_CLOSE, CONF_COVER_STOP, CONF_FAN_TOGGLE,
CONF_FAN_TURN_OFF, CONF_FAN_TURN_ON, CONF_OUTPUT_TURN_ON, CONF_OUTPUT_TURN_OFF,
CONF_OUTPUT_SET_LEVEL, CONF_IF, CONF_DEEP_SLEEP_ENTER, CONF_DEEP_SLEEP_PREVENT]
ACTIONS_SCHEMA = vol.All(cv.ensure_list, [vol.All({
cv.GenerateID(CONF_ACTION_ID): cv.declare_variable_id(None),
vol.Optional(CONF_DELAY): cv.templatable(cv.positive_time_period_milliseconds),
vol.Optional(CONF_MQTT_PUBLISH): vol.Schema({
vol.Required(CONF_TOPIC): cv.templatable(cv.publish_topic),
vol.Required(CONF_PAYLOAD): cv.templatable(cv.mqtt_payload),
vol.Optional(CONF_QOS): cv.templatable(cv.mqtt_qos),
vol.Optional(CONF_RETAIN): cv.templatable(cv.boolean),
}),
vol.Optional(CONF_LIGHT_TOGGLE): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
vol.Optional(CONF_TRANSITION_LENGTH): cv.templatable(cv.positive_time_period_milliseconds),
}),
vol.Optional(CONF_LIGHT_TURN_OFF): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
vol.Optional(CONF_TRANSITION_LENGTH): cv.templatable(cv.positive_time_period_milliseconds),
}),
vol.Optional(CONF_LIGHT_TURN_ON): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
vol.Exclusive(CONF_TRANSITION_LENGTH, 'transformer'):
cv.templatable(cv.positive_time_period_milliseconds),
vol.Exclusive(CONF_FLASH_LENGTH, 'transformer'):
cv.templatable(cv.positive_time_period_milliseconds),
vol.Optional(CONF_BRIGHTNESS): cv.templatable(cv.percentage),
vol.Optional(CONF_RED): cv.templatable(cv.percentage),
vol.Optional(CONF_GREEN): cv.templatable(cv.percentage),
vol.Optional(CONF_BLUE): cv.templatable(cv.percentage),
vol.Optional(CONF_WHITE): cv.templatable(cv.percentage),
vol.Optional(CONF_COLOR_TEMPERATURE): cv.templatable(cv.positive_float),
vol.Optional(CONF_EFFECT): cv.templatable(cv.string),
}),
vol.Optional(CONF_SWITCH_TOGGLE): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_SWITCH_TURN_OFF): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_SWITCH_TURN_ON): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_COVER_OPEN): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_COVER_CLOSE): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_COVER_STOP): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_COVER_OPEN): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_COVER_CLOSE): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_COVER_STOP): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_FAN_TOGGLE): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_FAN_TURN_OFF): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_FAN_TURN_ON): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
vol.Optional(CONF_OSCILLATING): cv.templatable(cv.boolean),
vol.Optional(CONF_SPEED): cv.templatable(fan.validate_fan_speed),
}),
vol.Optional(CONF_OUTPUT_TURN_OFF): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None),
}),
vol.Optional(CONF_OUTPUT_TURN_ON): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(None)
}),
vol.Optional(CONF_OUTPUT_SET_LEVEL): {
vol.Required(CONF_ID): cv.use_variable_id(None),
vol.Required(CONF_LEVEL): cv.percentage,
},
vol.Optional(CONF_DEEP_SLEEP_ENTER): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(deep_sleep.DeepSleepComponent),
}),
vol.Optional(CONF_DEEP_SLEEP_PREVENT): maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(deep_sleep.DeepSleepComponent),
}),
vol.Optional(CONF_IF): vol.All({
vol.Required(CONF_CONDITION): validate_recursive_condition,
vol.Optional(CONF_THEN): validate_recursive_action,
vol.Optional(CONF_ELSE): validate_recursive_action,
}, cv.has_at_least_one_key(CONF_THEN, CONF_ELSE)),
vol.Optional(CONF_LAMBDA): cv.lambda_,
}, cv.has_exactly_one_key(*ACTION_KEYS))])
ACTION_REGISTRY = ServiceRegistry()
CONDITION_REGISTRY = ServiceRegistry()
# pylint: disable=invalid-name
DelayAction = esphomelib_ns.DelayAction
LambdaAction = esphomelib_ns.LambdaAction
IfAction = esphomelib_ns.IfAction
Automation = esphomelib_ns.Automation
DelayAction = esphomelib_ns.class_('DelayAction', Action, Component)
LambdaAction = esphomelib_ns.class_('LambdaAction', Action)
IfAction = esphomelib_ns.class_('IfAction', Action)
WhileAction = esphomelib_ns.class_('WhileAction', Action)
UpdateComponentAction = esphomelib_ns.class_('UpdateComponentAction', Action)
Automation = esphomelib_ns.class_('Automation')
CONDITIONS_SCHEMA = vol.All(cv.ensure_list, [cv.templatable({
cv.GenerateID(CONF_CONDITION_ID): cv.declare_variable_id(None),
vol.Optional(CONF_AND): validate_recursive_condition,
vol.Optional(CONF_OR): validate_recursive_condition,
vol.Optional(CONF_RANGE): vol.All(vol.Schema({
vol.Optional(CONF_ABOVE): vol.Coerce(float),
vol.Optional(CONF_BELOW): vol.Coerce(float),
}), cv.has_at_least_one_key(CONF_ABOVE, CONF_BELOW)),
vol.Optional(CONF_LAMBDA): cv.lambda_,
})])
# pylint: disable=invalid-name
AndCondition = esphomelib_ns.AndCondition
OrCondition = esphomelib_ns.OrCondition
RangeCondition = esphomelib_ns.RangeCondition
LambdaCondition = esphomelib_ns.LambdaCondition
Condition = esphomelib_ns.class_('Condition')
AndCondition = esphomelib_ns.class_('AndCondition', Condition)
OrCondition = esphomelib_ns.class_('OrCondition', Condition)
RangeCondition = esphomelib_ns.class_('RangeCondition', Condition)
LambdaCondition = esphomelib_ns.class_('LambdaCondition', Condition)
def validate_automation(extra_schema=None):
schema = AUTOMATION_SCHEMA.extend(extra_schema or {})
def validate_automation(extra_schema=None, extra_validators=None, single=False):
if extra_schema is None:
extra_schema = {}
if isinstance(extra_schema, vol.Schema):
extra_schema = extra_schema.schema
schema = AUTOMATION_SCHEMA.extend(extra_schema)
def validator(value):
def validator_(value):
if isinstance(value, list):
return schema({CONF_THEN: value})
try:
# First try as a sequence of actions
return [schema({CONF_THEN: value})]
except vol.Invalid as err:
# Next try as a sequence of automations
try:
return vol.Schema([schema])(value)
except vol.Invalid as err2:
if 'Unable to find action' in str(err):
raise err2
raise vol.MultipleInvalid([err, err2])
elif isinstance(value, dict):
if CONF_THEN in value:
return schema(value)
return schema({CONF_THEN: value})
return schema(value)
return [schema(value)]
return [schema({CONF_THEN: value})]
# This should only happen with invalid configs, but let's have a nice error message.
return [schema(value)]
def validator(value):
value = validator_(value)
if extra_validators is not None:
value = vol.Schema([extra_validators])(value)
if single:
if len(value) != 1:
raise vol.Invalid("Cannot have more than 1 automation for templates")
return value[0]
return value
return validator
AUTOMATION_SCHEMA = vol.Schema({
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(None),
cv.GenerateID(CONF_AUTOMATION_ID): cv.declare_variable_id(None),
vol.Optional(CONF_IF): CONDITIONS_SCHEMA,
vol.Required(CONF_THEN): ACTIONS_SCHEMA,
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(Trigger),
cv.GenerateID(CONF_AUTOMATION_ID): cv.declare_variable_id(Automation),
vol.Optional(CONF_IF): validate_recursive_condition,
vol.Required(CONF_THEN): validate_recursive_action,
})
AND_CONDITION_SCHEMA = validate_recursive_condition
@CONDITION_REGISTRY.register(CONF_AND, AND_CONDITION_SCHEMA)
def and_condition_to_code(config, condition_id, arg_type, template_arg):
for conditions in build_conditions(config, arg_type):
yield
rhs = AndCondition.new(template_arg, conditions)
type = AndCondition.template(template_arg)
yield Pvariable(condition_id, rhs, type=type)
OR_CONDITION_SCHEMA = validate_recursive_condition
@CONDITION_REGISTRY.register(CONF_OR, OR_CONDITION_SCHEMA)
def or_condition_to_code(config, condition_id, arg_type, template_arg):
for conditions in build_conditions(config, arg_type):
yield
rhs = OrCondition.new(template_arg, conditions)
type = OrCondition.template(template_arg)
yield Pvariable(condition_id, rhs, type=type)
RANGE_CONDITION_SCHEMA = vol.All(vol.Schema({
vol.Optional(CONF_ABOVE): cv.templatable(cv.float_),
vol.Optional(CONF_BELOW): cv.templatable(cv.float_),
}), cv.has_at_least_one_key(CONF_ABOVE, CONF_BELOW))
@CONDITION_REGISTRY.register(CONF_RANGE, RANGE_CONDITION_SCHEMA)
def range_condition_to_code(config, condition_id, arg_type, template_arg):
for conditions in build_conditions(config, arg_type):
yield
rhs = RangeCondition.new(template_arg, conditions)
type = RangeCondition.template(template_arg)
condition = Pvariable(condition_id, rhs, type=type)
if CONF_ABOVE in config:
for template_ in templatable(config[CONF_ABOVE], arg_type, float_):
yield
condition.set_min(template_)
if CONF_BELOW in config:
for template_ in templatable(config[CONF_BELOW], arg_type, float_):
yield
condition.set_max(template_)
yield condition
DELAY_ACTION_SCHEMA = cv.templatable(cv.positive_time_period_milliseconds)
@ACTION_REGISTRY.register(CONF_DELAY, DELAY_ACTION_SCHEMA)
def delay_action_to_code(config, action_id, arg_type, template_arg):
rhs = App.register_component(DelayAction.new(template_arg))
type = DelayAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
for template_ in templatable(config, arg_type, uint32):
yield
add(action.set_delay(template_))
yield action
IF_ACTION_SCHEMA = vol.All({
vol.Required(CONF_CONDITION): validate_recursive_condition,
vol.Optional(CONF_THEN): validate_recursive_action,
vol.Optional(CONF_ELSE): validate_recursive_action,
}, cv.has_at_least_one_key(CONF_THEN, CONF_ELSE))
@ACTION_REGISTRY.register(CONF_IF, IF_ACTION_SCHEMA)
def if_action_to_code(config, action_id, arg_type, template_arg):
for conditions in build_conditions(config[CONF_CONDITION], arg_type):
yield None
rhs = IfAction.new(template_arg, conditions)
type = IfAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
if CONF_THEN in config:
for actions in build_actions(config[CONF_THEN], arg_type):
yield None
add(action.add_then(actions))
if CONF_ELSE in config:
for actions in build_actions(config[CONF_ELSE], arg_type):
yield None
add(action.add_else(actions))
yield action
WHILE_ACTION_SCHEMA = vol.Schema({
vol.Required(CONF_CONDITION): validate_recursive_condition,
vol.Required(CONF_THEN): validate_recursive_action,
})
def build_condition(config, arg_type):
template_arg = TemplateArguments(arg_type)
if isinstance(config, core.Lambda):
lambda_ = None
for lambda_ in process_lambda(config, [(arg_type, 'x')]):
yield
yield LambdaCondition.new(template_arg, lambda_)
elif CONF_AND in config:
yield AndCondition.new(template_arg, build_conditions(config[CONF_AND], template_arg))
elif CONF_OR in config:
yield OrCondition.new(template_arg, build_conditions(config[CONF_OR], template_arg))
elif CONF_LAMBDA in config:
lambda_ = None
for lambda_ in process_lambda(config[CONF_LAMBDA], [(arg_type, 'x')]):
yield
yield LambdaCondition.new(template_arg, lambda_)
elif CONF_RANGE in config:
conf = config[CONF_RANGE]
rhs = RangeCondition.new(template_arg)
type = RangeCondition.template(template_arg)
condition = Pvariable(config[CONF_CONDITION_ID], rhs, type=type)
if CONF_ABOVE in conf:
template_ = None
for template_ in templatable(conf[CONF_ABOVE], arg_type, float_):
yield
condition.set_min(template_)
if CONF_BELOW in conf:
template_ = None
for template_ in templatable(conf[CONF_BELOW], arg_type, float_):
yield
condition.set_max(template_)
yield condition
else:
raise ESPHomeYAMLError(u"Unsupported condition {}".format(config))
@ACTION_REGISTRY.register(CONF_WHILE, WHILE_ACTION_SCHEMA)
def while_action_to_code(config, action_id, arg_type, template_arg):
for conditions in build_conditions(config[CONF_CONDITION], arg_type):
yield None
rhs = WhileAction.new(template_arg, conditions)
type = WhileAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
for actions in build_actions(config[CONF_THEN], arg_type):
yield None
add(action.add_then(actions))
yield action
def build_conditions(config, arg_type):
conditions = []
for conf in config:
condition = None
for condition in build_condition(conf, arg_type):
yield None
conditions.append(condition)
yield ArrayInitializer(*conditions)
LAMBDA_ACTION_SCHEMA = cv.lambda_
@ACTION_REGISTRY.register(CONF_LAMBDA, LAMBDA_ACTION_SCHEMA)
def lambda_action_to_code(config, action_id, arg_type, template_arg):
for lambda_ in process_lambda(config, [(arg_type, 'x')], return_type=void):
yield None
rhs = LambdaAction.new(template_arg, lambda_)
type = LambdaAction.template(template_arg)
yield Pvariable(action_id, rhs, type=type)
LAMBDA_CONDITION_SCHEMA = cv.lambda_
@CONDITION_REGISTRY.register(CONF_LAMBDA, LAMBDA_CONDITION_SCHEMA)
def lambda_condition_to_code(config, condition_id, arg_type, template_arg):
for lambda_ in process_lambda(config, [(arg_type, 'x')], return_type=bool_):
yield
rhs = LambdaCondition.new(template_arg, lambda_)
type = LambdaCondition.template(template_arg)
yield Pvariable(condition_id, rhs, type=type)
CONF_COMPONENT_UPDATE = 'component.update'
COMPONENT_UPDATE_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(PollingComponent),
})
@ACTION_REGISTRY.register(CONF_COMPONENT_UPDATE, COMPONENT_UPDATE_ACTION_SCHEMA)
def component_update_action_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = UpdateComponentAction.new(template_arg, var)
type = UpdateComponentAction.template(template_arg)
yield Pvariable(action_id, rhs, type=type)
def build_action(full_config, arg_type):
from esphomeyaml.components import light, mqtt, switch
template_arg = TemplateArguments(arg_type)
# Keep pylint from freaking out
var = None
action_id = full_config[CONF_ACTION_ID]
key, config = next((k, v) for k, v in full_config.items() if k in ACTION_KEYS)
if key == CONF_DELAY:
rhs = App.register_component(DelayAction.new(template_arg))
type = DelayAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
for template_ in templatable(config, arg_type, uint32):
yield
add(action.set_delay(template_))
yield action
elif key == CONF_LAMBDA:
for lambda_ in process_lambda(config, [(arg_type, 'x')]):
yield None
rhs = LambdaAction.new(template_arg, lambda_)
type = LambdaAction.template(template_arg)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_MQTT_PUBLISH:
rhs = App.Pget_mqtt_client().Pmake_publish_action(template_arg)
type = mqtt.MQTTPublishAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
for template_ in templatable(config[CONF_TOPIC], arg_type, std_string):
yield None
add(action.set_topic(template_))
key, config = next((k, v) for k, v in full_config.items() if k in ACTION_REGISTRY)
for template_ in templatable(config[CONF_PAYLOAD], arg_type, std_string):
yield None
add(action.set_payload(template_))
if CONF_QOS in config:
for template_ in templatable(config[CONF_QOS], arg_type, uint8):
yield
add(action.set_qos(template_))
if CONF_RETAIN in config:
for template_ in templatable(config[CONF_RETAIN], arg_type, bool_):
yield None
add(action.set_retain(template_))
yield action
elif key == CONF_LIGHT_TOGGLE:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_toggle_action(template_arg)
type = light.ToggleAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
if CONF_TRANSITION_LENGTH in config:
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
yield None
add(action.set_transition_length(template_))
yield action
elif key == CONF_LIGHT_TURN_OFF:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_off_action(template_arg)
type = light.TurnOffAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
if CONF_TRANSITION_LENGTH in config:
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
yield None
add(action.set_transition_length(template_))
yield action
elif key == CONF_LIGHT_TURN_ON:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_on_action(template_arg)
type = light.TurnOnAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
if CONF_TRANSITION_LENGTH in config:
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
yield None
add(action.set_transition_length(template_))
if CONF_FLASH_LENGTH in config:
for template_ in templatable(config[CONF_FLASH_LENGTH], arg_type, uint32):
yield None
add(action.set_flash_length(template_))
if CONF_BRIGHTNESS in config:
for template_ in templatable(config[CONF_BRIGHTNESS], arg_type, float_):
yield None
add(action.set_brightness(template_))
if CONF_RED in config:
for template_ in templatable(config[CONF_RED], arg_type, float_):
yield None
add(action.set_red(template_))
if CONF_GREEN in config:
for template_ in templatable(config[CONF_GREEN], arg_type, float_):
yield None
add(action.set_green(template_))
if CONF_BLUE in config:
for template_ in templatable(config[CONF_BLUE], arg_type, float_):
yield None
add(action.set_blue(template_))
if CONF_WHITE in config:
for template_ in templatable(config[CONF_WHITE], arg_type, float_):
yield None
add(action.set_white(template_))
if CONF_COLOR_TEMPERATURE in config:
for template_ in templatable(config[CONF_COLOR_TEMPERATURE], arg_type, float_):
yield None
add(action.set_color_temperature(template_))
if CONF_EFFECT in config:
for template_ in templatable(config[CONF_EFFECT], arg_type, std_string):
yield None
add(action.set_effect(template_))
yield action
elif key == CONF_SWITCH_TOGGLE:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_toggle_action(template_arg)
type = switch.ToggleAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_SWITCH_TURN_OFF:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_off_action(template_arg)
type = switch.TurnOffAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_SWITCH_TURN_ON:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_on_action(template_arg)
type = switch.TurnOnAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_COVER_OPEN:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_open_action(template_arg)
type = cover.OpenAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_COVER_CLOSE:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_close_action(template_arg)
type = cover.CloseAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_COVER_STOP:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_stop_action(template_arg)
type = cover.StopAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_FAN_TOGGLE:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_toggle_action(template_arg)
type = fan.ToggleAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_FAN_TURN_OFF:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_off_action(template_arg)
type = fan.TurnOffAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_FAN_TURN_ON:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_on_action(template_arg)
type = fan.TurnOnAction.template(arg_type)
action = Pvariable(action_id, rhs, type=type)
if CONF_OSCILLATING in config:
for template_ in templatable(config[CONF_OSCILLATING], arg_type, bool_):
yield None
add(action.set_oscillating(template_))
if CONF_SPEED in config:
for template_ in templatable(config[CONF_SPEED], arg_type, fan.FanSpeed):
yield None
add(action.set_speed(template_))
yield action
elif key == CONF_OUTPUT_TURN_OFF:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_off_action(template_arg)
type = output.TurnOffAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_OUTPUT_TURN_ON:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_on_action(template_arg)
type = output.TurnOnAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_OUTPUT_SET_LEVEL:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_set_level_action(template_arg)
type = output.SetLevelAction.template(arg_type)
action = Pvariable(action_id, rhs, type=type)
for template_ in templatable(config[CONF_LEVEL], arg_type, bool_):
yield None
add(action.set_level(template_))
yield action
elif key == CONF_IF:
for conditions in build_conditions(config[CONF_CONDITION], arg_type):
yield None
rhs = IfAction.new(template_arg, conditions)
type = IfAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
if CONF_THEN in config:
for actions in build_actions(config[CONF_THEN], arg_type):
yield None
add(action.add_then(actions))
if CONF_ELSE in config:
for actions in build_actions(config[CONF_ELSE], arg_type):
yield None
add(action.add_else(actions))
yield action
elif key == CONF_DEEP_SLEEP_ENTER:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_enter_deep_sleep_action(template_arg)
type = deep_sleep.EnterDeepSleepAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
elif key == CONF_DEEP_SLEEP_PREVENT:
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_prevent_deep_sleep_action(template_arg)
type = deep_sleep.PreventDeepSleepAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
else:
raise ESPHomeYAMLError(u"Unsupported action {}".format(config))
builder = ACTION_REGISTRY[key][1]
template_arg = TemplateArguments(arg_type)
for result in builder(config, action_id, arg_type, template_arg):
yield None
yield result
def build_actions(config, arg_type):
@@ -478,6 +325,26 @@ def build_actions(config, arg_type):
yield ArrayInitializer(*actions, multiline=False)
def build_condition(full_config, arg_type):
action_id = full_config[CONF_CONDITION_ID]
key, config = next((k, v) for k, v in full_config.items() if k in CONDITION_REGISTRY)
builder = CONDITION_REGISTRY[key][1]
template_arg = TemplateArguments(arg_type)
for result in builder(config, action_id, arg_type, template_arg):
yield None
yield result
def build_conditions(config, arg_type):
conditions = []
for conf in config:
for condition in build_condition(conf, arg_type):
yield None
conditions.append(condition)
yield ArrayInitializer(*conditions, multiline=False)
def build_automation_(trigger, arg_type, config):
rhs = App.make_automation(TemplateArguments(arg_type), trigger)
type = Automation.template(arg_type)
@@ -495,4 +362,4 @@ def build_automation_(trigger, arg_type, config):
def build_automation(trigger, arg_type, config):
add_job(build_automation_, trigger, arg_type, config)
CORE.add_job(build_automation_, trigger, arg_type, config)

View File

@@ -1,30 +1,27 @@
import voluptuous as vol
from esphomeyaml.components import i2c, sensor
import esphomeyaml.config_validation as cv
from esphomeyaml.components import sensor
from esphomeyaml.const import CONF_ADDRESS, CONF_ID, CONF_RATE
from esphomeyaml.helpers import App, Pvariable
from esphomeyaml.const import CONF_ADDRESS, CONF_ID
from esphomeyaml.cpp_generator import Pvariable
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, Component
DEPENDENCIES = ['i2c']
MULTI_CONF = True
ADS1115Component = sensor.sensor_ns.ADS1115Component
ADS1115Component = sensor.sensor_ns.class_('ADS1115Component', Component, i2c.I2CDevice)
RATE_REMOVE_MESSAGE = """The rate option has been removed in 1.5.0 and is no longer required."""
ADS1115_SCHEMA = vol.Schema({
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(ADS1115Component),
vol.Required(CONF_ADDRESS): cv.i2c_address,
vol.Optional(CONF_RATE): cv.invalid(RATE_REMOVE_MESSAGE)
})
CONFIG_SCHEMA = vol.All(cv.ensure_list, [ADS1115_SCHEMA])
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
for conf in config:
rhs = App.make_ads1115_component(conf[CONF_ADDRESS])
Pvariable(conf[CONF_ID], rhs)
rhs = App.make_ads1115_component(config[CONF_ADDRESS])
var = Pvariable(config[CONF_ID], rhs)
setup_component(var, config)
BUILD_FLAGS = '-DUSE_ADS1115_SENSOR'

View File

@@ -0,0 +1,33 @@
import voluptuous as vol
from esphomeyaml.components import i2c, sensor
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ADDRESS, CONF_ID, CONF_UPDATE_INTERVAL
from esphomeyaml.cpp_generator import Pvariable, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, PollingComponent
DEPENDENCIES = ['i2c']
MULTI_CONF = True
CONF_APDS9960_ID = 'apds9960_id'
APDS9960 = sensor.sensor_ns.class_('APDS9960', PollingComponent, i2c.I2CDevice)
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(APDS9960),
vol.Optional(CONF_ADDRESS): cv.i2c_address,
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
rhs = App.make_apds9960(config.get(CONF_UPDATE_INTERVAL))
var = Pvariable(config[CONF_ID], rhs)
if CONF_ADDRESS in config:
add(var.set_address(config[CONF_ADDRESS]))
setup_component(var, config)
BUILD_FLAGS = '-DUSE_APDS9960'

View File

@@ -0,0 +1,88 @@
import voluptuous as vol
from esphomeyaml.automation import ACTION_REGISTRY
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_DATA, CONF_DATA_TEMPLATE, CONF_ID, CONF_PASSWORD, CONF_PORT, \
CONF_SERVICE, CONF_VARIABLES, CONF_REBOOT_TIMEOUT
from esphomeyaml.core import CORE
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, add, get_variable, process_lambda
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import Action, App, Component, StoringController, esphomelib_ns
api_ns = esphomelib_ns.namespace('api')
APIServer = api_ns.class_('APIServer', Component, StoringController)
HomeAssistantServiceCallAction = api_ns.class_('HomeAssistantServiceCallAction', Action)
KeyValuePair = api_ns.class_('KeyValuePair')
TemplatableKeyValuePair = api_ns.class_('TemplatableKeyValuePair')
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(APIServer),
vol.Optional(CONF_PORT, default=6053): cv.port,
vol.Optional(CONF_PASSWORD, default=''): cv.string_strict,
vol.Optional(CONF_REBOOT_TIMEOUT): cv.positive_time_period_milliseconds,
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
rhs = App.init_api_server()
api = Pvariable(config[CONF_ID], rhs)
if config[CONF_PORT] != 6053:
add(api.set_port(config[CONF_PORT]))
if config.get(CONF_PASSWORD):
add(api.set_password(config[CONF_PASSWORD]))
if CONF_REBOOT_TIMEOUT in config:
add(api.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
setup_component(api, config)
BUILD_FLAGS = '-DUSE_API'
def lib_deps(config):
if CORE.is_esp32:
return 'AsyncTCP@1.0.1'
if CORE.is_esp8266:
return 'ESPAsyncTCP@1.1.3'
raise NotImplementedError
CONF_HOMEASSISTANT_SERVICE = 'homeassistant.service'
HOMEASSISTANT_SERVIC_ACTION_SCHEMA = vol.Schema({
cv.GenerateID(): cv.use_variable_id(APIServer),
vol.Required(CONF_SERVICE): cv.string,
vol.Optional(CONF_DATA): vol.Schema({
cv.string: cv.string,
}),
vol.Optional(CONF_DATA_TEMPLATE): vol.Schema({
cv.string: cv.string,
}),
vol.Optional(CONF_VARIABLES): vol.Schema({
cv.string: cv.lambda_,
}),
})
@ACTION_REGISTRY.register(CONF_HOMEASSISTANT_SERVICE, HOMEASSISTANT_SERVIC_ACTION_SCHEMA)
def homeassistant_service_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_home_assistant_service_call_action(template_arg)
type = HomeAssistantServiceCallAction.template(arg_type)
act = Pvariable(action_id, rhs, type=type)
add(act.set_service(config[CONF_SERVICE]))
if CONF_DATA in config:
datas = [KeyValuePair(k, v) for k, v in config[CONF_DATA].items()]
add(act.set_data(ArrayInitializer(*datas)))
if CONF_DATA_TEMPLATE in config:
datas = [KeyValuePair(k, v) for k, v in config[CONF_DATA_TEMPLATE].items()]
add(act.set_data_template(ArrayInitializer(*datas)))
if CONF_VARIABLES in config:
datas = []
for key, value in config[CONF_VARIABLES].items():
for value_ in process_lambda(value, []):
yield None
datas.append(TemplatableKeyValuePair(key, value_))
add(act.set_variables(ArrayInitializer(*datas)))
yield act

View File

@@ -1,13 +1,21 @@
import voluptuous as vol
from esphomeyaml import automation, core
from esphomeyaml.automation import maybe_simple_id, CONDITION_REGISTRY, Condition
from esphomeyaml.components import mqtt
from esphomeyaml.components.mqtt import setup_mqtt_component
import esphomeyaml.config_validation as cv
from esphomeyaml import automation
from esphomeyaml.const import CONF_DEVICE_CLASS, CONF_ID, CONF_INTERNAL, CONF_INVERTED, \
CONF_MAX_LENGTH, CONF_MIN_LENGTH, CONF_MQTT_ID, CONF_ON_CLICK, CONF_ON_DOUBLE_CLICK, \
CONF_ON_PRESS, CONF_ON_RELEASE, CONF_TRIGGER_ID, CONF_FILTERS, CONF_INVERT, CONF_DELAYED_ON, \
CONF_DELAYED_OFF, CONF_LAMBDA, CONF_HEARTBEAT
from esphomeyaml.helpers import App, NoArg, Pvariable, add, add_job, esphomelib_ns, \
setup_mqtt_component, bool_, process_lambda, ArrayInitializer
from esphomeyaml.const import CONF_DELAYED_OFF, CONF_DELAYED_ON, CONF_DEVICE_CLASS, CONF_FILTERS, \
CONF_HEARTBEAT, CONF_ID, CONF_INTERNAL, CONF_INVALID_COOLDOWN, CONF_INVERT, CONF_INVERTED, \
CONF_LAMBDA, CONF_MAX_LENGTH, CONF_MIN_LENGTH, CONF_MQTT_ID, CONF_ON_CLICK, \
CONF_ON_DOUBLE_CLICK, CONF_ON_MULTI_CLICK, CONF_ON_PRESS, CONF_ON_RELEASE, CONF_STATE, \
CONF_TIMING, CONF_TRIGGER_ID, CONF_ON_STATE
from esphomeyaml.core import CORE
from esphomeyaml.cpp_generator import process_lambda, ArrayInitializer, add, Pvariable, \
StructInitializer, get_variable
from esphomeyaml.cpp_types import esphomelib_ns, Nameable, Trigger, NoArg, Component, App, bool_, \
optional
from esphomeyaml.py_compat import string_types
DEVICE_CLASSES = [
'', 'battery', 'cold', 'connectivity', 'door', 'garage_door', 'gas',
@@ -21,51 +29,164 @@ PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
})
binary_sensor_ns = esphomelib_ns.namespace('binary_sensor')
PressTrigger = binary_sensor_ns.PressTrigger
ReleaseTrigger = binary_sensor_ns.ReleaseTrigger
ClickTrigger = binary_sensor_ns.ClickTrigger
DoubleClickTrigger = binary_sensor_ns.DoubleClickTrigger
BinarySensor = binary_sensor_ns.BinarySensor
InvertFilter = binary_sensor_ns.InvertFilter
LambdaFilter = binary_sensor_ns.LambdaFilter
DelayedOnFilter = binary_sensor_ns.DelayedOnFilter
DelayedOffFilter = binary_sensor_ns.DelayedOffFilter
HeartbeatFilter = binary_sensor_ns.HeartbeatFilter
MQTTBinarySensorComponent = binary_sensor_ns.MQTTBinarySensorComponent
BinarySensor = binary_sensor_ns.class_('BinarySensor', Nameable)
BinarySensorPtr = BinarySensor.operator('ptr')
MQTTBinarySensorComponent = binary_sensor_ns.class_('MQTTBinarySensorComponent', mqtt.MQTTComponent)
# Triggers
PressTrigger = binary_sensor_ns.class_('PressTrigger', Trigger.template(NoArg))
ReleaseTrigger = binary_sensor_ns.class_('ReleaseTrigger', Trigger.template(NoArg))
ClickTrigger = binary_sensor_ns.class_('ClickTrigger', Trigger.template(NoArg))
DoubleClickTrigger = binary_sensor_ns.class_('DoubleClickTrigger', Trigger.template(NoArg))
MultiClickTrigger = binary_sensor_ns.class_('MultiClickTrigger', Trigger.template(NoArg), Component)
MultiClickTriggerEvent = binary_sensor_ns.struct('MultiClickTriggerEvent')
StateTrigger = binary_sensor_ns.class_('StateTrigger', Trigger.template(bool_))
# Condition
BinarySensorCondition = binary_sensor_ns.class_('BinarySensorCondition', Condition)
# Filters
Filter = binary_sensor_ns.class_('Filter')
DelayedOnFilter = binary_sensor_ns.class_('DelayedOnFilter', Filter, Component)
DelayedOffFilter = binary_sensor_ns.class_('DelayedOffFilter', Filter, Component)
HeartbeatFilter = binary_sensor_ns.class_('HeartbeatFilter', Filter, Component)
InvertFilter = binary_sensor_ns.class_('InvertFilter', Filter)
LambdaFilter = binary_sensor_ns.class_('LambdaFilter', Filter)
FILTER_KEYS = [CONF_INVERT, CONF_DELAYED_ON, CONF_DELAYED_OFF, CONF_LAMBDA, CONF_HEARTBEAT]
FILTERS_SCHEMA = vol.All(cv.ensure_list, [vol.All({
FILTERS_SCHEMA = cv.ensure_list({
vol.Optional(CONF_INVERT): None,
vol.Optional(CONF_DELAYED_ON): cv.positive_time_period_milliseconds,
vol.Optional(CONF_DELAYED_OFF): cv.positive_time_period_milliseconds,
vol.Optional(CONF_HEARTBEAT): cv.positive_time_period_milliseconds,
vol.Optional(CONF_LAMBDA): cv.lambda_,
}, cv.has_exactly_one_key(*FILTER_KEYS))])
}, cv.has_exactly_one_key(*FILTER_KEYS))
MULTI_CLICK_TIMING_SCHEMA = vol.Schema({
vol.Optional(CONF_STATE): cv.boolean,
vol.Optional(CONF_MIN_LENGTH): cv.positive_time_period_milliseconds,
vol.Optional(CONF_MAX_LENGTH): cv.positive_time_period_milliseconds,
})
def parse_multi_click_timing_str(value):
if not isinstance(value, string_types):
return value
parts = value.lower().split(' ')
if len(parts) != 5:
raise vol.Invalid("Multi click timing grammar consists of exactly 5 words, not {}"
"".format(len(parts)))
try:
state = cv.boolean(parts[0])
except vol.Invalid:
raise vol.Invalid(u"First word must either be ON or OFF, not {}".format(parts[0]))
if parts[1] != 'for':
raise vol.Invalid(u"Second word must be 'for', got {}".format(parts[1]))
if parts[2] == 'at':
if parts[3] == 'least':
key = CONF_MIN_LENGTH
elif parts[3] == 'most':
key = CONF_MAX_LENGTH
else:
raise vol.Invalid(u"Third word after at must either be 'least' or 'most', got {}"
u"".format(parts[3]))
try:
length = cv.positive_time_period_milliseconds(parts[4])
except vol.Invalid as err:
raise vol.Invalid(u"Multi Click Grammar Parsing length failed: {}".format(err))
return {
CONF_STATE: state,
key: str(length)
}
if parts[3] != 'to':
raise vol.Invalid("Multi click grammar: 4th word must be 'to'")
try:
min_length = cv.positive_time_period_milliseconds(parts[2])
except vol.Invalid as err:
raise vol.Invalid(u"Multi Click Grammar Parsing minimum length failed: {}".format(err))
try:
max_length = cv.positive_time_period_milliseconds(parts[4])
except vol.Invalid as err:
raise vol.Invalid(u"Multi Click Grammar Parsing minimum length failed: {}".format(err))
return {
CONF_STATE: state,
CONF_MIN_LENGTH: str(min_length),
CONF_MAX_LENGTH: str(max_length)
}
def validate_multi_click_timing(value):
if not isinstance(value, list):
raise vol.Invalid("Timing option must be a *list* of times!")
timings = []
state = None
for i, v_ in enumerate(value):
v_ = MULTI_CLICK_TIMING_SCHEMA(v_)
min_length = v_.get(CONF_MIN_LENGTH)
max_length = v_.get(CONF_MAX_LENGTH)
if min_length is None and max_length is None:
raise vol.Invalid("At least one of min_length and max_length is required!")
if min_length is None and max_length is not None:
min_length = core.TimePeriodMilliseconds(milliseconds=0)
new_state = v_.get(CONF_STATE, not state)
if new_state == state:
raise vol.Invalid("Timings must have alternating state. Indices {} and {} have "
"the same state {}".format(i, i + 1, state))
if max_length is not None and max_length < min_length:
raise vol.Invalid("Max length ({}) must be larger than min length ({})."
"".format(max_length, min_length))
state = new_state
tim = {
CONF_STATE: new_state,
CONF_MIN_LENGTH: min_length,
}
if max_length is not None:
tim[CONF_MAX_LENGTH] = max_length
timings.append(tim)
return timings
BINARY_SENSOR_SCHEMA = cv.MQTT_COMPONENT_SCHEMA.extend({
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTBinarySensorComponent),
cv.GenerateID(): cv.declare_variable_id(BinarySensor),
vol.Optional(CONF_DEVICE_CLASS): vol.All(vol.Lower, cv.one_of(*DEVICE_CLASSES)),
vol.Optional(CONF_DEVICE_CLASS): cv.one_of(*DEVICE_CLASSES, lower=True),
vol.Optional(CONF_FILTERS): FILTERS_SCHEMA,
vol.Optional(CONF_ON_PRESS): vol.All(cv.ensure_list, [automation.validate_automation({
vol.Optional(CONF_ON_PRESS): automation.validate_automation({
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(PressTrigger),
})]),
vol.Optional(CONF_ON_RELEASE): vol.All(cv.ensure_list, [automation.validate_automation({
}),
vol.Optional(CONF_ON_RELEASE): automation.validate_automation({
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(ReleaseTrigger),
})]),
vol.Optional(CONF_ON_CLICK): vol.All(cv.ensure_list, [automation.validate_automation({
}),
vol.Optional(CONF_ON_CLICK): automation.validate_automation({
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(ClickTrigger),
vol.Optional(CONF_MIN_LENGTH, default='50ms'): cv.positive_time_period_milliseconds,
vol.Optional(CONF_MAX_LENGTH, default='350ms'): cv.positive_time_period_milliseconds,
})]),
vol.Optional(CONF_ON_DOUBLE_CLICK):
vol.All(cv.ensure_list, [automation.validate_automation({
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(DoubleClickTrigger),
vol.Optional(CONF_MIN_LENGTH, default='50ms'): cv.positive_time_period_milliseconds,
vol.Optional(CONF_MAX_LENGTH, default='350ms'): cv.positive_time_period_milliseconds,
})]),
}),
vol.Optional(CONF_ON_DOUBLE_CLICK): automation.validate_automation({
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(DoubleClickTrigger),
vol.Optional(CONF_MIN_LENGTH, default='50ms'): cv.positive_time_period_milliseconds,
vol.Optional(CONF_MAX_LENGTH, default='350ms'): cv.positive_time_period_milliseconds,
}),
vol.Optional(CONF_ON_MULTI_CLICK): automation.validate_automation({
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(MultiClickTrigger),
vol.Required(CONF_TIMING): vol.All([parse_multi_click_timing_str],
validate_multi_click_timing),
vol.Optional(CONF_INVALID_COOLDOWN): cv.positive_time_period_milliseconds,
}),
vol.Optional(CONF_ON_STATE): automation.validate_automation({
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(StateTrigger),
}),
vol.Optional(CONF_INVERTED): cv.invalid(
"The inverted binary_sensor property has been replaced by the "
@@ -87,8 +208,8 @@ def setup_filter(config):
elif CONF_HEARTBEAT in config:
yield App.register_component(HeartbeatFilter.new(config[CONF_HEARTBEAT]))
elif CONF_LAMBDA in config:
lambda_ = None
for lambda_ in process_lambda(config[CONF_LAMBDA], [(bool_, 'x')]):
for lambda_ in process_lambda(config[CONF_LAMBDA], [(bool_, 'x')],
return_type=optional.template(bool_)):
yield None
yield LambdaFilter.new(lambda_)
@@ -137,6 +258,27 @@ def setup_binary_sensor_core_(binary_sensor_var, mqtt_var, config):
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
automation.build_automation(trigger, NoArg, conf)
for conf in config.get(CONF_ON_MULTI_CLICK, []):
timings = []
for tim in conf[CONF_TIMING]:
timings.append(StructInitializer(
MultiClickTriggerEvent,
('state', tim[CONF_STATE]),
('min_length', tim[CONF_MIN_LENGTH]),
('max_length', tim.get(CONF_MAX_LENGTH, 4294967294)),
))
timings = ArrayInitializer(*timings, multiline=False)
rhs = App.register_component(binary_sensor_var.make_multi_click_trigger(timings))
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
if CONF_INVALID_COOLDOWN in conf:
add(trigger.set_invalid_cooldown(conf[CONF_INVALID_COOLDOWN]))
automation.build_automation(trigger, NoArg, conf)
for conf in config.get(CONF_ON_STATE, []):
rhs = binary_sensor_var.make_state_trigger()
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
automation.build_automation(trigger, bool_, conf)
setup_mqtt_component(mqtt_var, config)
@@ -145,14 +287,54 @@ def setup_binary_sensor(binary_sensor_obj, mqtt_obj, config):
has_side_effects=False)
mqtt_var = Pvariable(config[CONF_MQTT_ID], mqtt_obj,
has_side_effects=False)
add_job(setup_binary_sensor_core_, binary_sensor_var, mqtt_var, config)
CORE.add_job(setup_binary_sensor_core_, binary_sensor_var, mqtt_var, config)
def register_binary_sensor(var, config):
binary_sensor_var = Pvariable(config[CONF_ID], var, has_side_effects=True)
rhs = App.register_binary_sensor(binary_sensor_var)
mqtt_var = Pvariable(config[CONF_MQTT_ID], rhs, has_side_effects=True)
add_job(setup_binary_sensor_core_, binary_sensor_var, mqtt_var, config)
CORE.add_job(setup_binary_sensor_core_, binary_sensor_var, mqtt_var, config)
def core_to_hass_config(data, config):
ret = mqtt.build_hass_config(data, 'binary_sensor', config,
include_state=True, include_command=False)
if ret is None:
return None
if CONF_DEVICE_CLASS in config:
ret['device_class'] = config[CONF_DEVICE_CLASS]
return ret
BUILD_FLAGS = '-DUSE_BINARY_SENSOR'
CONF_BINARY_SENSOR_IS_ON = 'binary_sensor.is_on'
BINARY_SENSOR_IS_ON_CONDITION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(BinarySensor),
})
@CONDITION_REGISTRY.register(CONF_BINARY_SENSOR_IS_ON, BINARY_SENSOR_IS_ON_CONDITION_SCHEMA)
def binary_sensor_is_on_to_code(config, condition_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_binary_sensor_is_on_condition(template_arg)
type = BinarySensorCondition.template(arg_type)
yield Pvariable(condition_id, rhs, type=type)
CONF_BINARY_SENSOR_IS_OFF = 'binary_sensor.is_off'
BINARY_SENSOR_IS_OFF_CONDITION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(BinarySensor),
})
@CONDITION_REGISTRY.register(CONF_BINARY_SENSOR_IS_OFF, BINARY_SENSOR_IS_OFF_CONDITION_SCHEMA)
def binary_sensor_is_off_to_code(config, condition_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_binary_sensor_is_off_condition(template_arg)
type = BinarySensorCondition.template(arg_type)
yield Pvariable(condition_id, rhs, type=type)

View File

@@ -0,0 +1,36 @@
import voluptuous as vol
from esphomeyaml.components import binary_sensor, sensor
from esphomeyaml.components.apds9960 import APDS9960, CONF_APDS9960_ID
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_DIRECTION, CONF_NAME
from esphomeyaml.cpp_generator import get_variable
DEPENDENCIES = ['apds9960']
APDS9960GestureDirectionBinarySensor = sensor.sensor_ns.class_(
'APDS9960GestureDirectionBinarySensor', binary_sensor.BinarySensor)
DIRECTIONS = {
'UP': 'make_up_direction',
'DOWN': 'make_down_direction',
'LEFT': 'make_left_direction',
'RIGHT': 'make_right_direction',
}
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(APDS9960GestureDirectionBinarySensor),
vol.Required(CONF_DIRECTION): cv.one_of(*DIRECTIONS, upper=True),
cv.GenerateID(CONF_APDS9960_ID): cv.use_variable_id(APDS9960)
}))
def to_code(config):
for hub in get_variable(config[CONF_APDS9960_ID]):
yield
func = getattr(hub, DIRECTIONS[config[CONF_DIRECTION]])
rhs = func(config[CONF_NAME])
binary_sensor.register_binary_sensor(rhs, config)
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -0,0 +1,37 @@
import voluptuous as vol
from esphomeyaml.components import binary_sensor
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_BINARY_SENSORS, CONF_ID, CONF_LAMBDA
from esphomeyaml.cpp_generator import process_lambda, variable
from esphomeyaml.cpp_types import std_vector
CustomBinarySensorConstructor = binary_sensor.binary_sensor_ns.class_(
'CustomBinarySensorConstructor')
PLATFORM_SCHEMA = binary_sensor.PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(CustomBinarySensorConstructor),
vol.Required(CONF_LAMBDA): cv.lambda_,
vol.Required(CONF_BINARY_SENSORS):
cv.ensure_list(binary_sensor.BINARY_SENSOR_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(binary_sensor.BinarySensor),
})),
})
def to_code(config):
for template_ in process_lambda(config[CONF_LAMBDA], [],
return_type=std_vector.template(binary_sensor.BinarySensorPtr)):
yield
rhs = CustomBinarySensorConstructor(template_)
custom = variable(config[CONF_ID], rhs)
for i, sens in enumerate(config[CONF_BINARY_SENSORS]):
binary_sensor.register_binary_sensor(custom.get_binary_sensor(i), sens)
BUILD_FLAGS = '-DUSE_CUSTOM_BINARY_SENSOR'
def to_hass_config(data, config):
return [binary_sensor.core_to_hass_config(data, sens) for sens in config[CONF_BINARY_SENSORS]]

View File

@@ -1,15 +1,18 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.components import binary_sensor
from esphomeyaml.components.esp32_ble_tracker import CONF_ESP32_BLE_ID, ESP32BLETracker, \
make_address_array
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_MAC_ADDRESS, CONF_NAME
from esphomeyaml.helpers import get_variable
from esphomeyaml.cpp_generator import get_variable
from esphomeyaml.cpp_types import esphomelib_ns
DEPENDENCIES = ['esp32_ble_tracker']
ESP32BLEPresenceDevice = esphomelib_ns.class_('ESP32BLEPresenceDevice', binary_sensor.BinarySensor)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(ESP32BLEPresenceDevice),
vol.Required(CONF_MAC_ADDRESS): cv.mac_address,
cv.GenerateID(CONF_ESP32_BLE_ID): cv.use_variable_id(ESP32BLETracker)
}))
@@ -21,3 +24,7 @@ def to_code(config):
yield
rhs = hub.make_presence_sensor(config[CONF_NAME], make_address_array(config[CONF_MAC_ADDRESS]))
binary_sensor.register_binary_sensor(rhs, config)
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -4,7 +4,8 @@ import esphomeyaml.config_validation as cv
from esphomeyaml.components import binary_sensor
from esphomeyaml.components.esp32_touch import ESP32TouchComponent
from esphomeyaml.const import CONF_NAME, CONF_PIN, CONF_THRESHOLD, ESP_PLATFORM_ESP32
from esphomeyaml.helpers import get_variable, global_ns
from esphomeyaml.cpp_generator import get_variable
from esphomeyaml.cpp_types import global_ns
from esphomeyaml.pins import validate_gpio_pin
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
@@ -34,7 +35,11 @@ def validate_touch_pad(value):
return value
ESP32TouchBinarySensor = binary_sensor.binary_sensor_ns.class_('ESP32TouchBinarySensor',
binary_sensor.BinarySensor)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(ESP32TouchBinarySensor),
vol.Required(CONF_PIN): validate_touch_pad,
vol.Required(CONF_THRESHOLD): cv.uint16_t,
cv.GenerateID(CONF_ESP32_TOUCH_ID): cv.use_variable_id(ESP32TouchComponent),
@@ -51,3 +56,7 @@ def to_code(config):
BUILD_FLAGS = '-DUSE_ESP32_TOUCH_BINARY_SENSOR'
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -4,14 +4,20 @@ import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.components import binary_sensor
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_PIN
from esphomeyaml.helpers import App, gpio_input_pin_expression, variable, Application
from esphomeyaml.cpp_generator import variable
from esphomeyaml.cpp_helpers import gpio_input_pin_expression, setup_component
from esphomeyaml.cpp_types import Application, Component, App
MakeGPIOBinarySensor = Application.MakeGPIOBinarySensor
MakeGPIOBinarySensor = Application.struct('MakeGPIOBinarySensor')
GPIOBinarySensorComponent = binary_sensor.binary_sensor_ns.class_('GPIOBinarySensorComponent',
binary_sensor.BinarySensor,
Component)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(GPIOBinarySensorComponent),
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeGPIOBinarySensor),
vol.Required(CONF_PIN): pins.gpio_input_pin_schema
}))
}).extend(cv.COMPONENT_SCHEMA.schema))
def to_code(config):
@@ -21,6 +27,11 @@ def to_code(config):
rhs = App.make_gpio_binary_sensor(config[CONF_NAME], pin)
gpio = variable(config[CONF_MAKE_ID], rhs)
binary_sensor.setup_binary_sensor(gpio.Pgpio, gpio.Pmqtt, config)
setup_component(gpio.Pgpio, config)
BUILD_FLAGS = '-DUSE_GPIO_BINARY_SENSOR'
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -1,16 +1,20 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.components import binary_sensor
from esphomeyaml.components import binary_sensor, display
from esphomeyaml.components.display.nextion import Nextion
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_COMPONENT_ID, CONF_NAME, CONF_PAGE_ID
from esphomeyaml.helpers import get_variable
from esphomeyaml.cpp_generator import get_variable
DEPENDENCIES = ['display']
CONF_NEXTION_ID = 'nextion_id'
NextionTouchComponent = display.display_ns.class_('NextionTouchComponent',
binary_sensor.BinarySensor)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(NextionTouchComponent),
vol.Required(CONF_PAGE_ID): cv.uint8_t,
vol.Required(CONF_COMPONENT_ID): cv.uint8_t,
cv.GenerateID(CONF_NEXTION_ID): cv.use_variable_id(Nextion)
@@ -18,9 +22,12 @@ PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend
def to_code(config):
hub = None
for hub in get_variable(config[CONF_NEXTION_ID]):
yield
rhs = hub.make_touch_component(config[CONF_NAME], config[CONF_PAGE_ID],
config[CONF_COMPONENT_ID])
binary_sensor.register_binary_sensor(rhs, config)
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -5,7 +5,7 @@ from esphomeyaml.components import binary_sensor
from esphomeyaml.components.pn532 import PN532Component
from esphomeyaml.const import CONF_NAME, CONF_UID
from esphomeyaml.core import HexInt
from esphomeyaml.helpers import ArrayInitializer, get_variable
from esphomeyaml.cpp_generator import get_variable, ArrayInitializer
DEPENDENCIES = ['pn532']
@@ -27,16 +27,23 @@ def validate_uid(value):
return value
PN532BinarySensor = binary_sensor.binary_sensor_ns.class_('PN532BinarySensor',
binary_sensor.BinarySensor)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(PN532BinarySensor),
vol.Required(CONF_UID): validate_uid,
cv.GenerateID(CONF_PN532_ID): cv.use_variable_id(PN532Component)
}))
def to_code(config):
hub = None
for hub in get_variable(config[CONF_PN532_ID]):
yield
addr = [HexInt(int(x, 16)) for x in config[CONF_UID].split('-')]
rhs = hub.make_tag(config[CONF_NAME], ArrayInitializer(*addr, multiline=False))
binary_sensor.register_binary_sensor(rhs, config)
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -3,21 +3,28 @@ import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.components import binary_sensor, rdm6300
from esphomeyaml.const import CONF_NAME, CONF_UID
from esphomeyaml.helpers import get_variable
from esphomeyaml.cpp_generator import get_variable
DEPENDENCIES = ['rdm6300']
CONF_RDM6300_ID = 'rdm6300_id'
RDM6300BinarySensor = binary_sensor.binary_sensor_ns.class_('RDM6300BinarySensor',
binary_sensor.BinarySensor)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(RDM6300BinarySensor),
vol.Required(CONF_UID): cv.uint32_t,
cv.GenerateID(CONF_RDM6300_ID): cv.use_variable_id(rdm6300.RDM6300Component)
}))
def to_code(config):
hub = None
for hub in get_variable(config[CONF_RDM6300_ID]):
yield
rhs = hub.make_card(config[CONF_NAME], config[CONF_UID])
binary_sensor.register_binary_sensor(rhs, config)
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -9,42 +9,48 @@ from esphomeyaml.components.remote_transmitter import RC_SWITCH_RAW_SCHEMA, \
from esphomeyaml.const import CONF_ADDRESS, CONF_CHANNEL, CONF_CODE, CONF_COMMAND, CONF_DATA, \
CONF_DEVICE, CONF_FAMILY, CONF_GROUP, CONF_LG, CONF_NAME, CONF_NBITS, CONF_NEC, \
CONF_PANASONIC, CONF_PROTOCOL, CONF_RAW, CONF_RC_SWITCH_RAW, CONF_RC_SWITCH_TYPE_A, \
CONF_RC_SWITCH_TYPE_B, CONF_RC_SWITCH_TYPE_C, CONF_RC_SWITCH_TYPE_D, CONF_SONY, CONF_STATE
from esphomeyaml.helpers import ArrayInitializer, Pvariable, get_variable
CONF_RC_SWITCH_TYPE_B, CONF_RC_SWITCH_TYPE_C, CONF_RC_SWITCH_TYPE_D, CONF_SAMSUNG, CONF_SONY, \
CONF_STATE
from esphomeyaml.cpp_generator import ArrayInitializer, get_variable, Pvariable
DEPENDENCIES = ['remote_receiver']
REMOTE_KEYS = [CONF_NEC, CONF_LG, CONF_SONY, CONF_PANASONIC, CONF_RAW, CONF_RC_SWITCH_RAW,
CONF_RC_SWITCH_TYPE_A, CONF_RC_SWITCH_TYPE_B, CONF_RC_SWITCH_TYPE_C,
CONF_RC_SWITCH_TYPE_D]
REMOTE_KEYS = [CONF_NEC, CONF_LG, CONF_SONY, CONF_PANASONIC, CONF_SAMSUNG, CONF_RAW,
CONF_RC_SWITCH_RAW, CONF_RC_SWITCH_TYPE_A, CONF_RC_SWITCH_TYPE_B,
CONF_RC_SWITCH_TYPE_C, CONF_RC_SWITCH_TYPE_D]
CONF_REMOTE_RECEIVER_ID = 'remote_receiver_id'
CONF_RECEIVER_ID = 'receiver_id'
RemoteReceiver = remote_ns.RemoteReceiver
LGReceiver = remote_ns.LGReceiver
NECReceiver = remote_ns.NECReceiver
PanasonicReceiver = remote_ns.PanasonicReceiver
RawReceiver = remote_ns.RawReceiver
SonyReceiver = remote_ns.SonyReceiver
RCSwitchRawReceiver = remote_ns.RCSwitchRawReceiver
RCSwitchTypeAReceiver = remote_ns.RCSwitchTypeAReceiver
RCSwitchTypeBReceiver = remote_ns.RCSwitchTypeBReceiver
RCSwitchTypeCReceiver = remote_ns.RCSwitchTypeCReceiver
RCSwitchTypeDReceiver = remote_ns.RCSwitchTypeDReceiver
RemoteReceiver = remote_ns.class_('RemoteReceiver', binary_sensor.BinarySensor)
LGReceiver = remote_ns.class_('LGReceiver', RemoteReceiver)
NECReceiver = remote_ns.class_('NECReceiver', RemoteReceiver)
PanasonicReceiver = remote_ns.class_('PanasonicReceiver', RemoteReceiver)
RawReceiver = remote_ns.class_('RawReceiver', RemoteReceiver)
SamsungReceiver = remote_ns.class_('SamsungReceiver', RemoteReceiver)
SonyReceiver = remote_ns.class_('SonyReceiver', RemoteReceiver)
RCSwitchRawReceiver = remote_ns.class_('RCSwitchRawReceiver', RemoteReceiver)
RCSwitchTypeAReceiver = remote_ns.class_('RCSwitchTypeAReceiver', RCSwitchRawReceiver)
RCSwitchTypeBReceiver = remote_ns.class_('RCSwitchTypeBReceiver', RCSwitchRawReceiver)
RCSwitchTypeCReceiver = remote_ns.class_('RCSwitchTypeCReceiver', RCSwitchRawReceiver)
RCSwitchTypeDReceiver = remote_ns.class_('RCSwitchTypeDReceiver', RCSwitchRawReceiver)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(RemoteReceiver),
vol.Optional(CONF_LG): vol.Schema({
vol.Required(CONF_DATA): cv.hex_uint32_t,
vol.Optional(CONF_NBITS, default=28): vol.All(vol.Coerce(int), cv.one_of(28, 32)),
vol.Optional(CONF_NBITS, default=28): cv.one_of(28, 32, int=True),
}),
vol.Optional(CONF_NEC): vol.Schema({
vol.Required(CONF_ADDRESS): cv.hex_uint16_t,
vol.Required(CONF_COMMAND): cv.hex_uint16_t,
}),
vol.Optional(CONF_SAMSUNG): vol.Schema({
vol.Required(CONF_DATA): cv.hex_uint32_t,
}),
vol.Optional(CONF_SONY): vol.Schema({
vol.Required(CONF_DATA): cv.hex_uint32_t,
vol.Optional(CONF_NBITS, default=12): vol.All(vol.Coerce(int), cv.one_of(12, 15, 20)),
vol.Optional(CONF_NBITS, default=12): cv.one_of(12, 15, 20, int=True),
}),
vol.Optional(CONF_PANASONIC): vol.Schema({
vol.Required(CONF_ADDRESS): cv.hex_uint16_t,
@@ -67,42 +73,43 @@ def receiver_base(full_config):
key, config = next((k, v) for k, v in full_config.items() if k in REMOTE_KEYS)
if key == CONF_LG:
return LGReceiver.new(name, config[CONF_DATA], config[CONF_NBITS])
elif key == CONF_NEC:
if key == CONF_NEC:
return NECReceiver.new(name, config[CONF_ADDRESS], config[CONF_COMMAND])
elif key == CONF_PANASONIC:
if key == CONF_PANASONIC:
return PanasonicReceiver.new(name, config[CONF_ADDRESS], config[CONF_COMMAND])
elif key == CONF_SONY:
if key == CONF_SAMSUNG:
return SamsungReceiver.new(name, config[CONF_DATA])
if key == CONF_SONY:
return SonyReceiver.new(name, config[CONF_DATA], config[CONF_NBITS])
elif key == CONF_RAW:
data = ArrayInitializer(*config[CONF_RAW], multiline=False)
if key == CONF_RAW:
data = ArrayInitializer(*config, multiline=False)
return RawReceiver.new(name, data)
elif key == CONF_RC_SWITCH_RAW:
if key == CONF_RC_SWITCH_RAW:
return RCSwitchRawReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
binary_code(config[CONF_CODE]), len(config[CONF_CODE]))
elif key == CONF_RC_SWITCH_TYPE_A:
if key == CONF_RC_SWITCH_TYPE_A:
return RCSwitchTypeAReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
binary_code(config[CONF_GROUP]),
binary_code(config[CONF_DEVICE]),
config[CONF_STATE])
elif key == CONF_RC_SWITCH_TYPE_B:
if key == CONF_RC_SWITCH_TYPE_B:
return RCSwitchTypeBReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
config[CONF_ADDRESS], config[CONF_CHANNEL],
config[CONF_STATE])
elif key == CONF_RC_SWITCH_TYPE_C:
if key == CONF_RC_SWITCH_TYPE_C:
return RCSwitchTypeCReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
ord(config[CONF_FAMILY][0]) - ord('a'),
config[CONF_GROUP], config[CONF_DEVICE],
config[CONF_STATE])
elif key == CONF_RC_SWITCH_TYPE_D:
if key == CONF_RC_SWITCH_TYPE_D:
return RCSwitchTypeDReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
ord(config[CONF_GROUP][0]) - ord('a'),
config[CONF_DEVICE], config[CONF_STATE])
else:
raise NotImplementedError("Unknown receiver type {}".format(config))
raise NotImplementedError("Unknown receiver type {}".format(config))
def to_code(config):
remote = None
for remote in get_variable(config[CONF_REMOTE_RECEIVER_ID]):
yield
rhs = receiver_base(config)
@@ -112,3 +119,7 @@ def to_code(config):
BUILD_FLAGS = '-DUSE_REMOTE_RECEIVER'
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -1,21 +1,31 @@
import esphomeyaml.config_validation as cv
from esphomeyaml.components import binary_sensor
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME
from esphomeyaml.helpers import App, Application, variable
from esphomeyaml.cpp_generator import variable
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import Application, Component, App
DEPENDENCIES = ['mqtt']
MakeStatusBinarySensor = Application.MakeStatusBinarySensor
MakeStatusBinarySensor = Application.struct('MakeStatusBinarySensor')
StatusBinarySensor = binary_sensor.binary_sensor_ns.class_('StatusBinarySensor',
binary_sensor.BinarySensor,
Component)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeStatusBinarySensor),
}))
cv.GenerateID(): cv.declare_variable_id(StatusBinarySensor),
}).extend(cv.COMPONENT_SCHEMA.schema))
def to_code(config):
rhs = App.make_status_binary_sensor(config[CONF_NAME])
status = variable(config[CONF_MAKE_ID], rhs)
binary_sensor.setup_binary_sensor(status.Pstatus, status.Pmqtt, config)
setup_component(status.Pstatus, config)
BUILD_FLAGS = '-DUSE_STATUS_BINARY_SENSOR'
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -1,22 +1,29 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.components import binary_sensor
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_LAMBDA, CONF_MAKE_ID, CONF_NAME
from esphomeyaml.helpers import App, Application, process_lambda, variable, optional, bool_, add
from esphomeyaml.cpp_generator import variable, process_lambda, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import Application, Component, App, optional, bool_
MakeTemplateBinarySensor = Application.MakeTemplateBinarySensor
MakeTemplateBinarySensor = Application.struct('MakeTemplateBinarySensor')
TemplateBinarySensor = binary_sensor.binary_sensor_ns.class_('TemplateBinarySensor',
binary_sensor.BinarySensor,
Component)
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(TemplateBinarySensor),
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeTemplateBinarySensor),
vol.Required(CONF_LAMBDA): cv.lambda_,
}))
}).extend(cv.COMPONENT_SCHEMA.schema))
def to_code(config):
rhs = App.make_template_binary_sensor(config[CONF_NAME])
make = variable(config[CONF_MAKE_ID], rhs)
binary_sensor.setup_binary_sensor(make.Ptemplate_, make.Pmqtt, config)
setup_component(make.Ptemplate_, config)
template_ = None
for template_ in process_lambda(config[CONF_LAMBDA], [],
@@ -26,3 +33,7 @@ def to_code(config):
BUILD_FLAGS = '-DUSE_TEMPLATE_BINARY_SENSOR'
def to_hass_config(data, config):
return binary_sensor.core_to_hass_config(data, config)

View File

@@ -1,20 +1,30 @@
import voluptuous as vol
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
from esphomeyaml.components import mqtt
from esphomeyaml.components.mqtt import setup_mqtt_component
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ID, CONF_MQTT_ID, CONF_INTERNAL
from esphomeyaml.helpers import Pvariable, esphomelib_ns, setup_mqtt_component, add
from esphomeyaml.const import CONF_ID, CONF_INTERNAL, CONF_MQTT_ID
from esphomeyaml.cpp_generator import Pvariable, add, get_variable
from esphomeyaml.cpp_types import Action, Nameable, esphomelib_ns
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
})
cover_ns = esphomelib_ns.namespace('cover')
Cover = cover_ns.Cover
MQTTCoverComponent = cover_ns.MQTTCoverComponent
CoverState = cover_ns.CoverState
Cover = cover_ns.class_('Cover', Nameable)
MQTTCoverComponent = cover_ns.class_('MQTTCoverComponent', mqtt.MQTTComponent)
CoverState = cover_ns.class_('CoverState')
COVER_OPEN = cover_ns.COVER_OPEN
COVER_CLOSED = cover_ns.COVER_CLOSED
OpenAction = cover_ns.OpenAction
CloseAction = cover_ns.CloseAction
StopAction = cover_ns.StopAction
# Actions
OpenAction = cover_ns.class_('OpenAction', Action)
CloseAction = cover_ns.class_('CloseAction', Action)
StopAction = cover_ns.class_('StopAction', Action)
COVER_SCHEMA = cv.MQTT_COMMAND_COMPONENT_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(Cover),
@@ -37,3 +47,54 @@ def setup_cover(cover_obj, mqtt_obj, config):
BUILD_FLAGS = '-DUSE_COVER'
CONF_COVER_OPEN = 'cover.open'
COVER_OPEN_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(Cover),
})
@ACTION_REGISTRY.register(CONF_COVER_OPEN, COVER_OPEN_ACTION_SCHEMA)
def cover_open_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_open_action(template_arg)
type = OpenAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
CONF_COVER_CLOSE = 'cover.close'
COVER_CLOSE_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(Cover),
})
@ACTION_REGISTRY.register(CONF_COVER_CLOSE, COVER_CLOSE_ACTION_SCHEMA)
def cover_close_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_close_action(template_arg)
type = CloseAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
CONF_COVER_STOP = 'cover.stop'
COVER_STOP_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(Cover),
})
@ACTION_REGISTRY.register(CONF_COVER_STOP, COVER_STOP_ACTION_SCHEMA)
def cover_stop_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_stop_action(template_arg)
type = StopAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
def core_to_hass_config(data, config):
ret = mqtt.build_hass_config(data, 'cover', config, include_state=True, include_command=True)
if ret is None:
return None
return ret

View File

@@ -5,18 +5,22 @@ from esphomeyaml import automation
from esphomeyaml.components import cover
from esphomeyaml.const import CONF_CLOSE_ACTION, CONF_LAMBDA, CONF_MAKE_ID, CONF_NAME, \
CONF_OPEN_ACTION, CONF_STOP_ACTION, CONF_OPTIMISTIC
from esphomeyaml.helpers import App, Application, NoArg, add, process_lambda, variable, optional
from esphomeyaml.cpp_generator import variable, process_lambda, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import Application, App, optional, NoArg
MakeTemplateCover = Application.MakeTemplateCover
MakeTemplateCover = Application.struct('MakeTemplateCover')
TemplateCover = cover.cover_ns.class_('TemplateCover', cover.Cover)
PLATFORM_SCHEMA = cv.nameable(cover.COVER_PLATFORM_SCHEMA.extend({
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeTemplateCover),
cv.GenerateID(): cv.declare_variable_id(TemplateCover),
vol.Optional(CONF_LAMBDA): cv.lambda_,
vol.Optional(CONF_OPTIMISTIC): cv.boolean,
vol.Optional(CONF_OPEN_ACTION): automation.validate_automation(),
vol.Optional(CONF_CLOSE_ACTION): automation.validate_automation(),
vol.Optional(CONF_STOP_ACTION): automation.validate_automation(),
}), cv.has_at_least_one_key(CONF_LAMBDA, CONF_OPTIMISTIC))
vol.Optional(CONF_OPEN_ACTION): automation.validate_automation(single=True),
vol.Optional(CONF_CLOSE_ACTION): automation.validate_automation(single=True),
vol.Optional(CONF_STOP_ACTION): automation.validate_automation(single=True),
}).extend(cv.COMPONENT_SCHEMA.schema), cv.has_at_least_one_key(CONF_LAMBDA, CONF_OPTIMISTIC))
def to_code(config):
@@ -24,9 +28,9 @@ def to_code(config):
make = variable(config[CONF_MAKE_ID], rhs)
cover.setup_cover(make.Ptemplate_, make.Pmqtt, config)
setup_component(make.Ptemplate_, config)
if CONF_LAMBDA in config:
template_ = None
for template_ in process_lambda(config[CONF_LAMBDA], [],
return_type=optional.template(cover.CoverState)):
yield
@@ -45,3 +49,12 @@ def to_code(config):
BUILD_FLAGS = '-DUSE_TEMPLATE_COVER'
def to_hass_config(data, config):
ret = cover.core_to_hass_config(data, config)
if ret is None:
return None
if CONF_OPTIMISTIC in config:
ret['optimistic'] = config[CONF_OPTIMISTIC]
return ret

View File

@@ -0,0 +1,32 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ID, CONF_LAMBDA, CONF_COMPONENTS
from esphomeyaml.cpp_generator import process_lambda, variable
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import Component, ComponentPtr, esphomelib_ns, std_vector
CustomComponentConstructor = esphomelib_ns.class_('CustomComponentConstructor')
MULTI_CONF = True
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(CustomComponentConstructor),
vol.Required(CONF_LAMBDA): cv.lambda_,
vol.Optional(CONF_COMPONENTS): cv.ensure_list(vol.Schema({
cv.GenerateID(): cv.declare_variable_id(Component)
}).extend(cv.COMPONENT_SCHEMA.schema)),
})
def to_code(config):
for template_ in process_lambda(config[CONF_LAMBDA], [],
return_type=std_vector.template(ComponentPtr)):
yield
rhs = CustomComponentConstructor(template_)
custom = variable(config[CONF_ID], rhs)
for i, comp in enumerate(config.get(CONF_COMPONENTS, [])):
setup_component(custom.get_component(i), comp)
BUILD_FLAGS = '-DUSE_CUSTOM_COMPONENT'

View File

@@ -1,24 +1,27 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.components import sensor
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ID, CONF_PIN, CONF_UPDATE_INTERVAL
from esphomeyaml.helpers import App, Pvariable
from esphomeyaml.cpp_generator import Pvariable
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, PollingComponent
DallasComponent = sensor.sensor_ns.DallasComponent
DallasComponent = sensor.sensor_ns.class_('DallasComponent', PollingComponent)
MULTI_CONF = True
CONFIG_SCHEMA = vol.All(cv.ensure_list, [vol.Schema({
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(DallasComponent),
vol.Required(CONF_PIN): pins.input_output_pin,
vol.Required(CONF_PIN): pins.input_pullup_pin,
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
})])
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
for conf in config:
rhs = App.make_dallas_component(conf[CONF_PIN], conf.get(CONF_UPDATE_INTERVAL))
Pvariable(conf[CONF_ID], rhs)
rhs = App.make_dallas_component(config[CONF_PIN], config.get(CONF_UPDATE_INTERVAL))
var = Pvariable(config[CONF_ID], rhs)
setup_component(var, config)
BUILD_FLAGS = '-DUSE_DALLAS_SENSOR'

View File

@@ -1,6 +1,7 @@
import voluptuous as vol
from esphomeyaml.helpers import App, add
from esphomeyaml.cpp_generator import add
from esphomeyaml.cpp_types import App
DEPENDENCIES = ['logger']

View File

@@ -1,9 +1,12 @@
import voluptuous as vol
from esphomeyaml import config_validation as cv, pins
from esphomeyaml.const import CONF_ID, CONF_NUMBER, CONF_RUN_CYCLES, CONF_RUN_DURATION, \
CONF_SLEEP_DURATION, CONF_WAKEUP_PIN
from esphomeyaml.helpers import App, Pvariable, add, gpio_input_pin_expression, esphomelib_ns
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
from esphomeyaml.const import CONF_ID, CONF_MODE, CONF_NUMBER, CONF_PINS, CONF_RUN_CYCLES, \
CONF_RUN_DURATION, CONF_SLEEP_DURATION, CONF_WAKEUP_PIN
from esphomeyaml.cpp_generator import Pvariable, StructInitializer, add, get_variable
from esphomeyaml.cpp_helpers import gpio_input_pin_expression, setup_component
from esphomeyaml.cpp_types import Action, App, Component, esphomelib_ns, global_ns
def validate_pin_number(value):
@@ -14,28 +17,41 @@ def validate_pin_number(value):
return value
DeepSleepComponent = esphomelib_ns.DeepSleepComponent
EnterDeepSleepAction = esphomelib_ns.EnterDeepSleepAction
PreventDeepSleepAction = esphomelib_ns.PreventDeepSleepAction
DeepSleepComponent = esphomelib_ns.class_('DeepSleepComponent', Component)
EnterDeepSleepAction = esphomelib_ns.class_('EnterDeepSleepAction', Action)
PreventDeepSleepAction = esphomelib_ns.class_('PreventDeepSleepAction', Action)
WakeupPinMode = esphomelib_ns.enum('WakeupPinMode')
WAKEUP_PIN_MODES = {
'IGNORE': esphomelib_ns.WAKEUP_PIN_MODE_IGNORE,
'KEEP_AWAKE': esphomelib_ns.WAKEUP_PIN_MODE_KEEP_AWAKE,
'INVERT_WAKEUP': esphomelib_ns.WAKEUP_PIN_MODE_INVERT_WAKEUP,
'IGNORE': WakeupPinMode.WAKEUP_PIN_MODE_IGNORE,
'KEEP_AWAKE': WakeupPinMode.WAKEUP_PIN_MODE_KEEP_AWAKE,
'INVERT_WAKEUP': WakeupPinMode.WAKEUP_PIN_MODE_INVERT_WAKEUP,
}
esp_sleep_ext1_wakeup_mode_t = global_ns.enum('esp_sleep_ext1_wakeup_mode_t')
Ext1Wakeup = esphomelib_ns.struct('Ext1Wakeup')
EXT1_WAKEUP_MODES = {
'ALL_LOW': esp_sleep_ext1_wakeup_mode_t.ESP_EXT1_WAKEUP_ALL_LOW,
'ANY_HIGH': esp_sleep_ext1_wakeup_mode_t.ESP_EXT1_WAKEUP_ANY_HIGH,
}
CONF_WAKEUP_PIN_MODE = 'wakeup_pin_mode'
CONF_ESP32_EXT1_WAKEUP = 'esp32_ext1_wakeup'
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(DeepSleepComponent),
vol.Optional(CONF_SLEEP_DURATION): cv.positive_time_period_milliseconds,
vol.Optional(CONF_WAKEUP_PIN): vol.All(cv.only_on_esp32, pins.internal_gpio_input_pin_schema,
validate_pin_number),
vol.Optional(CONF_WAKEUP_PIN_MODE): vol.All(cv.only_on_esp32, vol.Upper,
cv.one_of(*WAKEUP_PIN_MODES)),
vol.Optional(CONF_WAKEUP_PIN_MODE): vol.All(cv.only_on_esp32,
cv.one_of(*WAKEUP_PIN_MODES), upper=True),
vol.Optional(CONF_ESP32_EXT1_WAKEUP): vol.All(cv.only_on_esp32, vol.Schema({
vol.Required(CONF_PINS): cv.ensure_list(pins.shorthand_input_pin, validate_pin_number),
vol.Required(CONF_MODE): cv.one_of(*EXT1_WAKEUP_MODES, upper=True),
})),
vol.Optional(CONF_RUN_CYCLES): cv.positive_int,
vol.Optional(CONF_RUN_DURATION): cv.positive_time_period_milliseconds,
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
@@ -44,7 +60,6 @@ def to_code(config):
if CONF_SLEEP_DURATION in config:
add(deep_sleep.set_sleep_duration(config[CONF_SLEEP_DURATION]))
if CONF_WAKEUP_PIN in config:
pin = None
for pin in gpio_input_pin_expression(config[CONF_WAKEUP_PIN]):
yield
add(deep_sleep.set_wakeup_pin(pin))
@@ -55,5 +70,48 @@ def to_code(config):
if CONF_RUN_DURATION in config:
add(deep_sleep.set_run_duration(config[CONF_RUN_DURATION]))
if CONF_ESP32_EXT1_WAKEUP in config:
conf = config[CONF_ESP32_EXT1_WAKEUP]
mask = 0
for pin in conf[CONF_PINS]:
mask |= 1 << pin[CONF_NUMBER]
struct = StructInitializer(
Ext1Wakeup,
('mask', mask),
('wakeup_mode', EXT1_WAKEUP_MODES[conf[CONF_MODE]])
)
add(deep_sleep.set_ext1_wakeup(struct))
setup_component(deep_sleep, config)
BUILD_FLAGS = '-DUSE_DEEP_SLEEP'
CONF_DEEP_SLEEP_ENTER = 'deep_sleep.enter'
DEEP_SLEEP_ENTER_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(DeepSleepComponent),
})
@ACTION_REGISTRY.register(CONF_DEEP_SLEEP_ENTER, DEEP_SLEEP_ENTER_ACTION_SCHEMA)
def deep_sleep_enter_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_enter_deep_sleep_action(template_arg)
type = EnterDeepSleepAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
CONF_DEEP_SLEEP_PREVENT = 'deep_sleep.prevent'
DEEP_SLEEP_PREVENT_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(DeepSleepComponent),
})
@ACTION_REGISTRY.register(CONF_DEEP_SLEEP_PREVENT, DEEP_SLEEP_PREVENT_ACTION_SCHEMA)
def deep_sleep_prevent_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_prevent_deep_sleep_action(template_arg)
type = PreventDeepSleepAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)

View File

@@ -3,14 +3,16 @@ import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_LAMBDA, CONF_ROTATION, CONF_UPDATE_INTERVAL
from esphomeyaml.helpers import add, add_job, esphomelib_ns
from esphomeyaml.core import CORE
from esphomeyaml.cpp_generator import add
from esphomeyaml.cpp_types import esphomelib_ns
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
})
display_ns = esphomelib_ns.namespace('display')
DisplayBuffer = display_ns.DisplayBuffer
DisplayBuffer = display_ns.class_('DisplayBuffer')
DisplayBufferRef = DisplayBuffer.operator('ref')
DISPLAY_ROTATIONS = {
@@ -50,7 +52,7 @@ def setup_display_core_(display_var, config):
def setup_display(display_var, config):
add_job(setup_display_core_, display_var, config)
CORE.add_job(setup_display_core_, display_var, config)
BUILD_FLAGS = '-DUSE_DISPLAY'

View File

@@ -1,15 +1,17 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.components import display
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_DATA_PINS, CONF_DIMENSIONS, CONF_ENABLE_PIN, CONF_ID, \
CONF_LAMBDA, CONF_RS_PIN, CONF_RW_PIN
from esphomeyaml.helpers import App, Pvariable, add, gpio_output_pin_expression, process_lambda
from esphomeyaml.cpp_generator import Pvariable, add, process_lambda
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
from esphomeyaml.cpp_types import App, PollingComponent, void
GPIOLCDDisplay = display.display_ns.GPIOLCDDisplay
LCDDisplay = display.display_ns.LCDDisplay
LCDDisplay = display.display_ns.class_('LCDDisplay', PollingComponent)
LCDDisplayRef = LCDDisplay.operator('ref')
GPIOLCDDisplay = display.display_ns.class_('GPIOLCDDisplay', LCDDisplay)
def validate_lcd_dimensions(value):
@@ -36,7 +38,7 @@ PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
vol.Required(CONF_ENABLE_PIN): pins.gpio_output_pin_schema,
vol.Required(CONF_RS_PIN): pins.gpio_output_pin_schema,
vol.Optional(CONF_RW_PIN): pins.gpio_output_pin_schema,
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
@@ -62,11 +64,13 @@ def to_code(config):
add(lcd.set_rw_pin(rw))
if CONF_LAMBDA in config:
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')]):
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')],
return_type=void):
yield
add(lcd.set_writer(lambda_))
display.setup_display(lcd, config)
setup_component(lcd, config)
BUILD_FLAGS = '-DUSE_LCD_DISPLAY'

View File

@@ -1,20 +1,23 @@
import voluptuous as vol
from esphomeyaml.components import display, i2c
from esphomeyaml.components.display.lcd_gpio import LCDDisplay, LCDDisplayRef, \
validate_lcd_dimensions
import esphomeyaml.config_validation as cv
from esphomeyaml.components import display
from esphomeyaml.components.display.lcd_gpio import LCDDisplayRef, validate_lcd_dimensions
from esphomeyaml.const import CONF_ADDRESS, CONF_DIMENSIONS, CONF_ID, CONF_LAMBDA
from esphomeyaml.helpers import App, Pvariable, add, process_lambda
from esphomeyaml.cpp_generator import Pvariable, add, process_lambda
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, void
DEPENDENCIES = ['i2c']
PCF8574LCDDisplay = display.display_ns.PCF8574LCDDisplay
PCF8574LCDDisplay = display.display_ns.class_('PCF8574LCDDisplay', LCDDisplay, i2c.I2CDevice)
PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(PCF8574LCDDisplay),
vol.Required(CONF_DIMENSIONS): validate_lcd_dimensions,
vol.Optional(CONF_ADDRESS): cv.i2c_address,
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
@@ -25,11 +28,13 @@ def to_code(config):
add(lcd.set_address(config[CONF_ADDRESS]))
if CONF_LAMBDA in config:
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')]):
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')],
return_type=void):
yield
add(lcd.set_writer(lambda_))
display.setup_display(lcd, config)
setup_component(lcd, config)
BUILD_FLAGS = ['-DUSE_LCD_DISPLAY', '-DUSE_LCD_DISPLAY_PCF8574']

View File

@@ -1,17 +1,18 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.components import display
from esphomeyaml.components import display, spi
from esphomeyaml.components.spi import SPIComponent
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_CS_PIN, CONF_ID, CONF_INTENSITY, CONF_LAMBDA, CONF_NUM_CHIPS, \
CONF_SPI_ID
from esphomeyaml.helpers import App, Pvariable, add, get_variable, gpio_output_pin_expression, \
process_lambda
from esphomeyaml.cpp_generator import Pvariable, add, get_variable, process_lambda
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
from esphomeyaml.cpp_types import App, PollingComponent, void
DEPENDENCIES = ['spi']
MAX7219Component = display.display_ns.MAX7219Component
MAX7219Component = display.display_ns.class_('MAX7219Component', PollingComponent, spi.SPIDevice)
MAX7219ComponentRef = MAX7219Component.operator('ref')
PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
@@ -21,15 +22,15 @@ PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
vol.Optional(CONF_NUM_CHIPS): vol.All(cv.uint8_t, vol.Range(min=1)),
vol.Optional(CONF_INTENSITY): vol.All(cv.uint8_t, vol.Range(min=0, max=15)),
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
for spi in get_variable(config[CONF_SPI_ID]):
for spi_ in get_variable(config[CONF_SPI_ID]):
yield
for cs in gpio_output_pin_expression(config[CONF_CS_PIN]):
yield
rhs = App.make_max7219(spi, cs)
rhs = App.make_max7219(spi_, cs)
max7219 = Pvariable(config[CONF_ID], rhs)
if CONF_NUM_CHIPS in config:
@@ -38,11 +39,13 @@ def to_code(config):
add(max7219.set_intensity(config[CONF_INTENSITY]))
if CONF_LAMBDA in config:
for lambda_ in process_lambda(config[CONF_LAMBDA], [(MAX7219ComponentRef, 'it')]):
for lambda_ in process_lambda(config[CONF_LAMBDA], [(MAX7219ComponentRef, 'it')],
return_type=void):
yield
add(max7219.set_writer(lambda_))
display.setup_display(max7219, config)
setup_component(max7219, config)
BUILD_FLAGS = '-DUSE_MAX7219'

View File

@@ -1,32 +1,36 @@
import esphomeyaml.config_validation as cv
from esphomeyaml.components import display
from esphomeyaml.components import display, uart
from esphomeyaml.components.uart import UARTComponent
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ID, CONF_LAMBDA, CONF_UART_ID
from esphomeyaml.helpers import App, Pvariable, add, get_variable, process_lambda
from esphomeyaml.cpp_generator import Pvariable, add, get_variable, process_lambda
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, PollingComponent, void
DEPENDENCIES = ['uart']
Nextion = display.display_ns.Nextion
Nextion = display.display_ns.class_('Nextion', PollingComponent, uart.UARTDevice)
NextionRef = Nextion.operator('ref')
PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(Nextion),
cv.GenerateID(CONF_UART_ID): cv.use_variable_id(UARTComponent),
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
for uart in get_variable(config[CONF_UART_ID]):
for uart_ in get_variable(config[CONF_UART_ID]):
yield
rhs = App.make_nextion(uart)
rhs = App.make_nextion(uart_)
nextion = Pvariable(config[CONF_ID], rhs)
if CONF_LAMBDA in config:
for lambda_ in process_lambda(config[CONF_LAMBDA], [(NextionRef, 'it')]):
for lambda_ in process_lambda(config[CONF_LAMBDA], [(NextionRef, 'it')],
return_type=void):
yield
add(nextion.set_writer(lambda_))
display.setup_display(nextion, config)
setup_component(nextion, config)
BUILD_FLAGS = '-DUSE_NEXTION'

View File

@@ -1,17 +1,18 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.components import display
from esphomeyaml.components.display import ssd1306_spi
from esphomeyaml.const import CONF_ADDRESS, CONF_EXTERNAL_VCC, CONF_ID, \
CONF_MODEL, CONF_RESET_PIN, CONF_LAMBDA
from esphomeyaml.helpers import App, Pvariable, add, \
gpio_output_pin_expression, process_lambda
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ADDRESS, CONF_EXTERNAL_VCC, CONF_ID, CONF_LAMBDA, CONF_MODEL, \
CONF_RESET_PIN
from esphomeyaml.cpp_generator import Pvariable, add, process_lambda
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
from esphomeyaml.cpp_types import App, void
DEPENDENCIES = ['i2c']
I2CSSD1306 = display.display_ns.I2CSSD1306
I2CSSD1306 = display.display_ns.class_('I2CSSD1306', ssd1306_spi.SSD1306)
PLATFORM_SCHEMA = display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(I2CSSD1306),
@@ -19,7 +20,7 @@ PLATFORM_SCHEMA = display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
vol.Optional(CONF_RESET_PIN): pins.gpio_output_pin_schema,
vol.Optional(CONF_EXTERNAL_VCC): cv.boolean,
vol.Optional(CONF_ADDRESS): cv.i2c_address,
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
@@ -36,11 +37,12 @@ def to_code(config):
add(ssd.set_address(config[CONF_ADDRESS]))
if CONF_LAMBDA in config:
for lambda_ in process_lambda(config[CONF_LAMBDA],
[(display.DisplayBufferRef, 'it')]):
[(display.DisplayBufferRef, 'it')], return_type=void):
yield
add(ssd.set_writer(lambda_))
display.setup_display(ssd, config)
setup_component(ssd, config)
BUILD_FLAGS = '-DUSE_SSD1306'

View File

@@ -1,31 +1,33 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.components import display
from esphomeyaml.components import display, spi
from esphomeyaml.components.spi import SPIComponent
from esphomeyaml.const import CONF_CS_PIN, CONF_DC_PIN, CONF_EXTERNAL_VCC, \
CONF_ID, CONF_MODEL, \
CONF_RESET_PIN, CONF_SPI_ID, CONF_LAMBDA
from esphomeyaml.helpers import App, Pvariable, add, get_variable, \
gpio_output_pin_expression, process_lambda
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_CS_PIN, CONF_DC_PIN, CONF_EXTERNAL_VCC, CONF_ID, CONF_LAMBDA, \
CONF_MODEL, CONF_RESET_PIN, CONF_SPI_ID
from esphomeyaml.cpp_generator import Pvariable, add, get_variable, process_lambda
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
from esphomeyaml.cpp_types import App, PollingComponent, void
DEPENDENCIES = ['spi']
SPISSD1306 = display.display_ns.SPISSD1306
SSD1306 = display.display_ns.class_('SSD1306', PollingComponent, display.DisplayBuffer)
SPISSD1306 = display.display_ns.class_('SPISSD1306', SSD1306, spi.SPIDevice)
SSD1306Model = display.display_ns.enum('SSD1306Model')
MODELS = {
'SSD1306_128X32': display.display_ns.SSD1306_MODEL_128_32,
'SSD1306_128X64': display.display_ns.SSD1306_MODEL_128_64,
'SSD1306_96X16': display.display_ns.SSD1306_MODEL_96_16,
'SSD1306_64X48': display.display_ns.SSD1306_MODEL_64_48,
'SH1106_128X32': display.display_ns.SH1106_MODEL_128_32,
'SH1106_128X64': display.display_ns.SH1106_MODEL_128_64,
'SH1106_96X16': display.display_ns.SH1106_MODEL_96_16,
'SH1106_64X48': display.display_ns.SH1106_MODEL_64_48,
'SSD1306_128X32': SSD1306Model.SSD1306_MODEL_128_32,
'SSD1306_128X64': SSD1306Model.SSD1306_MODEL_128_64,
'SSD1306_96X16': SSD1306Model.SSD1306_MODEL_96_16,
'SSD1306_64X48': SSD1306Model.SSD1306_MODEL_64_48,
'SH1106_128X32': SSD1306Model.SH1106_MODEL_128_32,
'SH1106_128X64': SSD1306Model.SH1106_MODEL_128_64,
'SH1106_96X16': SSD1306Model.SH1106_MODEL_96_16,
'SH1106_64X48': SSD1306Model.SH1106_MODEL_64_48,
}
SSD1306_MODEL = vol.All(vol.Upper, vol.Replace(' ', '_'), cv.one_of(*MODELS))
SSD1306_MODEL = cv.one_of(*MODELS, upper=True, space="_")
PLATFORM_SCHEMA = display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(SPISSD1306),
@@ -35,18 +37,18 @@ PLATFORM_SCHEMA = display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
vol.Required(CONF_MODEL): SSD1306_MODEL,
vol.Optional(CONF_RESET_PIN): pins.gpio_output_pin_schema,
vol.Optional(CONF_EXTERNAL_VCC): cv.boolean,
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
for spi in get_variable(config[CONF_SPI_ID]):
for spi_ in get_variable(config[CONF_SPI_ID]):
yield
for cs in gpio_output_pin_expression(config[CONF_CS_PIN]):
yield
for dc in gpio_output_pin_expression(config[CONF_DC_PIN]):
yield
rhs = App.make_spi_ssd1306(spi, cs, dc)
rhs = App.make_spi_ssd1306(spi_, cs, dc)
ssd = Pvariable(config[CONF_ID], rhs)
add(ssd.set_model(MODELS[config[CONF_MODEL]]))
@@ -58,11 +60,12 @@ def to_code(config):
add(ssd.set_external_vcc(config[CONF_EXTERNAL_VCC]))
if CONF_LAMBDA in config:
for lambda_ in process_lambda(config[CONF_LAMBDA],
[(display.DisplayBufferRef, 'it')]):
[(display.DisplayBufferRef, 'it')], return_type=void):
yield
add(ssd.set_writer(lambda_))
display.setup_display(ssd, config)
setup_component(ssd, config)
BUILD_FLAGS = '-DUSE_SSD1306'

View File

@@ -2,25 +2,32 @@ import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.components import display
from esphomeyaml.components import display, spi
from esphomeyaml.components.spi import SPIComponent
from esphomeyaml.const import CONF_BUSY_PIN, CONF_CS_PIN, CONF_DC_PIN, CONF_FULL_UPDATE_EVERY, \
CONF_ID, CONF_LAMBDA, CONF_MODEL, CONF_RESET_PIN, CONF_SPI_ID
from esphomeyaml.helpers import App, Pvariable, add, get_variable, gpio_input_pin_expression, \
gpio_output_pin_expression, process_lambda
from esphomeyaml.cpp_generator import get_variable, Pvariable, process_lambda, add
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, gpio_input_pin_expression, \
setup_component
from esphomeyaml.cpp_types import PollingComponent, App, void
DEPENDENCIES = ['spi']
WaveshareEPaperTypeA = display.display_ns.WaveshareEPaperTypeA
WaveshareEPaper = display.display_ns.WaveshareEPaper
WaveshareEPaper = display.display_ns.class_('WaveshareEPaper',
PollingComponent, spi.SPIDevice, display.DisplayBuffer)
WaveshareEPaperTypeAModel = display.display_ns.enum('WaveshareEPaperTypeAModel')
WaveshareEPaperTypeBModel = display.display_ns.enum('WaveshareEPaperTypeBModel')
MODELS = {
'1.54in': ('a', display.display_ns.WAVESHARE_EPAPER_1_54_IN),
'2.13in': ('a', display.display_ns.WAVESHARE_EPAPER_2_13_IN),
'2.90in': ('a', display.display_ns.WAVESHARE_EPAPER_2_9_IN),
'2.70in': ('b', display.display_ns.WAVESHARE_EPAPER_2_7_IN),
'4.20in': ('b', display.display_ns.WAVESHARE_EPAPER_4_2_IN),
'7.50in': ('b', display.display_ns.WAVESHARE_EPAPER_7_5_IN),
'1.54in': ('a', WaveshareEPaperTypeAModel.WAVESHARE_EPAPER_1_54_IN),
'2.13in': ('a', WaveshareEPaperTypeAModel.WAVESHARE_EPAPER_2_13_IN),
'2.90in': ('a', WaveshareEPaperTypeAModel.WAVESHARE_EPAPER_2_9_IN),
'2.70in': ('b', WaveshareEPaperTypeBModel.WAVESHARE_EPAPER_2_7_IN),
'4.20in': ('b', WaveshareEPaperTypeBModel.WAVESHARE_EPAPER_4_2_IN),
'7.50in': ('b', WaveshareEPaperTypeBModel.WAVESHARE_EPAPER_7_5_IN),
}
@@ -34,19 +41,19 @@ def validate_full_update_every_only_type_a(value):
PLATFORM_SCHEMA = vol.All(display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(None),
cv.GenerateID(): cv.declare_variable_id(WaveshareEPaper),
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
vol.Required(CONF_DC_PIN): pins.gpio_output_pin_schema,
vol.Required(CONF_MODEL): vol.All(vol.Lower, cv.one_of(*MODELS)),
vol.Required(CONF_MODEL): cv.one_of(*MODELS, lower=True),
vol.Optional(CONF_RESET_PIN): pins.gpio_output_pin_schema,
vol.Optional(CONF_BUSY_PIN): pins.gpio_input_pin_schema,
vol.Optional(CONF_FULL_UPDATE_EVERY): cv.uint32_t,
}), validate_full_update_every_only_type_a)
}).extend(cv.COMPONENT_SCHEMA.schema), validate_full_update_every_only_type_a)
def to_code(config):
for spi in get_variable(config[CONF_SPI_ID]):
for spi_ in get_variable(config[CONF_SPI_ID]):
yield
for cs in gpio_output_pin_expression(config[CONF_CS_PIN]):
yield
@@ -55,16 +62,17 @@ def to_code(config):
model_type, model = MODELS[config[CONF_MODEL]]
if model_type == 'a':
rhs = App.make_waveshare_epaper_type_a(spi, cs, dc, model)
rhs = App.make_waveshare_epaper_type_a(spi_, cs, dc, model)
epaper = Pvariable(config[CONF_ID], rhs, type=WaveshareEPaperTypeA)
elif model_type == 'b':
rhs = App.make_waveshare_epaper_type_b(spi, cs, dc, model)
rhs = App.make_waveshare_epaper_type_b(spi_, cs, dc, model)
epaper = Pvariable(config[CONF_ID], rhs, type=WaveshareEPaper)
else:
raise NotImplementedError()
if CONF_LAMBDA in config:
for lambda_ in process_lambda(config[CONF_LAMBDA], [(display.DisplayBufferRef, 'it')]):
for lambda_ in process_lambda(config[CONF_LAMBDA], [(display.DisplayBufferRef, 'it')],
return_type=void):
yield
add(epaper.set_writer(lambda_))
if CONF_RESET_PIN in config:
@@ -79,6 +87,7 @@ def to_code(config):
add(epaper.set_full_update_every(config[CONF_FULL_UPDATE_EVERY]))
display.setup_display(epaper, config)
setup_component(epaper, config)
BUILD_FLAGS = '-DUSE_WAVESHARE_EPAPER'

View File

@@ -1,5 +0,0 @@
from esphomeyaml import config_validation as cv
CONFIG_SCHEMA = cv.invalid("The 'esp32_ble' component has been renamed to the 'esp32_ble_tracker' "
"component in order to avoid confusion with the new 'esp32_ble_beacon' "
"component.")

View File

@@ -1,29 +1,31 @@
import voluptuous as vol
from esphomeyaml import config_validation as cv
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, ESP_PLATFORM_ESP32, CONF_UUID, CONF_TYPE
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, RawExpression, ArrayInitializer
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, CONF_TYPE, CONF_UUID, ESP_PLATFORM_ESP32
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, RawExpression, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
ESP32BLEBeacon = esphomelib_ns.ESP32BLEBeacon
ESP32BLEBeacon = esphomelib_ns.class_('ESP32BLEBeacon', Component)
CONF_MAJOR = 'major'
CONF_MINOR = 'minor'
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(ESP32BLEBeacon),
vol.Required(CONF_TYPE): vol.All(vol.Upper, cv.one_of('IBEACON')),
vol.Required(CONF_TYPE): cv.one_of('IBEACON', upper=True),
vol.Required(CONF_UUID): cv.uuid,
vol.Optional(CONF_MAJOR): cv.uint16_t,
vol.Optional(CONF_MINOR): cv.uint16_t,
vol.Optional(CONF_SCAN_INTERVAL): cv.positive_time_period_milliseconds,
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
uuid = config[CONF_UUID].hex
uuid_arr = [RawExpression('0x{}'.format(uuid[i:i+2])) for i in range(0, len(uuid), 2)]
uuid_arr = [RawExpression('0x{}'.format(uuid[i:i + 2])) for i in range(0, len(uuid), 2)]
rhs = App.make_esp32_ble_beacon(ArrayInitializer(*uuid_arr, multiline=False))
ble = Pvariable(config[CONF_ID], rhs)
if CONF_MAJOR in config:
@@ -31,5 +33,7 @@ def to_code(config):
if CONF_MINOR in config:
add(ble.set_minor(config[CONF_MINOR]))
setup_component(ble, config)
BUILD_FLAGS = '-DUSE_ESP32_BLE_BEACON'

View File

@@ -1,19 +1,27 @@
import voluptuous as vol
from esphomeyaml import config_validation as cv
from esphomeyaml.components import sensor
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, ESP_PLATFORM_ESP32
from esphomeyaml.core import HexInt
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, ArrayInitializer
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
CONF_ESP32_BLE_ID = 'esp32_ble_id'
ESP32BLETracker = esphomelib_ns.ESP32BLETracker
ESP32BLETracker = esphomelib_ns.class_('ESP32BLETracker', Component)
XiaomiSensor = esphomelib_ns.class_('XiaomiSensor', sensor.Sensor)
XiaomiDevice = esphomelib_ns.class_('XiaomiDevice')
XIAOMI_SENSOR_SCHEMA = sensor.SENSOR_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(XiaomiSensor)
})
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(ESP32BLETracker),
vol.Optional(CONF_SCAN_INTERVAL): cv.positive_time_period_milliseconds,
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def make_address_array(address):
@@ -27,5 +35,7 @@ def to_code(config):
if CONF_SCAN_INTERVAL in config:
add(ble.set_scan_interval(config[CONF_SCAN_INTERVAL]))
setup_component(ble, config)
BUILD_FLAGS = '-DUSE_ESP32_BLE_TRACKER'

View File

@@ -2,11 +2,13 @@ import voluptuous as vol
from esphomeyaml import config_validation as cv
from esphomeyaml.components import binary_sensor
from esphomeyaml.const import CONF_ID, CONF_SETUP_MODE, CONF_IIR_FILTER, \
CONF_SLEEP_DURATION, CONF_MEASUREMENT_DURATION, CONF_LOW_VOLTAGE_REFERENCE, \
CONF_HIGH_VOLTAGE_REFERENCE, CONF_VOLTAGE_ATTENUATION, ESP_PLATFORM_ESP32
from esphomeyaml.const import CONF_HIGH_VOLTAGE_REFERENCE, CONF_ID, CONF_IIR_FILTER, \
CONF_LOW_VOLTAGE_REFERENCE, CONF_MEASUREMENT_DURATION, CONF_SETUP_MODE, CONF_SLEEP_DURATION, \
CONF_VOLTAGE_ATTENUATION, ESP_PLATFORM_ESP32
from esphomeyaml.core import TimePeriod
from esphomeyaml.helpers import App, Pvariable, add, global_ns
from esphomeyaml.cpp_generator import Pvariable, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, Component, global_ns
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
@@ -19,6 +21,7 @@ def validate_voltage(values):
if not value.endswith('V'):
value += 'V'
return cv.one_of(*values)(value)
return validator
@@ -41,7 +44,7 @@ VOLTAGE_ATTENUATION = {
'0V': global_ns.TOUCH_HVOLT_ATTEN_0V,
}
ESP32TouchComponent = binary_sensor.binary_sensor_ns.ESP32TouchComponent
ESP32TouchComponent = binary_sensor.binary_sensor_ns.class_('ESP32TouchComponent', Component)
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(ESP32TouchComponent),
@@ -54,7 +57,7 @@ CONFIG_SCHEMA = vol.Schema({
vol.Optional(CONF_LOW_VOLTAGE_REFERENCE): validate_voltage(LOW_VOLTAGE_REFERENCE),
vol.Optional(CONF_HIGH_VOLTAGE_REFERENCE): validate_voltage(HIGH_VOLTAGE_REFERENCE),
vol.Optional(CONF_VOLTAGE_ATTENUATION): validate_voltage(VOLTAGE_ATTENUATION),
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
@@ -80,5 +83,7 @@ def to_code(config):
value = VOLTAGE_ATTENUATION[config[CONF_VOLTAGE_ATTENUATION]]
add(touch.set_voltage_attenuation(value))
setup_component(touch, config)
BUILD_FLAGS = '-DUSE_ESP32_TOUCH_BINARY_SENSOR'

View File

@@ -0,0 +1,73 @@
import voluptuous as vol
from esphomeyaml import pins
from esphomeyaml.components import wifi
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_DOMAIN, CONF_HOSTNAME, CONF_ID, CONF_MANUAL_IP, CONF_TYPE, \
ESP_PLATFORM_ESP32
from esphomeyaml.cpp_generator import Pvariable, add
from esphomeyaml.cpp_helpers import gpio_output_pin_expression
from esphomeyaml.cpp_types import App, Component, esphomelib_ns, global_ns
CONFLICTS_WITH = ['wifi']
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
CONF_PHY_ADDR = 'phy_addr'
CONF_MDC_PIN = 'mdc_pin'
CONF_MDIO_PIN = 'mdio_pin'
CONF_CLK_MODE = 'clk_mode'
CONF_POWER_PIN = 'power_pin'
EthernetType = esphomelib_ns.enum('EthernetType')
ETHERNET_TYPES = {
'LAN8720': EthernetType.ETHERNET_TYPE_LAN8720,
'TLK110': EthernetType.ETHERNET_TYPE_TLK110,
}
eth_clock_mode_t = global_ns.enum('eth_clock_mode_t')
CLK_MODES = {
'GPIO0_IN': eth_clock_mode_t.ETH_CLOCK_GPIO0_IN,
'GPIO0_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO0_OUT,
'GPIO16_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO16_OUT,
'GPIO17_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO17_OUT,
}
EthernetComponent = esphomelib_ns.class_('EthernetComponent', Component)
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(EthernetComponent),
vol.Required(CONF_TYPE): cv.one_of(*ETHERNET_TYPES, upper=True),
vol.Required(CONF_MDC_PIN): pins.output_pin,
vol.Required(CONF_MDIO_PIN): pins.input_output_pin,
vol.Optional(CONF_CLK_MODE, default='GPIO0_IN'): cv.one_of(*CLK_MODES, upper=True, space='_'),
vol.Optional(CONF_PHY_ADDR, default=0): vol.All(cv.int_, vol.Range(min=0, max=31)),
vol.Optional(CONF_POWER_PIN): pins.gpio_output_pin_schema,
vol.Optional(CONF_MANUAL_IP): wifi.STA_MANUAL_IP_SCHEMA,
vol.Optional(CONF_HOSTNAME): cv.hostname,
vol.Optional(CONF_DOMAIN, default='.local'): cv.domain_name,
})
def to_code(config):
rhs = App.init_ethernet()
eth = Pvariable(config[CONF_ID], rhs)
add(eth.set_phy_addr(config[CONF_PHY_ADDR]))
add(eth.set_mdc_pin(config[CONF_MDC_PIN]))
add(eth.set_mdio_pin(config[CONF_MDIO_PIN]))
add(eth.set_type(ETHERNET_TYPES[config[CONF_TYPE]]))
add(eth.set_clk_mode(CLK_MODES[config[CONF_CLK_MODE]]))
if CONF_POWER_PIN in config:
for pin in gpio_output_pin_expression(config[CONF_POWER_PIN]):
yield
add(eth.set_power_pin(pin))
if CONF_HOSTNAME in config:
add(eth.set_hostname(config[CONF_HOSTNAME]))
if CONF_MANUAL_IP in config:
add(eth.set_manual_ip(wifi.manual_ip(config[CONF_MANUAL_IP])))
REQUIRED_BUILD_FLAGS = '-DUSE_ETHERNET'

View File

@@ -1,37 +1,46 @@
import voluptuous as vol
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
from esphomeyaml.components import mqtt
from esphomeyaml.components.mqtt import setup_mqtt_component
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ID, CONF_MQTT_ID, CONF_OSCILLATION_COMMAND_TOPIC, \
CONF_OSCILLATION_STATE_TOPIC, CONF_SPEED_COMMAND_TOPIC, CONF_SPEED_STATE_TOPIC, CONF_INTERNAL
from esphomeyaml.helpers import Application, Pvariable, add, esphomelib_ns, setup_mqtt_component
from esphomeyaml.const import CONF_ID, CONF_INTERNAL, CONF_MQTT_ID, CONF_NAME, CONF_OSCILLATING, \
CONF_OSCILLATION_COMMAND_TOPIC, CONF_OSCILLATION_OUTPUT, CONF_OSCILLATION_STATE_TOPIC, \
CONF_SPEED, CONF_SPEED_COMMAND_TOPIC, CONF_SPEED_STATE_TOPIC
from esphomeyaml.cpp_generator import add, Pvariable, get_variable, templatable
from esphomeyaml.cpp_types import Application, Component, Nameable, esphomelib_ns, Action, bool_
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
})
fan_ns = esphomelib_ns.namespace('fan')
FanState = fan_ns.FanState
MQTTFanComponent = fan_ns.MQTTFanComponent
MakeFan = Application.MakeFan
TurnOnAction = fan_ns.TurnOnAction
TurnOffAction = fan_ns.TurnOffAction
ToggleAction = fan_ns.ToggleAction
FanSpeed = fan_ns.FanSpeed
FAN_SPEED_OFF = fan_ns.FAN_SPEED_OFF
FAN_SPEED_LOW = fan_ns.FAN_SPEED_LOW
FAN_SPEED_MEDIUM = fan_ns.FAN_SPEED_MEDIUM
FAN_SPEED_HIGH = fan_ns.FAN_SPEED_HIGH
FanState = fan_ns.class_('FanState', Nameable, Component)
MQTTFanComponent = fan_ns.class_('MQTTFanComponent', mqtt.MQTTComponent)
MakeFan = Application.struct('MakeFan')
# Actions
TurnOnAction = fan_ns.class_('TurnOnAction', Action)
TurnOffAction = fan_ns.class_('TurnOffAction', Action)
ToggleAction = fan_ns.class_('ToggleAction', Action)
FanSpeed = fan_ns.enum('FanSpeed')
FAN_SPEED_OFF = FanSpeed.FAN_SPEED_OFF
FAN_SPEED_LOW = FanSpeed.FAN_SPEED_LOW
FAN_SPEED_MEDIUM = FanSpeed.FAN_SPEED_MEDIUM
FAN_SPEED_HIGH = FanSpeed.FAN_SPEED_HIGH
FAN_SCHEMA = cv.MQTT_COMMAND_COMPONENT_SCHEMA.extend({
cv.GenerateID(): cv.declare_variable_id(FanState),
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTFanComponent),
vol.Optional(CONF_OSCILLATION_STATE_TOPIC): cv.publish_topic,
vol.Optional(CONF_OSCILLATION_COMMAND_TOPIC): cv.subscribe_topic,
vol.Optional(CONF_OSCILLATION_STATE_TOPIC): vol.All(cv.requires_component('mqtt'),
cv.publish_topic),
vol.Optional(CONF_OSCILLATION_COMMAND_TOPIC): vol.All(cv.requires_component('mqtt'),
cv.subscribe_topic),
})
FAN_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(FAN_SCHEMA.schema)
FAN_SPEEDS = {
'OFF': FAN_SPEED_OFF,
'LOW': FAN_SPEED_LOW,
@@ -40,10 +49,6 @@ FAN_SPEEDS = {
}
def validate_fan_speed(value):
return vol.All(vol.Upper, cv.one_of(*FAN_SPEEDS))(value)
def setup_fan_core_(fan_var, mqtt_var, config):
if CONF_INTERNAL in config:
add(fan_var.set_internal(config[CONF_INTERNAL]))
@@ -66,3 +71,70 @@ def setup_fan(fan_obj, mqtt_obj, config):
BUILD_FLAGS = '-DUSE_FAN'
CONF_FAN_TOGGLE = 'fan.toggle'
FAN_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(FanState),
})
@ACTION_REGISTRY.register(CONF_FAN_TOGGLE, FAN_TOGGLE_ACTION_SCHEMA)
def fan_toggle_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_toggle_action(template_arg)
type = ToggleAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
CONF_FAN_TURN_OFF = 'fan.turn_off'
FAN_TURN_OFF_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(FanState),
})
@ACTION_REGISTRY.register(CONF_FAN_TURN_OFF, FAN_TURN_OFF_ACTION_SCHEMA)
def fan_turn_off_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_off_action(template_arg)
type = TurnOffAction.template(arg_type)
yield Pvariable(action_id, rhs, type=type)
CONF_FAN_TURN_ON = 'fan.turn_on'
FAN_TURN_ON_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(FanState),
vol.Optional(CONF_OSCILLATING): cv.templatable(cv.boolean),
vol.Optional(CONF_SPEED): cv.templatable(cv.one_of(*FAN_SPEEDS, upper=True)),
})
@ACTION_REGISTRY.register(CONF_FAN_TURN_ON, FAN_TURN_ON_ACTION_SCHEMA)
def fan_turn_on_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_on_action(template_arg)
type = TurnOnAction.template(arg_type)
action = Pvariable(action_id, rhs, type=type)
if CONF_OSCILLATING in config:
for template_ in templatable(config[CONF_OSCILLATING], arg_type, bool_):
yield None
add(action.set_oscillating(template_))
if CONF_SPEED in config:
for template_ in templatable(config[CONF_SPEED], arg_type, FanSpeed):
yield None
add(action.set_speed(template_))
yield action
def core_to_hass_config(data, config):
ret = mqtt.build_hass_config(data, 'fan', config, include_state=True, include_command=True)
if ret is None:
return None
if CONF_OSCILLATION_OUTPUT in config:
default = mqtt.get_default_topic_for(data, 'fan', config[CONF_NAME], 'oscillation/state')
ret['oscillation_state_topic'] = config.get(CONF_OSCILLATION_STATE_TOPIC, default)
default = mqtt.get_default_topic_for(data, 'fan', config[CONF_NAME], 'oscillation/command')
ret['oscillation_command__topic'] = config.get(CONF_OSCILLATION_COMMAND_TOPIC, default)
return ret

View File

@@ -1,29 +1,34 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.components import fan
from esphomeyaml.components import fan, output
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_OSCILLATION_OUTPUT, CONF_OUTPUT
from esphomeyaml.helpers import App, add, get_variable, variable
from esphomeyaml.cpp_generator import get_variable, variable, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App
PLATFORM_SCHEMA = cv.nameable(fan.FAN_PLATFORM_SCHEMA.extend({
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(fan.MakeFan),
vol.Required(CONF_OUTPUT): cv.use_variable_id(None),
vol.Optional(CONF_OSCILLATION_OUTPUT): cv.use_variable_id(None),
}))
vol.Required(CONF_OUTPUT): cv.use_variable_id(output.BinaryOutput),
vol.Optional(CONF_OSCILLATION_OUTPUT): cv.use_variable_id(output.BinaryOutput),
}).extend(cv.COMPONENT_SCHEMA.schema))
def to_code(config):
output = None
for output in get_variable(config[CONF_OUTPUT]):
for output_ in get_variable(config[CONF_OUTPUT]):
yield
rhs = App.make_fan(config[CONF_NAME])
fan_struct = variable(config[CONF_MAKE_ID], rhs)
add(fan_struct.Poutput.set_binary(output))
add(fan_struct.Poutput.set_binary(output_))
if CONF_OSCILLATION_OUTPUT in config:
oscillation_output = None
for oscillation_output in get_variable(config[CONF_OSCILLATION_OUTPUT]):
yield
add(fan_struct.Poutput.set_oscillation(oscillation_output))
fan.setup_fan(fan_struct.Pstate, fan_struct.Pmqtt, config)
setup_component(fan_struct.Poutput, config)
def to_hass_config(data, config):
return fan.core_to_hass_config(data, config)

View File

@@ -1,45 +1,55 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.components import fan
from esphomeyaml.components import fan, mqtt, output
from esphomeyaml.const import CONF_HIGH, CONF_LOW, CONF_MAKE_ID, CONF_MEDIUM, CONF_NAME, \
CONF_OSCILLATION_OUTPUT, CONF_OUTPUT, CONF_SPEED, CONF_SPEED_COMMAND_TOPIC, \
CONF_SPEED_STATE_TOPIC
from esphomeyaml.helpers import App, add, get_variable, variable
from esphomeyaml.cpp_generator import get_variable, variable, add
from esphomeyaml.cpp_types import App
PLATFORM_SCHEMA = cv.nameable(fan.FAN_PLATFORM_SCHEMA.extend({
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(fan.MakeFan),
vol.Required(CONF_OUTPUT): cv.use_variable_id(None),
vol.Required(CONF_OUTPUT): cv.use_variable_id(output.FloatOutput),
vol.Optional(CONF_SPEED_STATE_TOPIC): cv.publish_topic,
vol.Optional(CONF_SPEED_COMMAND_TOPIC): cv.subscribe_topic,
vol.Optional(CONF_OSCILLATION_OUTPUT): cv.use_variable_id(None),
vol.Optional(CONF_OSCILLATION_OUTPUT): cv.use_variable_id(output.BinaryOutput),
vol.Optional(CONF_SPEED): vol.Schema({
vol.Required(CONF_LOW): cv.percentage,
vol.Required(CONF_MEDIUM): cv.percentage,
vol.Required(CONF_HIGH): cv.percentage,
}),
}))
}).extend(cv.COMPONENT_SCHEMA.schema))
def to_code(config):
output = None
for output in get_variable(config[CONF_OUTPUT]):
for output_ in get_variable(config[CONF_OUTPUT]):
yield
rhs = App.make_fan(config[CONF_NAME])
fan_struct = variable(config[CONF_MAKE_ID], rhs)
if CONF_SPEED in config:
speeds = config[CONF_SPEED]
add(fan_struct.Poutput.set_speed(output, 0.0,
add(fan_struct.Poutput.set_speed(output_,
speeds[CONF_LOW],
speeds[CONF_MEDIUM],
speeds[CONF_HIGH]))
else:
add(fan_struct.Poutput.set_speed(output))
add(fan_struct.Poutput.set_speed(output_))
if CONF_OSCILLATION_OUTPUT in config:
oscillation_output = None
for oscillation_output in get_variable(config[CONF_OSCILLATION_OUTPUT]):
yield
add(fan_struct.Poutput.set_oscillation(oscillation_output))
fan.setup_fan(fan_struct.Pstate, fan_struct.Pmqtt, config)
def to_hass_config(data, config):
ret = fan.core_to_hass_config(data, config)
if ret is None:
return None
default = mqtt.get_default_topic_for(data, 'fan', config[CONF_NAME], 'speed/state')
ret['speed_state_topic'] = config.get(CONF_SPEED_STATE_TOPIC, default)
default = mqtt.get_default_topic_for(data, 'fan', config[CONF_NAME], 'speed/command')
ret['speed_command__topic'] = config.get(CONF_SPEED_COMMAND_TOPIC, default)
return ret

View File

@@ -1,18 +1,20 @@
# coding=utf-8
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import core
from esphomeyaml.components import display
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_FILE, CONF_GLYPHS, CONF_ID, CONF_SIZE
from esphomeyaml.core import HexInt
from esphomeyaml.helpers import App, ArrayInitializer, MockObj, Pvariable, RawExpression, add, \
relative_path
from esphomeyaml.core import CORE, HexInt
from esphomeyaml.cpp_generator import ArrayInitializer, MockObj, Pvariable, RawExpression, add
from esphomeyaml.cpp_types import App
from esphomeyaml.py_compat import sort_by_cmp
DEPENDENCIES = ['display']
MULTI_CONF = True
Font = display.display_ns.Font
Glyph = display.display_ns.Glyph
Font = display.display_ns.class_('Font')
Glyph = display.display_ns.class_('Glyph')
def validate_glyphs(value):
@@ -32,12 +34,11 @@ def validate_glyphs(value):
if len(x_) < len(y_):
return -1
elif len(x_) > len(y_):
if len(x_) > len(y_):
return 1
else:
raise vol.Invalid(u"Found duplicate glyph {}".format(x))
raise vol.Invalid(u"Found duplicate glyph {}".format(x))
value.sort(cmp=comparator)
sort_by_cmp(value, comparator)
return value
@@ -46,11 +47,11 @@ def validate_pillow_installed(value):
import PIL
except ImportError:
raise vol.Invalid("Please install the pillow python package to use this feature. "
"(pip2 install pillow)")
"(pip install pillow)")
if PIL.__version__[0] < '4':
raise vol.Invalid("Please update your pillow installation to at least 4.0.x. "
"(pip2 install -U pillow)")
"(pip install -U pillow)")
return value
@@ -76,46 +77,45 @@ FONT_SCHEMA = vol.Schema({
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(None),
})
CONFIG_SCHEMA = vol.All(validate_pillow_installed, cv.ensure_list, [FONT_SCHEMA])
CONFIG_SCHEMA = vol.All(validate_pillow_installed, FONT_SCHEMA)
def to_code(config):
from PIL import ImageFont
for conf in config:
path = relative_path(conf[CONF_FILE])
try:
font = ImageFont.truetype(path, conf[CONF_SIZE])
except Exception as e:
raise core.ESPHomeYAMLError(u"Could not load truetype file {}: {}".format(path, e))
path = CORE.relative_path(config[CONF_FILE])
try:
font = ImageFont.truetype(path, config[CONF_SIZE])
except Exception as e:
raise core.EsphomeyamlError(u"Could not load truetype file {}: {}".format(path, e))
ascent, descent = font.getmetrics()
ascent, descent = font.getmetrics()
glyph_args = {}
data = []
for glyph in conf[CONF_GLYPHS]:
mask = font.getmask(glyph, mode='1')
_, (offset_x, offset_y) = font.font.getsize(glyph)
width, height = mask.size
width8 = ((width + 7) // 8) * 8
glyph_data = [0 for _ in range(height * width8 // 8)] # noqa: F812
for y in range(height):
for x in range(width):
if not mask.getpixel((x, y)):
continue
pos = x + y * width8
glyph_data[pos // 8] |= 0x80 >> (pos % 8)
glyph_args[glyph] = (len(data), offset_x, offset_y, width, height)
data += glyph_data
glyph_args = {}
data = []
for glyph in config[CONF_GLYPHS]:
mask = font.getmask(glyph, mode='1')
_, (offset_x, offset_y) = font.font.getsize(glyph)
width, height = mask.size
width8 = ((width + 7) // 8) * 8
glyph_data = [0 for _ in range(height * width8 // 8)] # noqa: F812
for y in range(height):
for x in range(width):
if not mask.getpixel((x, y)):
continue
pos = x + y * width8
glyph_data[pos // 8] |= 0x80 >> (pos % 8)
glyph_args[glyph] = (len(data), offset_x, offset_y, width, height)
data += glyph_data
raw_data = MockObj(conf[CONF_RAW_DATA_ID])
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
raw_data, len(data),
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
raw_data = MockObj(config[CONF_RAW_DATA_ID])
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
raw_data, len(data),
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
glyphs = []
for glyph in conf[CONF_GLYPHS]:
glyphs.append(Glyph(glyph, raw_data, *glyph_args[glyph]))
glyphs = []
for glyph in config[CONF_GLYPHS]:
glyphs.append(Glyph(glyph, raw_data, *glyph_args[glyph]))
rhs = App.make_font(ArrayInitializer(*glyphs), ascent, ascent + descent)
Pvariable(conf[CONF_ID], rhs)
rhs = App.make_font(ArrayInitializer(*glyphs), ascent, ascent + descent)
Pvariable(config[CONF_ID], rhs)

View File

@@ -0,0 +1,35 @@
import voluptuous as vol
from esphomeyaml import config_validation as cv
from esphomeyaml.const import CONF_ID, CONF_INITIAL_VALUE, CONF_RESTORE_VALUE, CONF_TYPE
from esphomeyaml.cpp_generator import Pvariable, RawExpression, TemplateArguments, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
GlobalVariableComponent = esphomelib_ns.class_('GlobalVariableComponent', Component)
MULTI_CONF = True
CONFIG_SCHEMA = vol.Schema({
vol.Required(CONF_ID): cv.declare_variable_id(GlobalVariableComponent),
vol.Required(CONF_TYPE): cv.string_strict,
vol.Optional(CONF_INITIAL_VALUE): cv.string_strict,
vol.Optional(CONF_RESTORE_VALUE): cv.boolean,
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
type_ = RawExpression(config[CONF_TYPE])
template_args = TemplateArguments(type_)
res_type = GlobalVariableComponent.template(template_args)
initial_value = None
if CONF_INITIAL_VALUE in config:
initial_value = RawExpression(config[CONF_INITIAL_VALUE])
rhs = App.Pmake_global_variable(template_args, initial_value)
glob = Pvariable(config[CONF_ID], rhs, type=res_type)
if config.get(CONF_RESTORE_VALUE, False):
hash_ = hash(config[CONF_ID].id) % 2**32
add(glob.set_restore_value(hash_))
setup_component(glob, config)

View File

@@ -1,24 +1,27 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.const import CONF_FREQUENCY, CONF_SCL, CONF_SDA, CONF_SCAN, CONF_ID, \
CONF_RECEIVE_TIMEOUT
from esphomeyaml.helpers import App, add, Pvariable, esphomelib_ns
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_FREQUENCY, CONF_ID, CONF_RECEIVE_TIMEOUT, CONF_SCAN, CONF_SCL, \
CONF_SDA
from esphomeyaml.cpp_generator import Pvariable, add
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
I2CComponent = esphomelib_ns.I2CComponent
I2CComponent = esphomelib_ns.class_('I2CComponent', Component)
I2CDevice = pins.I2CDevice
CONFIG_SCHEMA = vol.Schema({
cv.GenerateID(): cv.declare_variable_id(I2CComponent),
vol.Required(CONF_SDA, default='SDA'): pins.input_output_pin,
vol.Required(CONF_SCL, default='SCL'): pins.input_output_pin,
vol.Optional(CONF_SDA, default='SDA'): pins.input_pullup_pin,
vol.Optional(CONF_SCL, default='SCL'): pins.input_pullup_pin,
vol.Optional(CONF_FREQUENCY): vol.All(cv.frequency, vol.Range(min=0, min_included=False)),
vol.Optional(CONF_SCAN): cv.boolean,
vol.Optional(CONF_RECEIVE_TIMEOUT): cv.invalid("The receive_timeout option has been removed "
"because timeouts are already handled by the "
"low-level i2c interface.")
})
}).extend(cv.COMPONENT_SCHEMA.schema)
def to_code(config):
@@ -27,6 +30,8 @@ def to_code(config):
if CONF_FREQUENCY in config:
add(i2c.set_frequency(config[CONF_FREQUENCY]))
setup_component(i2c, config)
BUILD_FLAGS = '-DUSE_I2C'

View File

@@ -3,19 +3,20 @@ import logging
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import core
from esphomeyaml.components import display, font
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_FILE, CONF_ID, CONF_RESIZE
from esphomeyaml.core import HexInt
from esphomeyaml.helpers import App, ArrayInitializer, MockObj, Pvariable, RawExpression, add, \
relative_path
from esphomeyaml.core import CORE, HexInt
from esphomeyaml.cpp_generator import ArrayInitializer, MockObj, Pvariable, RawExpression, add
from esphomeyaml.cpp_types import App
_LOGGER = logging.getLogger(__name__)
DEPENDENCIES = ['display']
MULTI_CONF = True
Image_ = display.display_ns.Image
Image_ = display.display_ns.class_('Image')
CONF_RAW_DATA_ID = 'raw_data_id'
@@ -26,40 +27,39 @@ IMAGE_SCHEMA = vol.Schema({
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(None),
})
CONFIG_SCHEMA = vol.All(font.validate_pillow_installed, cv.ensure_list, [IMAGE_SCHEMA])
CONFIG_SCHEMA = vol.All(font.validate_pillow_installed, IMAGE_SCHEMA)
def to_code(config):
from PIL import Image
for conf in config:
path = relative_path(conf[CONF_FILE])
try:
image = Image.open(path)
except Exception as e:
raise core.ESPHomeYAMLError(u"Could not load image file {}: {}".format(path, e))
path = CORE.relative_path(config[CONF_FILE])
try:
image = Image.open(path)
except Exception as e:
raise core.EsphomeyamlError(u"Could not load image file {}: {}".format(path, e))
if CONF_RESIZE in conf:
image.thumbnail(conf[CONF_RESIZE])
if CONF_RESIZE in config:
image.thumbnail(config[CONF_RESIZE])
image = image.convert('1', dither=Image.NONE)
width, height = image.size
if width > 500 or height > 500:
_LOGGER.warning("The image you requested is very big. Please consider using the resize "
"parameter")
width8 = ((width + 7) // 8) * 8
data = [0 for _ in range(height * width8 // 8)]
for y in range(height):
for x in range(width):
if image.getpixel((x, y)):
continue
pos = x + y * width8
data[pos // 8] |= 0x80 >> (pos % 8)
image = image.convert('1', dither=Image.NONE)
width, height = image.size
if width > 500 or height > 500:
_LOGGER.warning("The image you requested is very big. Please consider using the resize "
"parameter")
width8 = ((width + 7) // 8) * 8
data = [0 for _ in range(height * width8 // 8)]
for y in range(height):
for x in range(width):
if image.getpixel((x, y)):
continue
pos = x + y * width8
data[pos // 8] |= 0x80 >> (pos % 8)
raw_data = MockObj(conf[CONF_RAW_DATA_ID])
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
raw_data, len(data),
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
raw_data = MockObj(config[CONF_RAW_DATA_ID])
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
raw_data, len(data),
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
rhs = App.make_image(raw_data, width, height)
Pvariable(conf[CONF_ID], rhs)
rhs = App.make_image(raw_data, width, height)
Pvariable(config[CONF_ID], rhs)

View File

@@ -0,0 +1,24 @@
import voluptuous as vol
from esphomeyaml import automation
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ID, CONF_INTERVAL
from esphomeyaml.cpp_generator import Pvariable
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, NoArg, PollingComponent, Trigger, esphomelib_ns
IntervalTrigger = esphomelib_ns.class_('IntervalTrigger', Trigger.template(NoArg), PollingComponent)
CONFIG_SCHEMA = automation.validate_automation(vol.Schema({
cv.GenerateID(): cv.declare_variable_id(IntervalTrigger),
vol.Required(CONF_INTERVAL): cv.positive_time_period_milliseconds,
}).extend(cv.COMPONENT_SCHEMA.schema))
def to_code(config):
for conf in config:
rhs = App.register_component(IntervalTrigger.new(conf[CONF_INTERVAL]))
trigger = Pvariable(conf[CONF_ID], rhs)
setup_component(trigger, conf)
automation.build_automation(trigger, NoArg, conf)

View File

@@ -1,6 +0,0 @@
import voluptuous as vol
def CONFIG_SCHEMA(config):
raise vol.Invalid("The ir_transmitter component has been renamed to "
"remote_transmitter because of 433MHz signal support.")

View File

@@ -1,54 +1,73 @@
import voluptuous as vol
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
from esphomeyaml.components import mqtt
from esphomeyaml.components.mqtt import setup_mqtt_component
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_ALPHA, CONF_BLUE, CONF_BRIGHTNESS, CONF_COLORS, \
CONF_DEFAULT_TRANSITION_LENGTH, CONF_DURATION, CONF_EFFECTS, CONF_EFFECT_ID, \
CONF_GAMMA_CORRECT, \
CONF_GREEN, CONF_ID, CONF_INTERNAL, CONF_LAMBDA, CONF_MQTT_ID, CONF_NAME, CONF_NUM_LEDS, \
CONF_RANDOM, CONF_RED, CONF_SPEED, CONF_STATE, CONF_TRANSITION_LENGTH, CONF_UPDATE_INTERVAL, \
CONF_WHITE, CONF_WIDTH
from esphomeyaml.helpers import Application, ArrayInitializer, Pvariable, RawExpression, \
StructInitializer, add, add_job, esphomelib_ns, process_lambda, setup_mqtt_component
CONF_GAMMA_CORRECT, CONF_GREEN, CONF_ID, CONF_INTERNAL, CONF_LAMBDA, CONF_MQTT_ID, CONF_NAME, \
CONF_NUM_LEDS, CONF_RANDOM, CONF_RED, CONF_SPEED, CONF_STATE, CONF_TRANSITION_LENGTH, \
CONF_UPDATE_INTERVAL, CONF_WHITE, CONF_WIDTH, CONF_FLASH_LENGTH, CONF_COLOR_TEMPERATURE, \
CONF_EFFECT
from esphomeyaml.core import CORE
from esphomeyaml.cpp_generator import process_lambda, Pvariable, add, StructInitializer, \
ArrayInitializer, get_variable, templatable
from esphomeyaml.cpp_types import esphomelib_ns, Application, Component, Nameable, Action, uint32, \
float_, std_string, void
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
})
# Base
light_ns = esphomelib_ns.namespace('light')
LightState = light_ns.LightState
LightColorValues = light_ns.LightColorValues
MQTTJSONLightComponent = light_ns.MQTTJSONLightComponent
ToggleAction = light_ns.ToggleAction
TurnOffAction = light_ns.TurnOffAction
TurnOnAction = light_ns.TurnOnAction
MakeLight = Application.MakeLight
RandomLightEffect = light_ns.RandomLightEffect
LambdaLightEffect = light_ns.LambdaLightEffect
StrobeLightEffect = light_ns.StrobeLightEffect
StrobeLightEffectColor = light_ns.StrobeLightEffectColor
FlickerLightEffect = light_ns.FlickerLightEffect
FastLEDLambdaLightEffect = light_ns.FastLEDLambdaLightEffect
FastLEDRainbowLightEffect = light_ns.FastLEDRainbowLightEffect
FastLEDColorWipeEffect = light_ns.FastLEDColorWipeEffect
FastLEDColorWipeEffectColor = light_ns.FastLEDColorWipeEffectColor
FastLEDScanEffect = light_ns.FastLEDScanEffect
FastLEDScanEffectColor = light_ns.FastLEDScanEffectColor
FastLEDTwinkleEffect = light_ns.FastLEDTwinkleEffect
FastLEDRandomTwinkleEffect = light_ns.FastLEDRandomTwinkleEffect
FastLEDFireworksEffect = light_ns.FastLEDFireworksEffect
FastLEDFlickerEffect = light_ns.FastLEDFlickerEffect
FastLEDLightOutputComponent = light_ns.FastLEDLightOutputComponent
LightState = light_ns.class_('LightState', Nameable, Component)
MakeLight = Application.struct('MakeLight')
LightOutput = light_ns.class_('LightOutput')
AddressableLight = light_ns.class_('AddressableLight')
AddressableLightRef = AddressableLight.operator('ref')
# Actions
ToggleAction = light_ns.class_('ToggleAction', Action)
TurnOffAction = light_ns.class_('TurnOffAction', Action)
TurnOnAction = light_ns.class_('TurnOnAction', Action)
LightColorValues = light_ns.class_('LightColorValues')
MQTTJSONLightComponent = light_ns.class_('MQTTJSONLightComponent', mqtt.MQTTComponent)
# Effects
LightEffect = light_ns.class_('LightEffect')
RandomLightEffect = light_ns.class_('RandomLightEffect', LightEffect)
LambdaLightEffect = light_ns.class_('LambdaLightEffect', LightEffect)
StrobeLightEffect = light_ns.class_('StrobeLightEffect', LightEffect)
StrobeLightEffectColor = light_ns.class_('StrobeLightEffectColor', LightEffect)
FlickerLightEffect = light_ns.class_('FlickerLightEffect', LightEffect)
AddressableLightEffect = light_ns.class_('AddressableLightEffect', LightEffect)
AddressableLambdaLightEffect = light_ns.class_('AddressableLambdaLightEffect',
AddressableLightEffect)
AddressableRainbowLightEffect = light_ns.class_('AddressableRainbowLightEffect',
AddressableLightEffect)
AddressableColorWipeEffect = light_ns.class_('AddressableColorWipeEffect', AddressableLightEffect)
AddressableColorWipeEffectColor = light_ns.struct('AddressableColorWipeEffectColor')
AddressableScanEffect = light_ns.class_('AddressableScanEffect', AddressableLightEffect)
AddressableTwinkleEffect = light_ns.class_('AddressableTwinkleEffect', AddressableLightEffect)
AddressableRandomTwinkleEffect = light_ns.class_('AddressableRandomTwinkleEffect',
AddressableLightEffect)
AddressableFireworksEffect = light_ns.class_('AddressableFireworksEffect', AddressableLightEffect)
AddressableFlickerEffect = light_ns.class_('AddressableFlickerEffect', AddressableLightEffect)
CONF_STROBE = 'strobe'
CONF_FLICKER = 'flicker'
CONF_FASTLED_LAMBDA = 'fastled_lambda'
CONF_FASTLED_RAINBOW = 'fastled_rainbow'
CONF_FASTLED_COLOR_WIPE = 'fastled_color_wipe'
CONF_FASTLED_SCAN = 'fastled_scan'
CONF_FASTLED_TWINKLE = 'fastled_twinkle'
CONF_FASTLED_RANDOM_TWINKLE = 'fastled_random_twinkle'
CONF_FASTLED_FIREWORKS = 'fastled_fireworks'
CONF_FASTLED_FLICKER = 'fastled_flicker'
CONF_ADDRESSABLE_LAMBDA = 'addressable_lambda'
CONF_ADDRESSABLE_RAINBOW = 'addressable_rainbow'
CONF_ADDRESSABLE_COLOR_WIPE = 'addressable_color_wipe'
CONF_ADDRESSABLE_SCAN = 'addressable_scan'
CONF_ADDRESSABLE_TWINKLE = 'addressable_twinkle'
CONF_ADDRESSABLE_RANDOM_TWINKLE = 'addressable_random_twinkle'
CONF_ADDRESSABLE_FIREWORKS = 'addressable_fireworks'
CONF_ADDRESSABLE_FLICKER = 'addressable_flicker'
CONF_ADD_LED_INTERVAL = 'add_led_interval'
CONF_REVERSE = 'reverse'
@@ -63,10 +82,10 @@ CONF_INTENSITY = 'intensity'
BINARY_EFFECTS = [CONF_LAMBDA, CONF_STROBE]
MONOCHROMATIC_EFFECTS = BINARY_EFFECTS + [CONF_FLICKER]
RGB_EFFECTS = MONOCHROMATIC_EFFECTS + [CONF_RANDOM]
FASTLED_EFFECTS = RGB_EFFECTS + [CONF_FASTLED_LAMBDA, CONF_FASTLED_RAINBOW, CONF_FASTLED_COLOR_WIPE,
CONF_FASTLED_SCAN, CONF_FASTLED_TWINKLE,
CONF_FASTLED_RANDOM_TWINKLE, CONF_FASTLED_FIREWORKS,
CONF_FASTLED_FLICKER]
ADDRESSABLE_EFFECTS = RGB_EFFECTS + [CONF_ADDRESSABLE_LAMBDA, CONF_ADDRESSABLE_RAINBOW,
CONF_ADDRESSABLE_COLOR_WIPE, CONF_ADDRESSABLE_SCAN,
CONF_ADDRESSABLE_TWINKLE, CONF_ADDRESSABLE_RANDOM_TWINKLE,
CONF_ADDRESSABLE_FIREWORKS, CONF_ADDRESSABLE_FLICKER]
EFFECTS_SCHEMA = vol.Schema({
vol.Optional(CONF_LAMBDA): vol.Schema({
@@ -83,7 +102,7 @@ EFFECTS_SCHEMA = vol.Schema({
vol.Optional(CONF_STROBE): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(StrobeLightEffect),
vol.Optional(CONF_NAME, default="Strobe"): cv.string,
vol.Optional(CONF_COLORS): vol.All(cv.ensure_list, [vol.All(vol.Schema({
vol.Optional(CONF_COLORS): vol.All(cv.ensure_list(vol.Schema({
vol.Optional(CONF_STATE, default=True): cv.boolean,
vol.Optional(CONF_BRIGHTNESS, default=1.0): cv.percentage,
vol.Optional(CONF_RED, default=1.0): cv.percentage,
@@ -92,7 +111,7 @@ EFFECTS_SCHEMA = vol.Schema({
vol.Optional(CONF_WHITE, default=1.0): cv.percentage,
vol.Required(CONF_DURATION): cv.positive_time_period_milliseconds,
}), cv.has_at_least_one_key(CONF_STATE, CONF_BRIGHTNESS, CONF_RED, CONF_GREEN, CONF_BLUE,
CONF_WHITE))], vol.Length(min=2)),
CONF_WHITE)), vol.Length(min=2)),
}),
vol.Optional(CONF_FLICKER): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FlickerLightEffect),
@@ -100,58 +119,59 @@ EFFECTS_SCHEMA = vol.Schema({
vol.Optional(CONF_ALPHA): cv.percentage,
vol.Optional(CONF_INTENSITY): cv.percentage,
}),
vol.Optional(CONF_FASTLED_LAMBDA): vol.Schema({
vol.Optional(CONF_ADDRESSABLE_LAMBDA): vol.Schema({
vol.Required(CONF_NAME): cv.string,
vol.Required(CONF_LAMBDA): cv.lambda_,
vol.Optional(CONF_UPDATE_INTERVAL, default='0ms'): cv.positive_time_period_milliseconds,
}),
vol.Optional(CONF_FASTLED_RAINBOW): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDRainbowLightEffect),
vol.Optional(CONF_ADDRESSABLE_RAINBOW): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableRainbowLightEffect),
vol.Optional(CONF_NAME, default="Rainbow"): cv.string,
vol.Optional(CONF_SPEED): cv.uint32_t,
vol.Optional(CONF_WIDTH): cv.uint32_t,
}),
vol.Optional(CONF_FASTLED_COLOR_WIPE): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDColorWipeEffect),
vol.Optional(CONF_ADDRESSABLE_COLOR_WIPE): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableColorWipeEffect),
vol.Optional(CONF_NAME, default="Color Wipe"): cv.string,
vol.Optional(CONF_COLORS): vol.All(cv.ensure_list, [vol.Schema({
vol.Optional(CONF_COLORS): cv.ensure_list({
vol.Optional(CONF_RED, default=1.0): cv.percentage,
vol.Optional(CONF_GREEN, default=1.0): cv.percentage,
vol.Optional(CONF_BLUE, default=1.0): cv.percentage,
vol.Optional(CONF_WHITE, default=1.0): cv.percentage,
vol.Optional(CONF_RANDOM, default=False): cv.boolean,
vol.Required(CONF_NUM_LEDS): vol.All(cv.uint32_t, vol.Range(min=1)),
})]),
}),
vol.Optional(CONF_ADD_LED_INTERVAL): cv.positive_time_period_milliseconds,
vol.Optional(CONF_REVERSE): cv.boolean,
}),
vol.Optional(CONF_FASTLED_SCAN): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDScanEffect),
vol.Optional(CONF_ADDRESSABLE_SCAN): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableScanEffect),
vol.Optional(CONF_NAME, default="Scan"): cv.string,
vol.Optional(CONF_MOVE_INTERVAL): cv.positive_time_period_milliseconds,
}),
vol.Optional(CONF_FASTLED_TWINKLE): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDTwinkleEffect),
vol.Optional(CONF_ADDRESSABLE_TWINKLE): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableTwinkleEffect),
vol.Optional(CONF_NAME, default="Twinkle"): cv.string,
vol.Optional(CONF_TWINKLE_PROBABILITY): cv.percentage,
vol.Optional(CONF_PROGRESS_INTERVAL): cv.positive_time_period_milliseconds,
}),
vol.Optional(CONF_FASTLED_RANDOM_TWINKLE): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDRandomTwinkleEffect),
vol.Optional(CONF_ADDRESSABLE_RANDOM_TWINKLE): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableRandomTwinkleEffect),
vol.Optional(CONF_NAME, default="Random Twinkle"): cv.string,
vol.Optional(CONF_TWINKLE_PROBABILITY): cv.percentage,
vol.Optional(CONF_PROGRESS_INTERVAL): cv.positive_time_period_milliseconds,
}),
vol.Optional(CONF_FASTLED_FIREWORKS): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDFireworksEffect),
vol.Optional(CONF_ADDRESSABLE_FIREWORKS): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableFireworksEffect),
vol.Optional(CONF_NAME, default="Fireworks"): cv.string,
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
vol.Optional(CONF_SPARK_PROBABILITY): cv.percentage,
vol.Optional(CONF_USE_RANDOM_COLOR): cv.boolean,
vol.Optional(CONF_FADE_OUT_RATE): cv.uint8_t,
}),
vol.Optional(CONF_FASTLED_FLICKER): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDFlickerEffect),
vol.Optional(CONF_NAME, default="FastLED Flicker"): cv.string,
vol.Optional(CONF_ADDRESSABLE_FLICKER): vol.Schema({
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableFlickerEffect),
vol.Optional(CONF_NAME, default="Addressable Flicker"): cv.string,
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
vol.Optional(CONF_INTENSITY): cv.percentage,
}),
@@ -160,28 +180,64 @@ EFFECTS_SCHEMA = vol.Schema({
def validate_effects(allowed_effects):
def validator(value):
value = cv.ensure_list(value)
is_list = isinstance(value, list)
if not is_list:
value = [value]
names = set()
ret = []
errors = []
for i, effect in enumerate(value):
path = [i] if is_list else []
if not isinstance(effect, dict):
raise vol.Invalid("Each effect must be a dictionary, not {}".format(type(value)))
errors.append(
vol.Invalid("Each effect must be a dictionary, not {}".format(type(value)),
path)
)
continue
if len(effect) > 1:
raise vol.Invalid("Each entry in the 'effects:' option must be a single effect.")
errors.append(
vol.Invalid("Each entry in the 'effects:' option must be a single effect.",
path)
)
continue
if not effect:
raise vol.Invalid("Found no effect for the {}th entry in 'effects:'!".format(i))
errors.append(
vol.Invalid("Found no effect for the {}th entry in 'effects:'!".format(i),
path)
)
continue
key = next(iter(effect.keys()))
if key.startswith('fastled'):
errors.append(
vol.Invalid("FastLED effects have been renamed to addressable effects. "
"Please use '{}'".format(key.replace('fastled', 'addressable')),
path)
)
continue
if key not in allowed_effects:
raise vol.Invalid("The effect '{}' does not exist or is not allowed for this "
"light type".format(key))
errors.append(
vol.Invalid("The effect '{}' does not exist or is not allowed for this "
"light type".format(key), path)
)
continue
effect[key] = effect[key] or {}
conf = EFFECTS_SCHEMA(effect)
try:
conf = EFFECTS_SCHEMA(effect)
except vol.Invalid as err:
err.prepend(path)
errors.append(err)
continue
name = conf[key][CONF_NAME]
if name in names:
raise vol.Invalid(u"Found the effect name '{}' twice. All effects must have "
u"unique names".format(name))
errors.append(
vol.Invalid(u"Found the effect name '{}' twice. All effects must have "
u"unique names".format(name), [i])
)
continue
names.add(name)
ret.append(conf)
if errors:
raise vol.MultipleInvalid(errors)
return ret
return validator
@@ -199,7 +255,7 @@ def build_effect(full_config):
key, config = next(iter(full_config.items()))
if key == CONF_LAMBDA:
lambda_ = None
for lambda_ in process_lambda(config[CONF_LAMBDA], []):
for lambda_ in process_lambda(config[CONF_LAMBDA], [], return_type=void):
yield None
yield LambdaLightEffect.new(config[CONF_NAME], lambda_, config[CONF_UPDATE_INTERVAL])
elif key == CONF_RANDOM:
@@ -233,22 +289,22 @@ def build_effect(full_config):
if CONF_INTENSITY in config:
add(effect.set_intensity(config[CONF_INTENSITY]))
yield effect
elif key == CONF_FASTLED_LAMBDA:
lambda_ = None
args = [(RawExpression('FastLEDLightOutputComponent &'), 'it')]
for lambda_ in process_lambda(config[CONF_LAMBDA], args):
elif key == CONF_ADDRESSABLE_LAMBDA:
args = [(AddressableLightRef, 'it')]
for lambda_ in process_lambda(config[CONF_LAMBDA], args, return_type=void):
yield None
yield FastLEDLambdaLightEffect.new(config[CONF_NAME], lambda_, config[CONF_UPDATE_INTERVAL])
elif key == CONF_FASTLED_RAINBOW:
rhs = FastLEDRainbowLightEffect.new(config[CONF_NAME])
yield AddressableLambdaLightEffect.new(config[CONF_NAME], lambda_,
config[CONF_UPDATE_INTERVAL])
elif key == CONF_ADDRESSABLE_RAINBOW:
rhs = AddressableRainbowLightEffect.new(config[CONF_NAME])
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
if CONF_SPEED in config:
add(effect.set_speed(config[CONF_SPEED]))
if CONF_WIDTH in config:
add(effect.set_width(config[CONF_WIDTH]))
yield effect
elif key == CONF_FASTLED_COLOR_WIPE:
rhs = FastLEDColorWipeEffect.new(config[CONF_NAME])
elif key == CONF_ADDRESSABLE_COLOR_WIPE:
rhs = AddressableColorWipeEffect.new(config[CONF_NAME])
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
if CONF_ADD_LED_INTERVAL in config:
add(effect.set_add_led_interval(config[CONF_ADD_LED_INTERVAL]))
@@ -257,40 +313,41 @@ def build_effect(full_config):
colors = []
for color in config.get(CONF_COLORS, []):
colors.append(StructInitializer(
FastLEDColorWipeEffectColor,
('r', color[CONF_RED]),
('g', color[CONF_GREEN]),
('b', color[CONF_BLUE]),
AddressableColorWipeEffectColor,
('r', int(round(color[CONF_RED] * 255))),
('g', int(round(color[CONF_GREEN] * 255))),
('b', int(round(color[CONF_BLUE] * 255))),
('w', int(round(color[CONF_WHITE] * 255))),
('random', color[CONF_RANDOM]),
('num_leds', color[CONF_NUM_LEDS]),
))
if colors:
add(effect.set_colors(ArrayInitializer(*colors)))
yield effect
elif key == CONF_FASTLED_SCAN:
rhs = FastLEDScanEffect.new(config[CONF_NAME])
elif key == CONF_ADDRESSABLE_SCAN:
rhs = AddressableScanEffect.new(config[CONF_NAME])
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
if CONF_MOVE_INTERVAL in config:
add(effect.set_move_interval(config[CONF_MOVE_INTERVAL]))
yield effect
elif key == CONF_FASTLED_TWINKLE:
rhs = FastLEDTwinkleEffect.new(config[CONF_NAME])
elif key == CONF_ADDRESSABLE_TWINKLE:
rhs = AddressableTwinkleEffect.new(config[CONF_NAME])
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
if CONF_TWINKLE_PROBABILITY in config:
add(effect.set_twinkle_probability(config[CONF_TWINKLE_PROBABILITY]))
if CONF_PROGRESS_INTERVAL in config:
add(effect.set_progress_interval(config[CONF_PROGRESS_INTERVAL]))
yield effect
elif key == CONF_FASTLED_RANDOM_TWINKLE:
rhs = FastLEDRandomTwinkleEffect.new(config[CONF_NAME])
elif key == CONF_ADDRESSABLE_RANDOM_TWINKLE:
rhs = AddressableRandomTwinkleEffect.new(config[CONF_NAME])
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
if CONF_TWINKLE_PROBABILITY in config:
add(effect.set_twinkle_probability(config[CONF_TWINKLE_PROBABILITY]))
if CONF_PROGRESS_INTERVAL in config:
add(effect.set_progress_interval(config[CONF_PROGRESS_INTERVAL]))
yield effect
elif key == CONF_FASTLED_FIREWORKS:
rhs = FastLEDFireworksEffect.new(config[CONF_NAME])
elif key == CONF_ADDRESSABLE_FIREWORKS:
rhs = AddressableFireworksEffect.new(config[CONF_NAME])
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
if CONF_UPDATE_INTERVAL in config:
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
@@ -301,8 +358,8 @@ def build_effect(full_config):
if CONF_FADE_OUT_RATE in config:
add(effect.set_spark_probability(config[CONF_FADE_OUT_RATE]))
yield effect
elif key == CONF_FASTLED_FLICKER:
rhs = FastLEDFlickerEffect.new(config[CONF_NAME])
elif key == CONF_ADDRESSABLE_FLICKER:
rhs = AddressableFlickerEffect.new(config[CONF_NAME])
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
if CONF_UPDATE_INTERVAL in config:
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
@@ -334,7 +391,132 @@ def setup_light_core_(light_var, mqtt_var, config):
def setup_light(light_obj, mqtt_obj, config):
light_var = Pvariable(config[CONF_ID], light_obj, has_side_effects=False)
mqtt_var = Pvariable(config[CONF_MQTT_ID], mqtt_obj, has_side_effects=False)
add_job(setup_light_core_, light_var, mqtt_var, config)
CORE.add_job(setup_light_core_, light_var, mqtt_var, config)
BUILD_FLAGS = '-DUSE_LIGHT'
CONF_LIGHT_TOGGLE = 'light.toggle'
LIGHT_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(LightState),
vol.Optional(CONF_TRANSITION_LENGTH): cv.templatable(cv.positive_time_period_milliseconds),
})
@ACTION_REGISTRY.register(CONF_LIGHT_TOGGLE, LIGHT_TOGGLE_ACTION_SCHEMA)
def light_toggle_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_toggle_action(template_arg)
type = ToggleAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
if CONF_TRANSITION_LENGTH in config:
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
yield None
add(action.set_transition_length(template_))
yield action
CONF_LIGHT_TURN_OFF = 'light.turn_off'
LIGHT_TURN_OFF_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(LightState),
vol.Optional(CONF_TRANSITION_LENGTH): cv.templatable(cv.positive_time_period_milliseconds),
})
@ACTION_REGISTRY.register(CONF_LIGHT_TURN_OFF, LIGHT_TURN_OFF_ACTION_SCHEMA)
def light_turn_off_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_off_action(template_arg)
type = TurnOffAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
if CONF_TRANSITION_LENGTH in config:
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
yield None
add(action.set_transition_length(template_))
yield action
CONF_LIGHT_TURN_ON = 'light.turn_on'
LIGHT_TURN_ON_ACTION_SCHEMA = maybe_simple_id({
vol.Required(CONF_ID): cv.use_variable_id(LightState),
vol.Exclusive(CONF_TRANSITION_LENGTH, 'transformer'):
cv.templatable(cv.positive_time_period_milliseconds),
vol.Exclusive(CONF_FLASH_LENGTH, 'transformer'):
cv.templatable(cv.positive_time_period_milliseconds),
vol.Optional(CONF_BRIGHTNESS): cv.templatable(cv.percentage),
vol.Optional(CONF_RED): cv.templatable(cv.percentage),
vol.Optional(CONF_GREEN): cv.templatable(cv.percentage),
vol.Optional(CONF_BLUE): cv.templatable(cv.percentage),
vol.Optional(CONF_WHITE): cv.templatable(cv.percentage),
vol.Optional(CONF_COLOR_TEMPERATURE): cv.templatable(cv.positive_float),
vol.Optional(CONF_EFFECT): cv.templatable(cv.string),
})
@ACTION_REGISTRY.register(CONF_LIGHT_TURN_ON, LIGHT_TURN_ON_ACTION_SCHEMA)
def light_turn_on_to_code(config, action_id, arg_type, template_arg):
for var in get_variable(config[CONF_ID]):
yield None
rhs = var.make_turn_on_action(template_arg)
type = TurnOnAction.template(template_arg)
action = Pvariable(action_id, rhs, type=type)
if CONF_TRANSITION_LENGTH in config:
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
yield None
add(action.set_transition_length(template_))
if CONF_FLASH_LENGTH in config:
for template_ in templatable(config[CONF_FLASH_LENGTH], arg_type, uint32):
yield None
add(action.set_flash_length(template_))
if CONF_BRIGHTNESS in config:
for template_ in templatable(config[CONF_BRIGHTNESS], arg_type, float_):
yield None
add(action.set_brightness(template_))
if CONF_RED in config:
for template_ in templatable(config[CONF_RED], arg_type, float_):
yield None
add(action.set_red(template_))
if CONF_GREEN in config:
for template_ in templatable(config[CONF_GREEN], arg_type, float_):
yield None
add(action.set_green(template_))
if CONF_BLUE in config:
for template_ in templatable(config[CONF_BLUE], arg_type, float_):
yield None
add(action.set_blue(template_))
if CONF_WHITE in config:
for template_ in templatable(config[CONF_WHITE], arg_type, float_):
yield None
add(action.set_white(template_))
if CONF_COLOR_TEMPERATURE in config:
for template_ in templatable(config[CONF_COLOR_TEMPERATURE], arg_type, float_):
yield None
add(action.set_color_temperature(template_))
if CONF_EFFECT in config:
for template_ in templatable(config[CONF_EFFECT], arg_type, std_string):
yield None
add(action.set_effect(template_))
yield action
def core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=True,
white_value=True):
ret = mqtt.build_hass_config(data, 'light', config, include_state=True, include_command=True)
if ret is None:
return None
ret['schema'] = 'json'
if brightness:
ret['brightness'] = True
if rgb:
ret['rgb'] = True
if color_temp:
ret['color_temp'] = True
if white_value:
ret['white_value'] = True
for effect in config.get(CONF_EFFECTS, []):
ret["effect"] = True
effects = ret.setdefault("effect_list", [])
effects.append(next(x for x in effect.values())[CONF_NAME])
return ret

View File

@@ -1,21 +1,28 @@
import voluptuous as vol
from esphomeyaml.components import light, output
import esphomeyaml.config_validation as cv
from esphomeyaml.components import light
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_OUTPUT, CONF_EFFECTS
from esphomeyaml.helpers import App, get_variable, variable
from esphomeyaml.const import CONF_EFFECTS, CONF_MAKE_ID, CONF_NAME, CONF_OUTPUT
from esphomeyaml.cpp_generator import get_variable, variable
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
vol.Required(CONF_OUTPUT): cv.use_variable_id(None),
vol.Required(CONF_OUTPUT): cv.use_variable_id(output.BinaryOutput),
vol.Optional(CONF_EFFECTS): light.validate_effects(light.BINARY_EFFECTS),
}))
}).extend(cv.COMPONENT_SCHEMA.schema))
def to_code(config):
output = None
for output in get_variable(config[CONF_OUTPUT]):
for output_ in get_variable(config[CONF_OUTPUT]):
yield
rhs = App.make_binary_light(config[CONF_NAME], output)
rhs = App.make_binary_light(config[CONF_NAME], output_)
light_struct = variable(config[CONF_MAKE_ID], rhs)
light.setup_light(light_struct.Pstate, light_struct.Pmqtt, config)
setup_component(light_struct.Pstate, config)
def to_hass_config(data, config):
return light.core_to_hass_config(data, config, brightness=False, rgb=False, color_temp=False,
white_value=False)

View File

@@ -1,25 +1,27 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml.components import light
from esphomeyaml.components import light, output
from esphomeyaml.components.light.rgbww import validate_cold_white_colder, \
validate_color_temperature
from esphomeyaml.const import CONF_COLD_WHITE, CONF_COLD_WHITE_COLOR_TEMPERATURE, \
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, \
CONF_NAME, CONF_WARM_WHITE, CONF_WARM_WHITE_COLOR_TEMPERATURE
from esphomeyaml.helpers import App, get_variable, variable
from esphomeyaml.cpp_generator import get_variable, variable
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
vol.Required(CONF_COLD_WHITE): cv.use_variable_id(None),
vol.Required(CONF_WARM_WHITE): cv.use_variable_id(None),
vol.Required(CONF_COLD_WHITE): cv.use_variable_id(output.FloatOutput),
vol.Required(CONF_WARM_WHITE): cv.use_variable_id(output.FloatOutput),
vol.Required(CONF_COLD_WHITE_COLOR_TEMPERATURE): validate_color_temperature,
vol.Required(CONF_WARM_WHITE_COLOR_TEMPERATURE): validate_color_temperature,
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
vol.Optional(CONF_EFFECTS): light.validate_effects(light.MONOCHROMATIC_EFFECTS),
}), validate_cold_white_colder)
}).extend(cv.COMPONENT_SCHEMA.schema), validate_cold_white_colder)
def to_code(config):
@@ -32,3 +34,9 @@ def to_code(config):
cold_white, warm_white)
light_struct = variable(config[CONF_MAKE_ID], rhs)
light.setup_light(light_struct.Pstate, light_struct.Pmqtt, config)
setup_component(light_struct.Pstate, config)
def to_hass_config(data, config):
return light.core_to_hass_config(data, config, brightness=True, rgb=False, color_temp=True,
white_value=False)

View File

@@ -1,14 +1,15 @@
import voluptuous as vol
import esphomeyaml.config_validation as cv
from esphomeyaml import pins
from esphomeyaml.components import light
from esphomeyaml.components.power_supply import PowerSupplyComponent
from esphomeyaml.const import CONF_CHIPSET, CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, \
CONF_MAKE_ID, CONF_MAX_REFRESH_RATE, CONF_NAME, CONF_NUM_LEDS, CONF_PIN, CONF_POWER_SUPPLY, \
CONF_RGB_ORDER, CONF_EFFECTS
from esphomeyaml.helpers import App, Application, RawExpression, TemplateArguments, add, \
get_variable, variable
import esphomeyaml.config_validation as cv
from esphomeyaml.const import CONF_CHIPSET, CONF_COLOR_CORRECT, CONF_DEFAULT_TRANSITION_LENGTH, \
CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, CONF_MAX_REFRESH_RATE, CONF_NAME, \
CONF_NUM_LEDS, CONF_PIN, CONF_POWER_SUPPLY, CONF_RGB_ORDER
from esphomeyaml.cpp_generator import RawExpression, TemplateArguments, add, get_variable, variable
from esphomeyaml.cpp_helpers import setup_component
from esphomeyaml.cpp_types import App, Application
TYPES = [
'NEOPIXEL',
@@ -53,23 +54,24 @@ def validate(value):
return value
MakeFastLEDLight = Application.MakeFastLEDLight
MakeFastLEDLight = Application.struct('MakeFastLEDLight')
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeFastLEDLight),
vol.Required(CONF_CHIPSET): vol.All(vol.Upper, cv.one_of(*TYPES)),
vol.Required(CONF_CHIPSET): cv.one_of(*TYPES, upper=True),
vol.Required(CONF_PIN): pins.output_pin,
vol.Required(CONF_NUM_LEDS): cv.positive_not_null_int,
vol.Optional(CONF_MAX_REFRESH_RATE): cv.positive_time_period_microseconds,
vol.Optional(CONF_RGB_ORDER): vol.All(vol.Upper, cv.one_of(*RGB_ORDERS)),
vol.Optional(CONF_RGB_ORDER): cv.one_of(*RGB_ORDERS, upper=True),
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=3)),
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
vol.Optional(CONF_EFFECTS): light.validate_effects(light.FASTLED_EFFECTS),
}), validate)
vol.Optional(CONF_EFFECTS): light.validate_effects(light.ADDRESSABLE_EFFECTS),
}).extend(cv.COMPONENT_SCHEMA.schema), validate)
def to_code(config):
@@ -88,12 +90,23 @@ def to_code(config):
add(fast_led.set_max_refresh_rate(config[CONF_MAX_REFRESH_RATE]))
if CONF_POWER_SUPPLY in config:
power_supply = None
for power_supply in get_variable(config[CONF_POWER_SUPPLY]):
yield
add(fast_led.set_power_supply(power_supply))
if CONF_COLOR_CORRECT in config:
r, g, b = config[CONF_COLOR_CORRECT]
add(fast_led.set_correction(r, g, b))
light.setup_light(make.Pstate, make.Pmqtt, config)
setup_component(fast_led, config)
BUILD_FLAGS = '-DUSE_FAST_LED_LIGHT'
LIB_DEPS = 'FastLED@3.2.0'
def to_hass_config(data, config):
return light.core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=False,
white_value=False)

Some files were not shown because too many files have changed in this diff Show More