diff --git a/_config.yml b/_config.yml index 27f14114207..e5b0a64f9e4 100644 --- a/_config.yml +++ b/_config.yml @@ -140,11 +140,11 @@ social: # Home Assistant release details current_major_version: 0 -current_minor_version: 43 -current_patch_version: 2 -date_released: 2017-04-27 +current_minor_version: 44 +current_patch_version: 0 +date_released: 2017-05-06 # Either # or the anchor link to latest release notes in the blog post. # Must be prefixed with a # and have double quotes around it. # Example #release-0431---april-25 -patch_version_notes: "#release-0432---april-27" +patch_version_notes: "" diff --git a/source/_components/binary_sensor.eight_sleep.markdown b/source/_components/binary_sensor.eight_sleep.markdown new file mode 100644 index 00000000000..b8dc41c9f9e --- /dev/null +++ b/source/_components/binary_sensor.eight_sleep.markdown @@ -0,0 +1,18 @@ +--- +layout: page +title: "Eight Sleep Binary Sensor" +description: "Instructions how to integrate binary motion sensors for Eight Sleep within Home Assistant." +date: 2017-04-24 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: eight_sleep.png +ha_category: Binary Sensor +ha_release: "0.44" +--- + + +The `eight_sleep` binary sensor platform lets you observe the presence state of a [Eight Sleep](https://eightsleep.com/) cover/mattress through Home Assistant. + +Devices will be configured automatically. Please refer to the [component](/components/eight_sleep/) configuration on how to setup. diff --git a/source/_components/binary_sensor.maxcube.markdown b/source/_components/binary_sensor.maxcube.markdown new file mode 100644 index 00000000000..ac6e5299ba9 --- /dev/null +++ b/source/_components/binary_sensor.maxcube.markdown @@ -0,0 +1,16 @@ +--- +layout: page +title: "eQ-3 MAX! Cube binary sensors" +description: "Instructions on how to integrate eQ-3 MAX! components with Home Assistant via eQ-3 MAX! Cube." +date: 2017-02-04 22:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: maxcube.png +ha_category: Climate +ha_release: "0.40" +ha_iot_class: "Local Polling" +--- + +See instructions at the [main component](/components/maxcube/). diff --git a/source/_components/binary_sensor.pilight.markdown b/source/_components/binary_sensor.pilight.markdown new file mode 100644 index 00000000000..8851767bd21 --- /dev/null +++ b/source/_components/binary_sensor.pilight.markdown @@ -0,0 +1,37 @@ +--- +layout: page +title: "Pilight Binary Sensor" +description: "Instructions how to integrate pilight binary sensors within Home Assistant." +date: 2017-03-24 20:41 +sidebar: true +comments: false +sharing: true +footer: true +logo: pilight.png +ha_category: Binary Sensor +ha_release: 0.44 +ha_iot_class: Local Poll +--- + +This component implement the [pilight hub](https://github.com/home-assistant/home-assistant.github.io/source/_components/pilight.markdown) binary sensor functionality. +Two type of pilight binary sensor configuration available. A normal sensor which send the on and off state cyclical and a trigger sensor which send only a trigger when an event happend (for example lots of cheap PIR motion detector) (see example configuration below). + +```yaml +# Example configuration.yml entry +binary_sensor: + - platform: pilight + name: 'Motion' + variable: 'state' + payload: + unitcode: 371399 + payload_on: 'closed' + disarm_after_trigger: True <-- use this if you want trigger type behavior +``` + +Configuration variables: +- **variable** (*Required*): The variable name in the data stream that defines the sensor value. +- **payload** (*Required*): Message payload identifiers. Only if all identifiers are matched the sensor value is set. +- **name** (*Optional*): Name of the sensor. +- **payload_on** (*Optional*): Variable `on` value. The component will recognize this as logical '1'. +- **payload_off** (*Optional*): Variable `off` value. The component will recognize this as logical '0'. +- **disarm_after_trigger:** (*Optional*): Configure sensor as trigger type. diff --git a/source/_components/binary_sensor.ping.markdown b/source/_components/binary_sensor.ping.markdown index 615dcdd2065..613e64316b1 100644 --- a/source/_components/binary_sensor.ping.markdown +++ b/source/_components/binary_sensor.ping.markdown @@ -38,6 +38,6 @@ The sensor exposes the different round trip times values measured by `ping` as a - `round trip time max`
-This sensor was only tested on a Linux-based system. +When run on Windows systems, the round trip time attributes are rounded to the nearest millisecond and the mdev value is unavailable.
diff --git a/source/_components/binary_sensor.zha.markdown b/source/_components/binary_sensor.zha.markdown new file mode 100644 index 00000000000..c976c2fb698 --- /dev/null +++ b/source/_components/binary_sensor.zha.markdown @@ -0,0 +1,16 @@ +--- +layout: page +title: "ZigBee Home Automation Binary Sensor" +description: "Instructions how to setup ZigBee Home Automation binary sensors within Home Assistant." +date: 2017-02-22 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: zigbee.png +ha_category: Binary Sensor +--- + +To get your ZigBee binary sensors working with Home Assistant, follow the +instructions for the general [ZigBee Home Automation +component](/components/zha/). diff --git a/source/_components/climate.ecobee.markdown b/source/_components/climate.ecobee.markdown index 930f2988c48..a6052218898 100644 --- a/source/_components/climate.ecobee.markdown +++ b/source/_components/climate.ecobee.markdown @@ -61,7 +61,7 @@ The following attributes are provided by the Ecobee Thermostat: `target_temperature_low`, `target_temperature_high`, `desired_fan_mode`, `fan`, `current_hold_mode`, `current_operation`, `operation_list`, `operation_mode`, `mode`, `fan_min_on_time`, `device_state_attributes`, -`is_away_mode_on`, `vacation`, `climate_list`. +`is_away_mode_on`, `vacation`, `climate_list`, `aux_heat`. The attributes `min_temp` and `max_temp` are meaningless constant values. @@ -202,6 +202,13 @@ Returns the currently active vacation or `None`. Returns the list of climates defined in the thermostat. +### {% linkable_title Attribute `aux_heat` %} + +Returns the current auxiliary heat state. + +| Attribute type | Description | +| ---------------| ----------- | +| String | 'on', 'off' ## {% linkable_title Services %} diff --git a/source/_components/climate.maxcube.markdown b/source/_components/climate.maxcube.markdown new file mode 100644 index 00000000000..ac6e5299ba9 --- /dev/null +++ b/source/_components/climate.maxcube.markdown @@ -0,0 +1,16 @@ +--- +layout: page +title: "eQ-3 MAX! Cube binary sensors" +description: "Instructions on how to integrate eQ-3 MAX! components with Home Assistant via eQ-3 MAX! Cube." +date: 2017-02-04 22:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: maxcube.png +ha_category: Climate +ha_release: "0.40" +ha_iot_class: "Local Polling" +--- + +See instructions at the [main component](/components/maxcube/). diff --git a/source/_components/climate.sensibo.markdown b/source/_components/climate.sensibo.markdown new file mode 100644 index 00000000000..71111a88477 --- /dev/null +++ b/source/_components/climate.sensibo.markdown @@ -0,0 +1,48 @@ +--- +layout: page +title: "Sensibo A/C controller" +description: "Instructions how to integrate Sensibo A/C controller into Home Assistant." +date: 2017-04-01 15:00 +0200 +sidebar: true +comments: false +sharing: true +footer: true +logo: sensibo.png +ha_category: Climate +ha_release: 0.44 +ha_iot_class: "Cloud Polling" +--- + +Integrates [Sensibo](https://sensibo.com) Air Conditioning controller into Home Assistant. + +To enable this platform, add the following lines to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +climate: + - platform: sensibo + api_key:+If you create the API key using a dedicated user (and not your main user), +then in the Sensibo app log you will be able to distinguish between actions +done in the app and actions done by Home Assistant. +
+ +### {% linkable_title Full config example %} +```yaml +climate: + - platform: sensibo + api_key: deadbeaf + id: + - id1 + - id2 +``` diff --git a/source/_components/cover.mqtt.markdown b/source/_components/cover.mqtt.markdown old mode 100644 new mode 100755 index d70dc6ed88e..05b5f801c0c --- a/source/_components/cover.mqtt.markdown +++ b/source/_components/cover.mqtt.markdown @@ -46,11 +46,41 @@ Configuration variables: - **qos** (*Optional*): The maximum QoS level of the state topic. Default is `0`. Will also be used when publishing messages. - **retain** (*Optional*): If the published message should have the retain flag on or not. Default is `false`. - **value_template** (*Optional*): Defines a [template](/topics/templating/) to extract a value from the payload. +- **tilt_command_topic** (*Optional*): The MQTT topic to publish commands to control the cover tilt. +- **tilt_status_topic** (*Optional*): The MQTT topic subscribed to receive tilt status update values. +- **tilt_min** (*Optional*): The minimum tilt value. Default is `0` +- **tilt_max** (*Optional*): The maximum tilt value. Default is `100` +- **tilt_closed_value** (*Optional*): The value that will be sent on a `close_cover_tilt` command. Default is `0` +- **tilt_opened_value** (*Optional*): The value that will be sent on an `open_cover_tilt` command. Default is `100` +- **tilt_status_optimistic** (*Optional*): Flag that determines if tilt works in optimistic mode. Default is `true` if `tilt_status_topic` is not deinfed, else `false` +- **tilt_invert_state** (*Optional*): Flag that determines if open/close are flipped; higher values toward closed and lower values toward open. Default is `False` ## {% linkable_title Examples %} In this section you find some real life examples of how to use this sensor. +### {% linkable_title Full configuration without tilt %} + +The example below shows a full configuration for a cover without tilt. + +```yaml +# Example configuration.yml entry +cover: + - platform: mqtt + state_topic: "home-assistant/cover" + command_topic: "home-assistant/cover/set" + name: "MQTT Cover" + qos: 0 + retain: true + payload_open: "OPEN" + payload_close: "CLOSE" + payload_stop: "STOP" + state_open: "OPEN" + state_closed: "STATE" + optimistic: false + value_template: '{% raw %}{{ value.x }}{% endraw %}' +``` + ### {% linkable_title Full configuration %} The example below shows a full configuration for a cover. @@ -71,6 +101,12 @@ cover: state_closed: "STATE" optimistic: false value_template: '{% raw %}{{ value.x }}{% endraw %}' + tilt_command_topic: 'home-assistant/cover/tilt' + tilt_status_topic: 'home-assistant/cover/tilt-status' + tilt_min: 0 + tilt_max: 180 + tilt_closed_value: 70 + tilt_opened_value: 180 ``` For a check you can use the command line tools `mosquitto_pub` shipped with `mosquitto` to send MQTT messages. This allows you to operate your cover manually: diff --git a/source/_components/cover.opengarage.markdown b/source/_components/cover.opengarage.markdown new file mode 100644 index 00000000000..7ee7fccb49b --- /dev/null +++ b/source/_components/cover.opengarage.markdown @@ -0,0 +1,109 @@ +--- +layout: page +title: "OpenGarage Cover" +description: "Instructions how to integrate OpenGarage.io covers within Home Assistant." +date: 2017-04-07 14:25 +sidebar: true +comments: false +sharing: true +footer: true +logo: opengarage.png +ha_category: Cover +ha_release: 0.44 +--- + + +The `opengarage` cover platform lets you control the open-source [OpenGarage.io](https://opengarage.io/) device through Home Assistant. + +To enable OpenGarage Covers in your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +cover: + platform: opengarage + covers: + garage: + host: 192.168.1.12 + device_key: opendoor + name: Left Garage Door + garage2: + host: 192.168.1.13 + device_key: opendoor + name: Right Garage Door +``` + +Configuration variables: + +- **covers** array (*Required*): List of your doors. + - **identifier** (*Required*): Name of the cover as slug. Multiple entries are possible. + - **host** (*Required*): IP address of device. + - **port** (*Optional*): HTTP Port. Default is `80`. + - **device_key** (*Required*): Access key to control device. Default is `opendoor`. + - **name** (*Optional*): Name to use in the Frontend. If not provided, it will use name configured in device. + + + +**Example with more detail:** +
+
+
-TP-Link devices typically only allow one login at a time to the admin console. This component will count towards your one allowed login. Depending on how aggressively you configure device_tracker you may not be able to access the admin console of your TP-Link device without first stopping Home Assistant (and waiting a few minutes for the session to timeout) before you'll be able to login. +TP-Link devices typically only allow one login at a time to the admin console. This component will count towards your one allowed login. Depending on how aggressively you configure device_tracker you may not be able to access the admin console of your TP-Link device without first stopping Home Assistant. Home Assistant takes a few seconds to login, collect data, and log out. If you log into the admin console manually, remember to log out so that Home Assistant can log in again.
diff --git a/source/_components/eight_sleep.markdown b/source/_components/eight_sleep.markdown new file mode 100644 index 00000000000..ee8f95b789d --- /dev/null +++ b/source/_components/eight_sleep.markdown @@ -0,0 +1,67 @@ +--- +layout: page +title: "Eight Sleep" +description: "Interface an Eight Sleep smart cover or mattress to Home Assistant" +date: 2017-04-24 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: eight_sleep.png +ha_category: Hub +ha_release: "0.44" +--- + +The `eight_sleep` component allows Home Assistant to fetch data from your [Eight Sleep](https://eightsleep.com/) smart cover or mattress. + +It's setup utilizing 'Sensor' components to convey the current state of your bed and results of your sleep sessions and a 'Binary Sensor' component to indicate your presence in the bed. A service is also provided to set the heating level and duration of the bed. + +To get started add the following information to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +eight_sleep: + username: "user@email.com" + password: "password" +``` + +Configuration variables: + +- **username** (*Required*): The email address associated with your Eight Sleep account. +- **password** (*Required*): The password associated with your Eight Sleep account. +- **partner** (*Optional*): Default is False. Defines if you'd like to fetch data for both sides of the bed. + +### {% linkable_title Supported features %} + +Sensors: + +- eight_left/right_bed_state +- eight_left/right_sleep_session +- eight_left/right_previous_sleep_session + +Binary Sensors: + +- eight_left/right_bed_presence + +### {% linkable_title Service `heat_set` %} + +You can use the service eight_sleep/heat_set to adjust the target heating level and heating duration of your bed. + +| Service data attribute | Optional | Description | +| ---------------------- | -------- | ----------- | +| `entity_id` | no | Entity ID of bed state to adjust. +| `target` | no | Target heating level from 0-100. +| `duration` | no | Duration to heat at the target level in seconds. + +Script Example: + +```yaml +script: + bed_set_heat: + sequence: + - service: eight_sleep.heat_set + data: + entity_id: "sensor.eight_left_bed_state" + target: 35 + duration: 3600 +``` diff --git a/source/_components/image_processing.dlib_face_detect.markdown b/source/_components/image_processing.dlib_face_detect.markdown new file mode 100644 index 00000000000..e6a9fc48981 --- /dev/null +++ b/source/_components/image_processing.dlib_face_detect.markdown @@ -0,0 +1,34 @@ +--- +layout: page +title: "Dlib Face Detect" +description: "Instructions how to integrate Dlib Face Detect into Home Assistant." +date: 2017-05-05 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: dlib.png +ha_category: Image Processing +featured: false +ha_release: 0.44 +--- + +The `dlib_face_detect` image processing platform allows you to use the [Dlib](http://www.dlib.net/) through Home Assistant. This platform enables you do detect face on camera and fire a event with attributes. + +For using the result inside an automation rule, take a look at the [component](/components/image_processing/) page. + +### {% linkable_title Configuration Home Assistant %} + +```yaml +# Example configuration.yaml entry +image_processing: + - platform: dlib_face_detect + source: + - entity_id: camera.door +``` + +Configuration variables: + +- **source** array (*Required*): List of image sources. + - **entity_id** (*Required*): A camera entity id to get picture from. + - **name** (*Optional*): This parameter allows you to override the name of your `image_processing` entity. diff --git a/source/_components/image_processing.dlib_face_identify.markdown b/source/_components/image_processing.dlib_face_identify.markdown new file mode 100644 index 00000000000..f7f1ab1e75b --- /dev/null +++ b/source/_components/image_processing.dlib_face_identify.markdown @@ -0,0 +1,38 @@ +--- +layout: page +title: "Dlib Face Identify" +description: "Instructions how to integrate Dlib Face Identify into Home Assistant." +date: 2017-01-25 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: dlib.png +ha_category: Image Processing +featured: false +ha_release: 0.44 +--- + +The `dlib_face_identify` image processing platform allows you to use the [Dlib](http://www.dlib.net/) through Home Assistant. This platform allow you do identify persons on camera and fire a event with identify persons. + +For using the result inside an automation rule, take a look at the [component](/components/image_processing/) page. + +### {% linkable_title Configuration Home Assistant %} + +```yaml +# Example configuration.yaml entry +image_processing: + - platform: dlib_face_identify + source: + - entity_id: camera.door + faces: + Jon: /home/hass/jon.jpg + Bob: /home/hass/bob.jpg +``` + +Configuration variables: + +- **source** array (*Required*): List of image sources. + - **entity_id** (*Required*): A camera entity id to get picture from. + - **name** (*Optional*): This parameter allows you to override the name of your `image_processing` entity. +- **faces** array (*Required*): List of faces sources. diff --git a/source/_components/image_processing.microsoft_face_detect.markdown b/source/_components/image_processing.microsoft_face_detect.markdown index aa2a1cd1a83..c15e5baab07 100644 --- a/source/_components/image_processing.microsoft_face_detect.markdown +++ b/source/_components/image_processing.microsoft_face_detect.markdown @@ -31,7 +31,6 @@ image_processing: Configuration variables: -- **group** (*Required*): Microsoft Face group used to detect the person. - **confidence** (*Optional*): The minimum of confidence in percent to process with Home Assistant. Defaults to 80. - **source** array (*Required*): List of image sources. - **entity_id** (*Required*): A camera entity id to get picture from. diff --git a/source/_components/image_processing.opencv.markdown b/source/_components/image_processing.opencv.markdown new file mode 100644 index 00000000000..99554722af6 --- /dev/null +++ b/source/_components/image_processing.opencv.markdown @@ -0,0 +1,18 @@ +--- +layout: page +title: "OpenCV" +description: "Instructions how to integrate OpenCV image processing into Home Assistant." +date: 2017-01-25 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: opencv.png +ha_category: Image Processing +featured: false +ha_release: 0.44 +--- + +The `opencv` image processing platform allows you to create a standalone image processor without the linked camera entity as mentioned in the [OpenCV page](https://home-assistant.io/components/opencv). + +Please refer to the [component](/components/opencv/) configuration on how to setup the image processor. diff --git a/source/_components/influxdb.markdown b/source/_components/influxdb.markdown index 4c69b010c1a..14b00a1a0fb 100644 --- a/source/_components/influxdb.markdown +++ b/source/_components/influxdb.markdown @@ -36,8 +36,12 @@ Configuration variables: - **verify_ssl** (*Optional*): Verify SSL certificate for https request. Defaults to false. - **default_measurement** (*Optional*): Measurement name to use when an entity doesn't have a unit. Defaults to entity id. - **override_measurement** (*Optional*): Measurement name to use instead of unit or default measurement. This will store all data points in a single measurement. -- **blacklist** (*Optional*): List of entities that should not be logged to InfluxDB. -- **whitelist** (*Optional*): List of the entities (only) that will be logged to InfluxDB. If not set, all entities will be logged. Values set by the **blacklist** option will prevail. +- **exclude** (*Optional*): Configure which components should be excluded from recording to InfluxDB. + - **entities** (*Optional*): The list of entity ids to be excluded from recording to InfluxDB. + - **domains** (*Optional*): The list of domains to be excluded from recording to InfluxDB. +- **include** (*Optional*): Configure which components should be included in recordings to InfluxDB. If set, all other entities will not be recorded to InfluxDB. Values set by the **blacklist** option will prevail. + - **entities** (*Optional*): The list of entity ids to be included from recordings to InfluxDB. + - **domains** (*Optional*): The list of domains to be included from recordings to InfluxDB. - **tags** (*Optional*): Tags to mark the data. ## {% linkable_title Data migration %} @@ -98,6 +102,62 @@ optional arguments: - The step option defaults to `1000`. +## {% linkable_title Data import script %} + +If you want to import all the recorded data from your recorder database you can use the data import script. +It will read all your state_change events from the database and add them as data-points to the InfluxDB. +You can specify the source database either by pointing the `--config` option to the config directory which includes the default sqlite database or by giving a sqlalchemy connection URI with `--uri`. +The writing to InfluxDB is done in batches that can be changed with `--step`. + +You can control, which data is imported by using the commandline options `--exclude-entities` and `--exclude-domain`. +Both get a comma separated list of either entity-ids or domain names that are excluded from the import. + +To test what gets imported you can use the `--simulate` option, which disables the actual write to the InfluxDB instance. +This only writes the statistics how much points would be imported from which entity. + +Example to run the script: + +```bash +$ hass --script influxdb_import --config CONFIG_DIR \ + -H IP_INFLUXDB_HOST -u INFLUXDB_USERNAME -p INFLUXDB_PASSWORD \ + --dbname INFLUXDB_DB_NAME --exclude-domain automation,configurator +``` +Script arguments: + +``` +required arguments: + -d dbname, --dbname dbname + InfluxDB database name + +optional arguments: + -h, --help show this help message and exit + -c path_to_config_dir, --config path_to_config_dir + Directory that contains the Home Assistant + configuration + --uri URI Connect to URI and import (if other than default + sqlite) eg: mysql://localhost/homeassistant + + -H host, --host host InfluxDB host address + -P port, --port port InfluxDB host port + -u username, --username username + InfluxDB username + -p password, --password password + InfluxDB password + -s step, --step step How many points to import at the same time + -t tags, --tags tags Comma separated list of tags (key:value) for all + points + -D default_measurement, --default-measurement default_measurement + Store all your points in the same measurement + -o override_measurement, --override-measurement override_measurement + Store all your points in the same measurement + -e exclude_entities, --exclude_entities exclude_entities + Comma separated list of excluded entities + -E exclude_domains, --exclude_domains exclude_domains + Comma separated list of excluded domains + -S, --simulate Do not write points but simulate preprocessing + and print statistics +``` + ## {% linkable_title Examples %} @@ -113,12 +173,16 @@ influxdb: ssl: true verify_ssl: true default_measurement: state - blacklist: - - entity.id1 - - entity.id2 - whitelist: - - entity.id3 - - entity.id4 + exclude: + entities: + - entity.id1 + - entity.id2 + domains: + - automation + include: + entities: + - entity.id3 + - entity.id4 tags: instance: prod source: hass diff --git a/source/_components/joaoapps_join.markdown b/source/_components/joaoapps_join.markdown index ca6ea4650c2..a66389a6c88 100644 --- a/source/_components/joaoapps_join.markdown +++ b/source/_components/joaoapps_join.markdown @@ -15,16 +15,19 @@ ha_release: "0.24" The Join platform exposes services from [Join](http://joaoapps.com/join). In Home Assistant, the Join features are divided up in two locations, the Join component, and the Join notify platform. The notify platform allows us to send messages to Join devices, the the component allows us to access the other special features that Join offers. -In the `configuration.yaml` file you need to provide the device id of the target device. If you want to send to a group of devices, you need to provide an api key. You can find you device id and api key [here](https://joinjoaomgcd.appspot.com/). +In the `configuration.yaml` file you need to provide the api key and device id or name of the target device. You can find your device id and api key [here](https://joinjoaomgcd.appspot.com/). To set it up, add the following information to your `configuration.yaml` file: ```yaml notify: - platform: joaoapps_join - device_id: d5asdfasdf54645h45h368761dfe5gt8a - name: droid *optional - api_key: asd97823jb628a34fwsdfwefd5384345tf2d *optional + api_key: asd97823jb628a34fwsdfwefd5384345tf2d + device_id: d5asdfasdf54645h45h368761dfe5gt8a *optional + device_ids: d5asdfasdf54645h45h368761dfe5gt8a, a4asdfasdf54645h45h368761dfe5gt3b *optional + device_names: Pixel, iPhone *optional + name: Phones *optional + joaoapps_join: - name: android @@ -34,23 +37,25 @@ joaoapps_join: Configuration variables: -- **device_id** (*Required*): The Id of your device. - **api_key** (*Required*): The API key for Join. +- **device_id** (*Optional*): The id of your device. +- **device_ids** (*Optional*): Comma separated list of device ids. +- **device_names** (*Optional*): Comma separated list of device names. -The notify service has two optional parameters: `icon` and `small icon`. You can use them like so: +The notify service has two optional parameters: `icon` and `vibration`. You can use them like so: ```json -{"message":"Hello!","title":"From Hass","data":{"icon":"https://goo.gl/KVqcYi","smallicon":"http://goo.gl/AU4Wf1"}} +{"message":"Hello from Home Assistant!","title":"Home Assistant","data":{"icon":"https://goo.gl/xeetdy", "vibration":"0,65,706,86,657,95,668,100"}} ``` The services exposed in the joaoapps_join component can be used with the service data described below: -| Service | Data | -|------------------------------ |------------------------------------------------------------------ | -| joaoapps_join/ring | | -| joaoapps_join/send_sms | {"number":"5553334444", "message":"Hello!"} | -| joaoapps_join/send_tasker | {"command":"test"} | -| joaoapps_join/send_url | {"url":"http://google.com"} | -| joaoapps_join/send_wallpaper | {"url":"http://www.planwallpaper.com/static/images/ZhGEqAP.jpg"} | -| joaoapps_join/send_file | {"url":"http://download.thinkbroadband.com/5MB.zip"} | +| Service | Data | +|------------------------------ |------------------------------------------------------------------ | +| joaoapps_join/ring | | +| joaoapps_join/send_sms | {"number":"5553334444", "message":"Hello!"} | +| joaoapps_join/send_tasker | {"command":"test"} | +| joaoapps_join/send_url | {"url":"http://google.com"} | +| joaoapps_join/send_wallpaper | {"url":"http://www.planwallpaper.com/static/images/ZhGEqAP.jpg"} | +| joaoapps_join/send_file | {"url":"http://download.thinkbroadband.com/5MB.zip"} | diff --git a/source/_components/light.blinkt.markdown b/source/_components/light.blinkt.markdown new file mode 100644 index 00000000000..cb268a83ac0 --- /dev/null +++ b/source/_components/light.blinkt.markdown @@ -0,0 +1,23 @@ +--- +layout: page +title: "Blinkt!" +description: "Instructions how to setup Blinkt! RGB LED lights within Home Assistant." +date: 2017-04-30 9:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: raspberry-pi.png +ha_category: Light +ha_iot_class: "Local Push" +--- + +The `blinkt` light platform lets you control the [Blinkt!](https://shop.pimoroni.com/products/blinkt) board, featuring eight super-bright RGB LEDs. + +To enable `blinkt` in your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +light: + - platform: blinkt +``` diff --git a/source/_components/light.flux_led.markdown b/source/_components/light.flux_led.markdown index e85e3083823..e74e6942d96 100644 --- a/source/_components/light.flux_led.markdown +++ b/source/_components/light.flux_led.markdown @@ -54,7 +54,7 @@ Configuration variables within devices list:Depending on your controller or bulb type, there are two ways to configure brightness. -The component defaults to rgbw. If your device has a separate white channel, you do not need to specify anything else; changing the brighness will set the device to white with your chosen brightness. However, if your device does not have a separate white channel, you will need to set the mode to rgb. In this mode, the device will keep the same color, and adjust the rgb values to dim or brighten the color. +The component defaults to rgbw. If your device has a separate white channel, you do not need to specify anything else; changing the white value will adjust the brightness of white channel keeping rgb color constant. However, if your device does not have a separate white channel, you will need to set the mode to rgb. In this mode, the device will keep the same color, and adjust the rgb values to dim or brighten the color.
@@ -101,7 +101,7 @@ Will add a light without the white mode: mode: "rgb" ``` -Will add a light with white mode (default). Changing the brightness will set the bulb in white mode: +Will add a light with rgb+white mode (default). White and RGB channels can be adjusted independently using a slider and color picker respectively. ```yaml 192.168.1.10: diff --git a/source/_components/light.sensehat.markdown b/source/_components/light.sensehat.markdown new file mode 100644 index 00000000000..b2bb9dbce5a --- /dev/null +++ b/source/_components/light.sensehat.markdown @@ -0,0 +1,27 @@ +--- +layout: page +title: "Sense HAT Light" +description: "Instructions how to setup Sense HAT LED lights within Home Assistant." +date: 2017-04-29 16:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: sense-hat.png +ha_version: 0.44 +ha_category: Light +ha_iot_class: "Assumed State" +--- + +The `sensehat` light platform lets you control the [Sense HAT](https://www.raspberrypi.org/products/sense-hat/) board's 8x8 RGB LED matrix on your Raspberry Pi from within Home Assistant. + +To add `sensehat` to your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +light: + platform: sensehat + name: SenseHAT +``` + +For setting up the Sense HAT sensors, please see the [Sense HAT sensor component](/components/sensor.sensehat/). diff --git a/source/_components/light.tradfri.markdown b/source/_components/light.tradfri.markdown index afce03a9b4c..82da9e0d331 100644 --- a/source/_components/light.tradfri.markdown +++ b/source/_components/light.tradfri.markdown @@ -14,4 +14,3 @@ ha_release: 0.43 --- For installation instructions, see [the Trådfri component](/components/tradfri/). - diff --git a/source/_components/light.zha.markdown b/source/_components/light.zha.markdown new file mode 100644 index 00000000000..fe71e06fc07 --- /dev/null +++ b/source/_components/light.zha.markdown @@ -0,0 +1,16 @@ +--- +layout: page +title: "ZigBee Home Automation Light" +description: "Instructions how to setup ZigBee Home Automation lights within Home Assistant." +date: 2017-02-22 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: zigbee.png +ha_category: Light +--- + +To get your ZigBee lights working with Home Assistant, follow the +instructions for the general [ZigBee Home Automation +component](/components/zha/). diff --git a/source/_components/media_player.denon.markdown b/source/_components/media_player.denon.markdown index d579463a5ce..d0b9da9b0ec 100644 --- a/source/_components/media_player.denon.markdown +++ b/source/_components/media_player.denon.markdown @@ -24,11 +24,13 @@ Supported devices: - Denon AVR receivers with Integrated Network support (partial support) - Denon AVR-X4100W (via denonavr platform) - Denon AVR receivers (via denonavr platform (untested)) +- Marantz M-RC610 (via denonavr platform) +- Marantz receivers (experimental via denonavr platform) To add a Denon Network Receiver to your installation, add the following to your `configuration.yaml` file: -**Telnet interface** +**Telnet platform** ```yaml # Example configuration.yaml entry media_player: @@ -49,7 +51,7 @@ A few notes for platform: denon - Seeking cannot be implemented as the UI sends absolute positions. Only seeking via simulated button presses is possible. -**denonavr interface** +**denonavr platform** ```yaml # Example configuration.yaml entry media_player: @@ -63,6 +65,6 @@ Configuration variables: - **name** (*Optional*): Name of the device. If not set, friendlyName of receiver is used. A few notes for platform: denonavr -- Additional option the control Denon AVR receivers with a builtin web server is using the HTTP interface with denonavr platform -- denonavr platform supports some additional functionalities like album covers, custom input source names and auto discovery -- Still be careful with the volume. 100% in an action movie will tear down your walls. +- Additional option the control Denon AVR receivers with a builtin web server is using the HTTP interface with denonavr platform. +- denonavr platform supports some additional functionalities like album covers, custom input source names and auto discovery. +- Marantz receivers seem to a have quite simliar interface. Thus if you own one, give it a try. diff --git a/source/_components/media_player.panasonic_viera.markdown b/source/_components/media_player.panasonic_viera.markdown index 73c4499ee24..aa3c4edcc88 100644 --- a/source/_components/media_player.panasonic_viera.markdown +++ b/source/_components/media_player.panasonic_viera.markdown @@ -22,6 +22,7 @@ Currently known supported models: - TC-P65VT30 - TX-55CX700E - TX-49DX650B +- TX-50DX700B If your model is not on the list then give it a test, if everything works correctly then add it to the list on [GitHub](https://github.com/home-assistant/home-assistant.io). diff --git a/source/_components/notify.smtp.markdown b/source/_components/notify.smtp.markdown index aa65fb72033..d5f33126c8a 100644 --- a/source/_components/notify.smtp.markdown +++ b/source/_components/notify.smtp.markdown @@ -35,7 +35,7 @@ Configuration variables: - **sender** (*Optional*): E-mail address of the sender. - **username** (*Optional*): Username for the SMTP account. - **password** (*Optional*): Password for the SMTP server that belongs to the given username. If the password contains a colon it need to be wrapped in apostrophes. -- **recipient** (*Required*): Recipient of the notification. +- **recipient** (*Required*): E-mail address of the recipient of the notification. This can be a recpient address or a list of addresses for multiple recipients. - **starttls** (*Optional*): Enables STARTTLS, eg. True or False. Defaults to False. - **debug** (*Optional*): Enables Debug, eg. True or False. Defaults to False. @@ -53,7 +53,9 @@ notify: starttls: true username: john@gmail.com password: thePassword - recipient: james@gmail.com + recipient: + - james@gmail.com + - bob@gmail.com ``` Keep in mind that Google has some extra layers of protection which need special attention (Hint: 'Less secure apps'). diff --git a/source/_components/opencv.markdown b/source/_components/opencv.markdown new file mode 100644 index 00000000000..301d1b3ed39 --- /dev/null +++ b/source/_components/opencv.markdown @@ -0,0 +1,76 @@ +--- +layout: page +title: "OpenCV" +description: "Instructions how to setup OpenCV within Home Assistant." +date: 2017-04-01 22:36 +sidebar: true +comments: false +sharing: true +footer: true +logo: opencv.png +ha_category: Hub +ha_release: 0.44 +ha_iot_class: "Local Push" +--- + +[OpenCV](https://www.opencv.org) is an open source computer vision image and video processing library. + +Some pre-defined classifiers can be found here: https://github.com/opencv/opencv/tree/master/data + +### {% linkable_title Configuration %} + +To setup OpenCV with Home Assistant, add the following section to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry + +opencv: + classifier_group: + - name: Family + add_camera: True + entity_id: + - camera.front_door + - camera.living_room + classifier: + - file_path: /path/to/classifier/face.xml + name: Bob + - file_path: /path/to/classifier/face_profile.xml + name: Jill + min_size: (20, 20) + color: (255, 0, 0) + scale: 1.6 + neighbors: 5 + - file_path: /path/to/classifier/kid_face.xml + name: Little Jimmy +``` + +Configuration variables: + +- **name** (*Required*): The name of the OpenCV image processor. +- **entity_id** (*Required*): The camera entity or list of camera entities that this classification group will be applied to. +- **classifier** (*Required*): The classification configuration for to be applied: + - **file_path** (*Required*): The path to the HAARS or LBP classification file (xml). + - **name** (*Optional*): The classification name, the default is `Face`. + - **min_size** (*Optional*): The minimum size for detection as a tuple `(width, height)`, the default is `(30, 30)`. + - **color** (*Optional*): The color, as a tuple `(Blue, Green, Red)` to draw the rectangle when linked to a dispatcher camera, the default is `(255, 255, 0)`. + - **scale** (*Optional*): The scale to perform when processing, this is a `float` value that must be greater than or equal to `1.0`, default is `1.1`. + - **neighbors** (*Optional*): The minimum number of neighbors required for a match, default is `4`. The higher this number, the more picky the matching will be; lower the number, the more false positives you may experience. + +Once OpenCV is configured, it will create an `image_processing` entity for each classification group/camera entity combination as well as a camera so you can see what Home Assistant sees. + +The attributes on the `image_processing` entity will be: + +```json +'matches': { + 'Bob': [ + (x, y, w, h) + ], + 'Jill': [ + (x, y, w, h) + ], + 'Little Jimmy': [ + (x, y, w, h) + ] +} +``` + diff --git a/source/_components/plant.markdown b/source/_components/plant.markdown new file mode 100644 index 00000000000..e78f946c382 --- /dev/null +++ b/source/_components/plant.markdown @@ -0,0 +1,54 @@ +--- +layout: page +title: "Plant Observer" +description: "Automation component to observe the status of your plants." +date: 2017-05-06 08:00 +sidebar: true +comments: false +sharing: true +footer: true +ha_category: Other +ha_release: 0.44 +--- + +```yaml +plant: + simulated_plant: + sensors: + moisture: sensor.mqtt_plant_moisture + battery: sensor.mqtt_plant_battery + temperature: sensor.mqtt_plant_temperature + conductivity: sensor.mqtt_plant_conductivity + brightness: sensor.mqtt_plant_brightness + min_moisture: 20 + max_moisture: 60 + min_battery: 17 + min_conductivity: 500 + min_temperature: 15 +``` + +## Using plain MQTT sensor to get the data + +```yaml +sensor: + - platform: mqtt + name: mqtt_plant_moisture + state_topic: test/simulated_plant + value_template: '{{ value_json.moisture }}' + - platform: mqtt + name: mqtt_plant_battery + state_topic: test/simulated_plant + value_template: '{{ value_json.battery }}' + - platform: mqtt + name: mqtt_plant_temperature + state_topic: test/simulated_plant + value_template: '{{ value_json.temperature }}' + - platform: mqtt + name: mqtt_plant_conductivity + state_topic: test/simulated_plant + value_template: '{{ value_json.conductivity }}' + - platform: mqtt + name: mqtt_plant_brightness + state_topic: test/simulated_plant + value_template: '{{ value_json.brightness }}' +``` diff --git a/source/_components/rss_feed_template.markdown b/source/_components/rss_feed_template.markdown new file mode 100644 index 00000000000..2ab720f43cd --- /dev/null +++ b/source/_components/rss_feed_template.markdown @@ -0,0 +1,37 @@ +--- +layout: page +title: "RSS feed template" +description: "Instructions how to setup an RSS feed for sensor information and other." +date: 2017-04-11 20:42 +sidebar: true +comments: false +sharing: true +footer: true +ha_category: Front end +ha_release: 0.43 +--- + +The `rss_feed_template` component can export any information from Home Assistant as static RSS feed. This can be used to display those information on several devices using RSS readers. While native apps for Home Assistant are not widely available, native RSS readers exists for almost any platform. + +E.g. on android, the app "Simple RSS Widget" can be used to display temperatures on the home screen. + + +```yaml +# Example configuration.yml entry +rss_feed_template: + garden: + requires_api_password: False + title: "Garden {% raw %}{{ as_timestamp(now())|timestamp_custom('%H:%m', True) }}{% endraw %}" + items: + - title: "Outside temperature" + description: "{% raw %}{% if is_state('sensor.temp_outside','unknown') %}---{% else %}{{states.sensor.temp_outside.state}} °C{% endif %}{% endraw %}" +``` + +Configuration variables: + +- **requires_api_password:** (*Optional*): If true and an api password is set, the password must be passed via '?api_password=...' parameter (Default: True) +- **feed_id** (*Required*): The key is used as id of the feed. The feed can be accessed at /api/rss_template/feed_id (example: 'garden') +- **title** (*Optional*): The title of the feed, which is parsed as [template](/topics/templating/). +- **items** (*Required*): A list of feed items +- **items/title** (*Optional*): The title of the item, which is parsed as [template](/topics/templating/). +- **items/description** (*Optional*): The description of the item, which is parsed as [template](/topics/templating/). diff --git a/source/_components/sensor.cert_expiry.markdown b/source/_components/sensor.cert_expiry.markdown new file mode 100644 index 00000000000..0321320e1d8 --- /dev/null +++ b/source/_components/sensor.cert_expiry.markdown @@ -0,0 +1,34 @@ +--- +layout: page +title: "Certificate Expiry" +description: "Instructions on how to set up HTTPS (SSL) certificate expiry sensors within Home Assistant." +date: 2017-04-24 14:14 +sidebar: true +comments: false +sharing: true +footer: true +logo: home-assistant.png +ha_category: System Monitor +ha_release: 0.44 +--- + +The `cert_expiry` sensor fetches information from a configured URL and displays the certificate expiry in days. + +To add the Certificate Expiry sensor to your installation, add these options to `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +sensor: + - platform: cert_expiry + host: home-assistant.io +``` + +Configuration variables: + +- **host** (*Required*): The host FQDN (or IP) to retrieve certificate from. +- **port** (*Optional*): The port number where the server is running. Defaults to `443`. + ++Make sure that the URL exactly matches your endpoint or resource. +
+ diff --git a/source/_components/sensor.eight_sleep.markdown b/source/_components/sensor.eight_sleep.markdown new file mode 100644 index 00000000000..42ed89e0586 --- /dev/null +++ b/source/_components/sensor.eight_sleep.markdown @@ -0,0 +1,18 @@ +--- +layout: page +title: "Eight Sleep Sensor" +description: "Instructions how to integrate sensors for Eight Sleep within Home Assistant." +date: 2017-04-24 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: eight_sleep.png +ha_category: Sensor +ha_release: "0.44" +--- + + +The `eight_sleep` sensor platform lets you observe states of [Eight Sleep](https://eightsleep.com/) sensors through Home Assistant. This includes bed state and results of the current and previous sleep sessions. + +Devices will be configured automatically. Please refer to the [component](/components/eight_sleep/) configuration on how to setup. diff --git a/source/_components/sensor.envirophat.markdown b/source/_components/sensor.envirophat.markdown new file mode 100644 index 00000000000..cf899363f7f --- /dev/null +++ b/source/_components/sensor.envirophat.markdown @@ -0,0 +1,118 @@ +--- +layout: page +title: "Enviro pHAT" +description: "Instructions how to integrate the Enviro pHAT within Home Assistant." +date: 2017-05-03 17:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: raspberry-pi.png +ha_category: Sensor +ha_iot_class: "Local Polling" +ha_release: 0.44 +--- + +The `envirophat` sensor platform allows you to display information collected by an [Enviro pHAT](https://shop.pimoroni.com/products/enviro-phat) add-on board for the Raspberry Pi. The board featues a wide range of sensors, such as: + +- BMP280 temperature/pressure sensor +- TCS3472 light and RGB colour sensor with two LEDs for illumination +- LSM303D accelerometer/magnetometer sensor +- ADS1015 4-channel 3.3v, analog to digital sensor (ADC) + +To add this platform to your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry, +# which is equivalent to the default setup +sensor: + - platform: envirophat + use_led: false + display_options: + - temperature + - pressure + - light + - light_red + - light_green + - light_blue + - accelerometer_x + - accelerometer_y + - accelerometer_z + - magnetometer_x + - magnetometer_y + - magnetometer_z + - voltage_0 + - voltage_1 + - voltage_2 + - voltage_3 +``` + +Configuration variables: + +- **display_options** (*Optional*) array: List of readings to monitor. Default is monitoring all of them: + - **temperature**: ambient temperature in Celsius. Since the sensor is close to the Raspberry Pi, that migth affect the accuracy of the reading (ie. the Pi might heat up the sensor) + - **pressure**: atmospheric pressure in hPa. + - **light**: ambient light, as an integer in the 0-65535 range + - **light_red**: red color reading scaled to the ambient light, as an integer in the 0-255 range + - **light_green**: green color reading scaled to the ambient light, as an integer in the 0-255 range + - **light_blue**: blue color reading scaled to the ambient light, as an integer in the 0-255 range + - **accelerometer_x**: accelerometer reading in units of G, along the X axis + - **accelerometer_y**: accelerometer reading in units of G, along the Y axis + - **accelerometer_z**: accelerometer reading in units of G, along the Z axis + - **magnetometer_x**: magnetometer reading, the X component of the raw vector + - **magnetometer_y**: magnetometer reading, the Y component of the raw vector + - **magnetometer_z**: magnetometer reading, the X component of the raw vector + - **voltage_0**: voltage reading on Analog In 0 in units of V + - **voltage_1**: voltage reading on Analog In 1 in units of V + - **voltage_2**: voltage reading on Analog In 2 in units of V + - **voltage_3**: voltage reading on Analog In 3 in units of V +- **use_led** (*Optional*) True / False boolean; Default value is False, declaring that the onboard LEDs are *not* used for the color measurements thus these readings are based on the ambient light. If the value is set to True, the onboard LEDs will blink whenever a reading is taken. + +### Notes + +* **X, Y, Z axes** + * X is parallel with the long edge of the board + * Y is parallel with the short edge of the board + * Z is perpendicular to the board +* **Voltages** + * voltage readings are done in the 0-3.3V range, please do not connect higher voltages than that! See the [Enviro pHAT's getting started guide](https://learn.pimoroni.com/tutorial/sandyj/getting-started-with-enviro-phat) regarding how to make a voltage divider + +### Give the values friendly names & icons + +Add something like the following to your [customize section](/docs/configuration/customizing-devices/): + +```yaml +# Example configuration.yaml entry + customize: + sensor.accelerometer_z: + icon: mdi:airplane-landing + friendly_name: "Acc Z" + sensor.magnetometer_x: + icon: mdi:arrow-up-bold-hexagon-outline + friendly_name: "Magnetic X" + sensor.pressure: + icon: mdi:weight + friendly_name: "Pressure" +``` + +### Create groups + +```yaml +# Example configuration.yaml entry +group: + enviro_phat_voltages: + name: Enviro pHAT Volages` + entities: + - sensor.voltage_0 + - sensor.voltage_1 + - sensor.voltage_2 + - sensor.voltage_3 +``` + +### Enabling the required `i2c-1` device + +Since the Enviro pHAT communicates over I2C, you might also need to make sure that the I2C devices are enabled, by adding or uncommenting the following line in `/boot/config.txt` (see the [DT Parameters section](https://www.raspberrypi.org/documentation/configuration/device-tree.md) in the Raspberry Pi documentation): + +``` +dtparam=i2c_arm=on +``` diff --git a/source/_components/sensor.ios.markdown b/source/_components/sensor.ios.markdown new file mode 100644 index 00000000000..e69de29bb2d diff --git a/source/_components/sensor.pushbullet.markdown b/source/_components/sensor.pushbullet.markdown new file mode 100644 index 00000000000..4b60f5cf5fb --- /dev/null +++ b/source/_components/sensor.pushbullet.markdown @@ -0,0 +1,48 @@ +--- +layout: page +title: "Pushbullet Mirrors" +description: "Instructions how to read user pushes in Home Assitant" +date: 2017-04-20 16:44 +sidebar: true +comments: false +sharing: true +footer: true +logo: pushbullet.png +ha_category: Sensor +ha_release: 0.44 +--- + +The `pushbullet` sensor platform reads messages from [Pushbullet](https://www.pushbullet.com/), a free service to send information between your phones, browsers, and friends. +This sensor platform provide sensors that show the properties of the latest recevied pushbullet notification mirror. + +Notification Mirroring allows users to see their Android device's notifications on their computer. It must be first enabled in the app and is currently only available on the Android platform. For more information, please see [this announcement](https://blog.pushbullet.com/2013/11/12/real-time-notification-mirroring-from-android-to-your-computer/) on the Pushbullet Blog + +To enable the Pushbullet sensor in your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +sensor: + - platform: pushbullet + api_key: YOUR_API_KEY + monitored_conditions: + - body +``` + +Configuration variables: + +- **api_key** (*Required*): Enter the API key for Pushbullet. Go to [https://www.pushbullet.com/#settings/account](https://www.pushbullet.com/#settings/account) to retrieve your API key/access token. + +- **monitored_conditions** array (*Optional*): Properties of the push to monitor. Defaults to `body` and `title`. + - **application_name**: The application sending the push. + - **body**: Body of the message. + - **notification_id**: ID of the notification. + - **notification_tag**: Tag (if the application sending supports it). + - **package_name**: Name of the sender's package. + - **receiver_email**: The email of the push's target. + - **sender_email**: The sender of the push. + - **source_device_iden**: ID of the sender's device. + - **title**: Title of the push. + - **type**: Type of push. + + +All properties will be displayed as attributes. The properties array are just for logging the sensor readings for multiple properties. diff --git a/source/_components/sensor.sensehat.markdown b/source/_components/sensor.sensehat.markdown index 733dcf617d3..03a4a95a6d0 100644 --- a/source/_components/sensor.sensehat.markdown +++ b/source/_components/sensor.sensehat.markdown @@ -150,3 +150,5 @@ This fix has been tested with a clean install of: and * [Home-Assistant 0.37.1](https://home-assistant.io/getting-started/installation-raspberry-pi-all-in-one/) + +For setting up the Sense HAT's RGB LED matrix as lights within Home Assistant, please see the [Sense HAT light component](/components/light.sensehat/). diff --git a/source/_components/sensor.zha.markdown b/source/_components/sensor.zha.markdown new file mode 100644 index 00000000000..75f0655db4e --- /dev/null +++ b/source/_components/sensor.zha.markdown @@ -0,0 +1,16 @@ +--- +layout: page +title: "ZigBee Home Automation Sensor" +description: "Instructions how to setup ZigBee Home Automation sensors within Home Assistant." +date: 2017-02-22 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: zigbee.png +ha_category: Sensor +--- + +To get your ZigBee sensors working with Home Assistant, follow the +instructions for the general [ZigBee Home Automation +component](/components/zha/). diff --git a/source/_components/switch.zha.markdown b/source/_components/switch.zha.markdown new file mode 100644 index 00000000000..e730624d63c --- /dev/null +++ b/source/_components/switch.zha.markdown @@ -0,0 +1,16 @@ +--- +layout: page +title: "ZigBee Home Automation Switch" +description: "Instructions how to setup ZigBee Home Automation switches within Home Assistant." +date: 2017-02-22 00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: zigbee.png +ha_category: Switch +--- + +To get your ZigBee switches working with Home Assistant, follow the +instructions for the general [ZigBee Home Automation +component](/components/zha/). diff --git a/source/_components/tradfri.markdown b/source/_components/tradfri.markdown index 320fbe02f45..60319a6d8aa 100644 --- a/source/_components/tradfri.markdown +++ b/source/_components/tradfri.markdown @@ -38,7 +38,6 @@ macOS: $ sudo brew install libtool $ sudo brew install autoconf $ sudo brew install automake - $ git clone --depth 1 --recursive -b dtls https://github.com/home-assistant/libcoap.git $ cd libcoap $ ./autogen.sh @@ -51,7 +50,7 @@ You will be prompted to configure the gateway through the Home Assistant interfaIf you see an "Unable to connect" message, restart the gateway and try again.
- + The gateway can also be manually configured by adding the following lines to your `configuration.yaml` file: ```yaml diff --git a/source/_components/zha.markdown b/source/_components/zha.markdown new file mode 100644 index 00000000000..46fd0c96a26 --- /dev/null +++ b/source/_components/zha.markdown @@ -0,0 +1,47 @@ +--- +layout: page +title: "Zigbee Home Automation" +description: "Instructions how to integrate your Zigbee Home Automation within Home Assistant." +date: 2017-02-22 19:59 +sidebar: true +comments: false +sharing: true +footer: true +logo: zigbee.png +ha_category: Hub +ha_release: 0.39 +--- + +[ZigBee Home Automation](http://www.zigbee.org/zigbee-for-developers/applicationstandards/zigbeehomeautomation/) +integration for Home Assistant allows you to connect many off-the-shelf ZigBee +devices to Home Assistant, using a compatible ZigBee radio. + +There is currently support for the following device types within Home Assistant: + +- [Binary Sensor](../binary_sensor.zha) (e.g. motion and door sensors) +- [Sensor](../sensor.zha) (e.g. temperature sensors) +- [Light](../light.zha) +- [Switch](../switch.zha) + +Known working ZigBee radios: + +- Nortek/GoControl Z-Wave & Zigbee USB Adaptor - Model HUSBZB-1 + +To configure the component, a `zha` section must be present in the `configuration.yaml`, +and the path to the serial device for the radio and path to the database which will persist your network data is required. + +```yaml +# Example configuration.yaml entry +zha: + usb_path: /dev/ttyUSB2 + database_path: zigbee.db +``` + +Configuration variables: + + - **usb_path** (*Required*): Path to the serial device for the radio. + - **database_path** (*Required*): Path to the database which will keep persisten newtork data. + + + +To add new devices to the network, call the `permit` service on the `zha` domain, and then follow the device instructions. diff --git a/source/_cookbook/configuration_yaml_by_fredsmith.markdown b/source/_cookbook/configuration_yaml_by_fredsmith.markdown new file mode 100644 index 00000000000..8cd2bdf2080 --- /dev/null +++ b/source/_cookbook/configuration_yaml_by_fredsmith.markdown @@ -0,0 +1,13 @@ +--- +layout: page +title: "Configuration.yaml by fredsmith" +description: "" +date: 2017-04-28 18:30 +sidebar: true +comments: false +sharing: true +footer: true +ha_category: Example configuration.yaml +ha_external_link: https://git.smith.bz/derf/homeautomation +--- + diff --git a/source/_posts/2017-05-06-zigbee-opencv-dlib.markdown b/source/_posts/2017-05-06-zigbee-opencv-dlib.markdown new file mode 100644 index 00000000000..f6c0334ea22 --- /dev/null +++ b/source/_posts/2017-05-06-zigbee-opencv-dlib.markdown @@ -0,0 +1,487 @@ +--- +layout: post +title: "Home Assistant 0.44: ZigBee, OpenCV and DLib" +description: "Speak natively with Zigbee network, detect faces with OpenCV: 0.44 is here." +date: 2017-05-06 01:04:05 +0000 +date_formatted: "May 6, 2017" +author: Paulus Schoutsen +author_twitter: balloob +comments: true +categories: Release-Notes +og_image: /images/blog/2017-05-0.44/components.png +--- + +