diff --git a/_config.yml b/_config.yml index 9c226bb45af..10e1ef759d7 100644 --- a/_config.yml +++ b/_config.yml @@ -142,12 +142,12 @@ social: # Home Assistant release details current_major_version: 0 -current_minor_version: 48 -current_patch_version: 1 -date_released: 2017-07-05 +current_minor_version: 49 +current_patch_version: 0 +date_released: 2017-07-16 # Either # or the anchor link to latest release notes in the blog post. # Must be prefixed with a # and have double quotes around it. # Major release: -patch_version_notes: "#release-0481---july-5" +patch_version_notes: "#" # Minor release (Example #release-0431---april-25): diff --git a/source/_components/amcrest.markdown b/source/_components/amcrest.markdown new file mode 100644 index 00000000000..194592081bd --- /dev/null +++ b/source/_components/amcrest.markdown @@ -0,0 +1,63 @@ +--- +layout: page +title: "Amcrest IP Camera" +description: "Instructions how to integrate Amcrest IP cameras within Home Assistant." +date: 2017-06-24 10:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: amcrest.png +ha_category: Hub +ha_iot_class: "Local Polling" +ha_release: 0.49 +--- + +The `amcrest` platform allows you to integrate your [Amcrest](https://amcrest.com/) IP camera in Home Assistant. + +To enable your camera in your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +amcrest: + - host: IP_ADDRESS + username: USERNAME + password: PASSWORD + sensors: + - motion_detector + - sdcard + + - host: IP_ADDRESS + username: USERNAME + password: PASSWORD + resolution: low + stream_source: snapshot + sensors: + - ptz_preset +``` + +Configuration variables: + +- **host** (*Required*): The IP address or hostname of your camera. If using hostname, make sure the DNS works as expected. +- **username** (*Required*): The username for accessing your camera. +- **password** (*Required*): The password for accessing your camera. +- **name** (*Optional*): This parameter allows you to override the name of your camera. The default is "Amcrest Camera". +- **port** (*Optional*): The port that the camera is running on. The default is 80. +- **resolution** (*Optional*): This parameter allows you to specify the camera resolution. For a high resolution (1080/720p), specify the option `high`. For VGA resolution (640x480p), specify the option `low`. If omitted, it defaults to *high*. +- **stream_source** (*Optional*): The data source for the live stream. `mjpeg` will use the camera's native MJPEG stream, whereas `snapshot` will use the camera's snapshot API to create a stream from still images. You can also set the `rtsp` option to generate the streaming via RTSP protocol. If omitted, it defaults to *snapshot*. +- **ffmpeg_arguments**: (*Optional*): Extra options to pass to ffmpeg, e.g. image quality or video filter options. +- **authentication**: (*Optional*): Defines which authentication method to use only when **stream_source** is **mjpeg**. Currently *aiohttp* only support *basic*. It defaults to *basic*. +- **scan_interval** (*Optional*): Defines the update interval of the sensor in seconds. The default is 10 seconds. +- **sensors** array (*Optional*): Conditions to display in the frontend. By default, *none* of the conditions are enabled. The following conditions can be monitored. + - **motion_detector**: Return True/False when a motion is detected + - **sdcard**: Return the SD card usage by reporting the total and used space + - **ptz_preset**: Return the number of PTZ preset positions configured for the given camera + +**Note:** Amcrest cameras with newer firmwares no longer have the ability to stream `high` definition video with MJPEG encoding. You may need to use `low` resolution stream or the `snapshot` stream source instead. If the quality seems too poor, lower the `Frame Rate (FPS)` and max out the `Bit Rate` settings in your camera's configuration manager. If you defined the *stream_source* to **mjpeg**, make sure your camera supports *Basic* HTTP authentication. Newer Amcrest firwmares may not work, then **rtsp** is recommended instead. + +**Note:** If you set the `stream_source` option to `rtsp`, make sure to follow the steps mentioned at +[FFMPEG](https://home-assistant.io/components/ffmpeg/) documentation to install the `ffmpeg`. + +Finish its configuration by visiting the [Amcrest sensor page](/components/sensor.amcrest/) or [Amcrest camera page](/components/camera.amcrest/). + +To check if your Amcrest camera is supported/tested, visit the [supportability matrix](https://github.com/tchellomello/python-amcrest#supportability-matrix) link from the `python-amcrest` project. diff --git a/source/_components/axis.markdown b/source/_components/axis.markdown index bf4736548d3..192c5774803 100644 --- a/source/_components/axis.markdown +++ b/source/_components/axis.markdown @@ -50,21 +50,23 @@ axis: Configuration variables: -- **device** (*Required*): Unique name for the Axis device. - - **host** (*Required*): The IP address to your Axis device. - - **username** (*Optional*): The username to your Axis device. Defaults to `root`. - - **password** (*Optional*): The password to your Axis device. Defaults to `pass`. - - **trigger_time** (*Optional*): Minimum time (in seconds) a sensor should keep its positive value. Defaults to 0. - - **location** (*Optional*): Physical location of your Axis device. Default not set. - - **include** (*Required*): This cannot be empty else there would be no use adding the device at all. - - **camera**: Stream MJPEG video to Home Assistant. - - **motion**: The built-in motion detection in Axis cameras. - - **vmd3**: ACAP Motion Detection app which has better algorithms for motion detection. - - **pir**: PIR sensor that can trigger on motion. - - **sound**: Sound detector. - - **daynight**: Certain cameras have day/night mode if they have built-in IR lights. - - **tampering**: Signals when camera believes that it has been tampered with. - - **input**: Trigger on whatever you have connected to device input port. +## {% linkable_title Configuration variables %} + +- **device** (*Required*): Unique name +- **host** (*Required*): The IP address to your Axis device. +- **username** (*Optional*): The username to your Axis device. Default 'root'. +- **password** (*Optional*): The password to your Axis device. Default 'pass'. +- **trigger_time** (*Optional*): Minimum time (in seconds) a sensor should keep its positive value. Default 0. +- **location** (*Optional*): Physical location of your Axis device. Default not set. +- **include** (*Required*): This cannot be empty else there would be no use adding the device at all. + - **camera**: Stream MJPEG video to Home Assistant. + - **motion**: The built-in motion detection in Axis cameras. + - **vmd3**: ACAP Motion Detection app which has better algorithms for motion detection. + - **pir**: PIR sensor that can trigger on motion. + - **sound**: Sound detector. + - **daynight**: Certain cameras have day/night mode if they have built-in IR lights. + - **tampering**: Signals when camera believes that it has been tampered with. + - **input**: Trigger on whatever you have connected to device input port. A full configuration example could look like this: @@ -85,6 +87,10 @@ axis: location: köket ``` +
+If you are using Python3.6 you might need to replace the 34m with 36m in the _gi.*.so filename in the gi folder. +
+Any specific levels for triggers needs to be configured on the device.
@@ -92,3 +98,18 @@ Any specific levels for triggers needs to be configured on the device.It is recommended that you create a user on your Axis device specifically for Home Assistant. For all current functionality it is enough to create a user belonging to user group viewer.
+ +## {% linkable_title Device services %} +Available services: `vapix_call`. + +#### {% linkable_title Service `axis/vapix_call` %} +Send a command using [Vapix](https://www.axis.com/support/developer-support/vapix). For details please read the API specifications. + +| Service data attribute | Optional | Description | +|---------------------------|----------|--------------------------------------------------| +| `name` | no | Name of device to communicate with. | +| `param` | no | What parameter to operate on. | +| `cgi` | yes | Which cgi to call on device. Default is `param.cgi`. | +| `action` | yes | What type of call. Default is `update`. | + +Response to call can be subscribed to on event `vapix_call_response` diff --git a/source/_components/binary_sensor.pilight.markdown b/source/_components/binary_sensor.pilight.markdown index 813ed0d22f8..a372098918f 100644 --- a/source/_components/binary_sensor.pilight.markdown +++ b/source/_components/binary_sensor.pilight.markdown @@ -20,18 +20,30 @@ Two type of pilight binary sensor configuration available. A normal sensor which # Example configuration.yml entry binary_sensor: - platform: pilight - name: 'Motion' variable: 'state' - payload: - unitcode: 371399 - payload_on: 'closed' - disarm_after_trigger: True <-- use this if you want trigger type behavior ``` Configuration variables: + - **variable** (*Required*): The variable name in the data stream that defines the sensor value. - **payload** (*Required*): Message payload identifiers. Only if all identifiers are matched the sensor value is set. - **name** (*Optional*): Name of the sensor. - **payload_on** (*Optional*): Variable `on` value. The component will recognize this as logical '1'. - **payload_off** (*Optional*): Variable `off` value. The component will recognize this as logical '0'. - **disarm_after_trigger:** (*Optional*): Configure sensor as trigger type. +- **reset_delay_sec** (*Optional*): Seconds before the sensor is disarmed if `disarm_after_trigger` is set to true. Default is 30 seconds. + +A full configuration example could look like this: + +```yaml +# Example configuration.yml entry +binary_sensor: + - platform: pilight + name: 'Motion' + variable: 'state' + payload: + unitcode: 371399 + payload_on: 'closed' + disarm_after_trigger: True + reset_delay_sec: 30 +``` diff --git a/source/_components/camera.amcrest.markdown b/source/_components/camera.amcrest.markdown index ee9d7761118..2c1a042c877 100644 --- a/source/_components/camera.amcrest.markdown +++ b/source/_components/camera.amcrest.markdown @@ -13,33 +13,14 @@ ha_iot_class: "Local Polling" ha_release: 0.34 --- -The `amcrest` platform allows you to integrate your [Amcrest](https://amcrest.com/) IP camera in Home Assistant. +To get your [Amcrest](https://amcrest.com/) cameras working within Home Assistant, please follow the instructions for the general [Amcrest component](/components/amcrest). -To enable your camera in your installation, add the following to your `configuration.yaml` file: +Once you have enabled the [Amcrest component](/components/amcrest), add the following to your `configuration.yaml` file: ```yaml # Example configuration.yaml entry camera: - platform: amcrest - host: IP_ADDRESS - username: USERNAME - password: PASSWORD ``` -Configuration variables: - -- **host** (*Required*): The IP address or hostname of your camera. If using hostname, make sure the DNS works as expected. -- **username** (*Required*): The username for accessing your camera. -- **password** (*Required*): The password for accessing your camera. -- **name** (*Optional*): This parameter allows you to override the name of your camera. The default is "Amcrest Camera". -- **port** (*Optional*): The port that the camera is running on. The default is 80. -- **resolution** (*Optional*): This parameter allows you to specify the camera resolution. For a high resolution (1080/720p), specify the option `high`. For VGA resolution (640x480p), specify the option `low`. If omitted, it defaults to *high*. -- **stream_source** (*Optional*): The data source for the live stream. `mjpeg` will use the camera's native MJPEG stream, whereas `snapshot` will use the camera's snapshot API to create a stream from still images. You can also set the `rtsp` option to generate the streaming via RTSP protocol. If omitted, it defaults to *mjpeg*. -- **ffmpeg_arguments**: (*Optional*): Extra options to pass to ffmpeg, e.g. image quality or video filter options. - -**Note:** Amcrest cameras with newer firmwares no longer have the ability to stream `high` definition video with MJPEG encoding. You may need to use `low` resolution stream or the `snapshot` stream source instead. If the quality seems too poor, lower the `Frame Rate (FPS)` and max out the `Bit Rate` settings in your camera's configuration manager. - -**Note:** If you set the `stream_source` option to `rtsp`, make sure to follow the steps mentioned at -[FFMPEG](https://home-assistant.io/components/ffmpeg/) documentation to install the `ffmpeg`. - To check if your Amcrest camera is supported/tested, visit the [supportability matrix](https://github.com/tchellomello/python-amcrest#supportability-matrix) link from the `python-amcrest` project. diff --git a/source/_components/cover.template.markdown b/source/_components/cover.template.markdown index d8d51f36953..46873b6e94c 100644 --- a/source/_components/cover.template.markdown +++ b/source/_components/cover.template.markdown @@ -36,9 +36,9 @@ cover: Configuration variables: - **covers** array (*Required*): List of your coverss. - - **open_cover** (*Required*): Defines an [action](/getting-started/automation/) to run when the cover is opened. - - **close_cover** (*Required*): Defines an [action](/getting-started/automation/) to run when the cover is closed. - - **stop_cover** (*Required*): Defines an [action](/getting-started/automation/) to run when the cover is stopped. + - **open_cover** (*Optional*): Defines an [action](/getting-started/automation/) to run when the cover is opened. If `open_cover` is specified, `close_cover` must also be specified. At least one of `open_cover` and `set_cover_position` must be specified. + - **close_cover** (*Optional*): Defines an [action](/getting-started/automation/) to run when the cover is closed. + - **stop_cover** (*Optional*): Defines an [action](/getting-started/automation/) to run when the cover is stopped. - **set_cover_position** (*Optional*): Defines an [action](/getting-started/automation/) to run when the cover is set to a specific value (between 0 and 100). - **set_cover_tilt_position** (*Optional*): Defines an [action](/getting-started/automation/) to run when the cover tilt is set to a specific value (between 0 and 100). - **friendly_name** (*Optional*): Name to use in the frontend. diff --git a/source/_components/device_tracker.upc_connect.markdown b/source/_components/device_tracker.upc_connect.markdown index bb5261269a5..d58a1878e93 100644 --- a/source/_components/device_tracker.upc_connect.markdown +++ b/source/_components/device_tracker.upc_connect.markdown @@ -21,12 +21,10 @@ To use a Connect Box in your installation, add the following to your `configurat # Example configuration.yaml entry device_tracker: - platform: upc_connect - password: YOUR_PASSWORD ``` Configuration variables: -- **password** (*Required*): The password for your Connect Box. - **host** (*Optional*): The IP address of your router. Set it if you are not using `192.168.0.1`. See the [device tracker component page](/components/device_tracker/) for instructions how to configure the people to be tracked. diff --git a/source/_components/frontend.markdown b/source/_components/frontend.markdown index 47027c322bf..785e59c7f8d 100644 --- a/source/_components/frontend.markdown +++ b/source/_components/frontend.markdown @@ -17,3 +17,22 @@ This offers the official frontend to control Home Assistant. # Example configuration.yaml entry frontend: ``` + +#### Themes +Starting with version 0.49 you can define themes: + +Example: +```yaml +frontend: + themes: + happy: + primary-color: pink + sad: + primary-color: blue +``` + +The example above defined two themes named `happy` and `sad`. For each theme you can set values for CSS variables. For a partial list of variables used by the main frontend see [ha-style.html](https://github.com/home-assistant/home-assistant-polymer/blob/master/src/resources/ha-style.html) + +There are 2 themes-related services: + - `frontend.reload_themes` - reloads theme configuration from yaml. + - `frontend.set_theme(name)` - sets backend-preferred theme name. diff --git a/source/_components/ha.markdown b/source/_components/ha.markdown index ba629a567f3..ec4168a82ce 100644 --- a/source/_components/ha.markdown +++ b/source/_components/ha.markdown @@ -1,6 +1,6 @@ --- layout: page -title: "Home Assistant 0.48" +title: "Home Assistant 0.49" description: "" date: 2016-12-16 17:00 sidebar: true @@ -9,7 +9,7 @@ sharing: true footer: true logo: home-assistant.png ha_category: Other -ha_release: 0.48 +ha_release: 0.49 --- Details about the latest release can always be found at: diff --git a/source/_components/lametric.markdown b/source/_components/lametric.markdown new file mode 100644 index 00000000000..36d6b6d4868 --- /dev/null +++ b/source/_components/lametric.markdown @@ -0,0 +1,19 @@ +--- +layout: page +title: "LaMetric" +description: "Instructions on how to integrate LaMetric with Home Assistant." +date: 2017-04-02 13:28 +sidebar: true +comments: false +sharing: true +footer: true +ha_category: Hub +ha_release: 0.49 +--- + +```yaml +# configuration.yaml example +lametric: + client_id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx + client_secret: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx +``` diff --git a/source/_components/light.lifx.markdown b/source/_components/light.lifx.markdown index f44445a186d..a8d5d3bb1ba 100644 --- a/source/_components/light.lifx.markdown +++ b/source/_components/light.lifx.markdown @@ -21,11 +21,11 @@ _Please note, the `lifx` platform does not support Windows. The `lifx_legacy` pl # Example configuration.yaml entry light: - platform: lifx - server: 192.168.1.10 ``` Configuration variables: -- **server** (*Optional*): Your server address. Only needed if using more than one network interface. Omit if you are unsure. +- **broadcast** (*Optional*): The broadcast address for discovering lights. Only needed if using more than one network interface. Omit if you are unsure. +- **server** (*Optional*): Your server address. Will listen on all interfaces if omitted. Omit if you are unsure. ## {% linkable_title Set state %} diff --git a/source/_components/light.rflink.markdown b/source/_components/light.rflink.markdown index 705d7073c66..4b24c6c1f84 100644 --- a/source/_components/light.rflink.markdown +++ b/source/_components/light.rflink.markdown @@ -49,18 +49,20 @@ Device configuration variables: - **name** (*Optional*): Name for the device, defaults to Rflink ID. - **type** (*Optional*): Override automatically detected type of the light device, can be: switchable, dimmable, hybrid or toggle. See 'Light Types' below. (default: Switchable) -- **aliasses** (*Optional*): Alternative Rflink ID's this device is known by. +- **aliases** (*Optional*): Alternative Rflink ID's this device is known by. - **fire_event** (*Optional*): Fire an `button_pressed` event if this device is turned on or off (default: False). - **signal_repetitions** (*Optional*): Repeat every Rflink command this number of times (default: 1). - **fire_event_** (*Optional*): Set default `fire_event` for RFLink switch devices (see below). - **signal_repetitions** (*Optional*): Set default `signal_repetitions` for RFLink switch devices (see below). - +- **group** (*Optional*): Allow light to respond to group commands (ALLON/ALLOFF). (default: yes) +- **group_aliases** (*Optional*): `aliases` which only respond to group commands. +- **no_group_aliases** (*Optional*): `aliases` which do not respond to group commands. ### {% linkable_title Light state %} Initially the state of a light is unknown. When the light is turned on or off (via frontend or remote) the state is known and will be shown in the frontend. -Sometimes a light is controlled by multiple remotes, each remote has its own code programmed in the light. To allow tracking of the state when switched via other remotes add the corresponding remote codes as aliasses: +Sometimes a light is controlled by multiple remotes, each remote has its own code programmed in the light. To allow tracking of the state when switched via other remotes add the corresponding remote codes as aliases: ```yaml # Example configuration.yaml entry @@ -69,7 +71,7 @@ light: devices: newkaku_0000c6c2_1: name: Living room - aliasses: + aliases: - newkaku_000000001_2 - kaku_000001_a Ansluta_ce30_0: diff --git a/source/_components/media_extractor.markdown b/source/_components/media_extractor.markdown new file mode 100644 index 00000000000..812a00f2e17 --- /dev/null +++ b/source/_components/media_extractor.markdown @@ -0,0 +1,37 @@ +--- +layout: page +title: "Media Extractor" +description: "Instructions how to integrate the Media Extrator into Home Assistant." +date: 2017-07-12 07:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: home-assistant.png +ha_category: Media Player +ha_release: 0.49 +--- + + +The `media_extractor` component gets an stream URL and send it to a media player entity. + +To use the media extrator service in your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +media_extractor: +``` + +### {% linkable_title Use the service %} + +Go the the "Developer Tools", then to "Call Service", and choose `media_extractor/play_media` from the list of available services. Fill the "Service Data" field as shown in the example below and hit "CALL SERVICE". + +This will download the file from the given URL. + +| Service data attribute | Optional | Description | +| ---------------------- | -------- | ----------- | +| `entity_id` | yes | Name(s) of entities to seek media on, eg. `media_player.living_room_chromecast`. Defaults to all. +| `media_content_id` | no | The ID of the content to play. Platform dependent. +| `media_content_type` | no | The type of the content to play. Must be one of MUSIC, TVSHOW, VIDEO, EPISODE, CHANNEL or PLAYLIST MUSIC. + + diff --git a/source/_components/media_player.onkyo.markdown b/source/_components/media_player.onkyo.markdown index b8ba67fa4d7..f0fb1580963 100644 --- a/source/_components/media_player.onkyo.markdown +++ b/source/_components/media_player.onkyo.markdown @@ -14,7 +14,7 @@ ha_iot_class: "Local Polling" --- -The `onkyo` platform allows you to control a [Onkyo receiver](http://www.onkyo.com/) from Home Assistant. +The `onkyo` platform allows you to control a [Onkyo receiver](http://www.onkyo.com/) from Home Assistant. Please be aware that you need to enable "Network Standby" for this component to work in your Hardware. To add an Onkyo receiver to your installation, add the following to your `configuration.yaml` file: diff --git a/source/_components/media_player.soundtouch.markdown b/source/_components/media_player.soundtouch.markdown index 897fa228399..fa2a550b024 100644 --- a/source/_components/media_player.soundtouch.markdown +++ b/source/_components/media_player.soundtouch.markdown @@ -52,7 +52,7 @@ Configuration variables: You can switch between one of your 6 pre-configured presets using ```media_player.play_media``` ```yaml -# Play media in configuration.yaml +# Play media preset - service: media_player.play_media data: entity_id: media_player.soundtouch_living_room @@ -60,6 +60,23 @@ You can switch between one of your 6 pre-configured presets using ```media_playe media_content_type: PLAYLIST ``` +You can also play HTTP (not HTTPS) URLs: + +```yaml +# Play media URL +- service: media_player.play_media + data: + entity_id: media_player.soundtouch_living_room + media_content_id: http://example.com/music.mp3 + media_content_type: MUSIC +``` + +### {% linkable_title Text-to-Speech services %} + +You can use TTS services like [Google Text-to-Speech](/components/tts.google/) or [Amazon Polly](/components/tts.amazon_polly) only if your Home Assistant is configured in HTTP and not HTTPS (current device limitation, a firmware upgrade is planned). + +A workaround if you want to publish your Home Assistant installation on Internet in SSL is to configure an HTTPS Web Server as a reverse proxy ([nginx](/docs/ecosystem/nginx/) for example) and let your Home Assistant configuration in HTTP on your local network. The Soundtouch devices will be available to access the TTS files in HTTP in local and your configuration will be in HTTPS on the Internet. + ### {% linkable_title Service `soundtouch_play_everywhere` %} Create a multi-room (zone) from a master and play same content on all other diff --git a/source/_components/media_player.vizio.markdown b/source/_components/media_player.vizio.markdown new file mode 100644 index 00000000000..b5bbaec40d6 --- /dev/null +++ b/source/_components/media_player.vizio.markdown @@ -0,0 +1,93 @@ +--- +layout: page +title: "Vizio SmartCast TV" +description: "Instructions how to integrate Vizio SmartCast TV into Home Assistant." +date: 2017-07-10 19:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: vizio-smartcast.png +ha_category: Media Player +featured: true +ha_release: 0.49 +ha_iot_class: "Local Polling" +--- + +The `vizio` component will allow you to control [SmartCast](https://www.vizio.com/smartcast-app) compatible TVs (2016+ models). + +## Pairing + +Before adding TV to Home Assistant you'll need to pair it manually, to do so follow these steps: + +Install the command-line tool using pip (you can choose to download it manually): + +```bash +$ pip3 install git+https://github.com/vkorn/pyvizio.git@master +$ pip3 install -I . +``` + +Make sure that your TV is on, as sometimes it won't show PIN code if it wasn't on during pairing initialization. +If you don't know IP address of your TV run following command: + +```bash +$ pyvizio --ip=0 --auth=0 discover +``` + +Initiate pairing: + +```bash +$ pyvizio --ip={ip} pair +``` + +Initiation will show you two different values: + +| Value | Description | +|:----------------|:---------------------| +| Challenge type | Usually it's should be `"1"`, if it's not the case for you, use additional parameter `--ch_type=your_type` in the next step | +| Challenge token | Token required to finalize pairing in the next step | + +Finally, at this point PIN code should be displayed at the top of your TV. With all these values, you can now finish pairing: + +```bash +$ pyvizio --ip={ip} pair_finish --token={challenge_token} --pin={tv_pin} +``` + +You will need authentication token returned by this command to configure Home Assistant. + +## Configuration + +To add your Vizio TV to your installation, add following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +media_player: + - platform: vizio + host: IP_ADDRESS + access_token: AUTH_TOKEN +``` + +Configuration variables: + +- **host** (*Required*): IP address of your TV. +- **access_token** (*Required*): Authentication token you've received in last step of the pairing process. + +## Notes and limitations + +### Turning TV on + +If you do have `Power Mode` of your TV configured to be `Eco Mode`, turning device ON won't work. + +### Changing tracks + +Changing tracks works like channels switching. If you have source other than regular TV it might end do nothing. + +### Sources + +Source list shows all external devices connected to the TV through HDMI plus list of internal devices (TV mode, Chrome Cast, etc.). + ++Vizio SmartCast service is accessible through HTTPS with self-signed certificate. It means that if you have low LOGLEVEL in your Home Assistant configuration, you'll see a lot of warnings like this `InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised.` + +As an option, you could proxy all calls for example through NGINX. +
diff --git a/source/_components/notify.discord.markdown b/source/_components/notify.discord.markdown index 44abb2deed4..2a102d073f7 100644 --- a/source/_components/notify.discord.markdown +++ b/source/_components/notify.discord.markdown @@ -14,7 +14,7 @@ ha_release: 0.37 The [Discord service](https://discordapp.com/) is a platform for the notify component. This allows components to send messages to the user using Discord. -In order to get a token you need to go to the [Discord My Apps page](https://discordapp.com/developers/applications/me) and create a new application. Once the application is ready, create a [bot](https://discordapp.com/developers/docs/topics/oauth2#bots) user (**Create a Bot User**) and don't activate **Require OAuth2 Code Grant**. Retreive the **Client ID** and the (hidden) **Token** of your bot for later. +In order to get a token you need to go to the [Discord My Apps page](https://discordapp.com/developers/applications/me) and create a new application. Once the application is ready, create a [bot](https://discordapp.com/developers/docs/topics/oauth2#bots) user (**Create a Bot User**) and activate **Require OAuth2 Code Grant**. Retreive the **Client ID** and the (hidden) **Token** of your bot for later. When setting up the application you can use this [icon](https://home-assistant.io/demo/favicon-192x192.png). diff --git a/source/_components/notify.lametric.markdown b/source/_components/notify.lametric.markdown new file mode 100644 index 00000000000..70ad2a4cdc0 --- /dev/null +++ b/source/_components/notify.lametric.markdown @@ -0,0 +1,20 @@ +--- +layout: page +title: "LaMetric Notify" +description: "Instructions on how to setup the LaMetric notify platform with Home Assistant." +date: 2017-04-02 13:28 +sidebar: true +comments: false +sharing: true +footer: true +ha_category: Notify +ha_release: 0.49 +--- + +```yaml +notify: + name: lametric1 + platform: lametric + display_time: 20 + icon: i555 +``` diff --git a/source/_components/notify.slack.markdown b/source/_components/notify.slack.markdown index d319a37438c..88cee6a70e9 100644 --- a/source/_components/notify.slack.markdown +++ b/source/_components/notify.slack.markdown @@ -40,5 +40,47 @@ Configuration variables: - **username** (*Optional*): Setting username will allow Home Assistant to post to Slack using the username specified. By default not setting this will post to Slack using the user account or botname that you generated the api_key as. - **icon** (*Optional*): Use one of the Slack emoji's as an Icon for the supplied username. Slack uses the standard emoji sets used [here](http://www.webpagefx.com/tools/emoji-cheat-sheet/). +### {% linkable_title Slack service data %} + +The following attributes can be placed `data` for extended functionality. + +| Service data attribute | Optional | Description | +| ---------------------- | -------- | ----------- | +| `file` | yes | Groups the attributes for file upload. If present, either `url` or `path` have to be provided. +| `path ` | yes | Local path of file, photo etc to post to slack. Is placed inside `file`. +| `url` | yes | URL of file, photo etc to post to slack. Is placed inside `file`. +| `username` | yes | Username if the url requires authentication. Is placed inside `file`. +| `password` | yes | Password if the url requires authentication. Is placed inside `file`. +| `auth` | yes | If set to `digest` HTTP-Digest-Authentication is used. If missing HTTP-BASIC-Authentication is used. Is placed inside `file`. + +Example for posting file from URL +```json +{ + "message":"Message that will be added as a comment to the file.", + "title":"Title of the file.", + "data":{ + "file":{ + "url":"http://[url to file, photo, security camera etc]", + "username":"optional user, if necessary", + "password":"optional password, if necessary", + "auth":"digest" + } + } +} +``` +Example for posting file from local path +```json +{ + "message":"Message that will be added as a comment to the file.", + "title":"Title of the file.", + "data":{ + "file":{ + "path":"/path/to/file.ext" + } + } +} +``` +Please note that `path` is validated against the `whitelist_external_dirs` in the `configuration.yaml`. + To use notifications, please see the [getting started with automation page](/getting-started/automation/). diff --git a/source/_components/prometheus.markdown b/source/_components/prometheus.markdown new file mode 100644 index 00000000000..e0f701bcf37 --- /dev/null +++ b/source/_components/prometheus.markdown @@ -0,0 +1,38 @@ +--- +layout: page +title: "Prometheus" +description: "Record events in Prometheus." +date: 2017-06-25 08:00 +sidebar: true +comments: false +sharing: true +logo: prometheus.png +footer: true +ha_category: "History" +ha_release: 0.49 +--- + +The `prometheus` component exposes metrics in a format which [Prometheus](https://prometheus.io/) can read. + +To use the `prometheus` component in your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +prometheus: +``` + +The Prometheus component has no configuration variables. + +You can then configure Prometheus to fetch metrics from Home Assistant by adding to its `scrape_configs` configuration. + +```yaml +# Example Prometheus scrape_configs entry + - job_name: 'hass' + scrape_interval: 60s + metrics_path: /api/prometheus + params: + api_password: ['PASSWORD'] + scheme: https + static_configs: + - targets: ['HOSTNAME:8123'] +``` diff --git a/source/_components/scene.velux.markdown b/source/_components/scene.velux.markdown new file mode 100644 index 00000000000..e94928d889a --- /dev/null +++ b/source/_components/scene.velux.markdown @@ -0,0 +1,19 @@ +--- +layout: page +title: "Velux Scene" +description: "Instructions on how to integrate Velux Scene with Home Assistant." +date: 2017-07-09 12:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: velux.png +ha_category: Scene +ha_release: 0.49 +ha_iot_class: "Local Polling" +--- + + +The `velux` scene platform allows you to control your [VELUX](http://www.velux.com/) windows. + +The requirement is that you have setup the [VELUX](/components/velux/) component. diff --git a/source/_components/sensor.amcrest.markdown b/source/_components/sensor.amcrest.markdown index dcdb919a014..538369de5a3 100644 --- a/source/_components/sensor.amcrest.markdown +++ b/source/_components/sensor.amcrest.markdown @@ -13,34 +13,13 @@ ha_release: 0.37 ha_iot_class: "Local Polling" --- -The `amcrest` sensor allows you to integrate your [Amcrest](https://amcrest.com/) IP camera in Home Assistant. +To get your [Amcrest](https://amcrest.com/) cameras working within Home Assistant, please follow the instructions for the general [Amcrest component](/components/amcrest). -To enable the `amcrest` sensors on your camera, add the following to your `configuration.yaml` file: +Once you have enabled the [Amcrest component](/components/amcrest), add the following to your `configuration.yaml` file: ```yaml # Example configuration.yaml entry sensor: - platform: amcrest - host: IP_ADDRESS - username: USERNAME - password: PASSWORD - monitored_conditions: - - motion_detector - - sdcard - - ptz_preset ``` - -Configuration variables: - -- **host** (*Required*): The IP address or hostname of your camera. If using hostname, make sure the DNS works as expected. -- **username** (*Required*): The username for accessing your camera. -- **password** (*Required*): The password for accessing your camera. -- **name** (*Optional*): This parameter allows you to override the name of your camera. The default is "Amcrest Camera". -- **port** (*Optional*): The port that the camera is running on. The default is 80. -- **scan_interval** (*Optional*): Defines the update interval of the sensor in seconds. The default is 10 seconds. -- **monitored_conditions** array (*Required*): Conditions to display in the frontend. The following conditions can be monitored. - - **motion_detector**: Return True/False when a motion is detected - - **sdcard**: Return the SD card usage by reporting the total and used space - - **ptz_preset**: Return the number of PTZ preset positions configured for the given camera - To check if your Amcrest camera is supported/tested, visit the [supportability matrix](https://github.com/tchellomello/python-amcrest#supportability-matrix) link from the `python-amcrest` project. diff --git a/source/_components/sensor.arwn.markdown b/source/_components/sensor.arwn.markdown index f7d76b8b6f8..fb66cc50bd0 100644 --- a/source/_components/sensor.arwn.markdown +++ b/source/_components/sensor.arwn.markdown @@ -22,4 +22,4 @@ sensor: - platform: arwn ``` -Currently all temperatures, barometers, and wind sensors will be displayed. Support for rain gauge sensors will happen in the future. +Currently all temperatures, barometers, moisture, rain, and wind sensors will be displayed. diff --git a/source/_components/sensor.buienradar.markdown b/source/_components/sensor.buienradar.markdown index 750a0ae7901..2bba6a25e68 100644 --- a/source/_components/sensor.buienradar.markdown +++ b/source/_components/sensor.buienradar.markdown @@ -15,7 +15,7 @@ ha_iot_class: "Cloud Polling" The `buienradar` platform uses [buienradar.nl](http://buienradar.nl/) as an source for current meteorological data for your location. The weather forecast is delivered by Buienradar, who provides a webservice that provides detailed weather information for users in The Netherlands. -The relevant weatherstation used will be automatically selected based on the location specified in the Home Assistant configuration (or in the buienradar weather/sensor component). +The relevant weatherstation used will be automatically selected based on the location specified in the Home Assistant configuration (or in the buienradar weather/sensor component). The selected weatherstation will provide all weather data, with the exception of the forecasted precipitaion. The forecasted precipitation data will be retrieved from buienradar using your actual gps-location (and not the location of the nearest weatherstation). To integrate `buienradar` with Home Assistant, add the following section to your `configuration.yaml` file: @@ -45,16 +45,16 @@ Configuration variables: - **groundtemperature**: The current ground temperature (in C). - **windspeed**: The wind speed in m/s. - **windforce**: The wind speed/force in Bft. - - **winddirection**: Where the wind is coming from in degrees, with true north at 0° and progressing clockwise. - - **windazimuth**: Where the wind is coming from: N (North),Z (south), NO (Noth-East), etc. + - **winddirection**: Where the wind is coming from: N (North),Z (south), NO (Noth-East), etc. + - **windazimuth**: Where the wind is coming from in degrees, with true north at 0° and progressing clockwise. - **pressure**: The sea-level air pressure in hPa. - **visibility**: Visibility in meters (m). - **windgust**: The windspeed of wind gusts (m/s). - **precipitation**: The amount of precipitation/rain in mm/h. - - **irradiance**: Sun intensity in Watt per square meter (W/m2). - **precipitation_forecast_average**: The average expected precipitation/rain in mm/h within the given timeframe. - - **precipitation_forecast_total**: The total expected precipitation/rain in mm/h within the given timeframe. - + - **precipitation_forecast_total**: The total expected precipitation/rain in mm within the given timeframe. The total expected rain in the configured timeframe will be equal to _precipitation_forecast_total_/_timeframe_ mm/min. So, with timeframe configured to 30 minutes and a value of 5, the expected rain is 5 mm in 30 minutes, which is the same as 10 mm/h. If timeframe is set to 90 minutes and a value of 5, the expected rain is 5 mm in 90 minutes, which is equal to 3.3 mm/h. + - **irradiance**: Sun intensity in Watt per square meter (W/m2). + Full configuration example where location is manually specified: ```yaml diff --git a/source/_components/sensor.citybikes.markdown b/source/_components/sensor.citybikes.markdown new file mode 100644 index 00000000000..44a222ddc7a --- /dev/null +++ b/source/_components/sensor.citybikes.markdown @@ -0,0 +1,46 @@ +--- +layout: page +title: "CityBikes API sensor" +description: "Instructions on how to integrate data from the CityBikes API into Home Assistant." +date: 2017-06-25 14:20 +sidebar: true +comments: false +sharing: true +footer: true +ha_category: Sensor +ha_release: 0.49 +--- + + +The `citybikes` sensor platform monitors bike availability at bike sharing stations in a chosen area. The data is provided by [CityBikes](https://citybik.es/#about), which supports bike sharing systems all around the world. + +To enable it, add the following lines to your `configuration.yaml`: + +```yaml +# Example configuration.yaml entry (using radius) +sensor: + - platform: citybikes + radius: 500 +``` + +Configuration options: + +- **name** (*Optional*): The base name of this group of monitored stations. The entity ID of every monitored station in this group will be prefixed with this base name, in addition to the network ID. +- **network** (*Optional*): The name of the bike sharing system to poll. Defaults to the system that operates in the monitored location. +- **latitude** (*Optional*): Latitude of the location, around which bike stations are monitored. Defaults to the latitude in your your `configuration.yaml` file. +- **longitude** (*Optional*): Longitude of the location, around which bike stations are monitored. Defaults to the longitude in your your `configuration.yaml` file. +- **radius** (*Optional*): The radius (in meters or feet, depending on the Home Assistant configuration) around the monitored location. Only stations closer than this distance will be monitored. +- **stations** array (*Optional*): A list of specific stations to monitor. The list should contain station `ID`s or `UID`s, which can be obtained from the CityBikes API. + +Additional configuration samples: + +```yaml +# Example configuration.yaml entry (using a list of stations) +sensor: + - platform: citybikes + name: Work Stations + stations: + - 123 + - 145 + - 436 +``` diff --git a/source/_components/sensor.london_underground.markdown b/source/_components/sensor.london_underground.markdown new file mode 100644 index 00000000000..a9139412989 --- /dev/null +++ b/source/_components/sensor.london_underground.markdown @@ -0,0 +1,45 @@ +--- +layout: page +title: "London Undergound" +description: "Display the current status of London underground & overground lines within Home Assistant." +date: 2017-07-30 18:45 +sidebar: true +comments: false +sharing: true +footer: true +logo: train.png +ha_category: Transport +ha_iot_class: "Cloud Polling" +ha_release: 0.49 +--- + + +The `london_underground` sensor will display the status of London underground lines, as well as the Overground, DLR and Tfl rail. + + +```yaml +# Example configuration.yaml entry +sensor: + - platform: london_underground + line: + - Bakerloo + - Central + - Circle + - District + - DLR + - Hammersmith & City + - Jubilee + - London Overground + - Metropolitan + - Northern + - Piccadilly + - TfL Rail + - Victoria + - Waterloo & City +``` + +Configuration variables: + +- **line** (*Required*): Enter the name of at least one line. + +Powered by TfL Open Data [TFL](https://api.tfl.gov.uk/). diff --git a/source/_components/sensor.otp.markdown b/source/_components/sensor.otp.markdown new file mode 100644 index 00000000000..9810cf0bc9f --- /dev/null +++ b/source/_components/sensor.otp.markdown @@ -0,0 +1,46 @@ +--- +layout: page +title: "OTP Sensor" +description: "Instructions how to add One-Time Password (OTP) sensors into Home Assistant." +date: 2017-07-04 07:00:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: home-assistant.png +ha_category: Sensor +ha_iot_class: "Local Polling" +ha_release: 0.49 +--- + +The `otp` sensor generates One-Time Passwords according to [RFC6238](https://tools.ietf.org/html/rfc6238) that is compatible with most OTP generators available, including Google Authenticator. You can use this when building custom security solutions and want to use "rolling codes", that change every 30 seconds. + +To enable the OTP sensor, add the following lines to your `configuration.yaml`: + +```yaml +# Example configuration.yaml entry +sensor: + - platform: otp + token: SHARED_SECRET_TOKEN +``` + +Configuration variables: + +- **name** (*Optional*): Name of the sensor to use in the frontend. Defaults to `OTP Sensor`. +- **token** (*Required*): The shared secret you use in your OTP generator (e.g. Google Authenticator on your phone) + +## Generating a token + +A simple way to generate a `token` for a new sensor is to run this snippet of python code in your Home Assistant virtual environment: + +```shell +$ pip install pyotp +$ python -c 'import pyotp; print("Token: " + pyotp.random_base32())' +Token: IHEDPEBEVA2WVHB7 +``` + +Copy and paste the token into your Home Assistant configuration and add it to your OTP generator. Verify that they generate the same code. + ++It is vital that your system clock is correct both on your Home Assistant server and on your OTP generator device (e.g. your phone). If not, the generated codes will not match! Make sure NTP is running and syncing your time correctly before creating an issue. +
diff --git a/source/_components/sensor.rflink.markdown b/source/_components/sensor.rflink.markdown index c99e751de7c..38b529af807 100644 --- a/source/_components/sensor.rflink.markdown +++ b/source/_components/sensor.rflink.markdown @@ -44,7 +44,7 @@ Device configuration variables: - **name** (*Optional*): Name for the device, defaults to RFLink ID. - **sensor_type** (*Required*): Override automatically detected type of sensor. For list of values see below. - **unit_of_measurement** (*Optional*): Override automatically detected unit of sensor. -- **aliasses** (*Optional*): Alternative RFLink ID's this device is known by. +- **aliases** (*Optional*): Alternative RFLink ID's this device is known by. Sensor type values: diff --git a/source/_components/sensor.uber.markdown b/source/_components/sensor.uber.markdown index b742b849358..b0bb558d3c0 100644 --- a/source/_components/sensor.uber.markdown +++ b/source/_components/sensor.uber.markdown @@ -14,8 +14,7 @@ ha_release: 0.16 --- -The `uber` sensor will give you time and price estimates for all available [Uber](https://uber.com) products at the given `start_latitude` and `start_longitude`.The `ATTRIBUTES` are used to provide extra information about products, such as estimated trip duration, distance and vehicle capacity. By default, 2 sensors will be created for each product at the given `start` location, one for pickup time and one for current price. The sensor is powered by the official Uber [API](https://developer.uber.com/). - +The `uber` sensor will give you time and price estimates for all available [Uber](https://uber.com) products at the given location. The `ATTRIBUTES` are used to provide extra information about products, such as estimated trip duration, distance and vehicle capacity. By default, 2 sensors will be created for each product at the given `start` location, one for pickup time and one for current price. The sensor is powered by the official Uber [API](https://developer.uber.com/). You must create an application [here](https://developer.uber.com/dashboard/create) to obtain a `server_token`. @@ -26,15 +25,13 @@ To enable this sensor, add the following lines to your `configuration.yaml` file sensor: - platform: uber server_token: 'BeAPPTDsWZSHLf7fd9OWjZkIezweRw18Q8NltY27' - start_latitude: 37.8116380 - start_longitude: -122.2648050 ``` Configuration variables: - **server_token** (*Required*): A server token obtained from [developer.uber.com](https://developer.uber.com) after [creating an app](https://developer.uber.com/dashboard/create). -- **start_latitude** (*Required*): The starting latitude for a trip. -- **start_longitude** (*Required*): The starting longitude for a trip. +- **start_latitude** (*Optional*): The starting latitude for a trip. Defaults to the latitude in your your `configuration.yaml` file. +- **start_longitude** (*Optional*): The starting longitude for a trip. Defaults to the longitude in your `configuration.yaml` file. - **end_latitude** (*Optional*): The ending latitude for a trip. While `end_latitude` is optional, it is strongly recommended to provide an `end_latitude`/`end_longitude` when possible as you will get more accurate price and time estimates. - **end_longitude** (*Optional*): The ending longitude for a trip. While `end_longitude` is optional, it is strongly recommended to provide an `end_latitude`/`end_longitude` when possible as you will get more accurate price and time estimates. - **product_ids** (*Options*): A list of Uber product UUIDs. If provided, sensors will only be created for the given product IDs. Please note that product IDs are region and some times even more specific geographies based. The easiest way to find a UUID is to click on a sensor in the Home Assistant frontend and look for "Product ID" in the attributes. diff --git a/source/_components/switch.flux.markdown b/source/_components/switch.flux.markdown index a48b0367b13..3cdb2e82342 100644 --- a/source/_components/switch.flux.markdown +++ b/source/_components/switch.flux.markdown @@ -51,5 +51,5 @@ Configuration variables: - **stop_colortemp** (*Optional*): The color temperature at the end. Defaults to `1900`. - **brightness** (*Optional*): The brightness of the lights. Calculated with `RGB_to_xy` by default. - **disable_brightness_adjust** (*Optional*): If true, brightness will not be adjusted besides color temperature. Defaults to False. -- **mode** (*Optional*): Select how color temperature is passed to lights. Valid values are `xy` and `mired`. Defaults to `xy`. +- **mode** (*Optional*): Select how color temperature is passed to lights. Valid values are `xy`, `mired` and `rgb`. Defaults to `xy`. diff --git a/source/_components/switch.rflink.markdown b/source/_components/switch.rflink.markdown index c4b24be8a93..fc7ab2b7a9f 100644 --- a/source/_components/switch.rflink.markdown +++ b/source/_components/switch.rflink.markdown @@ -49,15 +49,18 @@ Configuration variables: Device configuration variables: - **name** (*Optional*): Name for the device, defaults to RFLink ID. -- **aliasses** (*Optional*): Alternative RFLink ID's this device is known by. +- **aliases** (*Optional*): Alternative RFLink ID's this device is known by. - **fire_event** (*Optional*): Fire an `button_pressed` event if this device is turned on or off (default: False). - **signal_repetitions** (*Optional*): Repeat every RFLink command this number of times (default: 1) +- **group** (*Optional*): Allow switch to respond to group commands (ALLON/ALLOFF). (default: yes) +- **group_aliases** (*Optional*): `aliases` which only respond to group commands. +- **no_group_aliases** (*Optional*): `aliases` which do not respond to group commands. ### {% linkable_title Switch state %} Initially the state of a switch is unknown. When the switch is turned on or off (via frontend or wireless remote) the state is known and will be shown in the frontend. -Sometimes a switch is controlled by multiple wireless remotes, each remote has its own code programmed in the switch. To allow tracking of the state when switched via other remotes add the corresponding remote codes as aliasses: +Sometimes a switch is controlled by multiple wireless remotes, each remote has its own code programmed in the switch. To allow tracking of the state when switched via other remotes add the corresponding remote codes as aliases: ```yaml # Example configuration.yaml entry @@ -67,7 +70,7 @@ switch: newkaku_0000c6c2_1: name: Ceiling fan icon: mdi:fan - aliasses: + aliases: - newkaku_000000001_2 - kaku_000001_a ``` diff --git a/source/_components/switch.xiaomi_vacuum.markdown b/source/_components/switch.xiaomi_vacuum.markdown new file mode 100644 index 00000000000..151a1ff47e3 --- /dev/null +++ b/source/_components/switch.xiaomi_vacuum.markdown @@ -0,0 +1,79 @@ +--- +layout: page +title: "Xiaomi Mi Robot Vacuum" +description: "Instructions how to integrate your Xiaomi Mi Robot Vacuum within Home Assistant." +date: 2017-05-05 18:11 +sidebar: true +comments: false +sharing: true +footer: true +logo: xiaomi_vacuum.png +ha_category: Switch +ha_release: 0.48 +--- + +The `xiaomi_vacuum`switch platform allows you to control the state of your [Xiaomi Mi Robot Vacuum](http://www.mi.com/roomrobot/). +Current supported features are `start` and `stop` (goes to dock). + +{% linkable_title Getting started %} + +Follow the pairing process using your phone and Mi-Home app. From here you will be able to retrieve the token from a SQLite file inside your phone. + ++If your Home Assistant installation is running in a [Virtualenv](/docs/installation/virtualenv/#upgrading-home-assistant), make sure you activate it by running the commands below.
+ +```bash +$ sudo su -s /bin/bash homeassistant +$ source /srv/homeassistant/bin/activate +``` + +In order to fetch the token follow these instructions depending on your mobile phone platform. + +### Windows and Android +1. Configure the robot with the Mi-Home app. +2. Enable developer mode and USB debugging on the Android phone and plug it into the computer. +3. Get ADB tool for Windows : https://developer.android.com/studio/releases/platform-tools.html +4. Create a backup of the application com.xiaomi.smarthome: +```bash +.\adb backup -noapk com.xiaomi.smarthome -f backup.ab +``` +5. If you have this message : "More than one device or emulator", use this command to list all devices: +```bash +.\adb devices +``` +and execute this command: +```bash +.\adb -s DEVICEID backup -noapk com.xiaomi.smarthome -f backup.ab # (with DEVICEID the device id from the previous command) +``` +6. On the phone, you must confirm the backup. DO NOT enter any password and press button to make the backup. +7. Get ADB Backup Extractor : https://sourceforge.net/projects/adbextractor/ +8. Extract All files from the backup: +```bash +java.exe -jar ../android-backup-extractor/abe.jar unpack backup.ab backup.tar "" +``` +9. Unzip the ".tar" file. +10. Open the sqlite DB miio2.db with a tool like SQLite Manager extension for FireFox. +11. Get token from "devicerecord" table. + + +### macOS and iOS +1. Setup iOS device with the Mi-Home app. +2. Create an unencrypted backup of the device using iTunes. +3. Install iBackup Viewer from here: http://www.imactools.com/iphonebackupviewer/ +4. Extract this file /raw data/com.xiami.mihome/_mihome.sqlite to your computer +5. Open the file extracted using notepad. You will then see the list of all the device in your account with their token. + +{% linkable_title Configuration %} + +```yaml +# Example configuration.yaml entry +- platform: xiaomi_vacuum + name: 'name of the robot' + host: 192.168.1.2 + token: your-token-here +``` + +Configuration variables: +- **name** (*Optional*): The name of your robot +- **host** (*Required*): The IP of your robot +- **token** (*Required*): The token of your robot. Go to Getting started section to read more about how to get it diff --git a/source/_components/upnp.markdown b/source/_components/upnp.markdown index 68cd9354ebc..a4211cf660e 100644 --- a/source/_components/upnp.markdown +++ b/source/_components/upnp.markdown @@ -14,22 +14,22 @@ ha_release: 0.18 The `upnp` component enables you to collect network statistics from your router such as bytes in/out and packets in/out. This information is provided by the Internet Gateway Device (IGD) Protocol if enabled on your router. -The IGD can also automatically create port forwarding mappings on your router for Home Assistant. +The IGD automatically creates port forwarding mappings on your router for Home Assistant, exposing your installation to the internet. The mapping will never automatically expire. Upon stopping Home Assistant, the mapping will be removed from your router. Please note that UPnP or NAT-PMP needs to be enabled on your router for this component to work. To integrate this into Home Assistant, add the following section to your `configuration.yaml` file: ```yaml -# Example configuration.yaml entry +# Example configuration.yaml entry with custom external portal upnp: + external_port: 80 ``` -A port mapping will be created using the IP address and port that Home Assistant is running on. The mapping will never automatically expire. Upon stopping Home Assistant, the mapping will be removed from your router. +If you which to have the statistics without having port mapping done through IGD, add the option **port_mapping**. -If you which to have the statistics without having port mapping done through IGD, add the option: -```yaml -# Example configuration.yaml entry with port mapping disabled -upnp: - port_mapping: false -``` +Configuration variables: + +- **external_port** (*Optional*): Expose Home Assistant to the internet over this TCP port. Defaults to Home Assistant configured port. +- **port_mapping** (*Optional*): Disables port mapping maintains the network statistics sensors) +- **unit** (*Optional*): UPnP sensors unit. Valid units are 'Bytes', 'KBytes', 'MBytes' and 'GBytes'. diff --git a/source/_components/velux.markdown b/source/_components/velux.markdown new file mode 100644 index 00000000000..f73ddfab26b --- /dev/null +++ b/source/_components/velux.markdown @@ -0,0 +1,30 @@ +--- +layout: page +title: "Velux" +description: "Instructions on how to integrate Velux KLF 200 component with Home Assistant." +date: 2017-07-09 12:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: velux.png +ha_category: Hub +ha_release: 0.49 +ha_iot_class: "Local Polling" +--- + +[Velux](http://www.velux.com) integration for Home Assistant allows you to connect to a Velux KLF 200 interface, to control [io-homecontrol](http://www.io-homecontrol.com) devices like windows and blinds. The module allows you to start scenes configured within KLF 200. + +A `velux` section must be present in the `configuration.yaml` file and contain the following options as required: + +```yaml +# Example configuration.yaml entry +velux: + host: "192.168.1.23" + password: "velux123" +``` + +Configuration variables: + +- **host** (*Required*): The IP address or hostname of the KLF 200 to use. +- **password** (*Required*): The password of the KLF 200 interface. diff --git a/source/_components/wake_on_lan.markdown b/source/_components/wake_on_lan.markdown new file mode 100644 index 00000000000..e4a1e5a9f49 --- /dev/null +++ b/source/_components/wake_on_lan.markdown @@ -0,0 +1,36 @@ +--- +layout: page +title: "Wake on LAN" +description: "Instructions how to setup the Wake on LAN component in Home Assistant." +date: 2017-07-8 15:00 +sidebar: true +comments: false +sharing: true +footer: true +logo: ethernet.png +ha_category: Hub +ha_release: "0.49" +ha_iot_class: "Local Push" +--- + +The `wake_on_lan` component enables the ability to send _magic packets_ to [Wake on LAN](https://en.wikipedia.org/wiki/Wake-on-LAN) capable devices, in order to turn them on. + +To use this component in your installation, add the following to your `configuration.yaml` file: + +```yaml +# Example configuration.yaml entry +wake_on_lan: +``` + +### {% linkable_title Component services %} + +Available services: `send_magic_packet`. + +#### {% linkable_title Service `wake_on_lan/send_magic_packet` %} + +Send a _magic packet_ to wake up a device with 'Wake-On-LAN' capabilities. + +| Service data attribute | Optional | Description | +|---------------------------|----------|-------------------------------------------------------| +| `mac` | no | MAC address of the device to wake up. | +| `broadcast_address` | yes | Optional broadcast IP where to send the magic packet. | diff --git a/source/_faq/after-upgrading.markdown b/source/_faq/after-upgrading.markdown index 1c137fc726f..66e94bc8477 100644 --- a/source/_faq/after-upgrading.markdown +++ b/source/_faq/after-upgrading.markdown @@ -13,4 +13,5 @@ ha_category: Usage After upgrading to a new version, you may notice your browser gets stuck at the "loading data" login screen. Close the window/tab and go into your browser settings and delete all the cookies for your URL. You can then log back in and it should work. Android Chrome + chrome -> settings -> site settings -> storage -> search for your URL for home assistant-> "clear & reset" diff --git a/source/_posts/2017-07-16-release-49.markdown b/source/_posts/2017-07-16-release-49.markdown new file mode 100644 index 00000000000..821b818fa6b --- /dev/null +++ b/source/_posts/2017-07-16-release-49.markdown @@ -0,0 +1,462 @@ +--- +layout: post +title: "0.49: Themes 🎨, kiosk mode and Prometheus.io" +description: "Style the frontend the way you want it and present it in Kiosk mode without tabs." +date: 2017-07-15 00:02:05 +0000 +date_formatted: "July 15, 2017" +author: Paulus Schoutsen +author_twitter: balloob +comments: true +categories: Release-Notes +og_image: /images/blog/2017-07-0.49/components.png +--- + +
+
+Screenshot of a green dashboard
+