Merge branch 'current' into next

This commit is contained in:
Paulus Schoutsen 2019-04-03 14:44:04 -07:00
commit 484fe6bdf4
518 changed files with 2465 additions and 599 deletions

View File

@ -1,6 +1,8 @@
language: ruby
sudo: false
cache: bundler
before_install:
- gem install bundler:2.0.1
script: travis_wait bundle exec rake generate
after_success:
- '[ "${TRAVIS_BRANCH}" = "current" ] && [ "${TRAVIS_PULL_REQUEST}" = "false" ] && bundle exec rake deploy || false'
- '[ "${TRAVIS_BRANCH}" = "current" ] && [ "${TRAVIS_PULL_REQUEST}" = "false" ] && bundle exec rake deploy || false'

View File

@ -138,9 +138,9 @@ social:
# Home Assistant release details
current_major_version: 0
current_minor_version: 90
current_patch_version: 2
date_released: 2019-03-26
current_minor_version: 91
current_patch_version: 0
date_released: 2019-04-03
# Either # or the anchor link to latest release notes in the blog post.
# Must be prefixed with a # and have double quotes around it.

View File

@ -11,6 +11,8 @@ logo: acer.png
ha_category: Multimedia
ha_iot_class: Local Polling
ha_release: 0.19
redirect_from:
- /components/switch.acer_projector/
---
The `acer_projector` switch platform allows you to control the state of RS232 connected projectors from [Acer](http://www.acer.com).

View File

@ -10,6 +10,8 @@ footer: true
logo: actiontec.png
ha_category: Presence Detection
ha_release: 0.7
redirect_from:
- /components/device_tracker.actiontec/
---

View File

@ -11,6 +11,8 @@ logo: aftership.png
ha_category: Postal Service
ha_release: 0.85
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.aftership/
---
The `aftership` platform allows one to track deliveries by [AfterShip](https://www.aftership.com), a service that supports 490+ couriers worldwide. It is free to use up to 100 tracked packages per month, after that there is a fee.

View File

@ -11,6 +11,8 @@ logo: airvisual.jpg
ha_category: Health
ha_release: 0.53
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.airvisual/
---
The `airvisual` sensor platform queries the [AirVisual](https://airvisual.com/) API for air quality data. Data can be collected via latitude/longitude or by city/state/country. The resulting information creates sensors for the Air Quality Index (AQI), the human-friendly air quality level, and the main pollutant of that area. Sensors that conform to either/both the [U.S. and Chinese air quality standards](http://www.clm.com/publication.cfm?ID=366) can be created.
@ -138,6 +140,8 @@ When configured, the platform will create three sensors for each configured air
- **Explanation:**
AQI | Status | Description
redirect_from:
- /components/sensor.airvisual/
------- | :----------------: | ----------
0 - 50 | **Good** | Air quality is considered satisfactory, and air pollution poses little or no risk
51 - 100 | **Moderate** | Air quality is acceptable; however, for some pollutants there may be a moderate health concern for a very small number of people who are unusually sensitive to air pollution
@ -160,6 +164,8 @@ AQI | Status | Description
- **Explanation:**
Pollutant | Symbol | More Info
redirect_from:
- /components/sensor.airvisual/
------- | :----------------: | ----------
Particulate (<= 2.5 μm) | PM2.5 | [EPA: Particulate Matter (PM) Pollution ](https://www.epa.gov/pm-pollution)
Particulate (<= 10 μm) | PM10 | [EPA: Particulate Matter (PM) Pollution ](https://www.epa.gov/pm-pollution)

View File

@ -11,6 +11,8 @@ logo: aladdin_connect.png
ha_category: Cover
ha_release: 0.75
ha_iot_class: Cloud Polling
redirect_from:
- /components/cover.aladdin_connect/
---
The `aladdin_connect` cover platform lets you control Genie Aladdin Connect garage doors through Home Assistant.

View File

@ -10,6 +10,8 @@ footer: true
logo: alarmdotcom.png
ha_category: Alarm
ha_release: 0.11
redirect_from:
- /components/alarm_control_panel.alarmdotcom/
---
The `alarmdotcom` platform is consuming the information provided by [Alarm.com](https://www.alarm.com/).

View File

@ -0,0 +1,88 @@
---
layout: page
title: "Amazon Alexa Flash Briefing"
description: "Instructions on how to create your Flash Briefing skills with Home Assistant."
date: 2019-03-14 00:00
sidebar: true
comments: false
sharing: true
footer: true
logo: amazon-alexa.png
ha_category: Voice
featured: false
ha_release: "0.31"
---
## {% linkable_title Flash Briefing Skills %}
As of version [0.31][zero-three-one] Home Assistant supports the new [Alexa Flash Briefing Skills API][flash-briefing-api]. A Flash Briefing Skill adds a new Flash Briefing source that is generated by Home Assistant.
### {% linkable_title Requirements %}
Amazon requires the endpoint of a skill to be hosted via SSL. Self-signed certificates are OK because our skills will only run in development mode. Read more on [our blog][blog-lets-encrypt] about how to set up encryption for Home Assistant. When running Hass.io, using the [Let's Encrypt](/addons/lets_encrypt/) and [Duck DNS](/addons/duckdns/) add-ons is the easiest method. If you are unable to get HTTPS up and running, consider using [this AWS Lambda proxy for Alexa skills](https://community.home-assistant.io/t/aws-lambda-proxy-custom-alexa-skill-when-you-dont-have-https/5230).
Additionally, note that at the time of this writing, your Alexa skill endpoint *must* accept requests over port 443 (Home Assistant default to 8123). There are two ways you can handle this:
1. In your router, forward external 443 to your Home Assistant serving port (defaults to 8123)
OR
2. Change your Home Assistant serving port to 443 this is done in the [`http`](/components/http/) section with the `server_port` entry in your `configuration.yaml` file
[blog-lets-encrypt]: /blog/2015/12/13/setup-encryption-using-lets-encrypt/
### {% linkable_title Configuring a Flash Briefing skill in Home Assistant %}
You can use [templates] for the `title`, `audio`, `text` and `display_url` configuration parameters.
Here's an example configuration of a Flash briefing skill that will tell you who is at home:
```yaml
{% raw %}# Example configuration.yaml entry
alexa:
flash_briefings:
whoishome:
- title: Who's at home?
text: >
{%- if is_state('device_tracker.paulus', 'home') and
is_state('device_tracker.anne_therese', 'home') -%}
You are both home, you silly
{%- else -%}
Anne Therese is at {{ states("device_tracker.anne_therese") }}
and Paulus is at {{ states("device_tracker.paulus") }}
{% endif %}{% endraw %}
```
You can add multiple items for a feed if you want. The Amazon required UID and timestamp will be randomly generated at startup and change at every restart of Home Assistant.
Please refer to the [Amazon documentation][flash-briefing-api-docs] for more information about allowed configuration parameters and formats.
### {% linkable_title Configuring your Flash Briefing skill %}
- Log in to [Amazon developer console][amazon-dev-console]
- Click the Alexa navigation tab at the top of the console
- Click on the "Get Started >" button under "Alexa Skills Kit"
- Click the yellow "Add a new skill" button in the top right
- Skill Information
- For Skill Type select "Flash Briefing Skill API"
- You can enter whatever name you want
- Hit "Next"
- Interaction Model
- Nothing to do here
- Configuration
- Add new feed
- For URL, enter `https://YOUR_HOST/api/alexa/flash_briefings/BRIEFING_ID?api_password=YOUR_API_PASSWORD` where `BRIEFING_ID` is the key you entered in your configuration (such as `whoishome` in the above example). **NOTE:** Do not use a non-standard HTTP or HTTPS port, AWS will not connect to it.
- You can use this [specially sized Home Assistant logo][large-icon] as the Feed Icon
- All other settings are up to you
- Hit "Next"
- Test
- Having passed all validations to reach this screen, you can now click on "< Back to All Skills" as your flash briefing is now available as in "Development" service.
- To invoke your flash briefing, open the Alexa app on your phone or go to the [Alexa Settings Site][alexa-settings-site], open the "Skills" configuration section, select "Your Skills", scroll to the bottom, tap on the Flash Briefing Skill you just created, enable it, then manage Flash Briefing and adjust ordering as necessary. Finally ask your Echo for your "news","flash briefing", or "briefing".
[amazon-dev-console]: https://developer.amazon.com
[flash-briefing-api]: https://developer.amazon.com/alexa-skills-kit/flash-briefing
[flash-briefing-api-docs]: https://developer.amazon.com/public/solutions/alexa/alexa-skills-kit/docs/flash-briefing-skill-api-feed-reference
[large-icon]: /images/components/alexa/alexa-512x512.png
[small-icon]: /images/components/alexa/alexa-108x108.png
[templates]: /topics/templating/
[zero-three-one]: /blog/2016/10/22/flash-briefing-updater-hacktoberfest/
[alexa-settings-site]: http://alexa.amazon.com/
[emulated-hue-component]: /components/emulated_hue/

View File

@ -0,0 +1,274 @@
---
layout: page
title: "Amazon Alexa Custom Skill"
description: "Instructions on how to build your Alexa/Amazon Echo custom commands to connect with Home Assistant."
date: 2015-12-13 13:02
sidebar: true
comments: false
sharing: true
footer: true
logo: amazon-alexa.png
ha_category: Voice
featured: false
ha_release: "0.10"
---
## {% linkable_title I want to build custom commands to use with Echo %}
The built-in Alexa component allows you to integrate Home Assistant into Alexa/Amazon Echo. This component will allow you to query information and call services within Home Assistant by using your voice. Home Assistant offers no built-in sentences but offers a framework for you to define your own.
<div class='videoWrapper'>
<iframe width="560" height="315" src="https://www.youtube.com/embed/1Ke3mtWd_cQ" frameborder="0" allowfullscreen></iframe>
</div>
### {% linkable_title Requirements %}
Amazon requires the endpoint of a skill to be hosted via SSL. Self-signed certificates are OK because our skills will only run in development mode. Read more on [our blog][blog-lets-encrypt] about how to set up encryption for Home Assistant. When running Hass.io, using the [Let's Encrypt](/addons/lets_encrypt/) and [Duck DNS](/addons/duckdns/) add-ons is the easiest method. If you are unable to get HTTPS up and running, consider using [this AWS Lambda proxy for Alexa skills](https://community.home-assistant.io/t/aws-lambda-proxy-custom-alexa-skill-when-you-dont-have-https/5230).
Additionally, note that at the time of this writing, your Alexa skill endpoint *must* accept requests over port 443 (Home Assistant default to 8123). There are two ways you can handle this:
1. In your router, forward external 443 to your Home Assistant serving port (defaults to 8123)
OR
2. Change your Home Assistant serving port to 443 this is done in the [`http`](/components/http/) section with the `server_port` entry in your `configuration.yaml` file
[blog-lets-encrypt]: /blog/2015/12/13/setup-encryption-using-lets-encrypt/
To get started with Alexa skills:
- Log in to [Amazon developer console][amazon-dev-console]
- Click the Alexa button at the top of the console
- Click the yellow "Add a new skill" button in the top right
- Skill Type: Custom Interaction Model (default)
- Name: Home Assistant
- Invocation name: home assistant (or be creative, up to you)
- Version: 1.0
- Endpoint:
- https
- `https://YOUR_HOST/api/alexa?api_password=YOUR_API_PASSWORD`
You can use this [specially sized Home Assistant logo][large-icon] as the large icon and [this one][small-icon] as the small one.
### {% linkable_title Configuring your Amazon Alexa skill %}
Alexa works based on intents. Each intent has a name and variable slots. For example, a `LocateIntent` with a slot that contains a `User`. Example intent schema:
```json
{
"intents": [
{
"intent": "LocateIntent",
"slots": [
{
"name": "User",
"type": "AMAZON.US_FIRST_NAME"
}]
},
{
"intent": "WhereAreWeIntent",
"slots": []
}
]
}
```
To bind these intents to sentences said by users you define utterances. Example utterances can look like this:
```text
LocateIntent Where is {User}
LocateIntent Where's {User}
LocateIntent Where {User} is
LocateIntent Where did {User} go
WhereAreWeIntent where we are
```
This means that we can now ask Alexa things like:
- Alexa, ask Home Assistant where Paul is
- Alexa, ask Home Assistant where we are
## {% linkable_title Configuring Home Assistant %}
When activated, the Alexa component will have Home Assistant's native intent support handle the incoming intents. If you want to run actions based on intents, use the [`intent_script`](/components/intent_script) component.
To enable Alexa, add the following entry to your `configuration.yaml` file:
```yaml
alexa:
```
### {% linkable_title Working With Scenes %}
One of the most useful applications of Alexa integrations is to call scenes directly. This is easily achieved with some simple setup on the Home Assistant side and by letting Alexa know which scenes you want to run.
First, we will configure Alexa. In the Amazon Interaction module add this to the intent schema:
```json
{
"intent": "ActivateSceneIntent",
"slots":
[
{
"name" : "Scene",
"type" : "Scenes"
}
]
}
```
Then create a custom slot type called `Scenes` listing every scene you want to control:
<p class='img'>
<img src='/images/components/alexa/scene_slot.png' />
Custom slot type for scene support.
</p>
The names must exactly match the scene names (minus underscores - Amazon discards them anyway and we later map them back in with the template).
In the new Alexa Skills Kit, you can also create synonyms for slot type values, which can be used in place of the base value in utterances. Synonyms will be replaced with their associated slot value in the intent request sent to the Alexa API endpoint, but only if there are not multiple synonym matches. Otherwise, the value of the synonym that was spoken will be used.
<p class='img'>
<img src='/images/components/alexa/scene_slot_synonyms.png' />
Custom slot values with synonyms.
</p>
Add a sample utterance:
```text
ActivateSceneIntent activate {Scene}
```
Then add the intent to your intent_script section in your HA config file:
```yaml
intent_script:
ActivateSceneIntent:
action:
service: scene.turn_on
data_template:
entity_id: scene.{% raw %}{{ Scene | replace(" ", "_") }}{% endraw %}
speech:
type: plain
text: OK
```
Here we are using [templates] to take the name we gave to Alexa e.g., `downstairs on` and replace the space with an underscore so it becomes `downstairs_on` as Home Assistant expects.
Now say `Alexa ask Home Assistant to activate <some scene>` and Alexa will activate that scene for you.
### {% linkable_title Adding Scripts %}
We can easily extend the above idea to work with scripts as well. As before, add an intent for scripts:
```json
{
"intent": "RunScriptIntent",
"slots":
[
{
"name" : "Script",
"type" : "Scripts"
}
]
}
```
Create a custom slot type called `Scripts` listing every script you want to run:
<p class='img'>
<img src='/images/components/alexa/script_slot.png' />
Custom slot type for script support.
</p>
Add a sample utterance:
```text
RunScriptIntent run {Script}
```
Then add the intent to your intent_script section in your HA config file:
```yaml
intent_script:
RunScriptIntent:
action:
service: script.turn_on
data_template:
entity_id: script.{% raw %}{{ Script | replace(" ", "_") }}{% endraw %}
speech:
type: plain
text: OK
```
Now say `Alexa ask Home Assistant to run <some script>` and Alexa will run that script for you.
### {% linkable_title Support for Launch Requests %}
There may be times when you want to respond to a launch request initiated from a command such as "Alexa, Red Alert!".
To start, you need to get the skill id:
- Log into [Amazon developer console][amazon-dev-console]
- Click the Alexa button at the top of the console
- Click the Alexa Skills Kit Get Started button
- Locate the skill for which you would like Launch Request support
- Click the "View Skill ID" link and copy the ID
The configuration is the same as an intent with the exception being you will use your skill ID instead of the intent name.
```yaml
intent_script:
amzn1.ask.skill.08888888-7777-6666-5555-444444444444:
action:
service: script.turn_on
entity_id: script.red_alert
speech:
type: plain
text: OK
```
## {% linkable_title Giving Alexa Some Personality %}
In the examples above, we told Alexa to say `OK` when she successfully completed the task. This is effective but a little dull! We can again use [templates] to spice things up a little.
First create a file called `alexa_confirm.yaml` with something like the following in it (go on, be creative!):
```text
{% raw %} >
{{ [
"OK",
"Sure",
"If you insist",
"Done",
"No worries",
"I can do that",
"Leave it to me",
"Consider it done",
"As you wish",
"By your command",
"Affirmative",
"Yes oh revered one",
"I will",
"As you decree, so shall it be",
"No Problem"
] | random }} {% endraw %}
```
Then, wherever you would put some simple text for a response like `OK`, replace it with a reference to the file so that:
```yaml
text: OK
```
becomes:
```yaml
text: !include alexa_confirm.yaml
```
Alexa will now respond with a random phrase each time. You can use the include for as many different intents as you like so you only need to create the list once.
[amazon-dev-console]: https://developer.amazon.com
[large-icon]: /images/components/alexa/alexa-512x512.png
[small-icon]: /images/components/alexa/alexa-108x108.png
[templates]: /topics/templating/

View File

@ -1,13 +1,13 @@
---
layout: page
title: "Alexa / Amazon Echo"
title: "Amazon Alexa"
description: "Instructions on how to connect Alexa/Amazon Echo to Home Assistant."
date: 2015-12-13 13:02
sidebar: true
comments: false
sharing: true
footer: true
logo: amazon-echo.png
logo: amazon-alexa.png
ha_category: Voice
featured: true
ha_release: "0.10"
@ -21,379 +21,21 @@ For Home Assistant Cloud Users, documentation can be found [here](https://www.na
## {% linkable_title Manual setup %}
There are a few ways that you can use Amazon Echo and Home Assistant together.
There are a few ways that you can use Amazon Alexa and Home Assistant together.
- [Build custom commands to use](#i-want-to-build-custom-commands-to-use-with-echo)
- [Create a new Flash Briefing source](#flash-briefing-skills)
- [Use the Smart Home API to control lights, etc](#smart-home)
- [Build custom commands to use](/components/alexa.intent/)
- [Create a new Flash Briefing source](/components/alexa.flash_briefings/)
- [Use the Smart Home API to control lights, etc](/components/alexa.smart_home/)
- Alternative: use the [Emulated Hue component][emulated-hue-component] to trick Alexa to thinking Home Assistant is a Philips Hue hub.
Amazon has released [Echosim], a website that simulates the Alexa service in your browser. That way it is easy to test your skills without having access to a physical Amazon Echo.
[Echosim]: https://echosim.io/
## {% linkable_title I want to build custom commands to use with Echo %}
The built-in Alexa component allows you to integrate Home Assistant into Alexa/Amazon Echo. This component will allow you to query information and call services within Home Assistant by using your voice. Home Assistant offers no built-in sentences but offers a framework for you to define your own.
<div class='videoWrapper'>
<iframe width="560" height="315" src="https://www.youtube.com/embed/1Ke3mtWd_cQ" frameborder="0" allowfullscreen></iframe>
</div>
### {% linkable_title Requirements %}
Amazon requires the endpoint of a skill to be hosted via SSL. Self-signed certificates are OK because our skills will only run in development mode. Read more on [our blog][blog-lets-encrypt] about how to set up encryption for Home Assistant. When running Hass.io, using the [Let's Encrypt](/addons/lets_encrypt/) and [Duck DNS](/addons/duckdns/) add-ons is the easiest method. If you are unable to get HTTPS up and running, consider using [this AWS Lambda proxy for Alexa skills](https://community.home-assistant.io/t/aws-lambda-proxy-custom-alexa-skill-when-you-dont-have-https/5230).
Manual setup the integration with Amazon Alexa needs several requirements
Additionally, note that at the time of this writing, your Alexa skill endpoint *must* accept requests over port 443 (Home Assistant default to 8123). There are two ways you can handle this:
1. In your router, forward external 443 to your Home Assistant serving port (defaults to 8123)
OR
2. Change your Home Assistant serving port to 443 this is done in the [`http`](/components/http/) section with the `server_port` entry in your `configuration.yaml` file
[blog-lets-encrypt]: /blog/2015/12/13/setup-encryption-using-lets-encrypt/
To get started with Alexa skills:
- Log in to [Amazon developer console][amazon-dev-console]
- Click the Alexa button at the top of the console
- Click the yellow "Add a new skill" button in the top right
- Skill Type: Custom Interaction Model (default)
- Name: Home Assistant
- Invocation name: home assistant (or be creative, up to you)
- Version: 1.0
- Endpoint:
- https
- `https://YOUR_HOST/api/alexa?api_password=YOUR_API_PASSWORD`
You can use this [specially sized Home Assistant logo][large-icon] as the large icon and [this one][small-icon] as the small one.
### {% linkable_title Configuring your Amazon Alexa skill %}
Alexa works based on intents. Each intent has a name and variable slots. For example, a `LocateIntent` with a slot that contains a `User`. Example intent schema:
```json
{
"intents": [
{
"intent": "LocateIntent",
"slots": [
{
"name": "User",
"type": "AMAZON.US_FIRST_NAME"
}]
},
{
"intent": "WhereAreWeIntent",
"slots": []
}
]
}
```
To bind these intents to sentences said by users you define utterances. Example utterances can look like this:
```text
LocateIntent Where is {User}
LocateIntent Where's {User}
LocateIntent Where {User} is
LocateIntent Where did {User} go
WhereAreWeIntent where we are
```
This means that we can now ask Alexa things like:
- Alexa, ask Home Assistant where Paul is
- Alexa, ask Home Assistant where we are
## {% linkable_title Configuring Home Assistant %}
When activated, the Alexa component will have Home Assistant's native intent support handle the incoming intents. If you want to run actions based on intents, use the [`intent_script`](/components/intent_script) component.
To enable Alexa, add the following entry to your `configuration.yaml` file:
```yaml
alexa:
```
### {% linkable_title Working With Scenes %}
One of the most useful applications of Alexa integrations is to call scenes directly. This is easily achieved with some simple setup on the Home Assistant side and by letting Alexa know which scenes you want to run.
First, we will configure Alexa. In the Amazon Interaction module add this to the intent schema:
```json
{
"intent": "ActivateSceneIntent",
"slots":
[
{
"name" : "Scene",
"type" : "Scenes"
}
]
}
```
Then create a custom slot type called `Scenes` listing every scene you want to control:
<p class='img'>
<img src='/images/components/alexa/scene_slot.png' />
Custom slot type for scene support.
</p>
The names must exactly match the scene names (minus underscores - amazon discards them anyway and we later map them back in with the template).
In the new Alexa Skills Kit, you can also create synonyms for slot type values, which can be used in place of the base value in utterances. Synonyms will be replaced with their associated slot value in the intent request sent to the Alexa API endpoint, but only if there are not multiple synonym matches. Otherwise, the value of the synonym that was spoken will be used.
<p class='img'>
<img src='/images/components/alexa/scene_slot_synonyms.png' />
Custom slot values with synonyms.
</p>
Add a sample utterance:
```text
ActivateSceneIntent activate {Scene}
```
Then add the intent to your intent_script section in your HA config file:
```yaml
intent_script:
ActivateSceneIntent:
action:
service: scene.turn_on
data_template:
entity_id: scene.{% raw %}{{ Scene | replace(" ", "_") }}{% endraw %}
speech:
type: plain
text: OK
```
Here we are using [templates] to take the name we gave to Alexa e.g., `downstairs on` and replace the space with an underscore so it becomes `downstairs_on` as Home Assistant expects.
Now say `Alexa ask Home Assistant to activate <some scene>` and Alexa will activate that scene for you.
### {% linkable_title Adding Scripts %}
We can easily extend the above idea to work with scripts as well. As before, add an intent for scripts:
```json
{
"intent": "RunScriptIntent",
"slots":
[
{
"name" : "Script",
"type" : "Scripts"
}
]
}
```
Create a custom slot type called `Scripts` listing every script you want to run:
<p class='img'>
<img src='/images/components/alexa/script_slot.png' />
Custom slot type for script support.
</p>
Add a sample utterance:
```text
RunScriptIntent run {Script}
```
Then add the intent to your intent_script section in your HA config file:
```yaml
intent_script:
RunScriptIntent:
action:
service: script.turn_on
data_template:
entity_id: script.{% raw %}{{ Script | replace(" ", "_") }}{% endraw %}
speech:
type: plain
text: OK
```
Now say `Alexa ask Home Assistant to run <some script>` and Alexa will run that script for you.
### {% linkable_title Support for Launch Requests %}
There may be times when you want to respond to a launch request initiated from a command such as "Alexa, Red Alert!".
To start, you need to get the skill id:
- Log into [Amazon developer console][amazon-dev-console]
- Click the Alexa button at the top of the console
- Click the Alexa Skills Kit Get Started button
- Locate the skill for which you would like Launch Request support
- Click the "View Skill ID" link and copy the ID
The configuration is the same as an intent with the exception being you will use your skill ID instead of the intent name.
```yaml
intent_script:
amzn1.ask.skill.08888888-7777-6666-5555-444444444444:
action:
service: script.turn_on
entity_id: script.red_alert
speech:
type: plain
text: OK
```
## {% linkable_title Giving Alexa Some Personality %}
In the examples above, we told Alexa to say `OK` when she successfully completed the task. This is effective but a little dull! We can again use [templates] to spice things up a little.
First create a file called `alexa_confirm.yaml` with something like the following in it (go on, be creative!):
```text
{% raw %} >
{{ [
"OK",
"Sure",
"If you insist",
"Done",
"No worries",
"I can do that",
"Leave it to me",
"Consider it done",
"As you wish",
"By your command",
"Affirmative",
"Yes oh revered one",
"I will",
"As you decree, so shall it be",
"No Problem"
] | random }} {% endraw %}
```
Then, wherever you would put some simple text for a response like `OK`, replace it with a reference to the file so that:
```yaml
text: OK
```
becomes:
```yaml
text: !include alexa_confirm.yaml
```
Alexa will now respond with a random phrase each time. You can use the include for as many different intents as you like so you only need to create the list once.
## {% linkable_title Flash Briefing Skills %}
As of version [0.31][zero-three-one] Home Assistant supports the new [Alexa Flash Briefing Skills API][flash-briefing-api]. A Flash Briefing Skill adds a new Flash Briefing source that is generated by Home Assistant.
### {% linkable_title Configuring a Flash Briefing skill in Home Assistant %}
You can use [templates] for the `title`, `audio`, `text` and `display_url` configuration parameters.
Here's an example configuration of a Flash briefing skill that will tell you who is at home:
```yaml
{% raw %}# Example configuration.yaml entry
alexa:
flash_briefings:
whoishome:
- title: Who's at home?
text: >
{%- if is_state('device_tracker.paulus', 'home') and
is_state('device_tracker.anne_therese', 'home') -%}
You are both home, you silly
{%- else -%}
Anne Therese is at {{ states("device_tracker.anne_therese") }}
and Paulus is at {{ states("device_tracker.paulus") }}
{% endif %}{% endraw %}
```
You can add multiple items for a feed if you want. The Amazon required UID and timestamp will be randomly generated at startup and change at every restart of Home Assistant.
Please refer to the [Amazon documentation][flash-briefing-api-docs] for more information about allowed configuration parameters and formats.
### {% linkable_title Configuring your Flash Briefing skill %}
- Log in to [Amazon developer console][amazon-dev-console]
- Click the Alexa navigation tab at the top of the console
- Click on the "Get Started >" button under "Alexa Skills Kit"
- Click the yellow "Add a new skill" button in the top right
- Skill Information
- For Skill Type select "Flash Briefing Skill API"
- You can enter whatever name you want
- Hit "Next"
- Interaction Model
- Nothing to do here
- Configuration
- Add new feed
- For URL, enter `https://YOUR_HOST/api/alexa/flash_briefings/BRIEFING_ID?api_password=YOUR_API_PASSWORD` where `BRIEFING_ID` is the key you entered in your configuration (such as `whoishome` in the above example). **NOTE:** Do not use a non-standard http or https port, AWS will not connect to it.
- You can use this [specially sized Home Assistant logo][large-icon] as the Feed Icon
- All other settings are up to you
- Hit "Next"
- Test
- Having passed all validations to reach this screen, you can now click on "< Back to All Skills" as your flash briefing is now available as in "Development" service.
- To invoke your flash briefing, open the Alexa app on your phone or go to the [Alexa Settings Site][alexa-settings-site], open the "Skills" configuration section, select "Your Skills", scroll to the bottom, tap on the Flash Briefing Skill you just created, enable it, then manage Flash Briefing and adjust ordering as necessary. Finally ask your Echo for your "news","flash briefing", or "briefing".
## {% linkable_title Smart Home %}
While the Skills API described above allows for arbitrary intents, all
utterances must begin with "Alexa, tell $invocation_name ..."
The [Emulated Hue component][emulated-hue-component] provides a simpler
interface such as, "Alexa, turn on the kitchen light". However it has some
limitations since everything looks like a light bulb.
Amazon provides a Smart Home API for richer home automation control. It takes
considerable effort to configure. The easy solution is to use
[Home Assistant Cloud](/components/cloud/).
If you don't want to use Home Assistant Cloud and are willing to do the
integration work yourself, Home Assistant can expose an HTTP API which makes
the integration work easier. Example configuration:
```yaml
alexa:
smart_home:
endpoint: https://api.amazonalexa.com/v3/events
client_id: !secret alexa_client_id
client_secret: !secret alexa_client_secret
filter:
include_entities:
- light.kitchen
- light.kitchen_left
include_domains:
- switch
exclude_entities:
- switch.outside
entity_config:
light.kitchen:
name: Custom Name for Alexa
description: The light in the kitchen
switch.stairs:
display_categories: LIGHT
```
This exposes an HTTP POST endpoint at `http://your_hass_ip/api/alexa/smart_home`
which accepts and returns messages conforming to the
[Smart Home v3 payload](https://developer.amazon.com/docs/smarthome/smart-home-skill-api-message-reference.html).
You must then create an Amazon developer account with an Alexa skill and Lambda
function to integrate this endpoint. See
[Haaska](https://github.com/mike-grant/haaska) for an example.
The `endpoint`, `client_id` and `client_secret` are optional, and are only required if you want to enable Alexa's proactive mode. Please note the following if you want to enable proactive mode:
- There are different endpoint urls, depending on the region of your skill. Please check the available endpoints at <https://developer.amazon.com/docs/smarthome/send-events-to-the-alexa-event-gateway.html#endpoints>
- The `client_id` and `client_secret` are not the ones used by the skill that have been set up using "Login with Amazon" (in the Alexa Developer Console: Build > Account Linking), but rather from the "Alexa Skill Messaging" (in the Alexa Developer Console: Build > Permissions > Alexa Skill Messaging). To get them, you need to enable the "Send Alexa Events" permission.
- If the "Send Alexa Events" permission was not enabled previously, you need to unlink and relink the skill using the Alexa App, or else Home Assistant will show the following error: "Token invalid and no refresh token available."
- Amazon Developer Account. You can sign up [here][amazon-dev-console].
- Building custom commands and Flash Briefing requires your Home Assistant instance to be accessible from the Internet with HTTPS on port 443.
- An [AWS account](https://aws.amazon.com/free/) is needed if you want to use the Smart Home Skill API. A part of your Smart Home Skill will be hosted on [AWS Lambda](https://aws.amazon.com/lambda/pricing/). However, you don't need to worry about costs, AWS Lambda allows free to use up to 1 million requests and 1GB outbound data transfer per month.
- Smart Home API also needs your Home Assistant instance accessible from the Internet.
[amazon-dev-console]: https://developer.amazon.com
[flash-briefing-api]: https://developer.amazon.com/alexa-skills-kit/flash-briefing
[flash-briefing-api-docs]: https://developer.amazon.com/public/solutions/alexa/alexa-skills-kit/docs/flash-briefing-skill-api-feed-reference
[large-icon]: /images/components/alexa/alexa-512x512.png
[small-icon]: /images/components/alexa/alexa-108x108.png
[templates]: /topics/templating/
[zero-three-one]: /blog/2016/10/22/flash-briefing-updater-hacktoberfest/
[alexa-settings-site]: http://alexa.amazon.com/
[emulated-hue-component]: /components/emulated_hue/

View File

@ -0,0 +1,74 @@
---
layout: page
title: "Amazon Alexa Smart Home Skill"
description: "Instructions on how to build Smart Home skill to connect Amazon Alexa with Home Assistant."
date: 2019-03-14 00:00
sidebar: true
comments: false
sharing: true
footer: true
logo: amazon-alexa.png
ha_category: Voice
featured: false
ha_release: "0.54"
---
## {% linkable_title Automatic setup via Home Assistant Cloud %}
With [Home Assistant Cloud](/cloud/), you can connect your Home Assistant instance in a few simple clicks to Amazon Alexa. With Home Assistant Cloud you don't have to deal with dynamic DNS, SSL certificates or opening ports on your router. Just log in via the user interface and a secure connection with the cloud will be established. Home Assistant Cloud requires a paid subscription after a 30-day free trial.
For Home Assistant Cloud Users, documentation can be found [here](https://www.nabucasa.com/config/amazon_alexa/).
## {% linkable_title Amazon Alexa Smart Home %}
While the Skills API described above allows for arbitrary intents, all
utterances must begin with "Alexa, tell $invocation_name ..."
The [Emulated Hue component][emulated-hue-component] provides a simpler
interface such as, "Alexa, turn on the kitchen light". However, it has some
limitations since everything looks like a light bulb.
Amazon provides a Smart Home API for richer home automation control. It takes
considerable effort to configure. The easy solution is to use
[Home Assistant Cloud](/components/cloud/).
If you don't want to use Home Assistant Cloud and are willing to do the
integration work yourself, Home Assistant can expose an HTTP API which makes
the integration work easier. Example configuration:
```yaml
alexa:
smart_home:
endpoint: https://api.amazonalexa.com/v3/events
client_id: !secret alexa_client_id
client_secret: !secret alexa_client_secret
filter:
include_entities:
- light.kitchen
- light.kitchen_left
include_domains:
- switch
exclude_entities:
- switch.outside
entity_config:
light.kitchen:
name: Custom Name for Alexa
description: The light in the kitchen
switch.stairs:
display_categories: LIGHT
```
This exposes an HTTP POST endpoint at `http://your_hass_ip/api/alexa/smart_home`
which accepts and returns messages conforming to the
[Smart Home v3 payload](https://developer.amazon.com/docs/smarthome/smart-home-skill-api-message-reference.html).
You must then create an Amazon developer account with an Alexa skill and Lambda function to integrate this endpoint.
[Haaska](https://github.com/mike-grant/haaska/wiki) provides a step-by-step guide and necessary assets to help you create the Alexa skill and AWS Lambda.
The `endpoint`, `client_id` and `client_secret` are optional, and are only required if you want to enable Alexa's proactive mode. Please note the following if you want to enable proactive mode:
- There are different endpoint URLs, depending on the region of your skill. Please check the available endpoints at <https://developer.amazon.com/docs/smarthome/send-events-to-the-alexa-event-gateway.html#endpoints>
- The `client_id` and `client_secret` are not the ones used by the skill that have been set up using "Login with Amazon" (in the [Alexa Developer Console][amazon-dev-console]: Build > Account Linking), but rather from the "Alexa Skill Messaging" (in the Alexa Developer Console: Build > Permissions > Alexa Skill Messaging). To get them, you need to enable the "Send Alexa Events" permission.
- If the "Send Alexa Events" permission was not enabled previously, you need to unlink and relink the skill using the Alexa App, or else Home Assistant will show the following error: "Token invalid and no refresh token available."
[amazon-dev-console]: https://developer.amazon.com
[emulated-hue-component]: /components/emulated_hue/

View File

@ -11,6 +11,8 @@ logo: alpha_vantage.png
ha_category: Finance
ha_iot_class: Cloud Polling
ha_release: "0.60"
redirect_from:
- /components/sensor.alpha_vantage/
---
The `alpha_vantage` sensor platform uses [Alpha Vantage](https://www.alphavantage.co) to monitor the stock market. This platform also provides detail about exchange rates.

View File

@ -10,6 +10,8 @@ footer: true
logo: polly.png
ha_category: Text-to-speech
ha_release: 0.37
redirect_from:
- /components/tts.amazon_polly/
---
The `amazon_polly` text-to-speech platform that works with [Amazon Polly](https://aws.amazon.com/polly/) to create the spoken output.

View File

@ -11,6 +11,8 @@ logo: anel.png
ha_category: Switch
ha_iot_class: Local Polling
ha_release: "0.30"
redirect_from:
- /components/switch.anel_pwrctrl/
---
The `anel_pwrctrl` switch platform allows you to control [ANEL PwrCtrl](http://anel-elektronik.de/SITE/produkte/produkte.htm) devices.

View File

@ -11,6 +11,8 @@ logo: anthemav.png
ha_category: Media Player
ha_iot_class: Local Push
ha_release: 0.37
redirect_from:
- /components/media_player.anthemav/
---
Both [Anthem]'s current and last generation of A/V Receivers and Processors support IP-based, network control. This Home Assistant platform adds proper "local push" support for any of these receivers on your network.

View File

@ -10,6 +10,8 @@ footer: true
logo: apple.png
ha_category: Notifications
ha_release: 0.31
redirect_from:
- /components/notify.apns/
---
The `apns` platform uses the Apple Push Notification service (APNS) to deliver notifications from Home Assistant.

View File

@ -11,6 +11,8 @@ logo: sharp_aquos.png
ha_category: Media Player
ha_release: 0.35
ha_iot_class: Local Polling
redirect_from:
- /components/media_player.aquostv/
---
The `aquostv` platform allows you to control a [Sharp Aquos TV](http://www.sharp.ca/en-CA/ForHome/HomeEntertainment/LEDTV/QuattronPlus.aspx).

View File

@ -11,6 +11,8 @@ logo: arest.png
ha_category: DIY
ha_iot_class: Local Polling
ha_release: 0.9
redirect_from:
- /components/binary_sensor.arest/
---
The `arest` binary sensor platform allows you to get all data from your devices (like Arduinos with an ethernet/wifi connection, the ESP8266, and the Raspberry Pi) running the [aREST](http://arest.io/) RESTful framework.

View File

@ -10,6 +10,8 @@ footer: true
logo: aruba.png
ha_category: Presence Detection
ha_release: 0.7
redirect_from:
- /components/device_tracker.aruba/
---

View File

@ -10,6 +10,8 @@ footer: true
ha_category: Sensor
ha_release: 0.31
ha_iot_class: Local Polling
redirect_from:
- /components/sensor.arwn/
---
The `arwn` sensor platform is a client for the [Ambient Radio Weather Network](http://github.com/sdague/arwn) project. This collects weather station data and makes it available in an MQTT subtree.

View File

@ -10,6 +10,8 @@ footer: true
logo: asterisk.png
ha_category: Mailbox
ha_release: 0.79
redirect_from:
- /components/mailbox.asterisk_cdr/
---
The Asterisk Call Data Recorder provides access to Asterisk call logs on the Asterisk PBX server. This mailbox is enabled automatically through the [Asterisk Voicemail component](/components/asterisk_mbox/) configuration if the `asterisk_mbox_server` is configured to provide CDR data. More information on configuring the server can be found in the [Asterisk PBX configuration guide](/docs/asterisk_mbox/).

View File

@ -10,6 +10,8 @@ footer: true
ha_category: Environment
ha_release: 0.39
logo: noaa.png
redirect_from:
- /components/binary_sensor.aurora/
---
The `aurora` platform uses the [NOAA aurora forecast](http://www.swpc.noaa.gov/products/aurora-30-minute-forecast) service to let you know if an aurora might be visible at your home location in the next 30 minutes, based off of current solar flare activity.

View File

@ -11,6 +11,8 @@ logo: automatic.png
ha_category: Car
ha_release: 0.28
ha_iot_class: Cloud Push
redirect_from:
- /components/device_tracker.automatic/
---

View File

@ -11,6 +11,8 @@ ha_category: Light
ha_iot_class: Assumed State
logo: avi-on.png
ha_release: 0.37
redirect_from:
- /components/light.avion/
---
Support for the Avi-on Bluetooth dimmer switch [Avi-On](http://avi-on.com/).

View File

@ -11,6 +11,8 @@ logo: awair.jpg
ha_category: Health
ha_release: 0.84
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.awair/
---
The `awair` sensor platform will fetch data from your [Awair device(s)](https://getawair.com).

View File

@ -10,6 +10,8 @@ footer: true
logo: aws_lambda.png
ha_category: Notifications
ha_release: "0.20"
redirect_from:
- /components/notify.aws_lambda/
---
The `aws_lambda` notification platform enables invoking [AWS Lambda](https://aws.amazon.com/lambda/) functions.

View File

@ -10,6 +10,8 @@ footer: true
logo: aws_sns.png
ha_category: Notifications
ha_release: "0.20"
redirect_from:
- /components/notify.aws_sns/
---
The `aws_sns` notification platform enables publishing to an [AWS SNS](https://aws.amazon.com/sns/) topic or application.

View File

@ -10,6 +10,8 @@ footer: true
logo: aws_sqs.png
ha_category: Notifications
ha_release: "0.20"
redirect_from:
- /components/notify.aws_sqs/
---
The `aws_sqs` notification platform enables publishing to an [AWS SQS](https://aws.amazon.com/sqs/) message queue.

View File

@ -10,6 +10,8 @@ footer: true
logo: baiducloud.png
ha_category: Text-to-speech
ha_release: 0.59
redirect_from:
- /components/tts.baidu/
---
The `baidu` text-to-speech platform uses [Baidu TTS engine](https://cloud.baidu.com/product/speech/tts) to read a text with natural sounding voices.

View File

@ -12,6 +12,8 @@ ha_category: Utility
ha_iot_class: Local Polling
ha_release: 0.53
ha_qa_scale: internal
redirect_from:
- /components/binary_sensor.bayesian/
---
The `bayesian` binary sensor platform observes the state from multiple sensors and uses [Bayes' rule](https://en.wikipedia.org/wiki/Bayes%27_theorem) to estimate the probability that an event has occurred given the state of the observed sensors. If the estimated posterior probability is above the `probability_threshold`, the sensor is `on` otherwise it is `off`.

View File

@ -11,6 +11,8 @@ logo: bbox.png
ha_category: Network
ha_release: 0.31
ha_iot_class: Local Push
redirect_from:
- /components/sensor.bbox/
---
The `bbox` platform uses the [Bbox Modem Router](https://fr.wikipedia.org/wiki/Bbox/) from the French Internet provider Bouygues Telecom. Sensors are mainly bandwidth measures.

View File

@ -11,6 +11,8 @@ logo: raspberry-pi.png
ha_category: DIY
ha_release: 0.48
ha_iot_class: Local Push
redirect_from:
- /components/sensor.bh1750/
---
The `bh1750` sensor platform allows you to read the ambient light level in Lux from a [BH1750FVI sensor](http://cpre.kmutnb.ac.th/esl/learning/bh1750-light-sensor/bh1750fvi-e_datasheet.pdf) connected via [I2c](https://en.wikipedia.org/wiki/I²C) bus (SDA, SCL pins). It allows you to use all the resolution modes of the sensor described in its datasheet.

View File

@ -1,60 +0,0 @@
---
layout: page
title: "PiFace Digital I/O Binary Sensor"
description: "Instructions on how to integrate the PiFace Digital I/O module into Home Assistant as a binary sensor."
date: 2016-05-08 15:00
sidebar: true
comments: false
sharing: true
footer: true
logo: raspberry-pi.png
ha_category: DIY
ha_release: 0.45
ha_iot_class: Local Push
---
The `rpi_pfio` binary sensor platform allows you to read sensor values of the [PiFace Digital I/O](http://www.piface.org.uk/products/piface_digital/) .
## {% linkable_title Configuration %}
To use your PiFace Digital I/O module in your installation, add the following to your `configuration.yaml` file:
```yaml
# Example configuration.yaml entry
binary_sensor:
- platform: rpi_pfio
ports:
0:
name: PIR Office
invert_logic: true
1:
name: Doorbell
settle_time: 50
```
{% configuration %}
ports:
description: List of used ports.
required: true
type: map
keys:
num:
description: The port number.
required: true
type: map
keys:
name:
description: The port name.
required: true
type: string
settle_time:
description: The time in milliseconds for port debouncing.
required: false
type: integer
default: 20
invert_logic:
description: If `true`, inverts the output logic to ACTIVE LOW.
required: false
type: boolean
default: "`false` (ACTIVE HIGH)"
{% endconfiguration %}

View File

@ -11,6 +11,8 @@ logo: bitcoin.png
ha_category: Finance
ha_release: pre 0.7
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.bitcoin/
---

View File

@ -11,6 +11,8 @@ logo: monoprice.svg
ha_category: Media Player
ha_release: 0.68
ha_iot_class: Local Polling
redirect_from:
- /components/media_player.blackbird/
---
The `blackbird` platform allows you to control [Monoprice Blackbird Matrix Switch](https://www.monoprice.com/product?p_id=21819) using a serial connection.

View File

@ -11,6 +11,8 @@ logo: blinkstick.png
ha_category: DIY
ha_release: 0.7.5
ha_iot_class: Local Polling
redirect_from:
- /components/light.blinksticklight/
---

View File

@ -11,6 +11,8 @@ logo: raspberry-pi.png
ha_category: DIY
ha_iot_class: Local Push
ha_release: 0.44
redirect_from:
- /components/light.blinkt/
---
The `blinkt` light platform lets you control the [Blinkt!](https://shop.pimoroni.com/products/blinkt) board, featuring eight super-bright RGB LEDs.

View File

@ -11,6 +11,8 @@ logo: blockchain.png
ha_category: Finance
ha_release: 0.47
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.blockchain/
---

View File

@ -11,6 +11,8 @@ logo: bluesound.png
ha_category: Media Player
ha_release: 0.51
ha_iot_class: Local Polling
redirect_from:
- /components/media_player.bluesound/
---
The `bluesound` platform allows you to control your [Bluesound](http://www.bluesound.com/) HiFi wireless speakers and audio components from Home Assistant.

View File

@ -11,6 +11,8 @@ logo: bluetooth.png
ha_category: Presence Detection
ha_iot_class: Local Polling
ha_release: 0.27
redirect_from:
- /components/device_tracker.bluetooth_le_tracker/
---
This tracker discovers new devices on boot and in regular intervals and tracks Bluetooth low-energy devices periodically based on interval_seconds value. It is not required to pair the devices with each other.

View File

@ -11,6 +11,8 @@ logo: bluetooth.png
ha_category: Presence Detection
ha_iot_class: Local Polling
ha_release: 0.18
redirect_from:
- /components/device_tracker.bluetooth_tracker/
---
This tracker discovers new devices on boot and tracks Bluetooth devices periodically based on `interval_seconds` value. It is not required to pair the devices with each other! Devices discovered are stored with 'bt_' as the prefix for device MAC addresses in `known_devices.yaml`.

View File

@ -11,6 +11,8 @@ logo: raspberry-pi.png
ha_category: DIY
ha_release: 0.48
ha_iot_class: Local Push
redirect_from:
- /components/sensor.bme280/
---
The `bme280` sensor platform allows you to read temperature, humidity and pressure values of a [Bosch BME280 Environmental sensor](https://cdn-shop.adafruit.com/datasheets/BST-BME280_DS001-10.pdf) connected via [I2c](https://en.wikipedia.org/wiki/I²C) bus (SDA, SCL pins). It allows you to use all the operation modes of the sensor described in its datasheet.

View File

@ -11,6 +11,8 @@ logo: raspberry-pi.png
ha_category: DIY
ha_release: 0.62
ha_iot_class: Local Push
redirect_from:
- /components/sensor.bme680/
---
The `bme680` sensor platform allows you to read temperature, humidity, pressure and gas resistance values of a [Bosch BME680 Environmental sensor](https://cdn-shop.adafruit.com/product-files/3660/BME680.pdf) connected via an [I2C](https://en.wikipedia.org/wiki/I²C) bus (SDA, SCL pins). It allows you to use all the operation modes of the sensor described in its datasheet. In addition, it includes a basic air quality calculation that uses gas resistance and humidity measurements to calculate a percentage based air quality measurement.

View File

@ -11,6 +11,8 @@ logo: bom.png
ha_category: Weather
ha_release: 0.36
ha_iot_class: Cloud Polling
redirect_from:
- /components/weather.bom/
---
The `bom` weather platform uses the [Australian Bureau of Meteorology (BOM)](http://www.bom.gov.au) as a source for current (half-hourly) meteorological data.

View File

@ -11,6 +11,8 @@ logo: bravia.png
ha_category: Media Player
ha_release: 0.23
ha_iot_class: Local Polling
redirect_from:
- /components/media_player.braviatv/
---
The `braviatv` platform allows you to control a [Sony Bravia TV](http://www.sony.com).

View File

@ -10,6 +10,8 @@ footer: true
logo: telegram.png
ha_category: Notifications
ha_release: 0.48
redirect_from:
- /components/telegram_bot.broadcast/
---
Telegram implementation to support **sending messages only**. Your Home Assistant instance does not have to be exposed to the Internet and there is no polling to receive messages sent to the bot.

View File

@ -11,6 +11,8 @@ logo: broadlink.png
ha_category: Switch
ha_release: 0.35
ha_iot_class: Local Polling
redirect_from:
- /components/switch.broadlink/
---
This `Broadlink` switch platform allow to you control Broadlink [devices](http://www.ibroadlink.com/).

View File

@ -11,6 +11,8 @@ ha_category: Social
logo: brottsplatskartan.png
ha_release: 0.85
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.brottsplatskartan/
---
The `brottsplatskartan` sensor allows one to track reported incidents occurring in a given area. Incidents include anything reported to [Brottsplatskartan](https://brottsplatskartan.se). The sensor only counts incidents from the current day.

View File

@ -11,6 +11,8 @@ logo: brunt.png
ha_category: Cover
ha_release: 0.75
ha_iot_class: Cloud Polling
redirect_from:
- /components/cover.brunt/
---
The `brunt` platform allows one to control Blind Engines by [Brunt](https://www.brunt.co). To use this sensor, you need a Brunt App Account. All Brunt Blind devices registered to your account are automatically added to your Home Assistant with the names given them through the Brunt app.

View File

@ -10,6 +10,8 @@ footer: true
logo: bt.png
ha_category: Presence Detection
ha_release: 0.22
redirect_from:
- /components/device_tracker.bt_home_hub_5/
---
This platform offers presence detection by looking at connected devices to a [BT Home Hub 5](https://en.wikipedia.org/wiki/BT_Home_Hub) based router.

View File

@ -11,6 +11,8 @@ logo: bt.png
ha_category: Presence Detection
ha_release: 0.82
ha_iot_class: Local Polling
redirect_from:
- /components/device_tracker.bt_smarthub/
---
This platform offers presence detection by looking at connected devices to a [BT Smart Hub](https://en.wikipedia.org/wiki/BT_Smart_Hub) based router.

View File

@ -11,6 +11,8 @@ logo: buienradar.png
ha_category: Weather
ha_release: 0.47
ha_iot_class: Cloud Polling
redirect_from:
- /components/weather.buienradar/
---
The `buienradar` platform uses [buienradar.nl](http://buienradar.nl/) as a source for current meteorological data for your location. The weather forecast is delivered by Buienradar, who provides a web service that provides detailed weather information for users in The Netherlands.

View File

@ -10,6 +10,8 @@ footer: true
ha_category: Calendar
ha_iot_class: Cloud Polling
ha_release: "0.60"
redirect_from:
- /components/calendar.caldav/
---
The `caldav` platform allows you to connect to your WebDav calendar and generate

View File

@ -11,6 +11,8 @@ logo: home-assistant.png
ha_category: Network
ha_release: 0.44
ha_iot_class: Configurable
redirect_from:
- /components/sensor.cert_expiry/
---
The `cert_expiry` sensor fetches information from a configured URL and displays the certificate expiry in days.

View File

@ -11,6 +11,8 @@ logo: channels.png
ha_category: Media Player
ha_release: 0.65
ha_iot_class: Local Polling
redirect_from:
- /components/media_player.channels/
---

View File

@ -10,6 +10,8 @@ footer: true
logo: cisco.png
ha_category: Presence Detection
ha_release: 0.33
redirect_from:
- /components/device_tracker.cisco_ios/
---
This is a presence detection scanner for [Cisco](http://www.cisco.com) IOS devices.

View File

@ -0,0 +1,53 @@
---
layout: page
title: "Cisco Spark"
description: "Instructions on how to add CiscoSpark notifications to Home Assistant."
date: 2017-02-20 15:00
sidebar: true
comments: false
sharing: true
footer: true
logo: ciscospark.png
ha_category: Notifications
ha_release: "0.40"
redirect_from:
- /components/notify.ciscospark/
---
The `ciscospark` notification platform allows you to deliver notifications from Home Assistant to [Cisco Spark](https://ciscospark.com/).
To use this notification platform you need to get a developer token. To obtain a token visit [Spark for Developers](https://developer.ciscospark.com/index.html)
At this time you also need to specify the `Cisco Spark` `roomid`. The `roomid` can also be found at [Spark for Developers](https://developer.ciscospark.com/index.html). Just look in the Documentation under Rooms.
In order to get notified for all new messages in the room you will need to create a bot. This will post the messages from the bot and mark them as new for you which will alert you. If you use your own personal token the messages are added to the room but no notification is triggered.
Once you have created the bot through the new App menu you will need to add the bot to the room that you are a member of as well. Now use the bot access token in your configuration below.
To enable the Cisco Spark notification in your installation, add the following to your `configuration.yaml` file:
```yaml
# Example configuration.yaml entry
notify:
- name: NOTIFIER_NAME
platform: ciscospark
token: YOUR_DEVELOPER_TOKEN
roomid: CISCO_SPARK_ROOMID
```
{% configuration %}
name:
description: Setting the optional parameter `name` allows multiple notifiers to be created. The notifier will bind to the service `notify.NOTIFIER_NAME`.
required: false
default: notify
type: string
token:
description: Your development token.
required: true
type: string
roomid:
description: The Room ID.
required: true
type: string
{% endconfiguration %}
To use notifications, please see the [getting started with automation page](/getting-started/automation/).

View File

@ -10,6 +10,8 @@ footer: true
logo: citybikes.png
ha_category: Transport
ha_release: 0.49
redirect_from:
- /components/sensor.citybikes/
---

View File

@ -11,6 +11,8 @@ logo: clementine.png
ha_category: Media Player
ha_release: 0.39
ha_iot_class: Local Polling
redirect_from:
- /components/media_player.clementine/
---
The `clementine` platform allows you to control a [Clementine Music Player](https://www.clementine-player.org).

View File

@ -10,6 +10,8 @@ footer: true
logo: clickatell.png
ha_category: Notifications
ha_release: 0.56
redirect_from:
- /components/notify.clickatell/
---
The `clickatell` platform uses [Clickatell](https://clickatell.com) to deliver SMS notifications from Home Assistant.

View File

@ -10,6 +10,8 @@ footer: true
logo: clicksend.png
ha_category: Notifications
ha_release: 0.48
redirect_from:
- /components/notify.clicksend/
---

View File

@ -13,6 +13,8 @@ ha_category:
ha_release: 0.55
redirect_from:
- /components/notify.clicksendaudio/
redirect_from:
- /components/notify.clicksend_tts/
---
The `clicksend_tts` platform uses [ClickSend](https://clicksend.com) to deliver text-to-speech (TTS) notifications from Home Assistant.

View File

@ -11,6 +11,8 @@ footer: true
ha_category: Media Player
ha_iot_class: Local Polling
ha_release: 0.23
redirect_from:
- /components/media_player.cmus/
---

View File

@ -11,6 +11,8 @@ logo: co2signal.png
ha_category: Environment
ha_release: 0.87
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.co2signal/
---
The `co2signal` sensor platform queries the [CO2Signal](https://www.co2signal.com/) API for the CO2 intensity of a specific region. Data can be collected via latitude/longitude or by country code. This API uses the same data as https://www.electricitymap.org/. Not all countries/regions in the world are supported so please consult this website to check local availability.

View File

@ -11,6 +11,8 @@ logo: coinmarketcap.png
ha_category: Finance
ha_release: 0.28
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.coinmarketcap/
---

View File

@ -11,6 +11,8 @@ logo: comed.png
ha_category: Energy
ha_release: "0.40"
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.comed_hourly_pricing/
---
The ComEd Hourly Pricing program is an optional program available to ComEd electric subscribers which charges customers a variable rate for electricity supply based on current demand rather than a traditional fixed rate. Live prices are published [here](https://hourlypricing.comed.com/live-prices/) and also via an [API](https://hourlypricing.comed.com/hp-api/) which we can integrate as a sensor in Home Assistant.

View File

@ -11,6 +11,8 @@ logo: command_line.png
ha_category: Utility
ha_release: 0.12
ha_iot_class: Local Polling
redirect_from:
- /components/binary_sensor.command_line/
---
The `command_line` binary sensor platform issues specific commands to get data.

View File

@ -10,6 +10,8 @@ footer: true
logo: interlogix.png
ha_category: Binary Sensor
ha_release: 0.31
redirect_from:
- /components/binary_sensor.concord232/
---
The `concord232` platform provides integration with GE, Interlogix (and other brands) alarm panels that support the RS-232 Automation Control Panel interface module (or have it built in). Supported panels include Concord 4.

View File

@ -11,6 +11,8 @@ logo: coolautomation.png
ha_category: Climate
ha_release: 0.88
ha_iot_class: Local Polling
redirect_from:
- /components/climate.coolmaster/
---

View File

@ -11,6 +11,8 @@ logo: cpu.png
ha_category: System Monitor
ha_release: pre 0.7
ha_iot_class: Local Push
redirect_from:
- /components/sensor.cpuspeed/
---

View File

@ -11,6 +11,8 @@ ha_category: Social
logo: crimereports.png
ha_release: 0.42
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.crimereports/
---
The `crimereports` sensor allows one to track reported incidents occurring in a given area. Incidents include anything reported to [Crime Reports](https://www.crimereports.com). Your regional emergency services may or may not report data. The sensor only counts incidents from the current day.

View File

@ -11,6 +11,8 @@ logo: cups.png
ha_category: System Monitor
ha_iot_class: Local Polling
ha_release: 0.32
redirect_from:
- /components/sensor.cups/
---

View File

@ -11,6 +11,8 @@ ha_category: Finance
logo: currencylayer.png
ha_iot_class: Cloud Polling
ha_release: 0.32
redirect_from:
- /components/sensor.currencylayer/
---

View File

@ -12,6 +12,8 @@ ha_category: Weather
ha_release: "0.30"
redirect_from: /components/sensor.forecast/
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.darksky/
---
The `darksky` platform uses the [Dark Sky](https://darksky.net/) web service as a source for meteorological data for your location. The location is based on the `longitude` and `latitude` coordinates configured in your `configuration.yaml` file. The coordinates are auto-detected but to take advantage of the hyper-local weather reported by Dark Sky, you can refine them down to your exact home address. GPS coordinates can be found by using [Google Maps](https://www.google.com/maps) and clicking on your home or [Openstreetmap](http://www.openstreetmap.org/).

View File

@ -10,6 +10,8 @@ footer: true
logo: ddwrt.png
ha_category: Presence Detection
ha_release: pre 0.7
redirect_from:
- /components/device_tracker.ddwrt/
---
This platform offers presence detection by looking at connected devices to a [DD-WRT](http://www.dd-wrt.com/site/index) based router.

View File

@ -11,6 +11,8 @@ ha_category: Light
ha_iot_class: Local Polling
logo: leviton.png
ha_release: 0.37
redirect_from:
- /components/light.decora/
---
Support for the Decora Bluetooth dimmer switch [Leviton](https://www.leviton.com/en/products/residential/dimmers/automation-smart-home/decora-digital-with-bluetooth-dimmers#t=Products&sort=%40wcs_site_tree_rank%20ascending&layout=card).

View File

@ -11,6 +11,8 @@ ha_category: Light
ha_iot_class: Cloud Polling
logo: leviton.png
ha_release: 0.51
redirect_from:
- /components/light.decora_wifi/
---
Support for [Leviton Decora Wi-Fi](http://www.leviton.com/en/products/lighting-controls/decora-smart-with-wifi) dimmers/switches via the MyLeviton API.

View File

@ -11,6 +11,8 @@ logo: deluge.png
ha_category: Downloading
ha_release: 0.57
ha_iot_class: Local Polling
redirect_from:
- /components/switch.deluge/
---
The `deluge` switch platform allows you to control your [Deluge](http://deluge-torrent.org/) client from within Home Assistant. The platform enables you switch all your torrents in pause, and then unpause them all.

View File

@ -11,6 +11,8 @@ logo: denon.png
ha_category: Media Player
ha_iot_class: Local Polling
ha_release: 0.7.2
redirect_from:
- /components/media_player.denon/
---
The `denon` platform allows you to control a [Denon Network Receivers](http://www.denon.co.uk/chg/product/compactsystems/networkmusicsystems/ceolpiccolo) from Home Assistant. It might be that your device is supported by the [Denon AVR] platform.

View File

@ -11,6 +11,8 @@ logo: denon.png
ha_category: Media Player
ha_iot_class: Local Polling
ha_release: 0.7.2
redirect_from:
- /components/media_player.denonavr/
---
The `denonavr` platform allows you to control a [Denon Network Receivers](http://www.denon.co.uk/chg/product/compactsystems/networkmusicsystems/ceolpiccolo) from Home Assistant. It might be that your device is supported by the [Denon] platform.

View File

@ -11,6 +11,8 @@ ha_category: Transport
logo: db.png
ha_iot_class: Cloud Polling
ha_release: 0.14
redirect_from:
- /components/sensor.deutsche_bahn/
---

View File

@ -11,6 +11,8 @@ ha_category: DIY
ha_release: 0.7
logo: dht.png
ha_iot_class: Local Polling
redirect_from:
- /components/sensor.dht/
---

View File

@ -11,6 +11,8 @@ logo: digitalloggers.png
ha_category: Switch
ha_release: 0.35
ha_iot_class: Local Polling
redirect_from:
- /components/switch.digitalloggers/
---
The `digitalloggers` switch platform allows you to control the state of your [Digital Loggers](http://www.digital-loggers.com/dinfaqs.html) switches.

View File

@ -11,6 +11,8 @@ logo: directv.png
ha_category: Media Player
ha_release: 0.25
ha_iot_class: Local Polling
redirect_from:
- /components/media_player.directv/
---
Master [DirecTV](http://www.directv.com/) receivers (ie: those that have tuners) will be automatically discovered if you enable the [discovery component](/components/discovery/) and the receiver is powered-on. Slave/RVU client/Genie boxes will also be discovered, but only if they are also online at the time of discovery.

View File

@ -11,6 +11,8 @@ ha_category: Multimedia
ha_release: 0.61
logo: discogs.png
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.discogs/
---
The `discogs` platform allows you to see the current amount of records in your [Discogs](https://discogs.com) collection.

View File

@ -10,6 +10,8 @@ footer: true
logo: discord.png
ha_category: Notifications
ha_release: 0.37
redirect_from:
- /components/notify.discord/
---
The [Discord service](https://discordapp.com/) is a platform for the notify component. This allows components to send messages to the user using Discord.

View File

@ -11,6 +11,8 @@ ha_category: Camera
logo: camcorder.png
ha_release: "0.40"
ha_iot_class: Configurable
redirect_from:
- /components/camera.dispatcher/
---
<p class='note'>

View File

@ -10,6 +10,8 @@ footer: true
logo: dlib.png
ha_category: Image Processing
ha_release: 0.44
redirect_from:
- /components/image_processing.dlib_face_detect/
---
The `dlib_face_detect` image processing platform allows you to use the [Dlib](http://www.dlib.net/) through Home Assistant. This platform enables face detection from cameras, and can fire events with attributes.

View File

@ -10,6 +10,8 @@ footer: true
logo: dlib.png
ha_category: Image Processing
ha_release: 0.44
redirect_from:
- /components/image_processing.dlib_face_identify/
---
The `dlib_face_identify` image processing platform allows you to use the [Dlib](http://www.dlib.net/) through Home Assistant. This platform allow you to identify persons on camera and fire an event with identify persons.

View File

@ -11,6 +11,8 @@ logo: dlink.png
ha_category: Switch
ha_iot_class: Local Polling
ha_release: 0.14
redirect_from:
- /components/switch.dlink/
---
The `dlink` switch platform allows you to control the state of your [D-Link Wi-Fi Smart Plugs](http://us.dlink.com/product-category/home-solutions/connected-home/smart-plugs/).

View File

@ -11,6 +11,8 @@ logo: dlna.png
ha_category: Media Player
ha_release: 0.76
ha_iot_class: Local Push
redirect_from:
- /components/media_player.dlna_dmr/
---
The `dlna_dmr` platform allows you to control a [DLNA Digital Media Renderer](https://www.dlna.org/), such as DLNA enabled TVs or radios.

View File

@ -11,6 +11,8 @@ logo: home-assistant.png
ha_category: Network
ha_iot_class: Cloud Polling
ha_release: "0.40"
redirect_from:
- /components/sensor.dnsip/
---
The `dnsip` sensor will expose an IP address, fetched via DNS resolution, as its value. There are two operational modes:

View File

@ -11,6 +11,8 @@ logo: netbeheernederland.jpg
ha_category: Energy
ha_release: 0.34
ha_iot_class: Local Push
redirect_from:
- /components/sensor.dsmr/
---
A sensor platform for Dutch Smart Meters which comply to DSMR (Dutch Smart Meter Requirements), also known as 'Slimme meter' or 'P1 poort'.

View File

@ -11,6 +11,8 @@ logo: dte_energy.png
ha_category: Energy
ha_release: 0.21
ha_iot_class: Local Polling
redirect_from:
- /components/sensor.dte_energy_bridge/
---
A sensor platform for the [DTE](https://www.dteenergy.com/) Energy Bridge. To find out which version of the DTE Energy Bridge sensor you have, find the status LED on your box.

View File

@ -11,6 +11,8 @@ logo: dublin_bus.png
ha_category: Transport
ha_iot_class: Cloud Polling
ha_release: 0.36
redirect_from:
- /components/sensor.dublin_bus_transport/
---
The `dublin_bus_transport` sensor will give you the time until the next two departures from a Dublin bus stop using the RTPI information.

View File

@ -11,6 +11,8 @@ logo: duke_energy.png
ha_category: Energy
ha_release: 0.74
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.duke_energy/
---
The `duke_energy` sensor platform allows you get the previous days usage for all of your Duke Energy smart meters.

View File

@ -11,6 +11,8 @@ logo: dunehd.png
ha_category: Media Player
ha_iot_class: Local Polling
ha_release: 0.34
redirect_from:
- /components/media_player.dunehd/
---

View File

@ -11,6 +11,8 @@ footer: true
ha_category: Weather
ha_release: 0.51
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.dwd_weather_warnings/
---
The `dwd_weather_warnings` sensor platform uses the [Deutsche Wetter Dienst (DWD)](https://www.dwd.de) as a source for current and advance warnings.

View File

@ -11,6 +11,8 @@ logo: ebox.png
ha_category: Network
ha_release: 0.39
ha_iot_class: Cloud Polling
redirect_from:
- /components/sensor.ebox/
---
Integrate your [EBox](https://client.ebox.ca/) account information into Home Assistant.

Some files were not shown because too many files have changed in this diff Show More